AGL40▼ -0.16 (0.00%)AIRLINK129.53▼ -2.2 (-0.02%)BOP6.68▼ -0.01 (0.00%)CNERGY4.63▲ 0.16 (0.04%)DCL8.94▲ 0.12 (0.01%)DFML41.69▲ 1.08 (0.03%)DGKC83.77▼ -0.31 (0.00%)FCCL32.77▲ 0.43 (0.01%)FFBL75.47▲ 6.86 (0.10%)FFL11.47▲ 0.12 (0.01%)HUBC110.55▼ -1.21 (-0.01%)HUMNL14.56▲ 0.25 (0.02%)KEL5.39▲ 0.17 (0.03%)KOSM8.4▼ -0.58 (-0.06%)MLCF39.79▲ 0.36 (0.01%)NBP60.29▲ 0 (0.00%)OGDC199.66▲ 4.72 (0.02%)PAEL26.65▼ -0.04 (0.00%)PIBTL7.66▲ 0.18 (0.02%)PPL157.92▲ 2.15 (0.01%)PRL26.73▲ 0.05 (0.00%)PTC18.46▲ 0.16 (0.01%)SEARL82.44▼ -0.58 (-0.01%)TELE8.31▲ 0.08 (0.01%)TOMCL34.51▼ -0.04 (0.00%)TPLP9.06▲ 0.25 (0.03%)TREET17.47▲ 0.77 (0.05%)TRG61.32▼ -1.13 (-0.02%)UNITY27.43▼ -0.01 (0.00%)WTL1.38▲ 0.1 (0.08%)

Deep-fake attacks on the rise | By Haya Fatima Sehgal

Share
Tweet
WhatsApp
Share on Linkedin
[tta_listen_btn]

Deep-fake attacks on the rise

IN a world already filled with a sense of doubt, espionage and artificially hyped sensationalism, does the recent Deep-fake controversy in Pakistan hold much value?

This is not meant to be a triggering statement, rather a rhetoric posed for a moment of self-reflection.

With so much presented as “fake” or manipulated, one should fully understand that with the advent of such technology, this was coming.

The recent notable personalities being the object of character assassination also begs another question: how much is being done with efforts to protect the rights of the average civilian?

One can only imagine how such technology can be used for malicious attacks, with anybody being a target.

By clear definition, Deep-fake is a form of artificial intelligence used to create seemingly realistic images, videos and audios.

Deep-fake attacks however are an insidious campaign that uses such files with a clear intent to malign and defame an opponent; specifically, manipulated media is used to character assassination of an individual, organization or even a group.

Culturally, and universally, ethical values denote that we understand any type of manipulation to character assassination of the other is wrong.

But we also know that the same technology was taken out to be a source of entertainment. We saw an image of Jinnah, our founding father, come to life via AI technology.

There are satire and comedy accounts on YouTube that specialize in producing Deep-fake content to entertain global audiences.

But where does the fine line between entertainment and mischievousness exist and who defines it?

Diverging from its real intent, recently, Deep-fake technology has been utilized to target political personalities around the world.

Almost over a decade ago, audio recordings were attributed to different people whereas most cases claimed it was manipulation.

With Deep-fake being called ‘Photoshop on steroids’, media manipulation has become simpler via easily accessible apps for anybody to use.

Interestingly enough, the action to vilify a human being has been an art for eons. Whether it is corporate espionage or to launch a smear campaign, character assassination has become a key component causing disruptions in various interactions.

Such character assassination campaigns will certainly be the next element to prevent under cybercrime laws.

Personnel would and could be easily targeted given that there might not be the technology here yet to distinguish simply between what is real – or not.

Taking into consideration the evolving technology scenario, Pakistan has developed a Cybercrime unit that oversees complex and essential cyber security.

A most common issue outlined and is prevalent in Pakistan, are photos and videos being manipulated to blackmail the vulnerable such as women and children.

It will be the threat of punishment by law that deters most deviant behaviour. There are laws that provide protection against defamation and cybercrime in Pakistan that are still evolving.

However, much of these have yet to be further defined. Accordingly, FIA has also said that there are several penalties for perverse propaganda and defamation under the new cybercrime laws.

As Deep-fake attacks become more common, it is the people being affected who will have to face public opinion, now being their judge and jury.

More literature must be written to address this subject which is now a global cyber security concern.

Public perception still overrules 90% of the time despite advancement in technological evidence.

How many lives, reputations, business or social relationships have ended because of such videos, audios or what one would call: plain old spy games with malafide intention.

The problem has never been technology. Technology is often being termed the ‘instrument of the devil’; a popular adage but so wrongly worded.

Nothing really is an instrument of the devil except man himself who exploits these instruments for his own nefarious purposes.

—The writer is contributing columnist, based in Islamabad.

 

Related Posts

Get Alerts