AGL40▼ -0.01 (0.00%)AIRLINK190.5▲ 2.52 (0.01%)BOP10.25▲ 0.13 (0.01%)CNERGY7.27▲ 0.16 (0.02%)DCL10.35▲ 0.2 (0.02%)DFML42.4▲ 0.83 (0.02%)DGKC109.01▲ 1.1 (0.01%)FCCL38.76▼ -0.24 (-0.01%)FFBL89.63▲ 7.61 (0.09%)FFL15.06▲ 0.16 (0.01%)HUBC122.1▲ 2.64 (0.02%)HUMNL14.22▲ 0.17 (0.01%)KEL6.4▲ 0 (0.00%)KOSM8.22▲ 0.15 (0.02%)MLCF50.25▲ 0.78 (0.02%)NBP74.45▲ 0.79 (0.01%)OGDC210▲ 5.15 (0.03%)PAEL33.6▲ 0.04 (0.00%)PIBTL8.65▲ 0.58 (0.07%)PPL191.44▲ 6.03 (0.03%)PRL33.81▲ 0.2 (0.01%)PTC27.4▲ 0.01 (0.00%)SEARL119.2▼ -0.62 (-0.01%)TELE9.92▲ 0.23 (0.02%)TOMCL35.71▲ 0.41 (0.01%)TPLP12.65▲ 0.4 (0.03%)TREET20.75▲ 0.49 (0.02%)TRG61.25▲ 0.47 (0.01%)UNITY37.8▼ -0.19 (-0.01%)WTL1.67▲ 0.02 (0.01%)

Deep-fake attacks on the rise | By Haya Fatima Sehgal

Share
Tweet
WhatsApp
Share on Linkedin
[tta_listen_btn]

Deep-fake attacks on the rise

IN a world already filled with a sense of doubt, espionage and artificially hyped sensationalism, does the recent Deep-fake controversy in Pakistan hold much value?

This is not meant to be a triggering statement, rather a rhetoric posed for a moment of self-reflection.

With so much presented as “fake” or manipulated, one should fully understand that with the advent of such technology, this was coming.

The recent notable personalities being the object of character assassination also begs another question: how much is being done with efforts to protect the rights of the average civilian?

One can only imagine how such technology can be used for malicious attacks, with anybody being a target.

By clear definition, Deep-fake is a form of artificial intelligence used to create seemingly realistic images, videos and audios.

Deep-fake attacks however are an insidious campaign that uses such files with a clear intent to malign and defame an opponent; specifically, manipulated media is used to character assassination of an individual, organization or even a group.

Culturally, and universally, ethical values denote that we understand any type of manipulation to character assassination of the other is wrong.

But we also know that the same technology was taken out to be a source of entertainment. We saw an image of Jinnah, our founding father, come to life via AI technology.

There are satire and comedy accounts on YouTube that specialize in producing Deep-fake content to entertain global audiences.

But where does the fine line between entertainment and mischievousness exist and who defines it?

Diverging from its real intent, recently, Deep-fake technology has been utilized to target political personalities around the world.

Almost over a decade ago, audio recordings were attributed to different people whereas most cases claimed it was manipulation.

With Deep-fake being called ‘Photoshop on steroids’, media manipulation has become simpler via easily accessible apps for anybody to use.

Interestingly enough, the action to vilify a human being has been an art for eons. Whether it is corporate espionage or to launch a smear campaign, character assassination has become a key component causing disruptions in various interactions.

Such character assassination campaigns will certainly be the next element to prevent under cybercrime laws.

Personnel would and could be easily targeted given that there might not be the technology here yet to distinguish simply between what is real – or not.

Taking into consideration the evolving technology scenario, Pakistan has developed a Cybercrime unit that oversees complex and essential cyber security.

A most common issue outlined and is prevalent in Pakistan, are photos and videos being manipulated to blackmail the vulnerable such as women and children.

It will be the threat of punishment by law that deters most deviant behaviour. There are laws that provide protection against defamation and cybercrime in Pakistan that are still evolving.

However, much of these have yet to be further defined. Accordingly, FIA has also said that there are several penalties for perverse propaganda and defamation under the new cybercrime laws.

As Deep-fake attacks become more common, it is the people being affected who will have to face public opinion, now being their judge and jury.

More literature must be written to address this subject which is now a global cyber security concern.

Public perception still overrules 90% of the time despite advancement in technological evidence.

How many lives, reputations, business or social relationships have ended because of such videos, audios or what one would call: plain old spy games with malafide intention.

The problem has never been technology. Technology is often being termed the ‘instrument of the devil’; a popular adage but so wrongly worded.

Nothing really is an instrument of the devil except man himself who exploits these instruments for his own nefarious purposes.

—The writer is contributing columnist, based in Islamabad.

 

Related Posts

Get Alerts