Deep fakes: A digital nuclear tool | By Shahbaz Rizwan Raja

146

Deep fakes: A digital nuclear tool

YOU might have seen news surfing through many media channels regarding a video which was leaked of a politician or an influential person.

The abrupt stance they take is that this video is doctored and have been tailored made to defame them. While this would sound at the top of a mind a baseless claim, but is it true?

Has the technology been upgraded to such an extent that a person’s whole action and voice could be mimicked without them knowing a thing about it?

In 2018, a Trump’s video was released in which he had spoken regarding Belgian Government.

The video portrayed then president, Donald Trump calling on Belgium to pull out from the Paris Climate Agreement, however Trump never gave that speech.

Another instance, in early 2022, a video surfaced of Ukraine President Volodymyr Zelenskyy.

In the video, it could be seen Zelenskyy calling on the Ukraine citizens to surrender to Russian forces.

The controversial post took the internet by storm which caused Zelenskyy to make clear that it was not his video on his Instagram by saying that “We are at home and protect Ukraine”.

While the situation extinguished, but what it ignited was a tool which had fooled many too eyes and left many in disbelief, and which would further cause unrest.

In a layman language, deep fake is a process in which a person’s video could be made without their involvement in it.

The term “deep fake” originates from the primary underlying technology “deep learning” which is a form of Artificial Intelligence (AI).

Deep learning algorithms are taken up which teach themselves how to solve problems when inputted huge amounts of data and are used in these to swap faces in video and other digital content to make realistic-looking fake media.

Machine Learning is the main constituent in these deep fakes, which has made it possible to produce them at a lower cost and much quicker.

To make a deep fake video of someone, the creator would first test such deep learning algorithms on hours of actual video footage of the person to give it a more realistic “know-how” of what he or she looks like from multiple angles and under different lights.

Then along with the algorithms, computer — graphic techniques are used to superimpose a copy of the person onto a different actor in different other digital contents.

In Pakistan the issue of doctored videos opened its eyes a bit late than the rest of the world, with people not aware enough of its possible devastating effect.

While most foreign people are well versed on many digital upcoming issues and would not easily be fooled by such stunts, however, in a country like Pakistan, where people are gullible and not brought up enough on new digital discoveries, this would open up a whole new set of problems.

Vitriol, anger and hatred in public discourse are common nowadays on all media forums, especially in Pakistan.

Any late-night talk show, family gathering or a social event would have a couple of political enthusiasts engaged in irrelevant, heated and incoherent conversation, shrouded in heated arguments and roaring out of thinking that they are right. Toxicity is widespread!

In situations like these, such deep fakes are a time ticking bomb, ready to explode on a person and completely demolish, annihilate their reputation, their relations and their mental well-being.

With digital technologies being more and more advanced, the line between the truth and a hoax is becoming thinner.

A person’s respect and dignity has now been more vulnerable than ever before, and a person could just wonder and question that whatever you see or hear is actually true or a conspired thing made to fool you.

—The writer is occasionally contributing columnist, based in Islamabad.

 

Previous articleGlobal power Pak, China share a lasting future vision | By Syed Qamar Afzal Rizvi
Next articleVoice of the People