AGL39.71▼ -0.42 (-0.01%)AIRLINK189.85▲ 0.42 (0.00%)BOP9.83▼ -0.51 (-0.05%)CNERGY7.01▼ -0.2 (-0.03%)DCL10.24▲ 0.03 (0.00%)DFML41.31▼ -0.49 (-0.01%)DGKC105.99▼ -2.64 (-0.02%)FCCL37.72▼ -0.87 (-0.02%)FFBL93.41▲ 3.5 (0.04%)FFL15▼ -0.02 (0.00%)HUBC122.3▼ -0.93 (-0.01%)HUMNL14.31▼ -0.14 (-0.01%)KEL6.32▼ -0.02 (0.00%)KOSM8.12▼ -0.28 (-0.03%)MLCF48.78▼ -0.69 (-0.01%)NBP72.31▼ -2.51 (-0.03%)OGDC222.95▲ 9.54 (0.04%)PAEL33.62▲ 0.63 (0.02%)PIBTL9.67▲ 0.6 (0.07%)PPL201.45▲ 1.52 (0.01%)PRL33.8▼ -0.75 (-0.02%)PTC26.59▼ -0.62 (-0.02%)SEARL116.87▼ -1.32 (-0.01%)TELE9.63▼ -0.25 (-0.03%)TOMCL36.61▲ 1.19 (0.03%)TPLP11.95▼ -0.62 (-0.05%)TREET24.49▲ 2.2 (0.10%)TRG61.36▲ 0.46 (0.01%)UNITY36.06▼ -0.63 (-0.02%)WTL1.79▲ 0 (0.00%)

Deepfakes are a threat not only to businesses, but also to individual users

Share
Tweet
WhatsApp
Share on Linkedin
[tta_listen_btn]

IsIslamabad: Research has found the availability of deepfake creation tools and services on darknet marketplaces. These services offer generative AI video creation for a variety of purposes, including fraud, blackmail, and stealing confidential data. According to the estimates by Kaspersky experts, prices per one minute of a deepfake video can be purchased for as little as $300.

 

The widespread adoption of artificial intelligence (AI) and machine learning technologies in recent years is providing threat actors with sophisticated new tools to perpetrate their attacks. One of these are deepfakes which include generated human-like speech or photo and video replicas of people. Kaspersky warns that companies and consumers must be aware that deepfakes will likely become more of a concern in the future.

 

According to the recent Kaspersky Business Digitisation Survey, 51% of employees surveyed in the META region said they could tell a deepfake from a real image, however in a test only 25% could actually distinguish a real image from an AI-generated one. This puts organisations at risk given how employees are often the primary targets of phishing and other social engineering attacks.

 

For example, cybercriminals can create a fake video of a CEO requesting a wire transfer or authorising a payment, which can be used to steal corporate funds. Compromising videos or images of individuals can be created, which can be used to extort money or information from them.

 

“Despite the technology for creating high-quality deepfakes not being widely available yet, one of the most likely use cases that will come from this is to generate voices in real-time to impersonate someone. It’s important to remember that deepfakes are a threat not only to businesses, but also to individual users – they spread misinformation, are used for scams, or to impersonate someone without consent – and are a growing cyberthreat to be protected from,” says Hafeez Rehman, Technical group manager at Kaspersky.

 

Kaspersky recommends people and businesses to be aware of the key characteristics of deepfake videos. A solution such as Kaspersky Threat Intelligence can assist keeping information security specialists up to date on the most recent developments in the deepfake game. Companies should also strengthen the human firewall by ensuring their employees understand what they see.

Related Posts

Get Alerts