23 Jumada I 1446 - 24 November 2024
    
Sign up for newsletter
Eye of Riyadh
Technology & IT | Thursday 3 August, 2023 12:00 am |
Share:

Don’t believe your ears: Kaspersky experts share insights on how to spot voice deepfakes

Kaspersky experts shared insights on distinguishing voice deepfakes. Deepfake (“deep learning” + “fake”) artificial intelligence has been growing at a rapid rate over the past few years. Machine learning can be used to create compelling fakes of images, video, or audio content. To determine whether some audio piece is a fake or a speech of a real human, there are several characteristics to consider: the timbre, manner and intonation of speech. For example, a voice deepfake will give out an unnatural monotony of speech. Another feature that should be considered is the sound quality. So, illegible speech and strange noises should be alerted while listening to an audio message or a call.

Deepfake technology in itself is harmless, but in the hands of scammers it can become a dangerous tool with lots of opportunities for deception, defamation or disinformation. Fortunately, there haven’t been any mass cases of scams involving voice alteration, but there have been several high-profile cases involving voice deepfakes. In 2019, scammers used this technology to shake down a UK-based energy firm. In a telephone conversation, the scammer pretended to be the chief executive of the firm’s German parent company, and requested an urgent transfer of €220,000 to the account of a certain supplier company. A year later, in 2020, scammers used deepfakes to steal up to $35,000,000 from a Japanese company.

“Currently the technology for creating high-quality deepfakes is not available for widespread use. However, in the future, it may become freely open, which could lead to a surge in related fraud. Most likely, attackers will try to generate voices in real time – to impersonate someone's relative and lure out money, for example. Such a scenario is not realistic for now: creating high-quality deepfakes involves a lot of limited resources. However, to make a low-quality audio fake, fewer resources are required, and fraudsters can use this. These signs can be helpful to spot this fraud,” comments Dmitry Anikin, Senior Data Scientist at Kaspersky.

To protect yourself from deepfakes, Kaspersky experts recommend:

  • pay attention to suspicious calls. You should be alerted by poor sound quality, unnatural monotony of voice, unintelligible speech, extraneous noise;
  • don't make decisions based on emotions, don't share your details with anyone, and don't transfer money, even if the interlocutor sounds convincing. It is better to stop the call and double-check the information received through several channels;
  • use reliable security solutions like Kaspersky Premium on your devices, which will further secure the use of gadgets.

 

Share:
Print
Post Your Comment
ADD TO EYE OF Riyadh
RELATED NEWS
MOST POPULAR