Take a fresh look at your lifestyle.

Large scams with voice deepfakes are already a reality: they steal 35 million dollars from a bank using this technology

The director of a bank in the United Arab Emirates receives a call from an important client, a director of a large company. The banker recognizes the voice on the other end of the phone and immediately grants it all credibility. The executive informs him that he is about to close the purchase of a company for 35 million dollars and needs him to transfer that amount from his account to that of the owner of the organization through an intermediary, a lawyer hired to coordinate all the transaction. The bank manager, once he hangs up, receives several emails from both his alleged client and the lawyer and, without being suspicious, makes the transfer to various accounts that are indicated in the emails.

But it was a scam. The Emirati banker had been the victim of a complex deception in which cybercriminals, using voice deepkfake technology and email spoofing, they managed to complete a millionaire robbery without raising suspicions.

The events occurred in early 2020, but have now been released after the United Arab Emirates authorities asked the United States to open an investigation into two bank accounts in their territory that could have received about $ 400,000 of the 35 million stolen, as revealed by Forbes magazine and Engadget has been able to verify in the case document.

The aforementioned document indicates that criminals they would have distributed the money through bank accounts around the world and that at least 17 people would be involved in the scam, but it does not provide much more information.

A growing threat

So far there are few known cases of deepfake money scams, although cybersecurity experts warn that it is a growing threat, potentially comparable to ransomware, due to the accelerated improvement and cheaper of this technology, as we told a few weeks ago in Engadget.

Audio deepfakes, like the one that managed to fool the UAE banker, they are the main threat of this technology in the short and medium termas it is relatively simple and cheap to reproduce a person’s voice if sufficient voice cuts are achieved through social engineering.

The experts consulted by Engadget explained that, for the moment, deepfakes have been used above all to misinform and, in the case of companies, to attack the reputation of the company by impersonating managers. However, they warned that, given their potential, economic scams would be the next step.

In this sense, the sources consulted by this means recommended, in order not to fall into deception, be critical of all communication involving delicate movements, such as large money transactions, and verify, by calling the person who claims to be the interlocutor, at their usual number. And even go see it if it is someone from the same company or from the same city.

Another tip is question the interlocutor with compromised information that only that person can know, to put criminals at a crossroads from which they will hardly be able to get out.