Page 1 of 1

Deepfake scams : increasingly popular

Posted: Sat Dec 28, 2024 6:32 am
by tasnimsanika00
sophisticated deepfake scam attempt . The scam involved the artificial reproduction of his voice and the use of his likeness. Mark Read described the plot in a recent email to staff at the corporation, warning employees about potential calls that might appear to be from senior management.

WPP is a publicly traded company with a market capitalisation of around $11.3 billion , and like other companies, it is exposed to increasingly professional and difficult-to-detect cyberattacks. For example, fake websites using its brand name.

This is how the attempted scam was carried out
The scammers created a WhatsApp account using Read’s public persona and used it to host a Microsoft Teams meeting that purported to involve him and another senior WPP executive, the email obtained by the Guardian details . During the virtual meeting, the imposters used a replica of the executive’s voice and played YouTube videos purporting to represent the two men. They also impersonated Read via the meeting’s chat window. While the scam failed to achieve its goal , it did target an “agency head,” asking him to set up a new company , all in an attempt to obtain money and personal data.

A WPP spokesperson confirmed in a statement that the phishing attempt was unsuccessful: “Thanks to belize whatsapp list the vigilance of our staff , including the affected executive, the incident was prevented.” WPP did not respond to questions about when the attack took place or which executives, other than Mark Read, were involved.


Aside from voice cloning via what appeared to be generative AI, the scam involved simpler techniques such as using public images . Read's image was taken from the internet and used as a contact photo. The attack is representative of the many tools scammers now have at their disposal to mimic legitimate corporate communications and ultimately their executives.

"We have seen increasing sophistication in cyberattacks on our colleagues, and those targeting senior leaders in particular," Read said in the email.

Read's email lists a number of red flags to look out for. These include passport applications, money transfers and any mention of a "secret purchase, transaction or payment that no one else knows about ." "Just because the account has my photo on it doesn't mean it's me," Read said in the email.

In recent years, low-cost audio deepfake technology has become widely available and is becoming increasingly convincing. Some AI models can generate realistic imitations of a person’s voice using just a few minutes of audio. This, for example, is easily obtainable from public figures due to their media exposure and allows scammers to create manipulated recordings of almost anyone.