What is AI voice cloning scam? How to spot it as 'millions of Brits at risk'

Watch more of our videos on ShotsTV.com 
and on Freeview 262 or Freely 565
Visit Shots! now
The sophisticated scam uses AI 😨
  • Nearly a third of Brits have been targeted by voice cloning scams in the last year. 
  • Scammers can use just three seconds of footage online to steal a voice. 
  • People urged to use ‘safe phrases’ with loved ones to avoid being scammed. 

Picking up your phone and instantly recognising the voice of a friend or loved one on the other end of the line should be a reassuring moment. However, in the age of highly sophisticated scams, you could be walking into a very expensive trap. 

Just because the voice on the phone may sound like your parents, children or best friend it could actually be a wolf in sheep's clothing. And if your “loved one” is suddenly asking you for money urgently and out of the blue, it could very well be a deeply insidious con. 

Hide Ad
Hide Ad

This advanced scam - known as voice cloning - sees crooks making a copy of a voice with AI and using it to target unsuspecting victims. It may sound like something out of a science fiction film but millions of Brits could be at risk, according to Starling Bank

But what is the scam, how does it work and how can you protect yourself? Here’s all you need to know: 

What is a voice cloning scam? 

The rise of artificial intelligence (AI) in recent years has opened the door to a whole new world of possibilities. From generating silly pictures using software like DALL-E to highly advanced machine learning tools implemented in your workplace, there are plenty of ways our lives are already changing. 

But while there are many benevolent ways that AI tools are being used, there are also darksides to the technology (like any tech advancements) and it gives fraudsters all new ways to attempt to part you from your money. You may have heard of the term “deepfake” floating around on the internet - usually in reference to celebrities or politicians - in which AI tools are used to create fake pictures, videos and even audio. 

Hide Ad
Hide Ad

The voice cloning scam - which Brits are being warned to watch out for - is an example of this, with the tech being used to replicate a person’s voice and then using it to target people they know. And in the age of social media, it is easy for potential scammers to get their hands on your voice. 

Millions of Brits could be the targets of voice cloning scamsMillions of Brits could be the targets of voice cloning scams
Millions of Brits could be the targets of voice cloning scams

How would a fraudster clone your voice? 

Starling Bank warns that crooks can replicate a person’s voice from as little as three seconds of audio. If you have ever shared a clip on social media featuring your voice, this could be used without your permission or knowledge by scammers. 

Brits are warned that once a voice has been cloned, fraudsters can then identify that person’s family members and use the cloned voice to stage a phone call, voice message or voicemail to them, asking for money that is needed urgently. In the survey, nearly 1 in 10 (8%) say they would send whatever they needed in this situation, even if they thought the call seemed strange - potentially putting millions at risk.

Cyber Security experts F-Secure warn that the AI-generated voice can be eerily reminiscent of the real person, such as your family member. With a sense of urgency mixed in, the victim of an AI scam call may not know to suspect a scam, even if they are familiar with common tricks used by fraudsters.

Hide Ad
Hide Ad

The New Yorker detailed such an incident in a feature on voice cloning scams earlier in 2024, a couple in Brooklyn, New York, were awoken by a call from their “relatives” who claimed to be ‘held hostage’. The couple were tricked into paying hundreds of dollars in order to free the ‘hostages’ - but the voices they heard did not belong to their loved ones and were in fact AI generated. 

Who is at risk from voice cloning scams? 

New research carried out by Starling Bank found that over a quarter (28%) of UK adults say they have been targeted by an AI voice cloning scam at least once in the past year. It means that millions of people in Britain could be at risk from this highly sophisticated scam. 

Lord Sir David Hanson, Minister of State at the Home Office with Responsibility for Fraud, said: “AI presents incredible opportunities for industry, society and governments but we must stay alert to the dangers, including AI-enabled fraud. As part of our commitment to working with industry and other partners, we are delighted to support initiatives such as this through the Stop! Think Fraud campaign and provide the public with practical advice about how to stay protected from this appalling crime.”

How to protect yourself from voice cloning scams? 

Brits are being urged to set up a ‘safe phrase’ that can be used to verify if the person you are talking to, really is who you think they are. Pick something memorable and only share it with your family and loved ones, that way if you receive an unexpected call and don’t hear the ‘safe phrase’ you can remain vigilant. 

Hide Ad
Hide Ad

Lisa Grahame, Chief Information Security Officer at Starling Bank, explained: “People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters. Scammers only need three seconds of audio to clone your voice, but it would only take a few minutes with your family and friends to create a Safe Phrase to thwart them. So it’s more important than ever for people to be aware of these types of scams being perpetuated by fraudsters, and how to protect themselves and their loved ones from falling victim. 

“We hope that through campaigns such as this we can arm the public with the information they need to keep themselves safe. Simply having a Safe Phrase in place with trusted friends and family - which you never share digitally - is a quick and easy way to ensure you can verify who is on the other end of the phone”.

Have you been the target of one of these voice cloning scams? Share your experiences with our tech writer by emailing: [email protected].

News you can trust since 1855
Follow us
©National World Publishing Ltd. All rights reserved.Cookie SettingsTerms and ConditionsPrivacy notice