Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Top News
Top News

AI Voice-Cloning Scams Target Millions, Warns UK Bank

An AI (Artificial Intelligence) sign is seen at the World Artificial Intelligence Conference (WAIC) in Shanghai

Artificial intelligence (AI) is being exploited by fraudsters to clone people's voices, potentially putting millions at risk of falling victim to scams, according to a warning from a UK bank.

Starling Bank, an online-only lender, revealed that scammers can use AI technology to replicate a person's voice with just three seconds of audio, such as from a video posted online. By mimicking the individual's voice, scammers can deceive friends and family members into sending money.

The bank emphasized the widespread threat posed by these scams, stating that they have the potential to ensnare millions of individuals. Already, hundreds of people have been affected by such fraudulent activities.

A recent survey conducted by Starling Bank and Mortar Research found that over a quarter of respondents had been targeted by AI voice-cloning scams in the past year. Alarmingly, 46% of participants were unaware of the existence of such scams, and 8% admitted they would comply with a money request from a suspicious call.

Scammers deceive friends and family into sending money using voice cloning.
AI technology can replicate voices with just 3 seconds of audio.
Hundreds of people have already fallen victim to AI voice-cloning scams.

Lisa Grahame, Chief Information Security Officer at Starling Bank, highlighted the risk posed by individuals unknowingly sharing their voice recordings online, making them vulnerable to fraudsters.

To combat this threat, the bank recommends establishing a 'safe phrase' with loved ones. This unique phrase, distinct from other passwords, can serve as a verification tool during phone calls. It is advised not to share the safe phrase via text to prevent scammers from obtaining it. If shared through text, the message should be promptly deleted after the recipient has viewed it.

As AI technology advances in replicating human voices, concerns are growing about its potential misuse by criminals to access personal accounts and spread misinformation. OpenAI, the developer of AI chatbot ChatGPT, recently introduced a voice replication tool called Voice Engine but withheld public access due to the risks associated with synthetic voice misuse.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.