Artificial intelligence (AI) is being exploited by fraudsters to clone people's voices, potentially putting millions at risk of falling victim to scams, according to a warning from a UK bank.
Starling Bank, an online-only lender, revealed that scammers can use AI technology to replicate a person's voice with just three seconds of audio, such as from a video posted online. By mimicking the individual's voice, scammers can deceive friends and family members into sending money.
The bank emphasized the widespread threat posed by these scams, stating that they have the potential to ensnare millions of individuals. Already, hundreds of people have been affected by such fraudulent activities.
A recent survey conducted by Starling Bank and Mortar Research found that over a quarter of respondents had been targeted by AI voice-cloning scams in the past year. Alarmingly, 46% of participants were unaware of the existence of such scams, and 8% admitted they would comply with a money request from a suspicious call.
Lisa Grahame, Chief Information Security Officer at Starling Bank, highlighted the risk posed by individuals unknowingly sharing their voice recordings online, making them vulnerable to fraudsters.
To combat this threat, the bank recommends establishing a 'safe phrase' with loved ones. This unique phrase, distinct from other passwords, can serve as a verification tool during phone calls. It is advised not to share the safe phrase via text to prevent scammers from obtaining it. If shared through text, the message should be promptly deleted after the recipient has viewed it.
As AI technology advances in replicating human voices, concerns are growing about its potential misuse by criminals to access personal accounts and spread misinformation. OpenAI, the developer of AI chatbot ChatGPT, recently introduced a voice replication tool called Voice Engine but withheld public access due to the risks associated with synthetic voice misuse.