Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Street
The Street
Ellen Chang

Fraudsters' New Trick Uses AI Voice Cloning to Scam People

Answering calls from unknown numbers can already be risky since it is often a salesperson or merely a recorded voice attempting to persuade you to buy a service or product.

Voice cloning is now easily accessible. Cyber criminals are taking advantage of it by adding voice cloning made created from artificial intelligence to their arsenal of tricks to con people in giving up their money or personal information.

DON'T MISS: Hot Scam: Robocallers Pushing a Fake Car Warranty

The Federal Trade Commisson is warning consumers against giving callers who claim to be a friend or relative and need money soon, especially in a form that has no recourse.

"Scammers ask you to pay or send money in ways that make it hard to get your money back," the FTC wrote in a blog post. "If the caller says to wire money, send cryptocurrency, or buy gift cards and give them the card numbers and PINs, those could be signs of a scam."

Cloning someone's voice, whether they are a famous musician, politician or CEO or even your best friend, has been made much easier from the advances in artificial intelligence technology.

AI is making it possible to clone a person’s voice and produce "authentic-sounding statements and conversations from them," Chris Pierson, CEO of BlackCloak, an Orlando, Fla.-based executive digital protection company, told TheStreet.

Fraudsters create the clones by capturing a sample of a person’s voice, which could be accomplished by pulling a video from YouTube or TikTok, he said.

Even a few seconds of the person’s voice is enough for the AI tools to capture the "essence of that person’s voice and then create entirely original statements and conversations with the same frequency, intensity, harmonic structure, tone and inflection," Pierson said. 

A snippet of someone's voice from a conversation is enough for a criminal to use it to generate "highly realistic conversations that are ultimately fake," he said.

Voice Cloning Will Be Popular

Scammers always follow the money and any tool that is easy to scale to generate more profit is attractive to them, Pierson said.

The leap in AI from ChatGPT will prove to be popular among cyber criminals since all these tech advances will give them larger paydays, he said.

"This technology can be used for malicious purposes and we are starting to see this happening," Pierson said. "Right now it looks more like a luxury attack method, but in the coming months and years it will most likely be applied en masse and really create a cyber cyclone of fraud."

The alarming part of the new technology is that artificial intelligence "needs very little content to be trained" unlike older methods such as the ones used for automated customer service agents, Alex Hamerstone, advisory solutions director at TrustedSec, a Fairlawn, Ohio-based ethical hacking and cyber incident response company, told TheStreet.

AI does not need a recording of every single word that it will use and only needs a "handful of words spoken by someone and can create a very real sounding version of just about any word, and is able to put these words together into sentences that sound just like the person that was recorded," he said.

"What’s really important here is that it’s not only the individual words that sound authentic, but the person’s entire speaking style," Hamerstone said. "It not only sounds like the person on a word-for-word basis, it also sounds like the person when they are speaking in longer sentences. It picks up patterns of speech too, such as pauses, mouth noises, etc. and is very convincing."

Voice cloning is already becoming popular and it is "likely to be heavily used by criminal groups, especially the more sophisticated gangs," he said.

Cloned voices are realistic and make it easy to fool someone. 

"As these tools evolve over the coming months and years, it will be extremely difficult to tell the difference between a real person’s voice and their AI clone," Hamerstone said. 

"This will not only help in carrying out direct scams over the phone, but also in combination with other social engineering attacks, such as email phishing and text phishing. Scammers are likely to continue to take full advantage of this technology because of how convincing it is."

AI’s ability to create "believable content via video, audio, and text has upped the malware game," Timothy Morris, chief security advisor at Tanium, a Kirkland, Washington-based provider of converged endpoint management, told TheStreet.

Using AI tools for voice makes the attacks and scams more believable and easier to dupe people because the "request sounds like it’s coming from someone you know," he said.

Common Scams from Voice Cloning

Fraudsters will attempt to quickly gain the confidence of the other person on the phone. 

Since people are used to being on calls with poor reception, the cloned voices do not have to be perfect.

The scammers are looking for money, especially in the form of gift cards since they are difficult to trace. Fraudsters will also be trying to gain access to a computer, confirmation of bank information or passwords or take a bolder step and request for access to funds via wire, Zelle, or another instantaneous payment method, Pierson said.

The number of consumer scams will increase -- grandparent scams are likely to proliferate and donation/charity scams are likely to benefit from this too, he said.

"Voice cloning, ChatGPT, image creators and deep fakes are all incredibly powerful tools in their own right, but when used in combination, they are likely to overwhelm even the most security-conscious person," Hamerstone said. 

Voice cloning will not be used just in one-off “vishing” (voice phishing) scams.  Expect to see it paired with other types of attacks, such as email and text phishing.

Businesses will be large targets because people are more likely to open an email, click a link or download an attachment, especially if they receive a call "urging them to do so soon after the email arrives," he said.

Scammers can use voice cloning to harvest an executive’s voice and use it to target employees or vice versa. 

"This will make spearphishing attacks on corporate entities much more effective, especially when it comes to wire fraud schemes," Pierson said. "When you combine this with the ease with which phone numbers can be spoofed and scripts that can be created by Chat GPT, it can create a perfect storm."

Companies can not be lax and must train their employees to only use trusted methods of communication and be "very careful when an unexpected and urgent request arrives," Zane Bond, head of product at Keeper Security, a Chicago-based provider of zero-trust and zero-knowledge cybersecurity software, told TheStreet.

"Artificial intelligence in the hands of adversaries has the potential to amp up social engineering exponentially, which is currently one of the most successful scamming tactics available," he said.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.