Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Hindu
The Hindu
National
R. Sivaraman

T.N. Cyber Crime Police issue advisory on new scam involving AI voice cloning

The Cyber Crime Wing of the Tamil Nadu Police has issued an advisory on a new impersonation scam that uses AI (Artificial Intelligence) voice cloning, and has asked the public to be cautious about unsolicited calls received on their mobile phones.

Police said that cyber fraudsters are now employing voice cloning, using advanced AI technology, to mimic the voices of trusted individuals, such as family members, over phone calls.

The calls come under the pretext of an emergency, and by creating a sense of urgency or distress, deceive victims into transferring money quickly, exploiting their trust.

This tactic highlights the evolving sophistication of cybercrimes and emphasises the importance of awareness and caution to prevent residents from falling victim to such fraudulent schemes, the police said.

The Cyber Crime Wing said the scam begins with a phone call to the victim from a scamster posing as someone the victim knows and trusts, such as a family member or friend. The scamster may claim to be in urgent need of financial assistance due to a fabricated emergency or threat. The scamster uses various tactics to evoke a sense of urgency and emotional distress in the victim. He/she may employ sobbing or pleading tones, claiming to be in a dire situation that requires immediate help.

Behind the scenes, the scamster utilises sophisticated AI software to clone the voice of the person they are impersonating. They get a voice sample of the person from social media posts/videos or by just talking to the person over the phone using a ‘wrong number’ tactic. This technology allows them to mimic the voice as well as the intonation and emotional nuance of the victim’s trusted contact, convincingly.

“In a nutshell, the scamsters use an AI-generated cloned voice to commit cybercrimes,” said Sanjay Kumar, ADGP, Cyber Crime Wing.

Once they have established a sense of urgency and trust, the scamster requests the victim to transfer money immediately to help resolve the crisis. He/she often suggests using fast and convenient payment methods like the Unified Payments Interface (UPI) system to expedite the transaction. Driven by concern and a desire to help their loved one, the victim may comply with the scamster’s demands without verifying the authenticity of the caller or the legitimacy of the situation.

After the money transfer is completed, the victims may later realise that they have been deceived when they independently contact their family member or friend and discover that they were never in distress or in need of financial assistance. The victim suffers a financial loss, and additionally, may feel betrayed, violated, and emotionally distressed upon realising they were scammed, the police said.

Mr. Sanjay Kumar advised members of the public to always verify the identity of the person calling, especially if they request urgent financial assistance and to ask probing questions or contact the friend/relative through a known, verified number to confirm their identity before taking any action.

He also asked the public to stay informed about common scams, including this voice cloning fraud, and learn to recognise warning signs. Be wary of unexpected requests for money, especially if they involve urgent situations or emotional manipulation, he said.

If you suspect that you have been the victim of such a fraud or have come across any suspicious activity, report the incident to Cyber Crime Toll Free Helpline 1930 or register a complaint on www.cybercrime.gov.in

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.