Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Chronicle Live
Chronicle Live
Business
Catherine Furze

Four scams to watch out for as high-tech criminals turn to Artificial Intelligence

Concern is growing that fake letters and texts will become difficult to detect as fraudsters use Artificial Intelligence (AI) software to improve their scams.

AI is intelligence demonstrated by machines as opposed to intelligence displayed by humans and experts have warned that scammers are thought to be using AI software to create convincing fake bank documents, bills and emails within seconds, which are then used to con people into sending money or sharing their personal details.

Fraudsters are able to give the software an instruction to request an action – such as clicking a link or sending a payment - in a style appropriate to the con, such as a legal letter, and tell-tale signs of scams like poor spelling, grammar or language use can be wiped out, making false requests much harder to spot.

Read more: Warning on impersonation scams to watch out for as copycat fraudsters try to con you

This means that advice previously given warning people to look out for poor spelling and grammar in a suspicious message as a tell-tale sign of fraud could be eradicated by AI.

The true scale of chatbot-related scams is hard to ascertain because of the difficulties in spotting AI generated content. Microsoft co-founder Bill Gates claimed recently that the advancement of AI such as GPT-4 was the most significant advancement in technology since the creation of modern computers, but Twitter owner Elon Musk, and Apple co-founder Steve Wozniak are among a group of tech figures who called for a halt on the creation of powerful AI models earlier this year, warning their risks are not yet understood.

Consumer group Which? is warning that fraudsters would “undoubtedly” exploit the capabilities of AI to make scams more convincing. Lisa Webb, Which? scams expert, said that “fraudsters tend to use any tools at their disposal to target victims and will undoubtedly be looking to take advantage of AI software like GPT-4 to carry out sophisticated scams”.

Scammers are already using AI to generate fake photo to tug at people's heartstrings in fake charity appeals and in an interview with The Washington Post, Canadian Benjamin Perkin told how his elderly parents lost thousands of dollars after they were taken in by AI generated voice technology. His parents received a phone call from an alleged lawyer, saying their son had killed a U.S. diplomat in a car accident and was in jail and needed money for legal fees. The lawyer put Mr Perkin on the phone, who said he loved them, appreciated them and needed the money. A few hours later, the lawyer called Mr Perkin’s parents again, saying their son needed $21,000 before a court date later that day.

The voice sounded “close enough for my parents to truly believe they did speak with me,” said Mr Perkin. They rushed to several banks to get cash and sent the lawyer the money through a bitcoin terminal and it was only when Mr Perkin called his parents that night for a casual check-in that the scam came to light.

AI voice-generating software analyses what makes a person’s voice unique and searches a vast database of voices to find similar ones and predict patterns, It can then re-create the pitch, timbre and individual sounds of a person’s voice to create an overall effect that is similar. It’s unclear where the scammers got Mr Perkin's voice, although he has posted YouTube videos talking about his snowmobiling hobby.

Sophie Miller, money-saving expert at Vouchers.co.uk explains “Artificial intelligence is making it more difficult for people to spot scams as the software allows scammers to create a realistic persona of who or what they’re pretending to be at the click of a button. Brits should be alarmed by this and be aware it’s not only fake texts and emails that scammers could be creating using AI, but it’s also things including videos, phone calls and dating profiles too.”

Here's four things to look out for when you're online:

1. Videos tricking watchers into sharing private information

Tools which are supposed to be used to generate videos and text-to-speak software for promotional clips and training videos have been infiltrated by scammers to create realistic videos of ‘people’ offering pirate software to download at the click of a link. However, you are downloading harmful malware onto your computer which can give the scammers access to all your private information stored on it. This could allow them to steal your identity, gain access to your online banking and online shopping accounts which could leave you with zero money in your bank account and a hefty bill to retailers for shopping purchased using your cards.

2. Dating scams

Romance scammers, also known as catfish, are using AI to their advantage to trick people into handing over substantial amounts of money. Chatbots like ChatGPT interact in a conversational way, helping the scammer to create a realistic persona of the person the victim believes they’re talking to. Often when getting to know someone online people will video chat to make sure the other person is really who they say they are and many catfish are caught out by their unwillingness to put their camera on, however, scammers are using the latest technology to create video clips so targets think they are real.

3. Fake LinkedIn profiles

When you’re job hunting on LinkedIn, multiple types of software are allowing scammers to create entirely fake profiles using AI. These will connect with many people including job hunters offering them ‘get rich quick’ opportunities which in reality will be things like crypto scams and work-from-home job scams that could leave you out of pocket and your personal information at risk.

4. AI voice cloning scams

Scammers are taking samples of people's voices from social media apps and submitting them to AI voice generator tools to mimic them and trick their family members into believing they’re in trouble. The scammer, using the loved ones' voices, will ask for thousands of pounds to guarantee their relatives' safe return. Elderly relatives are often targeted by this scam due to the likelihood that they're less likely to use the internet which could make them aware of the scam.

Now read:

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.