Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Daily Mirror
Daily Mirror
National
Chiara Fiorillo

'My stomach dropped after I saw internet clip with my face - but it wasn't me'

A 30-year-old woman said her stomach "dropped" when she saw a deepfake porn clip featuring her face.

Kate Isaacs first saw the video while scrolling through Twitter and recalled "feeling hot" when she realised that the woman in the footage looked like her.

She started asking herself many questions - including where it had been filmed, whether someone had filmed her without her consent, and who could have published the clip.

Because of her anxiety, at first she did not realise that the footage was not her, but her face had been superimposed on the body of a porn star through a deepfake software, which digitally manipulates images, creating visual, audio and video hoaxes.

Deepfakes are created using artificial intelligence (Getty Images)

Ms Isaacs said the deepfake video was "so convincing" that it took her a few minutes to realise it was not her.

She told the Daily Mail: "Anyone who knew me would think the same. It was devastating. I felt violated, and it was out there for everyone to see."

The woman said she never discovered who made the video but believes she was targeted as she had previously spoken out about the rise of "non-consensual porn".

The activist, who is the founder of the campaign group Not Your Porn, said the fake porn video of her, which appeared in 2020, was made using innocent footage taken from the internet.

She said: "This is all it takes now. It makes every woman who has an image of herself online a potential victim; that's pretty much all of us, these days."

Deepfakes are created using artificial intelligence software that can take an existing video and replace someone's face with another person's face, even mirroring facial expression.

The technology has been used to create some lighthearted deepfake videos of celebrities, but its most common use is for sexually explicit videos.

As software becomes more sophisticated, experts have warned that deepfakes are set to become a growing problem.

Adam Dodge, the founder of EndTAB, a group that provides trainings on technology-enabled abuse, told CBS News: "The reality is that the technology will continue to proliferate, will continue to develop and will continue to become sort of as easy as pushing the button.

"And as long as that happens, people will undoubtedly ... continue to misuse that technology to harm others, primarily through online sexual violence, deepfake pornography and fake nude images."

The issue Ms Isaacs experienced is sadly quite common - as the example of Noelle Martin, a woman from Perth, Australia, shows.

The 28-year-old found deepfake porn of herself 10 years ago when she used Google to search her name.

She said she still does not know who created the fake images and believes that someone took a picture posted on her social media.

Despite contacting several websites in a bid to take the images down, she was either ignored or saw the fake pictures being taken down before being posted online again.

Ms Martin said: "You cannot win. This is something that is always going to be out there. It's just like it's forever ruined you."

In October 2022, Ms Isaacs spoke to the Sunday People as figures emerged showing that just 3% of revenge porn ­incidents reported to police have resulted in charges.

Sharing another person's ­private sexual photos or videos without consent became illegal in 2015 - but of the 23,672 complaints to police on the matter since, just 846 have resulted in a charge.

Ms Isaacs said: "The original law in 2015 was loose and means it's difficult to get a conviction. You have to be able to prove malicious intent. Added to that, no one ­really knows the law."

In some cases "lack of public interest" was cited as a reason for no charges, a Freedom of Information request to all 43 UK forces found. And not all responded, with the Met among the 12 that declined.

Ms Isaacs said: "I feel sick they'd say it's not in the public interest. It's shocking and a clear demonstration of how little women's rights and image-based sexual abuse are taken seriously."

Mum-of-two Natasha Saunders, 33, a domestic abuse victim, said: "It's horrific. The law is not being enforced and it's not good enough. These are very real crimes.

"When I was with my abuser he took images of me. When he gets out of prison he'll have access to these images. Even after I left I was getting threats. He said he would share the images with everyone on my phone book. There needs to be better guidance for the police."

Ms Saunders' ex-husband could be out of prison by 2025, after he was sentenced to 12 years in 2018 for three counts of rape and one of sexual assault.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.