A woman has revealed what it was like to discover deepfake porn of herself on the internet.
Kate Isaacs said it was “violating” to fall victim to “deepfakes” – explicit images or videos which have been manipulated to look like someone without their consent – as she accused her attacker of trying to “silence” and “scare” her.
Kate Isaacs’s comments come as the government announced distributing pornographic deepfakes as well as sharing “downblouse” images – non-consensual photos taken down a woman’s top – will be made illegal.
Police and prosecutors will be given more power to hold perpetrators to account under an amendment to the Online Safety Bill, with the distributors of deepfakes now potentially facing prison time under the suggested measures.
Ms Isaacs, who is 30, told The Independent she thinks she fell victim to a deepfake because she set up a campaign called #NotYourPorn to get non-consensual content removed from the popular adult entertainment platform Pornhub after her friend ended up on the site without her consent.
In 2020, the campaign helped pressure Pornhub into deleting an estimated 10 million unverified videos on the site in order to strip the platform of non-consensual and child porn videos.
Ms Isaacs said: “That was a great day. It spread a message. Unfortunately, after that incident, I became a target on Twitter of a very small but loud group of men. They felt they were entitled to non-consensual porn.
“They were against the campaign and they were upset I had ridded Pornhub of their porn. After that, they started an attack on me online, they posted my work and home address on Twitter.
“They commented underneath they were going to find me, follow me home, rape me, film it and upload it to Pornhub. That was completely terrifying, I’d never experienced fear like that in my life.”
They were against the campaign and they were upset I had ridded Pornhub of their porn. After that, they started an attack on me online, they posted my work and home address on Twitter— Kate Isaacs
The campaigner added part of their attack was to turn her into a deepfake, describing the ordeal as “an act of sexual violence”.
She said: “They took a BBC TV interview and doctored my face onto a porn film of someone having sex. It was so violating. For a few short seconds, I didn’t know it was a deepfake and I was terrified looking at that as I couldn’t remember that moment or the man.”
Upon closer inspection, she realised it was a deepfake, Ms Isaacs, who campaigns on image-based sexual abuse, added.
She said: “Someone was using my identity, my profile without my consent in a sexual manner. I appreciate some people don’t feel like it is that big of a deal.
“But we have hit blurry lines between perception and reality because perception is reality now. I find it abhorrent that they used my image to silence me, to scare me, or for sexual gratification without my consent.”
Many have sounded alarm bells about how deepfakes can mislead members of the public, and previous research conducted by cybersecurity firm Deeptrace indicated around 96 per cent of all deepfake videos are non-consensual porn, while women are targets in 96 per cent of cases.
The newly unveiled measures will see ministers introduce extra laws to address other abusive acts, such as concealing secret cameras to capture photos or footage of an individual without getting their consent.
Nicole Jacobs, domestic abuse commissioner, said: “I welcome these moves by the government which aim to make victims and survivors safer online, on the streets and in their own homes.”