Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - US
The Guardian - US
World
Edward Helmore

Alexandria Ocasio-Cortez recounts horror of seeing herself in ‘deepfake porn’

a woman with dark hair and red lipstick sits at a desk with a microphone
Ocasio-Cortez supports legislation to make it easier for nonconsensual AI pornography victims to sue publishers, distributors and consumers of X-rated digital forgeries. Photograph: Anadolu Agency/Getty Images

Alexandria Ocasio-Cortez has described her horror at discovering – midway through a discussion with aides during a car journey in February – that she had been transposed into a “deepfake porn” simulation generated by artificial intelligence and posted on X.

In an interview with Rolling Stone, the progressive congresswoman said the experience of coming up close to a depiction of herself performing a sex act had shown her that being deepfaked was “not as imaginary as people want to make it seem”.

The Democratic US House member added that the mental picture of her deepfake version performing sexual acts stayed with her the rest of the day.

“There’s a shock to seeing images of yourself that someone could think are real,” Ocasio-Cortez told Rolling Stone. “As a survivor of physical sexual assault, it adds a level of dysregulation. It resurfaces trauma, while I’m … in the middle of a fucking meeting.”

A recent Guardian podcast investigation – The hunt for ClothOff: The deepfake porn app – attempted to find out who is behind the deepfake porn explosion, which has recently hit dozens of well-known figures, including Taylor Swift.

A survey of five deepfake porn sites by Britain’s Channel 4 News analysis found 4,000 famous individuals listed.

Ocasio-Cortez told Rolling Stone that “digitizing violent humiliation” is akin to physical rape, and she predicted that “people are going to kill themselves over this”.

She continued: “There are certain images that don’t leave a person, they can’t leave a person,” she said, citing research into problems the human brain has in separating real from fake – even when material is understood to be fake.

“It’s not as imaginary as people want to make it seem,” Ocasio-Cortez continued. “It has real, real effects not just on the people that are victimized by it, but on the people who see it and consume it. And once you’ve seen it, you’ve seen it.”

Ocasio-Cortez, 34, is supporting congressional legislation to make it easier for nonconsensual AI pornography victims to sue publishers, distributors and consumers of X-rated digital forgeries.

Dick Durbin has said the legislation – known as the Disrupt Explicit Forged Images and Non-Consensual Edits Act of 2024 – would hold accountable those who are responsible for the proliferation of nonconsensual, sexually explicit “deepfake” images and videos.

“Sexually explicit deepfake content is often used to exploit and harass women – particularly public figures, politicians and celebrities,” said the Democratic Illinois senator, who pointed to the sexually explicit, fake images of Swift that had “swept across social media platforms” earlier in the year.

“Although the imagery may be fake, the harm to the victims from the distribution of sexually explicit deepfakes is very real,” he added. “Laws have not kept up with the spread of this abusive content.”

But just as pornography advanced ahead of music and film when online media distribution began in the 1990s, so it is now propelling AI-content creation and distribution. A 2023 study by cybersecurity firm Home Security Heroes found that fabricated pornography accounts for 98% of all deepfake videos posted online.

In her interview with Rolling Stone, Ocasio-Cortez said that deepfake pornography “parallels the same exact intention of physical rape and sexual assault, which is about power, domination, and humiliation”.

She added: “Deepfakes are absolutely a way of digitizing violent humiliation against other people.”

  • Information and support for anyone affected by rape or sexual abuse issues is available from the following organisations. In the US, Rainn offers support on 800-656-4673. In the UK, Rape Crisis offers support on 0808 500 2222. In Australia, support is available at 1800Respect (1800 737 732). Other international helplines can be found at ibiblio.org/rcip/internl.html

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.