CONTENT WARNING: This article discusses image based abuse.
Taylor Swift is reportedly “furious” over AI-generated explicit images of her that sprung up on a celebrity nudes site and have since gone viral on social media.
An insider told the Daily Mail on Thursday that Swift is considering what legal action she can take against the site that generated the pictures of her — its watermark is in the corner of the photos.
“Whether or not legal action will be taken is being decided, but there is one thing that is clear: these fake, AI-generated images are abusive, offensive, exploitative and done without Taylor’s consent and/or knowledge,” the insider said.
“The Twitter account that posted them does not exist anymore. It is shocking that the social media platform even let them be up to begin with.
“Legislation needs to be passed to prevent this, and laws must be enacted.”
While the account that initially posted the nudes on X, formerly Twitter, has since been taken down, the photos are still spreading like wildfire — and as we all know with the internet, once something is on there it’s really hard to have it scrubbed.
This is especially true for things like this, because for every photo that is taken down, some freak just generates a new one — and if they move their offensive content to an app like Telegram, there’s not much you can do about it.
Legislation on offensive content like this is woefully behind. While services do exist to track down and remove non-consensually shared nudes (of both AI-generated and real ones), these are often the responsibility of the social media platform — not any government authorities — and they have their limitations.
With X’s moderation crippled since Elon Musk bought it and fired most of its staff, situations like these are especially hard to get a handle on.
Fans have made their own attempt at protecting Taylor by flooding X with tweets that use the term “Taylor Swift AI” so the NSFW images are harder to find.
One troll who posted the image after it appeared on the celebrity nudes site also became the target of Swifties’ ire and… good.
“My Taylor post went viral and now everyone is posting it,” he wrote, per Page Six.
He later added, “Bro what have I done… They might pass new laws because of my Taylor Swift post. If Netflix did a documentary about AI pics they’d put me in it as a villain. It’s never been so over.”
His account is now private.
As horrific as this entire ordeal is, if there’s one person I think that has the power to actually force a change in legislation around deepfake porn for the better — its Taylor Swift.
Recording, sharing, or threatening to share an intimate image without consent is a criminal offence across much of Australia. If you’d like to report image based abuse to police or get help removing an intimate image from social media, reach out to the Australian eSafety Commissioner here.
- If you’d like to speak to someone about image based abuse, please call the 1800
- Respect hotline on 1800 737 732 or chat online.
- Under 25? You can reach Kids Helpline at 1800 55 1800 or chat online.
The post Taylor Swift Is Reportedly Considering Legal Action Over AI Porn Images: ‘Sick And Disgusting’ appeared first on PEDESTRIAN.TV .