Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Digital Camera World
Digital Camera World
Leonie Helm

A teenager created an AI nude of his female classmate and circulated it on Snapchat – he was suspended from school for one day

An iPhone, siting on a laptop keyboard next to a pair of headphones, with the Snapchat messaging app logo on screen.

As members of Parliament in the UK launch a bid to legally ban smartphones in schools, to safeguard children from the potential harm of social media, a teenager in the US is sharing her story after AI nude images of her ended up circulating on social media.

Francesca Mani was 14 years old last October when her name was called over the loudspeaker, at Westfield High School in New Jersey, summoning her to the principal's office. She then learned that a picture of her had been taken from Instagram by a fellow male student, and uploaded to a site called Clothoff – a popular “nudify” website.

Mani, who has been chosen as one of Time Out's top 100 most influential people in AI 2024 for her campaigning, sat down with Anderson Cooper on 60 Minutes and explained why she is now on a mission to prevent this from happening to others.

According to Graphika and reported by CBS News, last month alone there were more than three million visits to Clothoff – a site where people can upload a photo, easily obtained from another person’s social media account, and use artificial intelligence to remove their clothes, resulting in a very real looking deepfake.

ABOVE: Watch Mani on 60 Minutes

Mani never saw the photo that was circulating of her, but a lawsuit filed by a fellow female student also targeted claims that a boy from their school had downloaded the photos from Instagram to Clothoff, and then circulated them around the school on Snapchat.

"It's like rapid fire," she said, "It just goes through everyone. And so then when someone hears – hears this, it's like, 'Wait. Like, AI?' No one thinks that could happen to you."

While the school argued that the images had been deleted and that circulation had ceased, Mani’s mother Dorota was not convinced. Speaking to 60 Minutes she said, "Who printed? Who screenshotted? Who downloaded? You can't really wipe it out.”

60 Minutes also found 30 similar cases in schools in the US over the last 20 months, in addition to other cases globally. In at least three of those cases, Snapchat was reportedly used to circulate AI nudes.

Yiota Souras is chief legal officer at the National Center for Missing and Exploited Children, who works with tech companies to flag inappropriate content on their sites. While some may argue that the images are fake, Souras argues that the damage they cause to victims is real.

She told CBS, "They'll suffer, you know, mental health distress and reputational harm. In a school setting it's really amplified, because one of their peers has created this imagery. So there's a loss of confidence. A loss of trust."

In Mani’s case, she was informed that the boy who created and circulated the image received a one-day suspension from the school as his only punishment.

New artificial intelligence technology has advanced far faster than its human counterparts can control it with appropriate legislation. There is confusion and there are loopholes, and the responsibility for these misuses of new technology are often passed around.

One parent told 60 Minutes that it took Snapchat 8 months to get the accounts that had shared the images of their daughter taken down. The Department of Justice says AI nudes of minors are illegal under federal child pornography laws if they depict what’s defined as “sexually explicit conduct.” However, Souras’ concern is that some of these nude images created by websites may not meet that definition.

Now 15, Mani and her mother have spent the last year encouraging schools to implement serious policies around AI, and have been working with members of Congress to try to pass multiple federal bills to protect people. The Take It Down Act is co-sponsored by Republican senator Ted Cruz and Amy Klobuchar, a politician and lawyer who, next year, will become the chair of the Senate Democratic Steering and Policy Committee.

The bill would create criminal penalties for sharing AI nudes and would require social media companies to take down photos within 48 hours of getting a request. It made it through the Senate earlier this month, and is now awaiting a vote in the House.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.