Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Chicago Sun-Times
Chicago Sun-Times
National
Barbara Ortutay | AP

‘Take It Down:’ a tool for teenagers and anyone else to remove explicit online images

(AP file)

“Once you send that photo, you can’t take it back,” goes the warning to teenagers, often ignoring the reality that many teens send explicit images of themselves under duress or without understanding the consequences.

A new online tool aims to give some control back to teenagers and anyone else, allowing them to take down explicit images and videos of themselves from the internet.

Called Take It Down, the tool is operated by the National Center for Missing and Exploited Children and funded in part by Meta Platforms, the owner of Instagram and Facebook.

The site lets anyone anonymously — and without uploading any actual images — create what is essentially a digital fingerprint of the image. This fingerprint — a unique set of numbers called a “hash” — then goes into a database, and the tech companies that have agreed to participate in the project remove the images from their services.

The participating platforms so far include Instagram, Facebook, Yubo, OnlyFans and Pornhub, owned by Mindgeek.

If the image is on another site, though, or if it is sent in an encrypted platform such as WhatsApp, it won’t be taken down.

Also, if someone alters the original image — say by cropping it, adding an emoji or turning it into a meme — it becomes a new image and would need a new hash.

“Take It Down is made specifically for people who have an image that they have reason to believe is already out on the Web somewhere or that it could be,” said Gavin Portnoy, a spokesman for the National Center for Missing and Exploited Children.

Portnoy said teenagers might feel more comfortable going to a site than to involve law enforcement, which wouldn’t be anonymous.

Meta, then still called Facebook, tried to create a similar tool, but for adults, in 2017. It didn’t go over well because the site asked people basically to send their nude images, though encrypted, to Facebook. The company briefly tested the service in Australia but didn’t expand it to other countries.

In 2021, it helped launch a toll for adults called StopNCII — for nonconsensual intimate images, aka revenge porn. That site is run by a British nonprofit, the UK Revenge Porn Helpline, but anyone anywhere can use it.

Many tech companies already use this hash system to share, take down and report to law enforcement images of child sexual abuse.

Portnoy said the goal is to have more companies sign up. “We never had anyone say no,” he said.

Antigone Davis, Meta’s global head of safety, said the site works with real as well as artificial intelligence-generated images and deepfakes.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.