In news that will come as a relief to Aussie teenagers frantically trying to get their leaked nudes off the internet, Meta is partnering with a service designed to do exactly that.
By now, there’s no shortage of horror stories young Aussies face when it comes to image-based abuse, be it nudes being leaked without consent, accidentally sharing explicit pics with the wrong person, or non-consensual deepfake porn. The issue is growing by the day, and few countries have laws that can catch up.
Here’s where Take It Down comes in.
Take It Down is a new service developed by the National Centre for Missing & Exploited Children in the US which aims to help people under the age of 18 get sexually explicit content of themselves on the internet taken down. It also can be used by adults who are worried there are pictures or videos of themselves online from when they were under 18.
The platform is American and was made for those in the US but by partnering with it, Meta is opening up the service to millions of people including Aussies.
How it works is kinda strange though.
Basically, Take It Down has an algorithm that matches your images with ones it finds online, kind of like reverse image searching.
You can upload your nudes (hmmm) through its algorithm which then creates a code, sends that code to websites like Meta’s Facebook and Instagram, as well as Pornhub and other sites, and then deletes content which matches yours.
While this is certainly useful and way better than anything we’ve currently got, it raises questions on security and also on the fact that it still requires users to report child pornography to Meta or the service, rather than — you know — having systems and algorithms in place that detect and take that shit down in the first place.
However, apparently the code can also be created without you uploading anything, which quells my concerns on security even though we’re not sure how that works.
And yes, the code can also work for deepfakes, which is good to know given nonconsensual deepfake porn is on the rise.
The movie is a huge leap in the right direction from Meta, but Australia’s eSafety Commissioner Julie Inman Grant
“The service relies on user-reporting, rather than the companies proactively detecting image-based abuse or known child sexual exploitation and abuse material,” she told the ABC.
“We maintain the view that companies need to be doing much more in this area.”
Recording, sharing, or threatening to share an intimate image without consent is a criminal offence across much of Australia. If you’d like to report image based abuse to police or get help removing an intimate image from social media, reach out to the Australian eSafety Commissioner here.If you’d like to speak to someone about image based abuse, please call the 1800 Respect hotline on 1800 737 732 or chat online. Under 25? You can reach Kids Helplineat 1800 55 1800 or chat online.
reckons more can still be done.
It’s also perhaps a little disappointing for some that this service can only be used for child exploitation material, because plenty of adults — particularly women — are also victims of abuse material, especially in regards to deepfakes and “revenge porn”.
Let’s hope that this service will lead to a rise in services for all people who experience image-based abuse.
Sign up to read this article
Read news from 100’s of titles, curated specifically for you.