Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Crikey
Crikey
National
Cam Wilson

Facebook won’t block ads for deepfake nude apps but it did block my article about them

Facebook blocked a post sharing a Crikey article about ads for deepfake nude apps, even as it continues to allow those advertisements to appear on its site.

Yesterday, Crikey reported that Meta — which owns Facebook, Instagram, Threads and WhatsApp — was allowing Australians to be targeted with dozens of paid advertisements that promote apps and websites that generate explicit intimate imagery using AI technology, often without the consent of the individuals featured in the images. 

This advertising content, and the apps themselves, are against Meta’s own rules. Despite this, ads continue to run across the company’s social media platforms, suggesting that the US$1.3 trillion dollar company is incapable or unwilling to enforce its own rules even after being repeatedly told about the problem.

While Meta has had limited success stopping these ads, the company did successfully delete a post sharing the Crikey article from this reporter’s Facebook page. 

A link to yesterday’s article with the caption “Facebook and Instagram are still running ads explicitly promoting apps for creating ‘deepfake’ naked/sexual images of people, in some cases targeted at 18yos even after previous reporting about the ads AND even after I pointed it out this time” was immediately deleted after being flagged as spam, according to a Facebook notification.

A Facebook page explaining the decision said that “we don’t allow people to share or send anything that contains misleading links or content.” The page offered an option to appeal the decision, but no verdict has been delivered at the time of writing. 

The same post was shared on Meta’s Threads without issue, as well as on other social media platforms such as X, LinkedIn, BlueSky and Mastodon. 

The decision to block the post was almost certainly an error made by Meta’s automated moderation tools, and not the result of any human interaction. Nevertheless, it shows the problems with the company’s approach to keeping its platforms safe for users. 

Whatever is happening behind the scenes, the end result is that Meta is accepting money to help promote technology that is predominantly used to abuse women, while blocking public interest journalism reporting on the problem. 

Meta didn’t comment on the matter.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.