Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Fortune
Fortune
Alexandra Sternlicht

Faceswap deepfakes are a growing problem that’s costing creators millions—but social media services are doing little to stop it

paper collage of woman (Credit: Tara Moore—Getty Images)

Last year, Ashley Suarez learned that someone had reposted images of her showing off dresses and different hair styles on Instagram without permission, but replaced her face. This imposter, who went by the name Elizabeth Reid, used the images to build a big social media following—and to make money by accepting gifts from fans, and from pornography. 

“I was getting sent a lot of these videos, because people would recognize the background as being my own, because I made most of my TikTok's in my parents' home,” says Suarez, who earns a living from modeling and promoting products online thanks to her nearly 500,000 followers on both Instagram and TikTok. “They were like, 'This looks just like your video, but it's not your face.'"

It was just one of many examples that Suarez knows of her content being stolen, altered, and then reposted online. The rising availability and sophistication of so-called “faceswap” deepfake technology has made it easier than ever for people to pull it off. 

With the help of slight alterations, those responsible make stolen images difficult to detect using automation. Meanwhile, copyright laws give the practice some protection under an exception for works that are slightly altered from the original. 

Many social media companies appear to have been unprepared for the onslaught on faceswap images and have created major hurdles for image owners to get the images taken down. Only recently have they started being more aggressive in tackling the problem that is only expected to get worse.  

Suarez first fell victim to faceswaping three years ago when she saw content she shot of herself at her family home in Naples, Fla. repurposed on Instagram with a face that was not her own. At the time, she told her parents, called the police, and reported the content to Instagram. When none of her efforts worked, Suarez deleted her social media accounts and began seeing a therapist. A few weeks later, she came to a turning point: She hired Takesdowns.AI, which scans the internet for stolen images of its clients—or client images that have been tweaked using AI—and asks social platforms to remove the imposter’s content. 

“I realized I didn’t want to live in fear,” recalls Suarez, now 21.

Takedowns.AI’s CEO Kunal Anand says that instances of faceswap deepfakes have increased 500% in recent months. Such images are found across TikTok, Instagram, Facebook, and sites like Patreon and Fanvue, where pornography is allowed.

Takedowns.AI contacted Instagram, but was told in an email, “It’s not clear you are the rights owner or are otherwise authorized to submit this report on the rights owner’s behalf.” Instagram asked Suarez to submit a new report from her email address or respond with documentation showing she was authorized to do so. But after following the instructions, Takedowns.AI received the same message from Instagram.

Frustrated, Takedowns.AI took matters into its own hands, and sent messages directly to Suarez’s imposter account that threatened legal action unless the account was deleted. A few days later, the “Elizabeth Reid” account was gone. 

The services of Takedowns.AI don’t come cheap. Suarez and her family shelled out between $2,000 to $3,000 monthly for its services, putting them into debt before hiring the company on a two-year retainer in 2023 at a discounted rate.

Suarez said the money spent is worth it because platforms like Instagram refuse to remove content based on creator complaints alone. She thinks the platform listens more to companies like Takedowns.AI, which sometimes uses attorneys to send complaints. 

Though generating faceswap images may seem complicated, AI tools like Stable Diffusion and open source forums like GitHub make it relatively easy for people to do it without much technical skill. One post on GitHub that people can tap for faceswap coding, the software for using AI to alter photos, has garnered nearly 50,000 likes.

Earnings for faceswap scammers can easily justify the effort. One user who posted on an anonymous developer forum says that they made over $100,000 from it since starting in January.

Initially, social media platforms like TikTok and Facebook-parent Meta generally refused to do anything in response to complaints from Takedowns.AI about faceswapping, Anand says. The reason, he says, is that they considered the images in question as “transformative works,” a legal exception that lets people use otherwise copyrighted work without permission in a different manner, or slightly altered, from the original. Many creators are “losing millions even though their own content is being used, without them even getting to monetize it,” Anand laments. 

Faceswap deepfaked profiles

Faceswaps may not be legal

Kristin Grant, a managing partner at Grant Attorneys at Law who specializes in intellectual property, acknowledges the legal hurdle of transformative works when it comes to stolen images. But she says creators like Suarez could have a strong legal claim under a different law: rights of publicity, which protects against misappropriation of a person’s likeness, among other things. 

Rights of publicity laws vary state-by-state, so the strength of a case involving faceswap fakes depends on where the creator lives. “Using someone else’s image without their authorization for commercial purposes—that is illegal,” Grant says. 

Social platforms struggle to keep up with AI advances

Recently, Anand says Facebook, Instagram, and TikTok have started doing a better job of removing faceswap images after Takedowns.AI’s reports, noting they still have room for improvement. In his experience, when Instagram reacts to a complaint these days, it generally only removes the offending post, and not the entire account.

Recently, some social media services have started cracking down more broadly on AI-generated content. 

Meta says it will use automation and complaints by content creators to decide when to apply “made with AI” labels to AI-generated videos, images, and audio posted on its platforms so that users know they’re created by computers. The automation, however, is far from perfect, and Takedowns.AI’s Anand says it failed to detect faceswaps in many cases that his company ending up having to flag to Meta. 

Under its policies, Meta doesn’t proactively ban content that infringes on others’ copyright or trademark rights and only takes down content after someone reports a violation—while sometimes preserving the account. It has no policy specific to faceswap images like those involving Suarez.  

Meta declined to comment for this article.

Meanwhile, until early May, TikTok required creators to label their own AI-generated videos and had no policies to address faceswaps. The honor system did little to stop faceswappers and others from passing off AI-generated content as authentic to the platform’s 1 billion active users. 

In May, TikTok changed its policies and said it would join rivals Meta and Google’s YouTube in automatically applying an “AI-generated” label to AI videos posted on its service. It is unclear how effective this tool is for TikTok.

TikTok declined to comment for this article.

How porn sites fit in

Faceswapping may not be a major concern for average social media users, but it can be a huge problem for creators, especially women who post sexually suggestive or explicit content. Loose verification requirements on many creator-focused adult sites and the lure of easy money for people who steal images make it an attractive business. 

“I think women who are sexualized at a young age are the biggest victims,” says Suarez. “I was very sexualized when I was very young…it really has been going on for years.”

OnlyFans, known mostly for pornography, may be the strictest platform when it comes to policing AI-generated content. It bars creators who use AI from making money on its site and requires that creators verify their accounts by uploading photos of their driver licenses that correspond with their likenesses in the content. Still, pornography on OnlyFans continues to be stolen and then manipulated by faceswappers, who post it to other platforms, says Anand, who also founded ChatPersona, a company that lets OnlyFans creators use AI bots to for racy chats with customers. 

A representative for OnlyFans declined to comment on this article. 

Fanvue also requires that creators upload their driver's licenses before letting them open accounts on the platform. But hackers said on a digital forum that the images on those licenses aren’t checked as to whether they match the creators’ likeness. 

Fanvue did not respond to Fortune’s multiple requests for comment.

Tools for detecting faceswaps may be as controversial as the faceswaps themselves 

Takedowns.AI offers creators a system to prevent their likenesses from being used in faceswaps, but it creates privacy concerns. Users upload photos of their faces to Takedowns.AI and the technology, powered by third-party service PimEyes, crawls the internet to find identical and modified versions. 

“You can just scan your face, and whatever it finds, you can just take it down,” Anand said.

It’s similar to Google’s reverse image search, which allows Google search users to insert an image, rather than text, as a search query, and find matches. But unlike Google, Takedowns.AI has integrated social media profiles of individuals into the results. This feature is essential for creators to find and report faceswapped content. But some privacy advocates worry the tool can scoop up the images of people who don’t have social media profiles, but whose faces have been shared online by friends and family. 

"Love excuses everything"

Another creator Takedowns.AI says it represents is former high school weightlifter Carolina Samani, who posts sexy photos and videos on Instagram, making money through OnlyFans. Samani’s faceswap deepfaker goes by Julia Rossi, and despite Takedowns.AI’s multiple complaints over a number of months, Rossi’s Instagram account remained live until early June, when Instagram finally removed it.

Rossi, with the help of her faceswapped content originally from Samani, had attracted 210,000 Instagram followers, about half as many followers as Samani has lured after years of posting to the platform.

In response to one smitten fan on Instagram, Rossi, or whoever was controlling her account, provided a window into her worldview about right and wrong. “Is it worth committing an [sic] arson for you? 😘😍🤎,”the fan asked.  

Rossi gave a cavalier and emoji-filled response: “Love excuses everything 🫢😂.”

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.