Facebook parent Meta says Chinese law enforcement is behind the largest covert online influence operation the company has ever disrupted.
The operation spread pro-China messages and attacked critics of Beijing's policies, using a sprawling network of fake accounts across more than 50 websites, from Facebook and Instagram to YouTube, Twitter (now known as X), TikTok, Reddit, and dozens of smaller platforms and forums.
The fake accounts posted links to articles praising China and denigrating U.S. and European foreign policy, as well as seemingly personal comments that appear to be copied and pasted from a numbered list, resulting in hundreds of identical posts.
"They were really trying to spread their way across the internet and to try to spread their message across the internet," said Ben Nimmo, Meta's global threat intelligence lead.
Despite the network's scale, Meta said it had failed to gain a large following of real people on its platforms or elsewhere online, and few authentic accounts interacted with its content.
Still, the confirmation of the scope of its activities and Meta's uncovering of "links to individuals associated with Chinese law enforcement" are a breakthrough for researchers and underscore how China has emerged as a significant player in covert influence operations alongside Russia and Iran.
"China is investing an enormous amount of money in the full spectrum of state propaganda, of which this is an important part," said Graham Brookie, senior director of the Atlantic Council's Digital Forensic Research Lab. Social media is "an important layer because it creates a façade of engagement on their chosen narratives...that are either beneficial to the [Chinese Communist Party] or harmful to its perceived competitors."
Meta did not give further details about how it linked the network to the Chinese government. The company bases its attributions on technical signals and behavioral indicators, like when posts are made and language patterns.
Since 2019, tech companies and independent organizations have been tracking seemingly disparate clusters of fake accounts posting pro-China content across social media.
This activity was dubbed "Spamouflage" for the accounts' tendency to intersperse political posts with random videos and pictures — effectively using spam as camouflage. Associated clusters were behind online attacks on pro-democracy protesters in Hong Kong and the Trump administration, praise for China's COVID-19 response, and AI-generated fake news anchors promoting the Chinese Communist Party.
But while researchers long suspected the clusters were linked, they had not been able to prove it, Nimmo said. (Nimmo was among the researchers at investigation firm Graphika who first identified Spamouflage activity in 2019, before he joined Meta.)
"It really is all one operation," Nimmo said. "And for the first time, not only have we been able to tie all this activity together, but we've been able to link it to individuals who are associated with Chinese law enforcement."
Spamouflage campaign mainly reached fake followers
The repetitiveness of the fake accounts' posts, as well as quirky headlines, small mistakes such as misspellings and apparently automated translations, and similar patterns in how and when they posted, allowed Meta to connect a new cluster of fake accounts found in 2022 targeting a human rights group to other sets of accounts that had been taken down over several years.
Meta said the Spamouflage operation is spread across different parts of China but shares internet infrastructure and what Nimmo called "centralized directions" about what content to post.
Meta has removed more than 8,600 Facebook accounts, pages, groups and Instagram accounts with a collective following of about 561,000 accounts. However, the company said many of those followers were themselves fake accounts, because the Chinese operation appeared to have bought pages from spam operators in Vietnam and Bangladesh that create pages and populate them with bot followers.
Some of those pages abruptly switched from posting ads to sharing political content, leading to "highly idiosyncratic behaviors where, for example, a Page that had been posting lingerie ads in Chinese abruptly switched to English and posted organic content about riots in Kazakhstan," Meta said in its report.
"There's nothing to suggest that anywhere across the internet this operation has really been systematically landing any kind of audience yet," Nimmo said. When real people do encounter Spamouflage messaging online, he said, "you quite often get people replying to it and saying, 'This is not real, this is a fake account' and calling it out."
Russian anti-Ukraine network spoofed Washington Post and Fox News
Separately, Meta said on Tuesday that it was continuing to take down fake accounts and ban fraudulent websites connected to a Russian influence operation aimed at eroding support for Ukraine.
The "Doppelganger" operation, which Meta first identified last year, created elaborately detailed websites spoofing news organizations and then posted links to articles criticizing Ukraine and Western countries across social media. Meta has attributed the effort to a Russian PR firm and a Russian IT company, which have both been sanctioned by the European Union.
In recent months, the campaign has expanded from faking websites of European outlets including the U.K.'s The Guardian and Germany's Der Spiegel to impersonating The Washington Post and Fox News as well as two Israeli news sites. It has also created fake websites purporting to represent NATO, the Polish and Ukrainian governments, the German police and the French Foreign Ministry.
"This is a single-minded influence operation," Nimmo said. "It really has one mission, and that is to undermine international support for Ukraine."
Meta said the operation was the "largest and most aggressively persistent" it had seen from Russia since 2017. The company says it has taken down thousands of fake accounts and pages connected to the network and blocked more than 2,000 domains from being shared on Facebook and Instagram in the past year.
Like the Chinese operation, however, this Russian operation has not gained much traction among real social media users — in part because the fake accounts it uses to promote links to its spoofed websites are so obviously fake as to be quickly detected.
"There is a very strange disconnect between the intricate websites and the smash and grab approach on social media," Nimmo said.
Influence campaigns pivot to smaller social networks
Both operations have targeted not just the big social networks, but also smaller sites that may be less able or willing to take on a pervasive influence operation. That's become more pronounced as companies like Meta have taken more aggressive action, Nimmo said.
For example, in 2019, the Chinese Spamouflage operation was mainly focused on YouTube, Twitter and Facebook. But as the larger companies caught on, it shifted to alternative sites.
"So rather than starting off on Facebook, it will post an article on a small forum in Nigeria or in Australia, and then it will start trying to share the link to that article on the larger platforms," Nimmo said. "It's putting itself into these smaller and smaller buckets and then trying to reach out of there."
Influence operations are also increasingly setting up off-platform websites that they use social media to drive traffic to. In another, separate set of takedowns, Meta removed three networks of pages and accounts targeting audiences in Turkey. Two were based in Turkey and one in Turkey and Iran. The networks created websites posing as independent news outlets and fictitious brands, and posted about news and politics in Turkey and the Middle East.
The Russian operation has adjusted its tactics in other ways to try to evade crackdown by social media platforms, such as by using alternate websites to redirect its links. But some of those domains have become increasingly bizarre.
"We've seen this operation posting its 'stop supporting Ukraine' content on websites with names like CoedNakedFootball.xyz, or EarlyGonorrhoeaSigns.com," Nimmo said. "As soon as you look at this, you just think, 'What?'"
When looking at both China's Spamouflage operation and Russia's Doppelganger campaign, he said, the disconnect between the scale of the operations and their limited impact makes it unclear what the people actually doing the work are trying to accomplish.
"You start asking yourself the question: Are they really trying to reach an audience in the country they're targeting? Or are they just trying to show the people who are paying them that they're posting lots of content?" he said.