TikTok has removed 284 accounts associated with a Chinese disinformation campaign after Guardian Australia raised questions about several accounts uncovered by the company’s rival Meta.
On Wednesday, the parent company of Facebook and Instagram reported it had shut down close to 9,000 Facebook and Instagram accounts, groups and pages associated with a Chinese political spam network that had targeted users in Australia and other parts of the world.
During its investigation Meta uncovered the influence operation on more than 50 online platforms and forums including YouTube, TikTok, Reddit, Pinterest, Medium, Blogspot, Livejournal and X, formerly known as Twitter, in addition to Instagram and Facebook.
The network typically posted positive commentary about China and Xinjiang province and negative commentary about the US, western foreign policies and critics of the Chinese government, including journalists and researchers.
The videos on TikTok, seen by Guardian Australia, focused heavily on responding to reports of forced labor in Xinjiang and on Taiwan. While most videos were in double-digit views, some had tens of thousands.
The report included links to four TikTok accounts associated with the campaign, which were still active as of Wednesday. After Guardian Australia reported these accounts to TikTok, the company responded on Thursday confirming 284 accounts linked to the operation had been banned by TikTok for violating company policy against covert influence operations.
A spokesperson for TikTok did not directly answer why the company had failed to pick up the accounts, but the company’s countering influence policy states the company focuses on “behaviour and assessing linkages between accounts and techniques to determine if actors are engaging in a coordinated effort to mislead TikTok’s systems or our community”.
It is understood the takedowns will be detailed in TikTok’s next quarterly community guidelines enforcement report.
Fergus Ryan, a senior analyst in cyber technology and security at the Australian Strategic Policy Institute, said it was positive news but TikTok should have acted proactively.
“For the past year, we’ve been actively monitoring Spamouflage accounts on TikTok,” he said.
“It’s worth noting that while many major social media platforms have acknowledged the presence of covert operations linked to the Chinese state, Chinese-owned TikTok hasn’t made any similar disclosures.”
He said Aspi was willing to work with TikTok to analyse and identify covert influence operations.
Albert Zhang, an Aspi analyst who co-authored a report on the Spamouflage operation earlier this year, said some of the accounts they had identified were still up on the platform.
“Greater transparency is needed from TikTok to allow independent organisations to assess their actions in this case,” he said.
“TikTok should follow a similar format to Meta’s latest report as the industry’s leading standard for disclosing state-backed influence operations on their platforms.”
TikTok was banned from Australian government devices earlier this year amid concern about the company’s links to the Chinese Communist party, after similar bans in other parts of the world including the UK and the US. The company has consistently denied claims of ties to the CCP or that the CCP influences content on the platform.
Earlier this month an Australian parliamentary committee chaired by the shadow home affairs and cybersecurity minister, James Paterson, recommended extending the ban to government contractors. It also recommended following the US in the event that the US government forces TikTok’s parent company ByteDance to divest itself from owning the social media platform.