Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Tribune News Service
Tribune News Service
World
Alex Barinka, Margi Murphy, Jillian Deutsch

TikTok reveals Russian disinformation network targeting European users

TikTok Inc. identified a Russian disinformation network spreading war propaganda about Ukraine to more than a hundred thousand European users over the summer, the company disclosed on Thursday.

The network operated in Russia, but targeted mostly people in Germany, Italy and the U.K. in their respective languages. About 1,700 accounts belonging to the network were able to spread anti-Ukrainian messages by masquerading as accounts created by local users. Between July and September 2022 the accounts attracted 133,564 followers, it added.

It is unclear whether the network was linked to the Russian government. The revelations were disclosed in a report submitted under the E.U.’s Code of Practice on Disinformation, a voluntary commitment from online platforms to combat misinformation online.

“We don’t allow activities that may undermine the integrity of our platform or the authenticity of our users,” a TikTok spokesperson said. “The fact that these networks and related accounts were quickly identified and removed is a testament to the considerable resources we have invested in protecting our community from being misled.”

Disinformation networks have several digital tools up their sleeves to expand their sphere of influence by masking where photos, memes and videos really come from. They typically use virtual private networks to make it appear as if they originate in different countries and buy personalized advertising in markets they wish to target, all making it appear as if their accounts are made by local citizens. They can also buy IP addresses to make it appear as if they are creating accounts from different residences, despite being generated under one roof.

In this particular network, the accounts on TikTok used speech synthesis and spread pro-Russian claims in languages native to the countries it was targeting, the company said in its report.

“We’ll see this more in these types of campaigns especially with the rise of artificial intelligence,” said Katie Harbath, the chief executive officer and founder of Anchor Change and former public policy director at Facebook. “My guess is that this makes it harder to detect these types of campaigns and makes things more believable to those they were targeting.”

In response, the company worked with local fact checkers to create guidelines for its content moderation team to identify and remove videos with misinformation and the accounts that share them. The company also made moves including adding a label to accounts knowingly owned by state controlled media and blocked the livestreams from users in Russia and Ukraine from being shown to users in E.U. countries.

TikTok and other social media platforms were asked to file voluntary reports on their disinformation mitigation practices in Europe by the end of January. The codes are tied to the E.U.’s new content moderation laws, the Digital Services Act.

Those reports were published Thursday for the first time. Large companies will publish new reports every six months

Coordinated misinformation networks from Russia, China and Iran have used the internet to sway public opinion in places like the U.S., Latin America and North Africa. They often use social media sites like Twitter or Meta Platforms Inc.’s Facebook and Instagram to disseminate misinformation, with spikes in activity ahead of elections.

While these types of misinformation campaigns are not new to social platforms, TikTok may face a heightened level of scrutiny because of its ownership by a Chinese company, ByteDance Ltd. TikTok’s parentage has already brought a slew of criticism and pushes for related legislation from governments across the globe.

Unlike the E.U., the U.S. has no mechanism to force the company to disclose coordinated influence efforts on social media platforms. They either choose to disclose them publicly, like Facebook did regarding Russian and Iranian efforts in 2019, or the news is broken by journalists.

TikTok’s future in the U.S. is being questioned. The Biden administration, through the Committee on Foreign Investment in the U.S., has spent years weighing a proposal to allow TikTok to operate in the country under the ownership of ByteDance. The arrangement would include ensuring all U.S. user data would route through servers maintained by U.S.-tech company Oracle Corp., who would also audit the app’s algorithms.

TikTok implemented the proposed arrangement with Oracle last year. “We are not waiting for an agreement to be in place,” a spokesperson said.

Three bills have been introduced in the U.S. Congress that could ban TikTok, scheduled to be taken up by the House Foreign Affairs Committee this month.

On Thursday, TikTok also laid out additional policies it will introduce over the next six months. The company said that it’s “aiming to work with our fact-checking partners to identify specific disinformation trends in countries and develop tailored, localized media literacy campaigns to tackle those trends.” Portugal, Denmark, Greece and Belgium, as well as other E.U. countries, are the focus over the next year.

It’s also working on technology that would group content with certain themes based on maturity to allow users to avoid some types of videos.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.