Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Dan Milmo Global technology editor

Ofcom investigates TikTok over parental control information

The TikTok logo is displayed on a smartphone
TikTok is one of three video-sharing platforms the regulator looks at in a report published on 14 December. Photograph: Bloomberg/Getty

The UK communications regulator has opened an investigation into whether TikTok gave “inaccurate” information about its parental controls to the watchdog.

Ofcom said it had asked the social video platform for information about its Family Pairing system and that is has “reason to believe that the information it provided was inaccurate”.

The watchdog said it had, therefore, opened an investigation into whether the Chinese-owned platform has breached the 2003 Communications Act.

TikTok said the problem was due to a technical issue that may have led to some of the information provided to the watchdog being inaccurate. It said it had identified the problem several weeks ago and raised it with Ofcom.

The regulator said it had also asked for information as part of a report into how video-sharing platforms [VSPs] were protecting users from harmful content.

“The available evidence suggests that the information provided by TikTok in response to the notice may not have been complete and accurate,” said Ofcom, adding that it would provide an update to the investigation in February.

Ofcom made the announcement as it published a report on Thursday into how three leading VSPs, TikTok, Snap and video game streaming service Twitch, were protecting children from encountering harmful videos.

The report cited research showing that more than a fifth of children aged eight to 17 have an adult online profile with an account stating their age is 18 or over. A third of children aged eightto 15 have an account with a user age of 16 or over, Ofcom added.

The regulator said the statistics called into question whether a policy of self-declaring a user’s age when they sign up was sufficient, and called on the platforms to step up attempts to find out the age of users.

“We therefore expect platforms to explore additional methods to gain a better understanding of the age of their users to be able to tailor their experiences in ways that are more appropriate to their age and that protect them from harm,” said Ofcom.

The regulator said TikTok used undisclosed technologies to detect keywords that flagged a potentially underage account, while Twitch used several measures including language analysis tools and Snap relied on people reporting underage users.

TikTok told Ofcom that the number of underage accounts it had removed in the 12 months to March 2023 represented just over 1% of its monthly active user base. Twitch said over the same period it had removed 0.03% of its total UK user base and Snap had removed up to 1,000 accounts.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.