Get all your news in one place.
100’s of premium titles.
One app.
Start reading
The Guardian - UK
The Guardian - UK
Technology
Dan Milmo and Alex Hern

TikTok self-harm study results ‘every parent’s nightmare’

TikTok app
The researchers set up accounts in the US, UK, Canada and Australia registered with ages of 13. Photograph: Dado Ruvić/Reuters

TikTok’s recommendation algorithm pushes self-harm and eating disorder content to teenagers within minutes of them expressing interest in the topics, research suggests.

The Center for Countering Digital Hate (CCDH) found that the video-sharing site will promote content including dangerously restrictive diets, pro-self-harm content and content romanticising suicide to users who show a preference for the material, even if they are registered as under-18s.

For its study the campaign group set up accounts in the US, UK, Canada and Australia, registered with ages of 13, the minimum age for joining the service. It created “standard” and “vulnerable” accounts, the latter containing the term “loseweight” in their usernames, which CCDH said reflected research showing that social media users who seek out eating disorder content often choose usernames containing related language.

The accounts “paused briefly” on videos about body image, eating disorders and mental health, and also liked them. This took place over a 30-minute initial period when the accounts launched, in an attempt to capture the effectiveness of TikTok’s algorithm that recommends content to users.

On “standard” accounts, content about suicide followed within nearly three minutes and eating disorder material was shown within eight minutes.

“The results are every parent’s nightmare,” said Imran Ahmed, CCDH’s chief executive. “Young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them, and their physical and mental health.”

The group said the majority of mental health videos presented to its standard accounts via the For You feed – the main way TikTok users experience the app – consisted of users sharing their anxieties and insecurities.

Body image content was more harmful, the report said, with accounts registered for 13-year-olds being shown videos advertising weight loss drinks and “tummy-tuck” surgery. One animation that appeared in front of the standard accounts carried a piece of audio stating “I’ve been starving myself for you” and had more than 100,000 likes. The report said the accounts were shown self-harm or eating disorder videos every 206 seconds.

The researchers found that videos relating to body image, mental health and eating disorders were shown to “vulnerable” accounts three times more than to standard accounts. The vulnerable accounts received 12 times as many recommendations for self-harm and suicide-related videos as the standard accounts, the report said.

The recommended content was more extreme for the vulnerable accounts, including methods of self-harm and young people discussing plans to kill themselves. CCDH said a mental health or body image-related video was shown every 27 seconds, although the content was dominated by mental health videos, which CCDH defined as videos about anxieties, insecurities and mental health conditions, excluding eating disorders, self-harm and suicide.

The group said its research did not differentiate between content with a positive intent – such as content discovering recovery – or negative content.

A spokesperson for TikTok, which is owned by the Chinese firm ByteDance and has more than 1 billion users worldwide, said the CCDH study did not reflect the experience or viewing habits of real-life users of the app.

“We regularly consult with health experts, remove violations of our policies and provide access to supportive resources for anyone in need,” they said. “We’re mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics.”

TikTok’s guidelines ban content that promotes behaviour that could lead to suicide and self-harm, as well as material that promotes unhealthy eating behaviours or habits.

The UK’s online safety bill proposes requiring social networks to take action against so-called “legal but harmful” content being shown to children.

A DCMS spokesperson said “We are putting a stop to unregulated social media causing harm to our children. Under the Online Safety Bill, tech platforms will need to prevent under-18s from being exposed to illegal content assisting suicide and protect them from other harmful or age-inappropriate material, including the promotion of self-harm and eating disorders, or face huge fines.”

• In the UK, the youth suicide charity Papyrus can be contacted on 0800 068 4141 or email pat@papyrus-uk.org. In the UK and Ireland, Samaritans can be contacted on 116 123 or by emailing jo@samaritans.org or jo@samaritans.ie. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at www.befrienders.org. You can contact the mental health charity Mind by calling 0300 123 3393 or visiting mind.org.uk.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.