In the few years since its launch, TikTok has already altered the face of the social media landscape, attracting more than 1 billion users and leading competitors to replicate some of its most unique features.
The impact of that explosive growth and the ‘TikTok-ification’ of the internet at large on social media users remains little understood, experts warn, exacerbating concerns about the impact of social media on our habits and mental health.
“It’s embarrassing that we know so little about TikTok and its effects,” said Philipp Lorenz-Spreen, a research scientist at the Max Planck Institute for Human Development in Berlin. “Research often lags behind industry, and this is an example of an instance where that could become a big problem.”
The lack of understanding in how TikTok affects its users is particularly concerning given the app’s massive popularity among young people, experts say. Increasingly called “the TikTok generation”, Gen Z prefers the platform to other social media, with nearly six in 10 teenagers counting themselves as daily users. The majority of US teens have accounts on TikTok, with 67% saying they have ever used the app and 16% saying they use it “almost constantly”.
“We owe it to ourselves and to the users of these platforms to understand how we are changed by the screens we use and how we use them,” said Michael Rich, a pediatrician who studies the impact of technology on children at Boston Children’s hospital.
“We need more information to make informed decisions on how we’re going to help younger people understand how to use them thoughtfully and mindfully – or not use them at all.”
What makes TikTok different
Concerns about the mental health impacts of social media activity are longstanding, and have only intensified in recent years. In 2021, for example, internal research at Instagram made public by Frances Haugen showed the drastic mental health impacts of the photo app on teen users – including increased rates of eating disorders among teen girls – and sparked widespread calls for stronger regulation.
But TikTok hosts similar harmful content, and experts warn a host of innovative features of the platform raise unique concerns.
TikTok largely optimizes content for minutes and hours of view time, internal documents leaked in 2021 showed, rather than prioritizing metrics like clicks and engagement favored by most social media platforms before. In order to do that, the company has deployed a unique algorithm and a landing page that marks the most extreme departure yet from a chronological to an algorithmic feed.
“What that does to the brain, we don’t know,” said Lorenz-Spreen.
Studies show that when chronological feeds are discarded in favor of suggested content, the algorithm frequently gives rise to more extreme views. One report in 2021 showed more than 70% of extremist content found on YouTube was recommended to users by the algorithm. And it incentivizes users to share attention-grabbing content that gets picked up by the feed.
In recent years, TikTok has faced intense scrutiny for dangerous challenges the algorithm has given rise to. The “Benadryl challenge”, wherein participants took a large amount of antihistamines in an attempt to produce hallucinogenic effects, led to at least one death. A new lawsuit claims the “blackout challenge” led to deaths of several young girls.
“Compared to other social media sites, TikTok is uniquely performative,” said Rich, the pediatrician. “This leads to both interesting content, and some edgy ways of seeking attention that are less healthy.”
TikTok also appears to be “faster than any other platform at detecting interest”, said Marc Faddoul, co-director of Tracking Exposed, a digital rights organization investigating TikTok’s algorithm. The app’s For You Page seems to know its users’ desires and interests so well it has sparked memes and articles such as The TikTok Algorithm Knew My Sexuality Better Than I Did and ‘Why is My TikTok For You Page All Lesbians?’ Asks Woman Who is About to Realize Why.
Researchers are still parsing what that uncanny tailoring means for users, particularly as it relates to targeted content around mental illness and other sensitive issues.
“The app provides an endless stream of emotional nudges, which can be hard to recognize and really impact users in the long run,” Faddoul said. “It’s not going to make anyone depressed overnight, but hours of consumption every day can have a serious impact on your mental health.”
These concerns are particularly pronounced in the realm of ADHD content, where users have reported being diagnosed by medical professionals after seeing videos about their symptoms. But while the prevalence of the #ADHD hashtag has brought increased awareness of the condition experts have warned of unintended negative effects, including medical misinformation, especially as the platform accepts advertising money from a number of for-profit mental health startups such as Cerebral.
TikTok declined to comment on criticisms relating to health misinformation and users self-diagnosing based on content seen on the app. It also declined to comment on its partnership with mental health startup Cerebral or its policies on medical information used in advertisements.
The algorithm may replicate existing inequalities that heighten mental health concerns for minority groups, researchers say. Black content creators on TikTok have long complained about their content being “shadowbanned”, or demoted by the algorithm, and in 2019 TikTok admitted to censoring videos from users it identified as disabled, overweight or LGBTQ+ in a misguided attempt to crack down on bullying.
“People of color on TikTok are constantly having to think about the ways in which the algorithm is surveilling them,” said Chelsea Peterson-Salahuddin, an internet researcher at the University of Michigan School of Information. “Putting the onus on marginalized people to constantly monitor themselves is very mentally and emotionally taxing.”
‘It creates a replacement for social interaction’
Researchers say the Covid-19 pandemic has illustrated the impact of the platform on users’ lives, especially young ones. When Covid-19 hit, and the world went into lockdown, TikTok’s use exploded.
The app was flooded with young people posting about the ways in which the pandemic was upending their lives. What has resulted is a very young user base taking advantage of the app to connect with one another during a very vulnerable time, said Yim Register, a researcher who studies mental health and social media.
“The largest effect of the pandemic is being faced with large uncertainty, and under uncertainty our brains want to reduce uncertainty and make sense of the world,” Register said. “We want to be able to accurately predict what’s going to happen and we turn to social media to sense-make collectively.”
Register said that ethos had contributed to TikTok’s unique “platform spirit”, a term coined by researcher Michael Ann Devito to characterize the nature of content and communication on a given app.
“The platform spirit of TikTok seems to be about posting very loudly about very intimate and intense things,” Register said. “And people are encouraged to be vulnerable to fit that spirit.”
This has given rise to viral videos using a wry, ironic tone to share often devastating personal stories. “Things people on the internet have said to me since my sister passed away from addiction,” says one video, with 3.5m views, featuring a user dancing to upbeat music and lights. “Things my ex boyfriend said to me as I held my lifeless babies,” the caption on another video using the same music and dancing reads.
Backlash has already emerged on the platform itself over the increasingly personal nature of the app. “I truly believe years from now people will deeply regret trauma dumping on TikTok,” a user says in one viral video, adding that such content is less likely to be shared on Facebook and YouTube. “What it is it about TikTok that drives people to reveal their deepest, dirtiest secrets?”
Experts agree, saying that while these kinds of videos can offer support and a creative way to deal with grief, it can also lead to additional trauma.
“For many people, disclosing abuse or mental health issues can be traumatic and harmful,” said Rich, the children’s mental health expert. “In clinical work, we have systems in place for if a disclosure occurs – there is a safety net to catch them. And that does not exist in a social media environment.”
The dangers are heightened by the anonymous nature of TikTok, whose feed differs from that of social media in the past, researchers say. While apps such as Facebook historically offered a feed of personal content primarily from friends and family, on TikTok the majority of people who see a user’s videos are largely strangers.
“With TikTok in particular, because of its large user base and the way its algorithm works, videos have the potential to get very big very fast, and not everyone is prepared for that,” Register said. “There are serious consequences to going viral.”
Often commenters will demand more engagement on viral TikToks, with a common refrain of “story time?” encouraging the original poster to elaborate on the traumatic share. Register said issues like these have led more researchers to call for better protections of users.
“Most computing is not trauma informed, and when social media is not trauma informed it can exacerbate trauma,” Register said. “When I look at social media, the question is not how it affects your mental health, but how do mental health issues you already have get exacerbated by its design?”
TikTok in March 2021 introduced new tools “to promote kindness” on the app, allowing users to more easily filter spam and offensive comments. It also added an automatic pop-up prompt for users leaving potentially violating comments asking them “to reconsider”.
“Our goal is to promote a positive environment where people support and lift each other up,” said Tara Wadhwa director of US policy at TikTok.
Meanwhile, TikTok’s opaque algorithm is slowly being cracked open. In August, Chinese regulators required TikTok to open up its algorithms for review, and the company around the same time began to allow Oracle to audit its content moderation models. Rich said this was just the beginning, and more transparency was needed.
“Legislators and these companies need to invest more in really understanding this interface between human nature and these platforms,” he said.
“We need more information to make informed decisions on how we’re going to help younger people understand how to use them thoughtfully and mindfully – or not use them at all.”