Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Reason
Reason
Joe Lancaster

Judge Dismisses Lawsuit Against TikTok Over Child's Death in Blackout Challenge

In December 2021, Tawainna Anderson experienced every parent's worst nightmare: She found her 10-year-old daughter, Nylah, hanging by a purse strap in a bedroom closet. Anderson called 911 and attempted CPR until the ambulance arrived, but after several days in intensive care, her daughter was pronounced dead.

Based on a search of her devices, police later determined the girl was attempting a viral challenge from TikTok, the social media video platform with over a billion monthly active users, 28 percent of whom are under the age of 20. TikTok challenges involve users recording themselves performing a particular act or feat and calling on other users to do it too. The videos typically feature a common hashtag to make them easy to search for. TikTok users also have a personalized "For You Page" (FYP), where the app curates suggested videos based on each user's viewing habits.

The so-called "blackout challenge" involves participants asphyxiating themselves with household objects until they pass out; according to the Centers for Disease Control and Prevention (CDC), a version of the challenge existed in some form at least as far back as 1995. Nylah watched blackout challenge videos before apparently attempting it herself. Since 2021, at least eight children have died after allegedly attempting the challenge.

Earlier this year, Anderson sued TikTok and its parent company, ByteDance, for wrongful death, negligence, and strict products liability. The suit claims the "predatory and manipulative app and algorithm…pushed exceedingly and unacceptably dangerous challenges and videos to Nylah's FYP, thus encouraging her to engage and participate."

Ultimately, it contends, TikTok's "algorithm determined that the deadly Blackout
Challenge was welltailored and likely to be of interest to 10yearold Nylah Anderson, and she died as a result.
"

In July, Reason reported on a similar lawsuit:

Even if TikTok were directing dangerous content to people's FYPs, it's not clear whether the platform could be found legally liable at all. Section 230 of the Communications Decency Act protects online services from legal liability for content posted by its users. While TikTok may host the videos, the law states that it cannot "be treated as the publisher or speaker" of any content provided by a user.

Just as with that suit, Anderson's lawsuit tried to circumvent Section 230's liability protections by stipulating that it "does not seek to hold [defendants] liable as the speaker or publisher of thirdparty content and instead intends to hold [them] responsible for their own independent conduct as the designers, programmers, manufacturers, sellers, and/or distributors."

This week, Judge Paul Diamond of the Eastern District Court of Pennsylvania dismissed the case, citing Section 230 as the justification. Diamond decided that Anderson "cannot defeat Section 230 immunityby creatively labeling her claims." He determined, "Although Anderson recasts her content claims by attacking Defendants' 'deliberate action' taken through their algorithm…courts have repeatedly held that such algorithms are 'not content in and of themselves.'"

Citing previous case law, Diamond contends that "Congress conferred this immunity 'to maintain the robust nature of Internet communication and, accordingly, to keep government interference in the medium to a minimum'…. It recognized that because of the 'staggering' amount of information communicated through interactive computer services, providers cannot prescreen each message they republish…. Accordingly, Congress conferred immunity on providers to encourage them not to restrict unduly the number and nature of their postings."

Indeed, given the sheer number of users, it would be impossible for TikTok to screen every single video for objectionable content. (The platform encourages content creators to post 1–4 times per day for maximum reach). A study produced this year concluded that while "TikTok removes challenges reported as dangerous and has increased safety controls," the sheer volume of content posted every day makes the task difficult.

Nylah Anderson's death is a tragedy, and her mother's pain is immeasurable. But it's far from evident that TikTok is uniquely liable for what happened, nor is it clear that it should bear any legal responsibility.

The post Judge Dismisses Lawsuit Against TikTok Over Child's Death in Blackout Challenge appeared first on Reason.com.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.