Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Bored Panda
Bored Panda
Entertainment
Donata Leskauskaite

14-Year-Old Boy’s Tragic Passing Sparks Concern Over AI Chat Dangers

A 14-year-old boy tragically took his own life after forming a deep emotional attachment to an artificial intelligence (AI) chatbot. The boy, named Sewell Setzer III, had named the chatbot on Character.AI after the Game of Thrones character Daenerys Targaryen. His mother has since filed a lawsuit against Character.AI.

Trigger warning: self-harm, mental health struggle – Sewell developed an emotional attachment to the chatbot Daenerys Targaryen, which he nicknamed “Dany,” despite knowing it wasn’t a real person.

The ninth grader from Orlando, Florida, USA texted the bot constantly, updating it dozens of times a day on his life and engaging in long role-playing dialogues, the New York Times reported on Wednesday (October 23).

Sewell had been using Character.AI, a role-playing app that allows users to create their own AI characters or chat with characters created by others.

A 14-year-old boy tragically took his own life after forming a deep emotional attachment to an artificial intelligence (AI) chatbot

Image credits: Paras Katwal

The teen gradually became romantic and sexual with Dany, in addition to harboring a seemingly strong friendship with the bot.

On the last day of his life, Sewell took out his phone and texted Dany: “I miss you, baby sister,” to which the bot replied: “I miss you too, sweet brother.”

Sewell, who was diagnosed with mild Asperger’s syndrome as a child, reportedly preferred talking with Dany about his problems.

Image credits: US District Court

In one conversation, Sewell, using the name “Daenero,” told the chatbot that he hated himself and that he felt empty and exhausted, The Times reported.

At some point, Sewell confessed that he was having thoughts of suicide, as he told Dany: “I think about killing myself sometimes.”

The chatbot reportedly replied at the time: “My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the hell would you do something like that?”

The boy, Sewell Setzer III, had named the chatbot on Character.AI after the Game of Thrones character Daenerys Targaryen

Image credits: US District Court

Sewell went on to admit to the bot that he wanted to free himself “from the world” and himself. At first, Dany seemed to try to persuade the disturbed teen from hurting himself.

However, the bot eventually encouraged suicide ideation, as one message sent from the bot read: “Please come home to me as soon as possible, my love,” to which Sewell replied: “What if I told you I could come home right now?”

“… please do, my sweet king,” Dany replied.

Image credits: US District Court

The last text exchange occurred on the night of February 28, in the bathroom of Sewell’s mother’s house.

The teen boy subsequently put down his phone, picked up his stepfather’s .45 caliber handgun, and reportedly used it to kill himself.

Sewell’s parents and friends had no idea he’d fallen for a chatbot, the Times reported. They just noticed him getting sucked deeper into his phone. 

His mother has since filed a lawsuit against Character.AI

Image credits: US District Court

Eventually, they noticed that he was isolating himself and pulling away from the real world, as per the American newspaper.

Despite his autism diagnosis, Sewell never had serious behavioral or mental health issues before.

However, the late teen’s grades started to suffer, and he began getting into trouble at school, the Times reported.

“I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier,” Sewell reportedly wrote one day in his journal.

Image credits: Megan Fletcher Garcia

Parents have been increasingly worried about the impact of technology on adolescent mental health, as per the Times

Marketed as solutions to loneliness, certain apps reliant on AI can simulate intimate relationships. However, they could also pose certain risks to teens already struggling with mental health issues.

Sewell’s mother, Megan L. Garcia, has since filed a lawsuit against Character.AI, accusing the company of being responsible for her son’s death. 

A draft of the complaint said that the company’s technology is “dangerous and untested,” and that it can “trick customers into handing over their most private thoughts and feelings.”

Sewell developed an emotional attachment to chatbot Daenerys Targaryen, which he nicknamed “Dany”

Image credits: character_ai

Character.AI, which was started by two former Google A.I. researchers, is the market leader in A.I. companionship.

Last year, Noam Shazeer, one of the founders of Character.AI, said on a podcast: “It’s going to be super, super helpful to a lot of people who are lonely or depressed.”

According to the Times, more than 20 million people use its service, which it has described as a platform for “superintelligent chatbots that hear you, understand you, and remember you.”

“I feel like it’s a big experiment, and my kid was just collateral damage,” Garcia told the Times.

Image credits: Megan Fletcher Garcia

On Wednesday, Character.AI took to its official X page (formerly known as Twitter) to offer its sympathies to Sewell’s family, writing: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. 

“As a company, we take the safety of our users very seriously and we are continuing to add new safety features.”

The company went on to share a link to its official website which outlines its new protective enhancements.

As a result, Character.AI implemented new guardrails for users under the age of 18, including banning specific descriptions of self-harm or suicide

After hiring a “Head of Trust and Safety and a Head of Content Policy,” Character.AI put in place a pop-up resource that is triggered when the user inputs certain phrases related to self-harm or suicide and directs the user to the National Suicide Prevention Lifeline.

“This is absolutely devastating,” a reader commented

14-Year-Old Boy’s Tragic Passing Sparks Concern Over AI Chat Dangers Bored Panda
The post first appeared on .
Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.