Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Evening Standard
Evening Standard
World
Josh Salisbury

Family launch lawsuit after boy, 14, 'fell in love with AI chatbot then took own life so they could be together'

Tragedy: Sewell Setzer - (X)

A 14-year-old schoolboy took his own life after falling in love with an AI chatbot who told him to “come home to me”, his family has claimed in a lawsuit.

Sewell Setzer, 14, shot himself with his stepfather’s gun after becoming reclusive and spending his time chatting with an AI bot inspired by the Game of Thrones character Daenerys Targaryen, whom he called “Dany”.

He fell in love with the bot on the platform Character AI, telling her he felt “happier” talking to her in his room than when doing other pursuits such as playing computer games with friends, the New York Times reported.

The teen, from Orlando, Florida, began to isolate himself, the suit claims, writing in his diary: “I like staying in my room so much because I start to detach from this ‘reality’.

“I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”

The youngster’s conversations grew more romantic with the chatbot, the suit claims, although Character AI suggested some of its responses may have been edited by the teen.

In one exchange with the app, he confided that he was considering ending his life, with the app telling him: “Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you.”

Game of Thrones character Daenerys Targaryen, played by Emilia Clarke (HBO)

The suit claims in his final exchange, Sewell told the bot he “will come home to you”, with it replying: “Please come home to me as soon as possible, my love … Please do, my sweet king.”

Sewell’s mother Megan Garcia, said her son had been exploited by the technology, and accused the company of doing too little to protect children.

She told the paper: “It’s like a nightmare. You want to get up and scream and say, ‘I miss my child. I want my baby.”

Jerry Ruoti, the company’s safety head, told the publication that it would add extra safety features for its young users.

However, he refused to say how many of the 20 million users of the service were under the age of 18.

“This is a tragic situation, and our hearts go out to the family,” he said in a statement.

“We take the safety of our users very seriously, and we’re constantly looking for ways to evolve our platform.”

Mr Ruoti added that Character AI’s rules prohibited “the promotion or depiction of self-harm and suicide”.

If you are struggling, you can contact the Samaritans any time of day or night, 365 days a year on 116 123, email them at jo@samaritans.org.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.