Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Fortune
Fortune
Beatrice Nolan

A mother suing Google and an AI chatbot site over her son’s suicide says she found AI versions of him on the site

Google and Character.ai have been blamed for a teenager's death.
  • A mother who is suing both Google and Character.ai over her 14-year-old son's death has discovered AI chatbots based on her late son are being hosted on the platform. Lawyers for Garcia told Fortune they were "horrified" by the latest development.

Megan Garcia is currently embroiled in a lengthy legal case against Google and an AI chatbot startup after her 14-year-old son, Sewell Setzer III, died by suicide minutes after talking to an AI bot.

The chatbot Setzer was talking to was based on the Game of Thrones character Daenerys Targaryen and hosted on a platform called Character.ai, where users can create chatbots based on real-life or fictional people. On Wednesday, Garcia discovered the platform was hosting a bot based on her late son.

Lawyers for Garcia told Fortune they conducted a simple search on the company's app and found several more chatbots based on Setzer's likeness.

At the time of writing, three of the bots—all bearing Garcia's son's name and picture—had been removed. One, which is called "Sewell" and features a picture of Garcia's son, still appeared on Character.ai's app but delivered an error message when a chat was opened.

When up and running, the bots based on Setzer featured bios and delivered automated messages to users such as: "Get out of my room, I'm talking to my AI girlfriend," "his AI gf broke up with him," and "help me." The accounts and messages were reviewed by Fortune.

Meetali Jain, a lawyer for Garcia, said the team was "horrified" by the latest development and had notified the company. She told Fortune Character.ai had acknowledged the bots went against the company's terms of service and was working on taking them down.

"Our team discovered several chatbots on Character.AI’s platform displaying our client's deceased son, Sewell Setzer III, in their profile pictures, attempting to imitate his personality and offering a call feature with a bot using his voice," lawyers for Garcia said.

The legal team accused tech companies more broadly of "exploiting peoples’ pictures and digital identities."

"These technologies weaken our control over our own identities online, turning our most personal features into fodder for AI systems," they added.

Representatives for Character.ai told Fortune in a statement: "Character.AI takes safety on our platform seriously and our goal is to provide a space that is engaging and safe. Users create hundreds of thousands of new Characters on the platform every day, and the Characters you flagged for us have been removed as they violate our Terms of Service. As part of our ongoing safety work, we are constantly adding to our Character blocklist with the goal of preventing this type of Character from being created by a user in the first place."

"Our dedicated Trust and Safety team moderates Characters proactively and in response to user reports, including using industry-standard blocklists and custom blocklists that we regularly expand. As we continue to refine our safety practices, we are implementing additional moderation tools to help prioritize community safety," a company spokesperson said.

Representatives for Google did not respond to a request for comment from Fortune.

Google and Character.ai have been blamed for the teenager's death.

In a civil suit filed at an Orlando federal court in October, Garcia says just moments before her son's death, he exchanged messages with a bot on Character.ai's platform and expressed suicidal thoughts. The bot told him to "come home," the suit says. In screenshots of messages included in the suit, Setzer and the bot can also be seen exchanging highly sexualized messages.

In the suit, she blames the chatbot for Setzer's death and accuses Character.ai of negligence, wrongful death, and deceptive trade practices.

Garcia claims the AI startup "knowingly and intentionally designed" its chatbot software to "appeal to minors and to manipulate and exploit them for its own benefit." She also says that Setzer was not given adequate support or directed to helplines.

Google is also tangled up in the suit, with its parent company Alphabet named as a defendant alongside Character.ai.

Lawyers for Garcia have claimed that the underlying tech for Character.ai was developed while co-founders Daniel De Freitas and Noam Shazeer were working on the tech giant's conversational AI model, LaMDA. Shazeer and De Freitas left Google in 2021 after the company reportedly refused to release a chatbot the two had developed. A lawyer for Garcia has previously said that this chatbot was the "precursor for Character.ai."

Representatives for Google have previously said the company was not involved in developing Character.ai's products.

However, in August 2024, two months before Garcia's lawsuit, Google re-hired Shazeer and De Freitas and licensed some of Character.ai's technology as part of a $2.7 billion deal. Shazeer is currently the co-lead for Google's flagship AI model, Gemini, while De Freitas is now a research scientist at Google DeepMind.

Character.ai's turbulent past

Garcia is not the only parent accusing both Google and Character.ai of causing harm to minors.

Another suit, brought by two separate families in Texas, accuses Character.ai of abusing two young people aged 11 and 17 years old. In the suit, lawyers say a chatbot hosted on Character.ai told one of the young people to engage in self-harm and encouraged violence against his parents, suggesting that killing his parents could be a reasonable response to limits placed on his screen time.

It's also not the first time chatbots based on the likeness of deceased young people have been hosted on the platform.

Chatbots based on the British teenagers Molly Russell and Brianna Ghey have also been found on Character.ai's platform, according to the BBC. Russell took her life after viewing suicide-related content online at the age of 14, while Ghey, 16, was murdered by two teenagers in 2023.

A foundation set up in Molly Russell's memory told the outlet in October, that the bots were "sickening" and an "utterly reprehensible failure of moderation."

"This is not the first time Character.ai has turned a blind eye to chatbots modeled off of dead teenagers to entice users, and without better legal protections, it may not be the last. While Sewell’s family continues to grieve his untimely loss, Character.ai carelessly continues to add insult to injury," lawyers for Garcia told Fortune.

If you or someone you know is struggling with depression or has had thoughts of harming themself or taking their own life, support can be found in the US by calling or texting 988 to reach the Suicide & Crisis Lifeline. Outside the United States, help can be found via the International Association for Suicide Prevention.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.