Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Fortune
Fortune
Chris Morris

National Eating Disorder Association yanks chatbot that replaced human helpline staff after users said it gave harmful advice

(Credit: mapo—Getty Images)

Less than a week after it announced plans to replace its human helpline staff with an A.I. chatbot named Tessa, the National Eating Disorder Association (NEDA) has taken the technology offline.

“It came to our attention [Monday] night that the current version of the Tessa Chatbot…may have given information that was harmful,” NEDA said in an Instagram post. “We are investigating this immediately and have taken down that program until further notice for a complete investigation.”

The chatbot was set to completely replace human associates on the organization’s hotline on June 1. It’s unclear how the organization plans to staff that helpline at this point.

The problems with Tessa were made public by an activist named Sharon Maxwell, who said: “Every single thing Tessa suggested were things that led to the development of my eating disorder.” NEDA officials initially called those claims a lie in a social media post, but deleted it after Maxwell sent screenshots of the interaction, she said.

Alexis Conason, a psychologist who specializes in treating eating disorders, was able to re-create the issues, posting screenshots of a conversation with the chatbot on Instagram.

“Imagine vulnerable people with eating disorders reaching out to a robot for support because that’s all they have available and receiving responses that further promote the eating disorder,” she wrote.

NEDA introduced Tessa after the hotline staff decision to unionize following a slew of pandemic-era calls led to mass staff burnout. The six paid employees oversaw a volunteer staff of roughly 200 people, who handled calls (sometimes multiple ones) from nearly 70,000 people last year.

NEDA officials told NPR the decision had nothing to do with the unionization. Instead, said Vice President Lauren Smolar, the increasing number of calls and largely volunteer staff was creating more legal liability for the organization and wait times for people who needed help were increasing. Former workers, however, called the move blatantly anti-union.

The creator of Tessa says the chatbot, which was specifically designed for NEDA, isn’t as advanced as ChatGPT. Instead, it’s programmed with a limited number of responses meant to help people learn strategies to avoid eating disorders.

“It's not an open-ended tool for you to talk to and feel like you're just going to have access to kind of a listening ear, maybe like the helpline was,” Dr. Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University's medical school who helped design Tessa, told NPR.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.