Eating Disorder Helpline Replaces Workers With An AI Chatbot To Help Struggling Callers

AI might be a step forward for technology, but it's a move in the wrong direction when people need human empathy and support.

Woman crying on the phone  cottonbro studio / Pexels

An eating disorder helpline that is designed to offer crucial support to people struggling has replaced its workers with an AI chatbot, sparking concern for the future of the platform.

AI Chatbots are being increasingly integrated into the ways workplaces operate, yet members of certain professions believe their use could cause more harm than good. For people involved in mental health work, the implications of utilizing AI as a replacement for person-to-person care are complex and varied, and possibly detrimental.


The National Eating Disorder Association (NEDA) replaced its Helpline workers with an AI Chatbot called Tessa to support struggling callers.

In an effort to improve their working conditions and increase training options, workers staffing the NEDA Helpline won a vote to unionize. Two weeks later, they were hit with devastating news–  the Helpline workers were being fired and replaced with an AI Chatbot, named Tessa. 

By June 1, 2023, the four full-time Helpline employees, along with hundreds of volunteers, were told they would no longer be of use. Instead, NEDA offered them the so-called opportunity to act as “testers” for Tessa.


In a post on the blog Labor Notes, Helpline worker Abbie Harper stated, "While we can think of many instances where technology could benefit us in our work on the Helpline, we’re not going to let our bosses use a chatbot to get rid of our union and our jobs. The support that comes from empathy and understanding can only come from people."

RELATED: I Overcame My Eating Disorder By Treating It As A 'Friend'

The AI Chatbot will replace all humans working at the Helpline.

Tessa, which is deemed a “wellness chatbot,” was developed by a team at Washington University’s medical school led by Dr. Ellen Fitzsimmons-Craft, who acknowledged the inherent differences between Tessa’s capabilities and those of actual humans.


“I do think that we wrote her to attempt to be empathetic, but it is not, again, a human,” Fitzsimmons-Craft told NPR.  “It's not an open-ended tool for you to talk to and feel like you're just going to have access to kind of a listening ear, maybe like the helpline was.”

That sentiment was echoed by one person in recovery from an eating disorder, who spoke anonymously with YourTango about their own experience and the implications of using AI as treatment.

“Having an eating disorder is super isolating and it’s something that you hold in because it’s taboo and there’s so much stigma around it, so reaching out is a huge, early step in the recovery process,” they said. “Having a connection with somebody else who has struggled is invaluable. An AI bot can’t offer empathy or any meaningful connection. Because it’s not ChatGPT, it can’t even meet you where you are— while ChatGPT is dynamic and can have a conversation with you, Tessa can’t.”

RELATED: What I Want My Friends And Family To Know About My Eating Disorder


“Eating disorder treatment centers are cost-prohibitive; I wanted to go to one, but couldn’t afford it, and they didn’t take insurance. The Helpline is a tool that creates access to meaningful recovery resources and community—having a person on the other end of the phone creates community, and then that creates belonging, and when people feel like someone understands where they’re coming from, that’s when healing starts.”

As the NEDA Helpline Associates Union tweeted, “A chatbot is no substitute for human empathy.” 

To pretend otherwise is to cause harm to those in need of human support systems, and deny people actual connection to those who can help them.


RELATED: I Asked ChatGPT To Name My Baby

Alexandra Blogier is a writer on YourTango's news and entertainment team. She covers celebrity gossip, pop culture analysis, and all things to do with the entertainment industry.