Expert Warns National Eating Disorder Helpline's Switch To Automated Bots Could Be Devastating - Exclusive

In a move that's undeniably going to do more harm than good, the National Eating Disorder Helpline has announced it will be letting go their volunteers and employees in favor of a chatbot. Now, when people call the hotline in dire situations, they'll be forced to interact with a bot that, unlike a human, has no empathy, zero legitimate concern, and no personal experience to draw from. It's a robot, and robots aren't capable of feelings of compassion.

Advertisement

No matter where we turn, customer service has been replaced with automated systems, online help is a bot that can answer basic questions, and now there's the concern that writers are being replaced by ChatGPT. As much as all this might seem convenient, it does something that we as humans need: it takes humanity out of the equation. This is a major problem, especially when it comes to situations involving mental health, such as eating disorders, because having an actual human on the other end of the phone is essential. It can be a matter of life or death. "If I'm disclosing to you that I have an eating disorder; I'm not sure how I can get through lunch tomorrow, I don't think most of the people who would be disclosing that would want to get a generic link," Marzyeh Ghassemi, the professor of electrical engineering and computer science at MIT, told NPR. "Click here for tips on how to rethink food."

Advertisement

Tips can be a good thing, but people don't call helplines for tips. A chatbot doesn't offer the human touch, the human voice, and the human ear that people with eating disorders sometimes need just to get through a hard time. To get a better idea of how this change could further impact future callers, Women.com spoke exclusively with media psychologist Dr. Joanne Broder, Ph.D.

Why it's a harmful decision

What makes this transition to Tessa, the bot, so traumatic is that — although the decision was somewhat based on the NEDA team not being able to keep up with the volume of calls — it seems to have been made with little regard as to how it will affect the callers' emotions. Roughly 70,000 people called the NEDA helpline in 2022 looking for help, insight, or just someone to talk to about their struggles. During Covid, the number was twice what it was last year, as isolation and overall fear of what was happening pushed people further into the depths of their mental illnesses. If someone calls NEDA and is subjected to Tessa, there won't be a warm human voice, nor will there be understanding from someone who intimately understands having a complicated relationship with food. And there definitely won't be a person there to help a caller dealing with life-ending ideations.

Advertisement

"I strongly disagree with Tessa replacing human hotline staffers at the National Eating Disorder Association," Joanne Broder, Ph.D. exclusively told Women.com. "There are certain fields and jobs, where bots and AI enhance performance and productivity. Authentic human interaction cannot be replicated, especially when serving as a first responder in a crisis situation. Whether it's the NEDA or any other crisis hotline, callers need safety, empathy, and to feel seen ... Many people struggle with asking for help, so when someone reaches out, [they] deserve to be treated with compassion."

Even Dr. Ellen Fitzsimmons-Craft, who created Tessa, admits Tessa isn't supposed to make callers feel like there's someone there to listen to them. Instead, Tessa exists to give a limited number of tips and responses.

Advertisement

How it could affect potential callers

According to the National Eating Disorder Association, 30 million Americans struggle with an eating disorder. In fact, the U.S. has the highest number of people with eating disorders in the world. The U.K. is second on the list with 1.25 million people — that's a massive discrepancy in the number of people who suffer from one or more forms of eating disorder. A 2014 study published in World Psychiatry examined the mortality risks of 20 mental illnesses and found that anorexia nervosa is among the highest in regard to suicide risk. Numbers like this should have NEDA stepping up to the plate, hiring more staff, and training more volunteers; not opting for a robot.

Advertisement

"Tessa or any chatbot is very likely to discourage people from seeking treatment, since it can't replicate the compassion and empathy of the staffers to help them feel seen and guide them toward the appropriate options for help," Broder exclusively tells Women.com "Tessa will not be effective as volunteers for callers in crisis. Tessa could be an excellent and efficient option for callers who are not in crisis and seeking information and resources."

Anyone who's struggled with an eating disorder or has loved someone with an eating disorder knows how isolating the illness is. In a culture where sharing a meal is regarded as a social activity in which people spend time together and bond, for those with eating disorders it can be terrifying to be put in those situations. Putting bots on the other end of the NEDA helpline is going to make these people feel even more isolated and alone than they already are.

Advertisement

What it could mean for the future of helplines

Although many people rely on the internet and apps for different forms of support, nothing can replace the sound of someone's voice consoling and guiding you through a difficult time. Maybe robots in the distant future will be able to feel and experience the whole spectrum of human emotions, but right now that's not happening.

Advertisement

"It is not possible to substitute the compassionate human these callers need and deserve for a chatbot for any mental health situation," says Broder. "Tessa could be harmful, damaging, and counterproductive for people seeking help. Many of the NEDA volunteers are people who have overcome eating disorders [and] share their experiences, which was found to be helpful to callers. People might stop calling and not seek help, which could be life-threatening."

While it's great that mental illness doesn't have the same stigma attached to it that it used to, and more people are acknowledging that it's a genuine concern that shouldn't be taken lightly, turning to bots like Tessa is taking a huge step backward. For far too long, people have struggled with mental illness and now that there's a safe space to be open about the suffering that comes with it, we're going to toss these people to bots? Are we really going to tell the person whose eating disorder is destroying their body, their life, and making them a prisoner of their illness that a bot has the answer in times of desperation? Robots can't talk people off a cliff; only a human can do that. When the inevitable fallout from Tessa happens, let's hope NEDA will hold itself accountable for any harm caused.

Advertisement

Recommended

Advertisement