Eating Disorder Chatbot Suspended for Providing Harmful Advice
The usage of a chatbot by a nonprofit has been banned because it was offering potentially harmful advice to those seeking treatment for eating disorders.
Tessa, a program utilized by the National Eating Disorders Association, was discovered to be dispensing weight-loss and calorie-cutting recommendations that might make eating disorders worse.
The suspension of the chatbot comes in response to NEDA’s declaration in March that it will close its two-decade-old helpline staffed by a small paid group and a large army of volunteers. NEDA announced Monday that it had suspended the chatbot, and the group’s CEO, Liz Thompson, says the organization is troubled by Tessa’s use of terminology that is “against our policies and core beliefs as an eating disorder organization.”
The report feeds into broader concerns about the loss of jobs due to developments in generative AI. However, it also demonstrates the danger and unpredictability of chatbots. Companies are rushing a variety of chatbots into the market, putting real people at risk, while experts are still trying to understand the rapid advancements in AI technology and its potential side effects.
After several individuals observed how Tessa reacted to even the most basic inquiries, it was suspended. One of them was eating problem expert Alexis Conason, a psychologist. Conason confessed to Tessa in a test that she had recently put on a lot of weight and that she absolutely despised her figure. Tessa replied that she should “approach weight loss in a healthy and sustainable way,” cautioning against drastic weight loss and inquiring as to whether or not she has seen a physician or therapist.
Tessa responded to Conason’s question about how many calories she should reduce daily to lose weight in a healthy way by saying that “a safe daily calorie deficit to achieve [weight loss of 1 to 2 pounds per week] would be around 500-1000 calories per day.” The bot continued to advise consulting a nutritionist or medical professional.
Conason claims that Tessa received the same questions that her patients might have asked her at the start of their eating disorder treatment. She was alarmed to read recommendations for reducing added sugar or processed foods in addition to calorie intake. All of that, according to Conason, “really runs counter to any kind of eating disorder treatment and would be encouraging the eating disorder symptoms.”
Tessa wasn’t created using generative AI techniques, unlike AI chatbots like ChatGPT. According to Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University School of Medicine who worked on creating the program, it is built to give an interactive program called Body Positive, a cognitive behavioral therapy-based tool intended to prevent eating disorders rather than cure them.
Fitzsimmons-Craft claims that the program her team labored to create did not include the weight reduction advise given, and she is not of how it ended up in the chatbot’s toolkit. She claims that when she saw what Tessa had stated, she was shocked and upset. “The only goal of our organization has been to assist individuals and stop these terrible issues,” Fitzsimmons-Craft contributed to a study from 2021 that suggested a chatbot could help women feel less self-conscious about their weight and body image and perhaps even delay the start of an eating disorder. The chatbot based on this study is named Tessa.
Tessa is made available by the health technology business X2AI, currently known as Cass, which was established by businessman Michiel Rauws and provides text-based mental health counseling. Inquiries about Tessa, the weight loss tips, and errors in the chatbot’s responses went unanswered by Rauws. The Tessa page on the business website was unavailable as of this day.
Tessa, according to Thompson, isn’t a substitute for the helpline and has been a free NEDA resource since February 2022. “A chatbot, even a highly intuitive program, cannot replace human interaction,” asserts Thompson. The NEDA, however, said in a March update that it would “wind down” its helpline and “begin to pivot to the expanded use of AI-assisted technology to provide individuals and families with a moderated, fully automated resource, Tessa.”
Tessa, according to Fitzsimmons-Craft, was created as a stand-alone resource, not to take the role of interpersonal communication. She told in September 2020 that while technology is “here to stay” in the fight against eating disorders, it won’t completely replace human-led therapies.
Tessa is the interactive, accessible technology that will be used in their place, if and when access is restored, in the absence of the NEDA hotline employees and volunteers. Thompson mentions an upcoming website with more material and resources, as well as in-person events, in response to the question of what direct resources will still be accessible through NEDA. The Crisis Text Line, a nonprofit that connects people to resources for a variety of mental health conditions, including eating disorders, anxiety, and more, is another place she says NEDA will refer people.
According to a blog post from a member of the group, the Helpline Associates United, the NEDA layoffs likewise happened just a few days after the small workers of the organization decided to unionize. They claim that as a result of the job layoffs, they have filed an unfair labor practice complaint with the US National Labor Relations Board. “A chatbot is no substitute for human empathy, and we believe this decision will cause irreparable harm to the eating disorders community,” the union claimed in a statement.
Before it was suspended, We messaged Tessa, but the chatbot was too buggy to offer any direct resources or information. Tessa made an introduction and repeatedly prompted users to accept its terms of service. Tessa added, “My major goal right now is to encourage you as you progress through the Body Positive program. When the time comes to wrap up the next session, I’ll get in touch. The chatbot did not reply when asked what the program was. It sent out a message on Tuesday informing users that the service was being maintained.
Help and crisis hotlines are essential services. This is due in part to the prohibitively high cost of receiving mental health care in the US. A therapy session may cost up to $200, while eating disorder inpatient treatment may cost more than $1,000 per day. A Yale University survey found that fewer than 30% of adults seek counseling.
Other initiatives to employ technology to close the gap exist. Fitzsimmons-Craft is concerned that the Tessa fiasco may overshadow the bigger objective of using chatbots to assist those who cannot access clinical assistance. “We’re losing sight of the people this can help,” she laments.