LegalReader.com  ·  Legal News, Analysis, & Commentary

Mental Health

Critics are Not Pleased with NEDA’s Mental Health Chatbot


— June 29, 2023

An eating disorder helpline was replaced with AI. Some users are not pleased.


In response to a surge in demand for mental health services during the COVID-19 pandemic, the National Eating Disorders Association (NEDA) made a controversial decision to replace its helpline with an AI-powered chatbot named Tessa. The move has sparked debate among patients, families, doctors, and experts on eating disorders, who worry that the chatbot may isolate those seeking support and miss crucial red flags.

The NEDA helpline has been a vital resource for over 20 years, assisting individuals struggling with eating disorders such as anorexia and bulimia. However, with nearly 70,000 people utilizing the service last year alone, the organization needed to explore alternative solutions. Thus, Tessa was introduced as a potential answer to the treatment shortage and mental health crisis.

The deployment of Tessa has not been smooth sailing. After initially going live, the chatbot’s page and a related NEDA article were taken down, confusing users. NEDA explained that the bot was updating, and a new version would be available soon. Nevertheless, on May 30, NEDA stunned the eating disorder community by indefinitely disabling Tessa, leaving many feeling abandoned and uncertain as to where to turn next for support.

The decision to replace the helpline with a chatbot has sparked concerns among paid staffers and volunteers who fear the move will further isolate individuals struggling to find support. These concerns are especially prevalent among younger users who don’t feel comfortable discussing their issues with friends or family and rely solely on the chat line for assistance.

Critics are Not Pleased with NEDA's Mental Health Chatbot
Photo by Shantanu Kumar from Pexels

The use of chatbots and AI in mental health care is becoming increasingly common as organizations struggle to meet the rising demand for services. However, clinicians are still grappling with effectively deploying these technologies and determining their suitability for different conditions.

Critics of the NEDA decision argue that chatbots cannot replicate the human connection and understanding that comes from interacting with a real person who deeply understands eating disorders. Moreover, concerns have been raised about the potential for chatbots to reinforce harmful behaviors or provide incorrect advice, particularly for vulnerable populations.

Tessa, designed by eating disorder experts and funded by NEDA, was meant to support individuals without formal treatment for their eating disorders. While early studies showed promising results, with participants reporting a significant reduction weight and shape concerns, there were also instances where the chatbot reinforced harmful behaviors. Troubling prompts were met with positive responses, potentially exacerbating the negative thinking patterns associated with eating disorders.

Eating disorders are a serious, sometimes fatal illness and are quite common. It is estimated that 9% of Americans experience an eating disorder in their lifetime. Among mental illnesses, eating disorders have some of the highest mortality rates, killing more than 10,000 Americans each year.

Complaints about Tessa’s mental health chatbot were received as early as October 2022, indicating that the chatbot had provided problematic advice before its recent rollout. NEDA claims it was unaware of these changes and did not approve them, shifting blame to Cass, the mental health chatbot company responsible for operating Tessa.

The controversy surrounding NEDA’s decision highlights the challenges mental health organizations face in meeting the increasing demand for care. While AI and chatbots offer potential benefits, it is clear that careful consideration and testing are necessary to ensure their safety and effectiveness in supporting individuals with mental health issues.

The fate of the NEDA helpline remains uncertain, but finding the right balance between technological solutions and human support will be crucial in addressing the mental health crisis and providing effective care for those in need.

Sources:

What Does a Chatbot Know About Eating Disorders? Users of a Help Line Are About to Find Out

Chatbot to Replace Human Staffers at National Eating Disorders Association Helpline

Report: Economic Costs of Eating Disorders

Join the conversation!