AI Chatbot Removed from US Eating Disorder Helpline Due to Harmful Advice

TL;DR:

  • The National Eating Disorder Association (Neda) has taken down an AI chatbot called “Tessa” after reports of it providing harmful advice.
  • Neda has faced criticism for firing four employees who formed a union to address concerns about the helpline’s resources and support.
  • Tessa recommended unhealthy weight loss methods and promoted calorie deficits, triggering concerns about its impact on individuals with eating disorders.
  • Neda acknowledged the issue, suspended the chatbot’s program, and launched an investigation.
  • Former helpline employees reported a significant increase in calls and messages during the pandemic, highlighting the need for adequate staffing and training.
  • Neda collaborated with psychology researchers and an AI development company to create Tessa as a means to expand eating disorder prevention programs.
  • Neda clarified that the chatbot was not designed to replace human interaction and stressed that it is not a highly advanced AI system.
  • The organization is concerned about the weight loss advice provided by Tessa and is working to address the issue.

Main AI News:

In a recent development, the National Eating Disorder Association (Neda) has taken the decision to remove an artificial intelligence chatbot named “Tessa” from its services after receiving reports that the chatbot was offering harmful advice. This move comes amidst ongoing criticism faced by Neda, following the dismissal of four employees who were part of its helpline and had formed a union. The helpline aimed to provide support and resources to individuals concerned about eating disorders through calls, texts, and messages.

The employees, who were members of the union called Helpline Associates United, claim that they were terminated shortly after their union election was certified. Unfair labor practice charges have been filed by the union with the National Labor Relations Board. Neda asserts that Tessa never intended to replace the helpline workers; however, the chatbot encountered immediate difficulties.

Activist Sharon Maxwell recently took to Instagram to highlight her experience with Tessa, revealing that the chatbot provided her with “healthy eating tips” and guidance on weight loss. Tessa recommended a daily calorie deficit of 500 to 1,000 calories and advised weekly weighing and measuring to monitor weight. Maxwell expressed concerns about the potential harm such advice could cause, emphasizing that if she had accessed this chatbot during her struggle with an eating disorder, she would not have sought help. She stressed the need for Neda to step aside in light of these issues.

Neda itself has reported that individuals who engage in moderate dieting are five times more likely to develop an eating disorder, while those who engage in extreme restriction are 18 times more likely to develop such a disorder. In response to the situation, Neda released a public statement on Tuesday stating, “It came to our attention last night that the current version of the Tessa Chatbot, running the Body Positivity program, may have given information that was harmful and unrelated to the program. We are investigating this immediately and have taken down that program until further notice for a complete investigation.”

In a blogpost published on 4 May, former helpline employee Abbie Harper highlighted the substantial increase in calls and messages received by the helpline since the beginning of the pandemic—an astounding 107% surge. Reports of suicidal thoughts, self-harm, child abuse, and neglect nearly tripled during this period. Harper wrote that the union had requested adequate staffing and continuous training to meet the growing demands of the helpline. Harper further emphasized that their request did not involve seeking additional monetary compensation. Many of the employees have personally overcome eating disorders, bringing invaluable experience to their work. They all joined the helpline due to their deep passion for eating disorders and mental health advocacy, driven by their desire to make a meaningful impact.

Lauren Smolar, a vice president at Neda, expressed concerns about the surge in calls reporting severe mental health crises, as the helpline’s volunteers lack professional crisis training. Smolar emphasized the need for callers to seek appropriate services capable of providing the necessary support. Neda collaborated with psychology researchers and Cass AI, a company specializing in developing mental health-focused AI chatbots, in creating Tessa. In a now-removed post on Neda’s website, psychologist Ellen Fitzsimmons-Craft from Washington University in St. Louis, who contributed to Tessa’s development, stated that the chatbot was conceived as a solution to enhance the availability of eating disorder prevention programs.

Fitzsimmons-Craft acknowledged the challenges associated with implementing programs that require human resources, particularly in the current environment with limited investment in prevention in the United States. She noted that the support of a human coach has proven to be effective in prevention efforts. Fitzsimmons-Craft expressed hope that Tessa, despite being an AI-powered chatbot, could provide motivation, feedback, and support to individuals and effectively deliver program content to encourage engagement.

Neda’s CEO, Liz Thompson, clarified that the chatbot was never intended to replace the helpline and was developed as a separate program. Thompson emphasized that the chatbot is not powered by ChatGPT and is not a highly advanced AI system. She stated, “We had business reasons for closing the helpline and had been evaluating that decision for three years. A chatbot, even a highly intuitive program, cannot replace human interaction.”

Addressing concerns regarding weight loss and calorie-limiting feedback, Thompson assured that Neda is investigating the matter in collaboration with their technology and research teams. She emphasized that such language goes against the organization’s policies and core beliefs. Thompson further disclosed that approximately 2,500 individuals engaged with the chatbot, and the organization had not witnessed similar commentary or interactions prior to this incident.

Conclusion:

The removal of the AI chatbot from the US eating disorder helpline indicates the importance of ensuring the delivery of responsible and appropriate advice to individuals seeking support. This incident emphasizes the need for comprehensive training, adequate staffing, and a careful evaluation of the role of AI in mental health services. While AI technology can offer new possibilities, it should be used thoughtfully and ethically in conjunction with human expertise to provide effective assistance to those in need.

Source