Recovery Writes

View Original

AI and eating disorders: How to find help on the internet

Can AI help you recover from an eating disorder?

If you follow the National Eating Disorders Association (NEDA) on any level, you’re likely aware of the recent controversy surrounding their website’s now-defunct chatbot, “Tessa.”

In May 2023, Tessa gave some disturbing advice to users who sought help with disordered thoughts and behavior, and the chatbot has since been removed from the site altogether.

What caused this scandal, and how can we, as humans, prevent robots from dishing out inappropriate and sometimes harmful guidance in the future? Keep reading to learn more.

What happened with NEDA’s chatbot?

Generative artificial intelligence is the main driver of services offered by organizations like Cass, a wide-reaching mental health app that connects users with counselors in a matter of minutes. NEDA used Cass to launch a chatbot as a website resource that people could use to find care and get advice for problems no matter the time of day or place. Given the still-existing scarcity of affordable and available eating disorder specialists, the intention makes perfect sense.

Tessa advised people who asked questions about eating disorders to do things that are counterintuitive to recovery like restricting calorie intake, performing daily weigh-ins, and choosing “healthier” snacks.

NEDA released a statement apologizing to the community:

"We recognize and regret that certain decisions taken by NEDA have disappointed members of the eating disorders community. Like all other organizations focused on eating disorders, NEDA's resources are limited and this requires us to make difficult choices. We always wish we could do more and we remain dedicated to doing better." —Liz Thompson, CEO of NEDA

But with (some) damage already done, what does this mean for AI’s role in the overly complex world of mental health?

AI’s current limitations with mental health

Despite stigma-stifling movements and awareness, mental health is still a bit of a taboo subject in the broader health conversation. And as the adage goes, if you don’t experience it yourself, it can be extremely difficult to convey the complexities of mental illnesses to the layperson.

And on the tech front, the delicacy with which AI tools should approach mental health might not be at the forefront of the developers’ minds.

Even an organization like NEDA, which one would assume is well-versed in appropriate language, has the responsibility to be the gatekeeper of communications to ensure what’s being delivered to those seeking help is sensitive, safe, and trigger-free. This calls for careful inspection and frequent observation of tech-forward tools to make sure they aren’t causing more harm than good.

AI’s potential for the future of eating disorders and mental health care

While it’s clear that artificial intelligence (AI) is not ready to jump into the mental health space just yet, especially when it comes to eating disorder care, it’s hard not to wonder about the future of the technology, or at least the more innocuous applications. For example, it might one day serve as a triage to direct us to the appropriate resources, not unlike an intake nurse in an emergency room. But any use beyond direction to qualified care will need to be monitored carefully and applied with sensitivity.

This also doesn’t mean the internet is or will be completely free of harmful messaging. Pro-anorexia websites have been online since the days of MySpace and Napster, but they have to be specifically sought out by the user, and they aren’t masked as “helpful” chatbots by one of the most well-known eating disorders associations in the United States.

Still, there is potential for AI to be a stepping stone toward customized care options for people suffering from eating disorders and a host of other mental illnesses. But before we get there, AI has some work to do.


To find eating disorder care, visit the Rules & Resources page.