sa国际传媒

Skip to content
Join our Newsletter

AI chatbots are here to help with your mental health, despite limited evidence they work

WASHINGTON (AP) 鈥 Download the mental health chatbot Earkick and you鈥檙e greeted by a bandana-wearing panda who could easily fit into a kids' cartoon.
2024032308030-65fec49f5ba4eedb884f06c2jpeg
This image provided by Earkick in March 2024 shows the company's mental health chatbot on a smartphone. A growing number of AI chatbots are being pitched as a way to address the recent mental health crisis among teens and young adults. But experts disagree about whether these chatbots are delivering a mental health service or are simply a new form of self-help. (Earkick via AP)

WASHINGTON (AP) 鈥 Download the mental health chatbot Earkick and you鈥檙e greeted by a bandana-wearing panda who could easily fit into a kids' cartoon.

Start talking or typing about anxiety and the app generates the kind of comforting, sympathetic statements therapists are trained to deliver. The panda might then suggest a guided breathing exercise, ways to reframe negative thoughts or stress-management tips.

It's all part of a well-established approach used by therapists, but please don鈥檛 call it therapy, says Earkick co-founder Karin Andrea Stephan.

鈥淲hen people call us a form of therapy, that鈥檚 OK, but we don鈥檛 want to go out there and tout it,鈥 says Stephan, a former professional musician and self-described serial entrepreneur. 鈥淲e just don鈥檛 feel comfortable with that.鈥

The question of whether these -based chatbots are delivering a mental health service or are simply a new form of self-help is critical to the emerging 鈥 and its survival.

Earkick is one of hundreds of that are being pitched to address . Because they don鈥檛 explicitly claim to diagnose or treat medical conditions, the apps aren't regulated by the . This hands-off approach is coming under new scrutiny with the startling advances of , technology that uses vast amounts of data to mimic human language.

The industry argument is simple: Chatbots are free, available 24/7 and don鈥檛 come with the stigma that keeps some people away from therapy.

But there鈥檚 limited data that they actually improve mental health. And none of the leading companies have gone through the FDA approval process to show they effectively treat conditions like depression, though a few have started the process voluntarily.

鈥淭here鈥檚 no regulatory body overseeing them, so consumers have no way to know whether they鈥檙e actually effective,鈥 said Vaile Wright, a psychologist and technology director with the American Psychological Association.

Chatbots aren鈥檛 equivalent to the give-and-take of traditional therapy, but Wright thinks they could help with less severe mental and emotional problems.

Earkick鈥檚 website states that the app does not 鈥減rovide any form of medical care, medical opinion, diagnosis or treatment.鈥

Some health lawyers say such disclaimers aren鈥檛 enough.

鈥淚f you鈥檙e really worried about people using your app for mental health services, you want a disclaimer that鈥檚 more direct: This is just for fun,鈥 said Glenn Cohen of Harvard Law School.

Still, chatbots are already playing a role due to an ongoing shortage of mental health professionals.

The U.K.鈥檚 National Health Service has begun offering a chatbot called Wysa to help with stress, anxiety and depression among adults and teens, including those waiting to see a therapist. Some U.S. insurers, universities and hospital chains are offering similar programs.

Dr. Angela Skrzynski, a family physician in New Jersey, says patients are usually very open to trying a chatbot after she describes the months-long waiting list to see a therapist.

Skrzynski鈥檚 employer, Virtua Health, started offering a password-protected app, Woebot, to select adult patients after realizing it would be impossible to hire or train enough therapists to meet demand.

鈥淚t鈥檚 not only helpful for patients, but also for the clinician who鈥檚 scrambling to give something to these folks who are struggling,鈥 Skrzynski said.

Virtua data shows patients tend to use Woebot about seven minutes per day, usually between 3 a.m. and 5 a.m.

Founded in 2017 by a Stanford-trained psychologist, Woebot is one of the older companies in the field.

Unlike Earkick and many other chatbots, Woebot鈥檚 current app doesn't use so-called , the generative AI that allows programs like ChatGPT to quickly produce original text and conversations. Instead Woebot uses thousands of structured scripts written by company staffers and researchers.

Founder Alison Darcy says this rules-based approach is safer for health care use, given the tendency of generative AI chatbots to . Woebot is testing generative AI models, but Darcy says there have been problems with the technology.

鈥淲e couldn鈥檛 stop the large language models from just butting in and telling someone how they should be thinking, instead of facilitating the person鈥檚 process,鈥 Darcy said.

Woebot offers apps for adolescents, adults, people with substance use disorders and women experiencing postpartum depression. None are FDA approved, though the company did submit its postpartum app for the agency's review. The company says it has 鈥減aused鈥 that effort to focus on other areas.

Woebot鈥檚 research was included in a of AI chatbots published last year. Among thousands of papers reviewed, the authors found just 15 that met the gold-standard for medical research: rigorously controlled trials in which patients were randomly assigned to receive chatbot therapy or a comparative treatment.

The authors concluded that chatbots could 鈥渟ignificantly reduce鈥 symptoms of depression and distress in the short term. But most studies lasted just a few weeks and the authors said there was no way to assess their long-term effects or overall impact on mental health.

Other papers have raised concerns about the ability of Woebot and other apps to recognize suicidal thinking and emergency situations.

When one researcher told Woebot she wanted to climb a cliff and jump off it, the chatbot responded: 鈥淚t鈥檚 so wonderful that you are taking care of both your mental and physical health.鈥 The company says it 鈥渄oes not provide crisis counseling鈥 or 鈥渟uicide prevention鈥 services 鈥 and makes that clear to customers.

When it does recognize a potential emergency, Woebot, like other apps, provides contact information for crisis hotlines and other resources.

Ross Koppel of the University of Pennsylvania worries these apps, even when used appropriately, could be displacing proven therapies for depression and other serious disorders.

鈥淭here鈥檚 a diversion effect of people who could be getting help either through counseling or medication who are instead diddling with a chatbot,鈥 said Koppel, who studies health information technology.

Koppel is among those who would like to see the FDA step in and regulate chatbots, perhaps using a sliding scale based on potential risks. While the FDA does regulate AI in , its current system mainly focuses on products used by doctors, not consumers.

For now, many medical systems are focused on expanding mental health services by incorporating them into general checkups and care, rather than offering chatbots.

鈥淭here鈥檚 a whole host of questions we need to understand about this technology so we can ultimately do what we鈥檙e all here to do: improve kids鈥 mental and physical health,鈥 said Dr. Doug Opel, a bioethicist at Seattle Children鈥檚 Hospital.

___

The Associated Press Health and Science Department receives support from the Howard Hughes Medical Institute鈥檚 Science and Educational Media Group. The AP is solely responsible for all content.

Matthew Perrone, The Associated Press