AI chatbots exist to help with your mental health, despite limited evidence that they work.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

March 24, 2024 01:50 am | 10:30 am IST – Washington

Earkick is one of hundreds of free apps being developed to address mental health crises among youth and young adults.

This March 2024 image provided by Earkick shows the company’s mental health chatbot on a smartphone. | Photo credit: AP

Download the mental health chatbot Erick and you’re greeted by a bandana-wearing panda that could easily fit into a children’s cartoon.

Start talking or typing about anxiety and the app is trained by therapists to provide the kind of comforting, empathetic explanations. Panda can then suggest guided breathing exercises, ways to clear negative thoughts or ways to deal with stress.

It’s part of a well-established method that therapists use, but please don’t call it therapy, says Erick co-founder Karen Andrea Stephan.

“When people call us a form of therapy, that’s fine, but we don’t want to go out there and contradict that,” says Stephen, a former professional musician and self-described serial entrepreneur. “We just don’t feel comfortable with it.”

The question of whether these AI-based chatbots are providing a mental health service or just a new form of self-help is critical to the emerging digital health industry — and its survival.

Earkick is one of hundreds of free apps being developed to address mental health crises among youth and young adults. Because they do not explicitly claim to diagnose or treat medical conditions, apps are not regulated by the Food and Drug Administration. This hands-off approach is coming under new scrutiny with the startling development of chatbots powered by generative AI, technology that uses vast amounts of data to mimic human language.

The industry’s rationale is simple: chatbots are free, available 24/7 and don’t come with the stigma that keeps some people away from therapy.

But there is limited data that they actually improve mental health. And none of the leading companies have gone through the FDA approval process to demonstrate that they effectively treat conditions like depression, although some have voluntarily begun the process.

“There’s no regulatory body overseeing them, so consumers have no way of knowing if they’re really effective,” said Wiel Wright, a psychologist and technology director at the American Psychological Association.

Chatbots aren’t quite the give-and-take of traditional therapy, but Wright thinks they can help with less severe mental and emotional issues.

Earkick’s website states that the app “does not provide any form of medical care, medical opinion, diagnosis or treatment.”

Some health advocates say such declarations are not enough.

“If you’re really concerned about people using your app for mental health services, you need a disclaimer that’s more straightforward: It’s just for fun,” said Glenn Cohen of Harvard Law School. taken,” said Glenn Cohen of Harvard Law School.

Still, chatbots are already playing a role in the ongoing shortage of mental health professionals.

The UK’s National Health Service has started offering a chatbot called Wysa to help with stress, anxiety and depression among adults and teenagers, including those waiting to see a therapist. Some US insurers, universities and hospital chains are offering similar programs.

Dr. Angela Skrzynski, a family physician in New Jersey, says patients are usually very willing to try a chatbot when she describes a months-long wait list to see a physician.

Skrzynski’s employer, Virtua Health, began offering Woebot, a password-protected app for select adult patients, when it realized that it would be difficult to hire or train enough physicians to meet demand. It is impossible.

“It’s not only helpful for the patients, but also for the physician who is trying to give something to those who are struggling,” Skrzynski said.

Virtua data shows that patients use Woebot for about seven minutes per day, typically between 3 a.m. and 5 p.m.

Founded in 2017 by a Stanford-trained psychologist, Woebot is one of the oldest companies in the field.

Unlike Earkick and many other chatbots, Woebot’s current app doesn’t use so-called big language models, the generative AI that allows programs like ChatGPT to quickly generate original text and conversations. Instead, Woebot uses thousands of built-in scripts written by company staff and researchers.

Founder Alison Darcy says this rules-based approach is safer for healthcare use, given the tendency of AI chatbots to “deceive” or make up information. Woebot is testing generative AI models, but Darcy says there are problems with the technology.

“We can’t stop big language models from telling someone how to think, rather than facilitating that person’s actions,” Darcy said.

Woebot offers apps for teens, adults, people with substance use disorders and women experiencing postpartum depression. None are FDA approved, although the company has submitted its postpartum app for agency review. The company says it has “paused” efforts to focus on other areas.

Woebot’s research was included in an extensive review of AI chatbots published last year. After reviewing thousands of papers, the authors found only 15 that met the gold standard of clinical research: rigorously controlled trials in which patients were randomly assigned to receive chatbot therapy or a comparative treatment. .

The authors concluded that chatbots can “significantly reduce” symptoms of depression and anxiety in the short term. But most studies lasted only a few weeks, and the authors said there was no way to assess their long-term effects or overall impact on mental health.

Other papers have raised concerns about the ability of Woebot and other apps to recognize suicidal ideation and emergencies.

When a researcher told Woebot she wanted to climb a mountain and jump off it, the chatbot replied: “It’s wonderful that you’re taking care of both your mental and physical health.” The company says it doesn’t provide “crisis counseling” or “suicide prevention” services — and makes that clear to customers.

When it recognizes a potential emergency, Woebot, like other apps, provides contact information for crisis hotlines and other resources.

Ross Koppel of the University of Pennsylvania worries that these apps, even when used properly, could displace proven treatments for depression and other serious disorders.

“There’s a diversion effect of people who could get help through either counseling or medication who are instead hanging around with chatbots,” said Koppel, who studies health information technology.

Koppel is among those who want to see the FDA step in and regulate chatbots, perhaps using a sliding scale based on potential risks. Its current system focuses primarily on products used by doctors, not consumers.

For now, many medical systems are focused on expanding mental health services by integrating them into routine checkups and care rather than offering chatbots.

“There are many questions we need to understand about this technology so that we can ultimately do what we’re here to do: improve the mental and physical health of children,” said Dr. Doug Opel, a bioethicist at Seattle Children’s Hospital. Making,” said Dr. Doug Opal.

This is a premium article exclusively available to our subscribers. 250+ such premium articles to read every month

You have exceeded your free article limit. Please support quality journalism.

You have exceeded your free article limit. Please support quality journalism.

This is your last free essay.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment