VOA Learning English

AI Chatbots to Help with Mental Health Struggles

VOA Learning English • voa
April 1, 2024 6 minSource

From VOA Learning English, this is the Health & Lifestyle report.

The mental health chatbot Earkick greets users with a friendly-looking panda that could fit easily in a children’s program.

When users talk about anxiety , the panda gives the kind of comforting statements that a trained mental health professional, called a therapist, would say. Then it might suggest breathing exercises or give advice on how to deal with stress.

Earkick is one of hundreds of free chatbots aimed at dealing with a mental health crisis among young people. But the co-founder of Earkick, Karin Andrea Stephan, says he and the other creators do not “feel comfortable” calling their chatbots a therapy tool.

Whether these chatbots, or apps, provide a simple self-help tool or mental health treatment is important to the growing digital health industry. Since the apps do not claim to diagnose or treat medical conditions, they do not need approval from the Food and Drug Administration (or FDA).

The use of AI chatbots

The industry’s position is now coming under more careful examination with recent developments of chatbots powered by artificial intelligence (AI). The technology uses a large amount of data to copy human language.

The upsides are clear: the chatbots are free; they are available 24 hours a day; and people can use them in private.

Now for the downsides: there is limited data that the chatbots improve mental health; and they have not received FDA approval to treat conditions like depression.

Vaile Wright is a psychologist and technology director with the American Psychological Association. She said users of these chatbots, “have no way to know whether they’re actually effective.”

Wright added that the chatbots are not the same as traditional mental health treatment. But, she said, they could help some people with less severe mental and emotional problems.

Earkick’s website states that the app does not “provide any form of medical care, medical opinion, diagnosis or treatment.” Some health lawyers say such claims are not enough.

Glenn Cohen of Harvard Law School said, “If you’re really worried about people using your app for mental health services, you want a disclaimer that’s more direct…” He suggested, “This is just for fun.”

Still, chatbots are already playing a role due to an ongoing shortage of mental health professionals.

Shortage of mental health professionals

Britain’s National Health Service has begun offering a chatbot called Wysa to help with stress, anxiety and depression among young people.

This includes those people waiting to see a therapist. Some health insurers, universities, and hospitals in the United States are offering similar programs.

Dr. Angela Skrzynski is a family doctor in the American state of New Jersey. When she tells her patients how long it will take to see a therapist, she says they are usually very open to trying a chatbot. Her employer, Virtua Health, offers Woebot to some adult patients.

Founded in 2017 by a Stanford-trained psychologist, Woebot does not use AI programs. The chatbot uses thousands of structured language models written by its staff and researchers.

Woebot founder Alison Darcy says this rules-based model is safer for health care use. The company is testing generative AI models, but Darcy says there have been problems with the technology.

She said, “We couldn’t stop the large language models from… telling someone how they should be thinking, instead of facilitating the person’s process.”

Woebot’s finding was included in a research paper on AI chatbots published last year in Digital Medicine .

The writers concluded that chatbots could help with depression in a short time. But there was no way to study their long-term effect on mental health.

Ross Koppel of the University of Pennsylvania studies health information technology. He worries these chatbots could be used in place of treatment and medications. Koppel and others would like to see the FDA review and possibly regulate these chatbots.

Dr. Doug Opel works at Seattle Children’s Hospital. He said, “There’s a whole host of questions we need to understand about this technology so we can ultimately do what we’re all here to do: improve kids’ mental and physical health.”

And that’s the Health & Lifestyle report. I’m Anna Matteo.

Matthew Perrone reported this story for the Associated Press from Washington, D.C. Anna Matteo adapted it for VOA Learning English.

Quiz - AI Chatbots to Help with Mental Health Struggles

Words in This Story

chatbot – n. a computer program or character (as in a game) designed to mimic the actions of a person that is designed to converse with human beings

anxiety – n. an abnormal and overwhelming sense of apprehension and fear often marked by physical signs

diagnose – v. to recognize (something, such as a disease) by signs and symptoms

artificial intelligence – n. the capability of computer systems or algorithms to imitate intelligent human behavior

psychologist – n. a person who specializes in the study of mind and behavior or in the treatment of mental, emotional, and behavioral disorders

diagnosis – n. the art or act of identifying a disease from its signs and symptoms

facilitate – v. to help bring (something) about

We want to hear from you. Do you have a similar expression in your language? In the Comments section, you can also practice using any of the expressions from the story. Our comment policy is here .


Share this article:

This article uses material from the VOA Learning English article, and is in public domain. Images and videos are available under their respective licenses.

Related Articles: