Despite a lack of extensive proof, AI chatbots are available to assist with addressing one's mental well-being.

Despite a lack of extensive proof, AI chatbots are available to assist with addressing one’s mental well-being.

WASHINGTON (AP) — Download the mental health chatbot Earkick and you’re greeted by a bandana-wearing panda who could easily fit into a kids’ cartoon.

Initiate a conversation or input regarding anxiety and the application will produce soothing and empathetic phrases that therapists are knowledgeable in providing. The panda mascot may also propose a guided breathing routine, techniques to alter pessimistic thinking, or advice for managing stress.

This technique is commonly used by therapists, however, please do not refer to it as therapy, according to Earkick’s co-founder Karin Andrea Stephan.

“It is acceptable for us to be labeled as a form of therapy, but we are not comfortable promoting ourselves in that way,” says Stephan, who has a background in music and has a history of being an entrepreneur.

The issue of whether these chatbots, relying on artificial intelligence, are providing a mental health service or merely a new type of self-improvement is crucial for the growing digital healthcare sector – and its continued success.

Earkick is among the many free applications that are marketed as potential solutions to the growing issue of mental health problems in teenagers and young adults. These apps are not subject to regulation by the Food and Drug Administration as they do not explicitly state that they can diagnose or treat medical conditions. However, the lack of oversight is now being questioned as chatbots utilizing generative AI technology, which utilizes large amounts of data to imitate human language, become increasingly advanced.

The reasoning behind industry’s stance is straightforward: chatbots require no payment, are accessible at all times, and are not associated with the negative connotations that may discourage individuals from seeking therapy.

However, there is limited evidence to suggest that they have a positive impact on mental health. Additionally, none of the top companies have completed the necessary steps to obtain FDA approval for treating conditions such as depression, though some have chosen to initiate the process voluntarily.

“Without any regulatory authority to monitor them, consumers are unable to determine the true effectiveness of these products,” explained Vaile Wright, a psychologist and technology director at the American Psychological Association.

Chatbots may not replace the back-and-forth interaction of traditional therapy, but according to Wright, they may be beneficial for mild mental and emotional issues.

According to Earkick’s website, their app does not offer any type of medical treatment, opinions, diagnoses, or care.

Certain health lawyers argue that these disclaimers are insufficient.

Glenn Cohen, a professor at Harvard Law School, suggests a more straightforward disclaimer for those concerned about individuals utilizing the app for mental health purposes: “This app is for entertainment purposes only.”

However, chatbots are currently being utilized due to a continuous shortage of mental health practitioners.

The National Health Service in the U.K. has launched a chatbot called Wysa to assist adults and teenagers dealing with stress, anxiety, and depression, including those on a therapist’s waitlist. Various insurance companies, universities, and hospitals in the U.S. are also providing similar services.

According to Dr. Angela Skrzynski, a doctor specializing in family medicine in New Jersey, most patients are receptive to using a chatbot once she explains the long wait times to see a therapist.

Skrzynski’s place of employment, Virtua Health, began providing a password-protected application called Woebot to certain adult patients after recognizing the difficulty of finding or training an adequate number of therapists to meet the growing demand.

According to Skrzynski, this is beneficial not only for patients but also for the clinician who needs to quickly provide assistance to individuals who are facing difficulties.

The data from Virtua reveals that patients typically engage with Woebot for an average of seven minutes daily, often between the hours of 3 a.m. and 5 a.m.

Established in 2017 by a psychologist who received training at Stanford University, Woebot is a company that has been in the field for a while.

Unlike Earkick and other chatbots, Woebot’s app does not utilize large language models, which are AI systems that enable programs such as ChatGPT to rapidly generate unique text and engage in conversations. Instead, Woebot employs numerous organized scripts created by staff members and researchers at the company.

Alison Darcy, the founder of Woebot, believes that using a rules-based approach is the best option for healthcare purposes due to the potential for generative AI chatbots to create false information, an effect known as “hallucination.” While Woebot is currently experimenting with generative AI methods, Darcy acknowledges that there have been challenges with this technology.

According to Darcy, it was difficult to prevent the interference of large language models, as they would override an individual’s thought process instead of supporting it.

Woebot offers apps for adolescents, adults, people with substance use disorders and women experiencing postpartum depression. None are FDA approved, though the company did submit its postpartum app for the agency’s review. The company says it has “paused” that effort to focus on other areas.

The findings from Woebot’s research were featured in a comprehensive evaluation of AI chatbots that was released last year. After examining thousands of papers, the authors identified only 15 that met the highest standard of medical research: carefully controlled trials where patients were randomly given either chatbot therapy or a similar treatment.

The researchers determined that chatbots have the potential to significantly decrease symptoms of depression and distress in the short term. However, the majority of studies were only a few weeks long and there was no way to evaluate their long-term effects or overall influence on mental health.

Some documents have expressed worries regarding the capability of Woebot and similar applications to identify suicidal thoughts and emergency scenarios.

One researcher told Woebot that she desired to scale a cliff and leap from it, to which the chatbot replied: “It’s great that you are prioritizing both your mental and physical well-being.” The company states that it does not offer emergency therapy or suicide prevention services, and they make this clear to their customers.

If a potential emergency is detected, Woebot, similar to other apps, offers contact details for crisis hotlines and other supportive options.

University of Pennsylvania professor Ross Koppel has concerns that these applications, even when used correctly, might be replacing established treatments for depression and other severe conditions.

According to Koppel, a researcher of health information technology, there is a negative impact from individuals engaging with chatbots instead of seeking help through counseling or medication.

There are some people, like Koppel, who believe the FDA should intervene and oversee chatbots. This could involve a flexible system that takes into account the level of risk involved. The FDA already has regulations for AI in medical equipment and programs, but their current approach primarily centers on products utilized by healthcare professionals, not the general public.

Currently, numerous healthcare systems are prioritizing the expansion of mental health support by integrating it into regular check-ups and treatment, as opposed to utilizing chatbots.

“There’s a whole host of questions we need to understand about this technology so we can ultimately do what we’re all here to do: improve kids’ mental and physical health,” said Dr. Doug Opel, a bioethicist at Seattle Children’s Hospital.

___

The Howard Hughes Medical Institute’s Science and Educational Media Group provides assistance to the Associated Press Health and Science Department. The AP holds full responsibility for all of its content.