LegalReader.com  ·  Legal News, Analysis, & Commentary

Mental Health

Agencies Warn of Mental Health Bots Flooding the Market


— June 15, 2023

The FDA will be cracking down on newly instituted AI therapy apps.


More and more focus is being placed on the need for mental health care with between 10,000 and 20,000 new apps entering the market over the past few years. One of the main triggers for such a surge in mental health awareness is the stress brought about by the pandemic. Millions of people felt the need to seek assistance in coping with their mental health issues during that time but there was already an established shortage of professionals to address these concerns. This shortage cleared the way for a boom in virtual mental health startups, resulting in thousands of new apps. Each of these comes equipped with AI bots that can allegedly be used to diagnose and treat all kinds of illnesses. However, concerns have been raised that AI bots are unreliable and traditional therapy may be disrupted as a result.

In terms of quality mental health care, this glut of new tech services is largely untested and unaccredited thus far.  Research is essential to validate their effectiveness, and there simply hasn’t been time to do so despite how many are on the market today.

The truth is that many of these new apps come with small print warnings that they are not intended to be considered a “medical, behavioral health or other healthcare service.” They also often state that they are not FDA cleared.

Agencies Warn of Mental Health Bots Flooding the Market
Photo by Sanket Mishra from Pexels

This is, of course, the big question: Is it possible for a chatbot programmed with appropriate responses to perform in the same way a trained human therapist would? Experts, such as director of the Health Design Lab at Thomas Jefferson University, Bon Ku, have said, “The core tenet of medicine is that it’s a relationship between human and human — and AI bots can’t love.”

Instead, these experts suggest there may be space for AI in mental health care but not as a therapist replacement. Rather, artificial intelligence can serve as a supporting organizational tool to free up time to allow human therapists to treat more patients. Although there are some apps that are praised for being helpful, the vast majority may in fact do more harm than good.

Anyone who’s used a chatbot to get advice for an issue with a product they purchased or used may realize just how obviously unhuman the conversation is and how frustrating this can be. Chatbots cannot deviate from their programmed responses, so the quality of care is limited. Given this, it is almost frightening to consider that AI is being used to conduct therapy sessions.

The U.S. Department of Labor may have taken note of the growing trend in AI mental healthcare products as they are taking steps to ensure that insurers are complying with mental health parity requirements. These requirements state that healthcare insurers should cover not just physical health conditions but emotional healthcare treatment as well.

Increased enforcement is also expected from the Food and Drug Administration, which has said it would be increasing its monitoring of new mental health products. This will not only be to curb potentially harmful offerings, but also to flag any that may show promise as an effective mental health tool. In this way, hopefully, those that are using AI to replace traditional therapy altogether will receive sufficient oversight to ensure their business models are capable of delivering quality care.

Sources:

Analysis: Chatbots for mental health care are booming, but there’s little proof that they help

AI therapists might not actually help your mental health

Join the conversation!