We need to talk more about the use of AI in mental health treatment

Applications of AI


Today's column addresses the question of whether there is enough discussion about the use of AI in mental health treatment. This study was partially inspired by a similar question: Are we talking too much or not enough about overall mental health in public spaces?

On the AI ​​side, it is my view that we wisely need to have more discussion about the use and impact of AI in mental health, particularly the rapidly expanding role of generative AI and large-scale language models (LLMs). Additionally, what is said should be particularly thoughtful and insightful, not confusing or misleading.

Let's talk about it.

This analysis of AI breakthroughs is part of my ongoing Forbes column on emerging AI, which includes identifying and explaining the various complexities of influential AI (see link here).

AI and mental health therapy

As a quick background, I have extensively covered and analyzed myriad aspects of the emergence of modern AI to generate mental health advice and implement AI-driven treatments. This increased use of AI is primarily driven by the evolution and proliferation of generative AI. For a quick overview of some of the columns I've posted on this evolving topic, see the link here. This is a quick summary of about 40 of the 100+ columns I've posted on this subject.

There is little doubt that this is a rapidly evolving field with significant benefits, but at the same time, unfortunately, these efforts also have hidden risks and obvious pitfalls. I speak out frequently about these pressing issues, including appearing on an episode of CBS last year. 60 minutessee this link.

If you are new to the topic of AI for mental health, you may want to consider reading my recent analysis on this area. This analysis also details a highly innovative initiative from the Stanford University School of Psychiatry and Behavioral Sciences called AI4MH. See link here.

Talk about mental health in general

recent editorials online psychiatry We tackled the nagging question of whether talking about mental health is dangerous.

Why is it a problem to discuss society's mental health?

Some argue that doing so will lead to more people believing they have a mental health problem, rather than choosing not to address the topic of mental health itself. You know, there's a calculus that says if discussing this subject is going to trigger people into mental illness, then we should be quiet, or at least modest, and ultimately not let the pebbles thunder and slide down the side of the mountain.

An opinion piece that confronted this controversial topic was titled “Are we talking too much about mental illness?” Written by Daniel Morehead, online psychiatryMarch 28, 2025, made the following important points (excerpt):

  • “While mental health awareness campaigns can help educate the public, a growing number of researchers are suggesting that people with mild symptoms may be overinterpreting stress and overdiagnosing it, which can lead to further distress, fear, and reluctance to live a normal life,” according to an article in the New York Times.
  • “Mental illness is shockingly common, devastating, and woefully undertreated. Is it really possible to talk about it much?”
  • “We don’t talk about mental health that much.”
  • “We talk about mental health with far too little knowledge.”
  • “For the first time in human history, there is widespread recognition that mental health is important. This is a tremendous achievement for psychiatry and all those who support mental health. Today, the majority of the population supports mental health, even if they do not really know what they are supporting. We must not waste this historic opportunity.”

We can clearly see that the editorial emphasizes that there should be an ongoing discussion about mental health, albeit with an important caveat. One thing to keep in mind is that such chatter needs to be properly informative and useful. Perhaps the real danger is spreading stories that misrepresent the nature of mental health and lead the public down a false and perhaps harmful path.

AI for mental health

Shifting gears, a related question is: Are we talking too much about the use of AI for mental health?

Why?

If you take a look at the latest news headlines, you'll find lots of articles about people falling in love with chatbots and using generated AI moment by moment as AI-driven therapists. There is a lot of online and offline chatter related to AI in mental health.

One aspect of this constant chatter is that more and more people are turning to AI for their mental health needs. It is a cascading phenomenon that feeds itself. The more common the topic becomes, the more people will be tempted to try it themselves. Despite the warnings and forebodings noted in the press, people are essentially being encouraged to explore the use of AI for their mental health.

Is that a good or bad outcome?

The idea is that if this expansion of AI use is bad, then we should stop talking about it. Please stay away. People gradually forget that the issue was popular and was becoming even more popular. As with many fads in life, usage in this case will inevitably decline.

Grab the beast with the horn

I tend to agree with that online psychiatry The editorial says we don't talk too much about mental health and need to ensure that what is said provides useful and practical insights. The same goes for discussions about AI and mental health.

We certainly need to have a conversation about AI for mental health, and likewise, we need to make sure that what is being said is the right thing to do. Let’s get straight to the topic of AI for mental health. Here are some highlights.

First, we need to recognize why so many people are turning to AI as a mental health advisor. The reality is that mental health professionals are in short supply. Societal demand for psychological guidance quickly outstrips the supply of available therapists.

In contrast, AI is a large-scale option that never runs out of availability because additional computer servers can be easily racked and used immediately.

Second, the cost of human-to-human treatment is often out of reach for those who cannot afford specialized treatment. Affordability is a troubling consideration. Billable hours and other charges are taken into account. Of course, therapists are entitled to be well compensated for their brave work. But costs continue to rise, and surcharges of all kinds seem to be mushrooming. Also, treatment can be continued almost indefinitely. The lifetime investment by clients and patients is potentially staggering.

Today's generation AI and LLM are typically available for free or at very low cost. AI-to-human therapy is very affordable.

Third, interacting with a human therapist can be a logistical challenge. You will need to schedule a visit. There may be some waiting time required to obtain a timeslot. Perhaps this interaction could take place at a time of choice on the part of the therapist rather than on the part of the client or patient, i.e., during weekdays or traditional work hours.

Access to AI is available anytime, anywhere. You can log in at 2am on a Saturday without having to make a reservation or worry about waking anyone up.

the other side of the coin

While these clear benefits of leveraging AI for mental health advice are valuable benefits, there are also complex costs that must be paid. Let’s consider some concerns when using AI for mental health.

First, existing general-purpose generative AI is not tailored to provide mental health treatment. People don't seem to realize that the generative AI they use to help with a variety of everyday tasks isn't sophisticated enough to provide mental health advice. AI manufacturers are a bit wink-wink, often including subtle warnings about using their AI for mental health, such as specifying in license agreements that the AI ​​may not be used for that purpose. This is a classic and somewhat sneaky approach to covering their butts.

In reality, very few people notice the wink-wink warning, and even fewer notice this language embedded somewhere in AI's online license. For more information on this troubling aspect, see my discussion at the link here.

Second, customized generative AI suitable for mental health treatment is still under development. Researchers and practitioners are working hard to build AI for mental health that is as fluent as generative AI, yet as predictable and reliable. Previously, such AI was primarily configured using rules or expert system features (see my analysis at the link here). They are deterministic and could be thoroughly tested. Generative AI and LLM are non-deterministic in nature and use statistics and randomness to appear creative and human-like.

There are generative AI add-ons that try to address the fundamental problem of non-determinism. Some companies choose to build AI for mental health from the ground up, creating AI foundational models shaped from the ground up for AI mental health. See my discussion at the link here.

Triad is almost here

We need to find a convergence between the challenges of providing human-to-human therapeutic services and the patchy benefits of existing AI-to-human approaches. One such approach is what I call the therapist-AI-client triad. This is an enhanced version of the traditional duo or dyad in the therapist-client relationship.

The bottom line is that therapists can intentionally and wisely choose to incorporate AI into their teaching practices. That way, clients and patients get the best of both worlds. Meeting with a therapist remains a human-to-human experience. On the other hand, if you need assistance outside of your limited time with your therapist, you will utilize AI that has been carefully selected and recommended by your therapist.

For more information on the new trifecta of therapist, AI, and client, please see my detailed discussion at the link here.

The caveat is that if poorly managed, the triple combination of therapist, AI, and client can lead to disaster. It will look like this. Therapists don't really care about the AI ​​aspect and just wave their hands when a client needs to use AI. Clients continue to rely on AI indiscriminately for their mental health needs. When therapists and clients come together, the majority of human-to-human time is spent discussing what the AI ​​says to do and what the therapist says to do. Ultimately, human-to-human advice and AI-to-human advice become interconnected to the detriment of everyone involved.

not good.

Importantly, the triple combination of therapist, AI, and client only works if the therapist takes a determined and mindful approach. Being lazy or aloof won't cut the mustard.

keep speaking correctly

The more eyes and ears we have on the AI ​​challenge of mental health, the better.

Few people realize that when they use general-purpose generative AI, they are voluntarily participating in a large-scale, unplanned, open-ended experiment, as if they were some kind of guinea pig. We do not know what the long-term effects of using a generic LLM as a mental health advisor will be. What will be the impact on population size?

Unfortunately, only time will tell.

One passionate argument is that something is better than nothing. In other words, for people who have no reasonable chance of receiving treatment through traditional means, using AI is an opportunity to obtain something resembling therapy, however imperfect and shallow it may be at the moment. Related concerns include thorny issues such as so-called AI hallucinations, such as confabulated speech, and the potential for over-the-top general-purpose AI to create new problems while potentially solving others.

Final thoughts for now.

Glenn Close famously made the following sharp statement about mental health: “What mental health needs is more sunshine, more frankness, and more unashamed conversations.” I give that excited comment a commendable tip.

Building on a similar line, let me suggest that what AI for mental health needs is bright sunshine, rich candor, and more frank and truthful conversations about the difficult and increasingly important directions in which the use of this exciting technology should go.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *