Why you should ask your therapist about using AI

Applications of AI


Since ChatGPT was released in December 2023, there has been a lot of discussion about how artificial intelligence has the potential to change various aspects of human life. From academia to the arts, from the media to the medical field, the implications are far-reaching and profound, whether inspiring, frightening, or both.

The field of psychotherapy, on the other hand, has been considered to be one of the latter fields affected. After all, isn't that the basis of relationships? But it turns out that's not the case.

Evidence shows that young people in particular are seeking help from therapy bots as well as more general AI such as ChatGPT. One study (McBain et al, 2025) found that one in eight young people used an AI chatbot, specifically for mental health advice. This raises some serious concerns, including the potential for AI to assist vulnerable people in committing suicide, and the potential for AI to accelerate mental illness through flattery and flattering responses to people who are becoming increasingly disconnected from reality.

But what has received less attention is how AI is creeping into in-person therapy. It sometimes happens without even the therapist's knowledge.

Therapist and AI

Automated note-taking software, transcription services, recording platforms, and scheduling services for therapists all promise to increase efficiency and streamline administrative tasks. While these all sound completely positive, the contracts that therapists enter into may not clearly spell out the potential privacy risks or long-term plans of these companies. How do these platforms handle client data? How is confidentiality maintained? Who owns the data? Will it ultimately be used to train robots to replace therapists?

As in other areas where expertise is required, therapists themselves may use AI for consultation (helping with difficult cases). However, this also raises confidentiality concerns. Even conscientious therapists may not fully understand the risks. And how many of them are raising disturbing questions ranging from (at best) how easily client privacy can be violated to (at worst) whether client data is being used to train therapy bots that will eventually replace therapists themselves?

Questions to ask your therapist about using AI

With so much uncertainty, seeking clarity is paramount. As a consumer, you have the power to start this conversation and get answers about exactly what's going on with your personal story.

The first step is to carefully read the informed consent document you signed at the beginning of your treatment. If you no longer have a copy, please request a new copy. Next, ask your therapist the following questions:

  1. Have there been any changes in the way your practice is managed since you signed the informed consent and practice policy document?
  2. How are you using AI in your practice?
  3. Are you using AI for note-taking, transcription, and case summarization?
  4. If I do telemedicine, will my session be recorded? Is it monitored in real time by some platform?
  5. Are you using AI to make treatment decisions?
  6. How is my privacy ensured in each of these steps?
  7. Will I be notified if I start using AI-related software?

Seeking therapy can be scary, and clients often don't want to make waves by asking questions about things they don't think are relevant to their treatment. But these questions are actually central to treatment. And all of us, therapists, clients, and the public alike, should be clear about how the most private details of our lives are viewed and used.

To find a therapist, visit: Psychology Today's Therapy Directory.



Source link