As Illinois limits the use of AI in therapeutic practices, experts weigh the benefits and risks

Applications of AI


At a time when the number of mental health professionals is decreasing and patient demand is increasing, AI may be the person people turn to when they need professional mental health help.

On August 4, Illinois enacted the Mental Resources Health and Surveillance Act, which limits the use of artificial intelligence. [AI] In the provision of therapy and psychotherapy services.

The purpose of this law is to protect individuals seeking therapy or psychotherapy services by ensuring that services are provided by certified or licensed professionals rather than unregulated AI systems.

Health experts say there are growing concerns about individuals using AI in place of human connection, as it creates a false sense of familiarity and, in some cases, AI directs users to engage in harmful activities.

Experts said that while people using AI for mental health advice and connections may be helpful in the short term, it could have long-term psychological downsides.

short term benefits

illinois state university [ISU] Psychology professor Dan Lanin said he surveyed university students to see if they would be open to using AI therapy chatbots.

“Most of the time they don’t, at least that’s what they say,” Lanin said. “But anecdotally, I know that many students talk to their chat GPTs like they’re therapists.”

Lannin said the short-term benefit is that people can get a little connection in the absence of humans.

A photo of Shengtian Wu, assistant professor of psychology and therapist at the University of Nevada, Las Vegas.

Shengtian Wu, assistant professor of psychology and therapist at the University of Nevada, Las Vegas;

Shentian Wu, an assistant professor of psychology and therapist at the University of Nevada, Las Vegas, said she imagines people who have never been authenticated before in their lives now have access to AI chatbots.

“Some children and adults never know their parents or even their surroundings. [validated] their experience. There are now AIs that speak like humans. They know they are not human, but they speak like humans. [it’s] It truly validates them. It means a lot to them,” Wu said.

As an Asian, Wu said there is a stigma against seeking therapy in the Asian community in general. So, by talking to an AI chatbot, people affected by that bias could potentially address the issue before they share it with their friends and family.

“Now I’m thinking,” Wu said. “What would happen if we had better AI? [where someone] Instead of undergoing treatment, could I partially address my concerns through AI whenever I wanted?”

Wu said AI could be a good tool in that case, but how can we guarantee that the tool is safe?

All certified therapists in the United States must comply with the Health Insurance Portability and Accountability Act. [HIPAA] Compliance to ensure that a patient’s personal information is not shared with anyone. Wu said if the AI ​​platform is well trained on HIPAA compliance, it can be safely tried out by those in need.

“But I don’t think we’re there yet,” Wu said.

risk

Aside from chatbots not having to follow HIPAA compliance, a big risk for AI users is that they may be used in place of real connections, which could make someone feel more alone in the long run, Lanin said.

Lanin said talking to an AI chatbot instead of a human when you need a connection is like drinking a Diet Coke instead of drinking orange juice when you need energy. Although both drinks are sweet, orange juice has better nutritional value.

“I think AI is one of those things that looks appealing but is never really the same,” Lanin said.

Researching AI is also difficult, according to Lanin.

“We don’t really understand every detail of how to process, understand and assemble responses,” Lanin said.

It’s hard to know whether AI will benefit people because research is still young. After all, large-scale language model AI is only modeling the language it has learned, and there is no research to prove that it has the same empathy as humans.

Although there is research showing that AI can replicate human empathy, it lacks genuine emotion.

In therapy, Lanin said, the strongest predictor of whether therapy will help someone’s mental health is the therapeutic alliance: the trust, empathy, and collaboration between therapist and client.

“It’s hard to use that when talking about AI, because what does it mean?” Lanin said. “What does it mean to say we have a real relationship? If the other person isn’t a real person, what does it mean that we have an alliance?”

This could be harmful if people with mental health issues rely on AI for advice that should be provided by mental health professionals.

“There have been some red flag stories that have made headlines and scary things are happening, whether it’s AI directing someone to self-harm or suicide,” Lanin said.

Lanin said the provision of AI therapy could ultimately widen disparities between socio-economic classes.

“The poor, the inaccessible, the pushed to the margins will have AI chat bots. [and] The rich and privileged have access to real people,” Lanin said.

The future of AI in therapy

Wu said a lot of groundwork and collaboration is needed to create great AI therapists in the future.

“I think it will happen,” Wu said. “And I imagine that will happen and people will be harmed in certain ways and there will be lawsuits. [companies] It’s okay to apologize and pay. ”

Wu said it’s hard to imagine a future where AI therapy isn’t an option.

“But I don’t think it will replace human therapists,” Wu said.

Although some AI chatbot therapists are already available, licensed professionals cannot use AI chatbots in their practices in Illinois because chatbots are not HIPAA compliant.

Lanin uses AI chatbots to train therapy students as a way to “use technology to our advantage.”

“Use it as a tutor,” Lanin said. “Use it as a way to learn skills, not as a way to heal emotional wounds.”





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *