AI chatbot systems such as ChatGPT, Claude and Copilot are increasingly being used as trusted confidants, but relying on them for companionship and emotional support is a cause for concern, especially for young people, say experts in the BMJ's Christmas issue.
They warn that “we may be witnessing a generation learning to form emotional bonds with beings who lack human-like capacities for empathy, compassion, and interpersonal harmony,” and say that evidence-based strategies to reduce social isolation and loneliness are paramount.
In 2023, the U.S. Surgeon General declared that the United States is experiencing an epidemic of loneliness and constitutes a public health concern on par with smoking and obesity, write Susan Shelmerdin and Matthew Noor.
In the UK, almost half of adults (25.9 million people) report feeling lonely sometimes, sometimes, always or often. Almost 1 in 10 people experience chronic loneliness (defined as feeling lonely “often or all the time”). Young people (16-24 years old) are also affected.
Given these trends, the authors say it's no wonder so many people are turning to other sources for companionship and emotional support. For example, ChatGPT has approximately 810 million weekly active users worldwide, with therapy and peer interaction being the top reasons for use, according to some reports.
Among young people, one study found that one-third of teens use an AI companion for social interactions, one in 10 report that AI conversations are more satisfying than human conversations, and one in three say they would choose an AI companion over a human for a serious conversation.
Given this evidence, researchers say it seems prudent to consider problematic chatbot use as a new environmental risk factor when evaluating patients with mental disorders.
In these cases, we suggest that clinicians should begin by gently asking questions about problematic chatbot use, especially during the holiday period when vulnerable populations are most at risk, followed by more direct questions to assess compulsive usage patterns, dependence, and emotional attachment, as appropriate.
They acknowledge that AI has the potential to offer benefits in improving accessibility and supporting individuals experiencing loneliness, and say that empirical research is needed to “characterize the prevalence and nature of risks in human-chatbot interactions, develop clinical capacity to assess patient use of AI, implement evidence-based interventions for problematic addictions, and advocate for regulatory frameworks that prioritize long-term well-being over superficial and short-sighted engagement metrics.”
Meanwhile, they conclude that it is most important to focus on and build on evidence-based strategies to reduce social isolation and loneliness.
