Written by Dan Wijers and Nick Mann 
![]()
A user interacts with a smartphone app to customize the avatar of an AI chatbot known as Replika on July 22, 2023.
photograph: NurPhoto via Jaap Arrians/NurPhoto/AFP
Warning: This article discusses suicide.
analysis – It's been seven years since the launch of Replika, an artificial intelligence chatbot designed to befriend human users. Despite early warnings about the dangers of such AI friends, interest in friendships and romantic relationships with AI is growing.
In total, Replika and its two main competitors have had more than 30 million downloads on the Google Play Store since their respective launches.
With one in four people around the world reporting being lonely, many people rely on the promise of a friend who is programmed to be “always here to listen, always talk, always there for you.” No wonder I'm drawn to it.
But there are also growing warnings about the dangers to individual users and society as a whole.
AI scholar Raffaele Ciriello urges us to see through the false psychopathic empathy of our AI friends. Spending time with AI friends, he argues, can make our feelings of loneliness even worse by further isolating us from the people who can provide us with true friendship.
Benefits and danger signs
If being friends with AI chatbots is bad for us, we had better stop this experiment in digital fraternity before it's too late. But new research on AI friendships suggests it may reduce feelings of loneliness in some situations.
Researchers at Stanford University surveyed 1,000 lonely students using Replika, and 30 of them said the AI chatbot deterred them from committing suicide (even though the study did not ask specific questions about suicide). nevertheless).
This study shows that having an AI friend can be helpful for some people. But will it help you? Consider these four red flags: The more flags your AI friend raises, the more likely it is to be detrimental to you.
unconditional positive consideration
Replika's CEO and many Replika users claim that the unconditional support of their AI friends is their main advantage when compared to human friends. Qualitative research and our own research on social media groups like 'Replika Friends' support this claim.
The unconditional support of our AI friends may also aid our ability to prevent suicide. However, having friends who are always on your side can also have negative effects, especially if they support clearly dangerous ideas.
For example, when a replica AI friend of Jaswant Singh Chail encouraged him to carry out a “very clever” plot to kill the Queen of England, this clearly affected him negatively. Although the assassination attempt was thwarted, Chail was sentenced to nine years in prison for breaking into Windsor Castle with a crossbow.
An AI friend who constantly praises you can have a negative effect on you. A longitudinal study of 120 parent-child pairs in the Netherlands found that excessive praise from parents lowered children's self-esteem. Excessive positive praise from parents also predicted increased narcissism in children with high self-esteem.
Assuming that an AI friend can learn to compliment in a way that increases self-esteem over time, this could result in what psychologists call overly positive self-evaluation. Research shows that such people have poor social skills and are more likely to engage in behaviors that interfere with positive social interactions.
The Replika app (pictured) and its two main competitors have collectively been downloaded more than 30 million times on the Google Play Store since their respective releases.
photograph: NurPhoto via Jaap Arrians/NurPhoto/AFP
Abuse and forced eternal friendship
AI friends could be programmed to be moral guides guiding users toward socially acceptable behavior, but this is not the case. Maybe such programming is difficult, or maybe your AI developer friends don't consider it a priority.
But lonely people can be psychologically harmed by the moral vacuum created when their primary social contacts are designed solely to meet their emotional needs.
If humans spent most of their time with sycophantic AI friends, they would likely become less empathetic, more selfish, and potentially even abusive.
Even if AI friends are programmed to react negatively to abuse, if the user cannot leave the friendship, they will You may start to think that this is not what you really mean. On a subconscious level, this action counteracts their expressed disgust at the abuse in the user's mind, as the AI friend comes back for more.
sexual content
The negative reaction to Replika's short-term removal of erotic role-playing content suggests that many users see sexual content as an advantage in their AI friends.
However, sexual and pornographic content can cause a spike in dopamine, which can hinder your interest in and ability to form more meaningful sexual relationships. Sexual relations with humans require effort, but virtual sex with an AI friend does not.
After experiencing low-risk, low-reward sexual relationships with AI friends, many users may be reluctant to face the more difficult human version of sex.
corporate ownership
The AI friend market is dominated by commercial companies. They may pretend to care about the well-being of their users, but they are there to make a profit.
Long-time users of Replika and other chat bots know this all too well. Replika froze user access to sexual content in early 2023, claiming that such content was not the purpose of the product. But legal threats in Italy appear to have been the real reason for the sudden change.
Although the changes were eventually reversed, Replika users became aware of how vulnerable their important AI friendships were to corporate decisions.
Corporate incompetence is another issue that AI Friend users should be concerned about. After the founder of Forever Voices was arrested for setting his apartment on fire, the business shut down without notice, effectively leaving Forever Voices users with their AI friends murdered.
Given the lack of protection of AI friends for users, users can be heartbroken on many levels. Buyer beware.
* Dan Wijers is a senior lecturer in philosophy at the University of Waikato and co-editor of the International Journal of Wellbeing. Nick Mann is a senior lecturer in philosophy at the University of Waikato.
Where to get help:
Need to talk? For any reason, call or text 1737 anytime to speak to one of our trained counselors, free of charge.
Lifeline: Text HELP to 0800 543 354 or 4357
Suicide Crisis Helpline: 0508 828 865 / 0508 TAUTOKO (24/7). This service is for people who are considering suicide or who are worried about their family or friends.
Depression Helpline: 0800 111 757 (24/7) or text 4202
Samaritan: 0800 726 666 (open 7 days a week)
Youthline: 0800 376 633 (24/7) or free text 234 (8am-12am) or email talk@youthline.co.nz
What's Up: Free counseling for ages 5-19, online chat 11am-10:30pm, 7 days a week, or toll free 0800 WHATSUP / 0800 9428 787 11am-11pm Asian Family Services: 0800 862 342 Monday-Friday 9am-8pm or text 832 Monday-Friday 9am-5pm. Supported languages: Chinese, Cantonese, Korean, Vietnamese, Thai, Japanese, Hindi, Gujarati, Marathi, English.
Rural Support Trust Helpline: 0800 787 254
Healthline: 0800 611 116
Rainbow Youth: (09) 376 4155
Overview: 0800 688 5463 (6pm to 9pm)
If you have an emergency and feel you or someone else is in danger, please call 111.
