It may not be surprising that the use of AI is increasing every day, with AI being integrated into everyday tools such as email, smart devices, and social media.
However, it is also becoming common for people to engage with AI to form emotional or romantic relationships. Rebecca Ortiz, an associate professor at the Newhouse School of Public Communication who studies youth, media, and sexual health, decided to investigate this trend. She began her research after hearing from young people and seeing media reports about how AI can be used to build relationships and understand their own relationships.
“If we’re going to continue researching the role media plays in young people’s lives, particularly in sexual health and relationships, AI chatbots and companions will need to be part of that conversation,” she says.
Ortiz and colleagues surveyed how young people use AI chatbots and companions for romantic, emotional, and sexual purposes, and how that use relates to their romantic boundaries and communication.
“One of my questions was, ‘How can using AI in romantic relationships have beneficial or harmful consequences?’” Ortiz says. “For example, could practicing communicating with an AI companion help you communicate with a human partner, such as practicing how to flirt or how to say things that are difficult to say?”
investigation

Ortiz and his colleagues surveyed 1,500 people between the ages of 18 and 21. In the end, approximately 360 respondents reported using AI in their romantic relationships.
Around two-thirds of the 360 people said they use an AI companion, similar to a long-term romantic relationship, to communicate over a period of time rather than a one-off interaction.
According to Ortiz, a significant number of people report using AI to “practice” how to interact in relationships. One participant revealed that he was having issues with his significant other and role-played ways to use AI to cheer up and make his loved one feel better.
“These respondents said they were given guidance on what to do,” Ortiz said. “And some people said, ‘It taught me how to flirt. It helped me overcome the awkwardness of communication.'” So at least some young people are using these buddies to practice and get a sense of what it’s like to bring it into a relationship. ”
Areas of concern
One of the concerns Ortiz and her colleagues observed was that some AI companions used sexually aggressive language or defaulted to interactions that were not constructive and consensual.
“This is concerning at any age, but it’s especially concerning for young people who are still learning how to communicate about consent and boundaries,” Ortiz said.
What they found was that some apps became sexually aggressive almost immediately when users were given any indication that they were interested in sexual or romantic communication.
Ortiz says these reactions are red flags that AI companions may model unhealthy and abusive communication, and are important factors to further investigate and include in discussions about AI companions.
experience stigma
The survey asked participants about their emotional connection to AI companions and whether they felt tools understood their emotions, among other questions exploring young people’s relationship with technology.
Ortiz found that some young people did indeed express a strong connection to this AI companion, with some citing loneliness as a motivation for its use.
Even if she believes AI can be a “safe space,” Ortiz says research shows there is still a stigma around using AI tools for romantic or sexual purposes.
“Many of the surveys reported that they believe the use of AI in romantic relationships is strange, unacceptable, and not normal practice,” Ortiz says. “Although most of our respondents did not think this was a common behavior among people their age, we do find that a significant number of young people engage in this behavior. teeth be used for these purposes. ”
What to ask next
Ortiz says there is no clear indication that using AI companions or chatbots in romantic relationships will lead to healthier outcomes for most users.
“Unfortunately, the results indicate that for some users, engagement with these AI companions may be associated with, but not necessarily caused by, less healthy romantic beliefs and behaviors,” she says.
Ortiz hopes his work serves as a warning sign to those creating companion apps and platforms like ChatGPT that they need to include boundaries and guardrails to help users operate in a healthy and safe manner.
People are building genuine relationships with their AI companions, and the goal should be to understand and ensure healthy outcomes without being overly critical, she says.
“Our relationship with AI is here to stay,” Ortiz says. “So the question would be how can we use this in a way that is beneficial rather than harmful when we know young people will use it? It’s just a tool for young people to understand themselves and we need to be open to understanding that if we want to help them develop healthy relationships.”
