What does our relationship with AI mean for the future? – Wellesley News

AI News


While scrolling through my Instagram feed on a regular basis, one particular post caught my eye. A post from “They fell in love with an AI chatbot and found something real.” new york times magazinecontains photos of three different middle-aged people, each with a snippet about how their love affair with an AI chatbot changed their lives. Incredulous, I opened my laptop to read the article and found out about a site posted by one of my interviewees. new york times What I was using was a replica. The “Our Story” portion of Replika’s website includes a video detailing founder Eugenia Kuyda’s story and how she created this AI chatbot companion. video Kudija explains that Khuida's best friend passed away a few years ago, and in his grief, he realized that by inputting old text messages and emails from Khuida into an AI, he could preserve his friend's memory and create a digitized version of him that he could continue to communicate with and confide in.

These relationships with AI are on the rise. As proven by over 27,000 participants In a Reddit community built around AI relationships. Participants are defended these relationshipsciting an increased sense of emotional fulfillment, Reduced feelings of loneliness..In addition to replicas, People who also use ChatGPT Form online buddies. Often these relationships don't start out as romantic or intimate relationships.;Rather, people started using AI as a tool to complete tasks and projects. Over time, as people interacted more with these models, they disclosed more personal information, especially feelings and emotions, and developed deeper relationships with the chatbots.

On the other hand, it is amazing how the seemingly real emotional intelligence of these AI systems has the power to transform the human-machine relationship from one that is purely practical to one that is emotional. On the other hand, the question also arises: what is the difference between humans and machines, now that both can meet the need for social and “human” interaction? When I think about AI models like Replika, especially their origin stories, I worry about what the constant dissolution of boundaries between humans and machines means for the future. Consider the fact that the replica was created because its founder, Kuida, was unable to face the grief of losing a close friend. Grief is perhaps a core human experience, and one that each of us must eventually face. Experiencing moderate amounts of loneliness and sadness is also an integral part of what it means to be human. So is the relief you feel when you confide in someone else and feel the certainty that you are not alone. Forming relationships with AI chatbots fundamentally prevents real connections. This is largely due to finding common ground among the many complex emotions of being human.

All in all, it was shocking to watch Replika's origin story video and see its founders and employees assure viewers that replacing real human interactions with chatbots is perfectly normal, and in some ways even better. This guarantee appears to be an opening for Kuyda to further the problems that AI chatbots like Replika claim to solve. When people replace living human interactions with inorganic ones to counter feelings of loneliness and hopelessness, they can increase the social distance and alienation that lead to those feelings in the first place.

As AI models continue to evolve and develop, it is important for AI developers and designers to keep in mind what the role of AI should be. But if technology companies aren't developing models with ethics at the forefront, perhaps it's more important that users of AI do their part. This comes in the form of limiting the use of AI to more basic technical functions, rather than using it to replace human connections. The realization of AI as a way to manage difficult human emotions perhaps brings us to a larger problem: both a lack of community and a sense of responsibility to be there for those around us. No matter how much of an impact we believe we have on others, it is our collective responsibility to function as that community.

Contact the editor responsible for this article: Caitlin Donovan, Avery Finley



Source link