AI companion apps like Replika need more effective safety management, experts say

Applications of AI


The idea of ​​having an emotional bond with digital characters was once a foreign concept.

Today, “peers” with artificial intelligence (AI) are increasingly functioning as friends, romantic partners, or confidants for millions of people.

The definition of wool for “dating” and “using” (for example, some people use ChatGpt as their partner) makes it difficult to convey exactly how widespread this phenomenon is.

Loading…

However, while AI companion apps Replika, Chai and Charital.ai each downloaded 10 million times from the Google App Store alone, in 2018 Microsoft had 660 million users on China-based Chatbot Xiaoice.

These apps allow users to construct characters with names and avatars. You can take this in text and hold audio and video calls.

But do these apps fight loneliness or do they go beyond quarantine? And is there a way to tilt the balance in the right direction?

REPLIKA: Connector or isolator?

Romance and sexuality are hugely attracting for the AI ​​companion market, but people can have a variety of other reasons to set up chatbots.

They may be seeking non-judgmental listening, personalized tutoring (particularly language skills), advice, or treatment.

Stanford University researcher Bethanie Drake-Maples is studying AI peers, but some people say they use the app to reflect their persona.

“Some people just create digital twins and have something to do with externalized versions of themselves,” she tells ABC Radio National's series Brain Rot.

Drake-Maples published a study based on an interview with More 1,000 students using the AI ​​companion app Replika.

She and her colleagues have discovered that there are important benefits for some users. Most importantly, 30 interviewees said they prevented them from attempting suicide using the app.

Many participants also reported that the app helps to build connections with others, such as through advice on relationships with others, helps them to build connections with others, and by teaching them empathy.

However, no other users reported any benefits or negative experiences. Outside of Ms. Drake-Maples' research, fellow AI is also involved in death.

Drake-Maples notes that their study is a self-selection cohort and does not necessarily represent all Replika users. Her team is conducting long-term research to see if it can gather more insights.

However, she believes that these apps, as a whole, could be beneficial to users.

“We wanted to particularly understand whether replicas were driving relationships away or if it stimulated relationships,” she says.

Quite a few people said the replica stimulated relationships more than it drove away.

However, this social promotion is not something that can be taken for granted.

Ms Drake-Maples is concerned that companion apps can replace people's interactions with other humans.

Although participants in her study were far more lonely than the general population, this is not necessarily unusual for young college students.

She believes that the government should regulate AI companion technology to prevent this isolation.

“You can definitely make money by isolating people,” she says.

“We absolutely need some ethical or policy guidelines regarding these agents being programmed to promote social use and unprogrammed attempts to isolate people.”

Replika introduces many controls into the app, including a “Get Help” button that directs people to professional helplines or scripts based on cognitive behavioral therapy, and a message coding system that flags “unsafe” messages and responds to a type of response.

Drake-Maples thinks this is a great example of following other apps.

“These things need to be fully mandated,” she says.

Is the replica really doing the right thing?

Raffaele Ciriello, a researcher at the University of Sydney, is skeptical of Replika's safety management, calling it a “superficial cosmetic modification.”

He points out that controls were introduced, citing concerns about age verification, months after the Italian government decided that it had to cease using data for Italian citizens in early 2023.

“They were afraid of a backlash from regulations.”

Dr. Ciriello also interviews and researches AI companion users, and although some users say they have found benefits, the app is primarily designed for emotional dependence.

“If you're looking at the road [Replika is] When you make money, they have all the incentives that make users hooked and rely on their products,” he says.

Replika works with the “Freemium” model: Free base app. It features more features (including romantic partner options) available with paid subscriptions. Other companion apps follow the same model.

“Replika and his relatives have the value of Silicon Valley embedded in them, and they know what these look like: data, data, profits, profits, profits,” says Dr. Ciriello.

Nevertheless, he also believes that it is possible to build AI companion technology safer and more ethically.

Companies that consult vulnerable stakeholders, embed crisis response protocols, and promote their products responsibly are likely to produce safer AI companions.

Dr. Ciriello says Replika fails on some of these aspects. For example, he calls the ad “deceived.”

The company bats products as “AI companions who care about their products.”

“[But] It's not consciousness, it's not empathetic, it's actually not compassionate,” says Dr. Cirello.

The app provides "romantic" Choosing your avatar wardrobe.

Replika's business model relies on users who pay to upgrade their peers, such as accessing romantic and sexually explicit models. (Supply: Luka/Replika))

A spokesperson for Replika said that “a fellow AI who cares” was “not a claim of sense or consciousness.”

“The phrase reflects the emotionally supportive experiences that many users report, and speaks to our commitment to designing thoughtfully and respectfully,” they said.

“In this regard, we are also working with institutions such as the Harvard Human Prosperity Program and Stanford University to better understand how replicas affect health and help shape responsible AI development.”

Dr. Ciriello says that the women-centric Australian app Jaimee is an example of AI companions of better ethical design, but it faces “the same commercial pressure” as the larger apps in the market.

The California Senate passed a bill last week regulating AI chatbots. If the bill continues to become law, peers in particular will require non-human users to regularly remind non-human users and enforce transparency in suicide and crisis data.

The bill is promising, says Dr. Cirello.

“If social media history taught us anything, I would have a national strategy in Australia that has some control over how these technologies are designed, what their incentives are, and how those algorithms work.”

However, he adds that research on these apps is still in their early stages and it will take years to understand their full impact.

“It will take some time for that research to come out and let the wise laws be known.”

Listen to the full episodes of AI peers' rise and riskand For more information, subscribe to our podcast.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *