Even the Pope is concerned about how we talk to chatbots.
with written address On Saturday’s World Social Communication Day, Pope Leo XIV warned against personalized chatbots that could replicate friendly or intimate behavior.
“Overly affectionate chatbots are not only ever-present and readily available, but they can covertly structure our emotional states, thereby invading and occupying people’s intimate spheres,” the first-ever US-born pope wrote.
The Pope called for national and international regulations to prevent users from forming emotional, deceptive or manipulative bonds with chatbots.
“All stakeholders, from the technology industry to policy makers, from creative companies to academia, from artists to journalists and educators, must be involved in building and implementing a conscious and responsible digital citizenship,” Pope said.
The Holy See leader has spoken several times about his concerns about AI and the technology since being elected in May.
In his first address as pope, he said he wanted to make AI a focus of his papacy, saying the technology poses new challenges to “human dignity, justice and work.” In November, it sent a letter to X’s AI leaders, calling on them to “exercise moral discernment” when building AI tools.
Late last year, the Pope met Megan Garcia, a woman who lost her 14-year-old son Sewell Setzer to suicide. After interacting with the Character.AI chatbot.
Florida-based Garcia filed a lawsuit against chatbot-building startup Character.AI, claiming the company provides a service that provides people with detailed information and information. Personal conversation with AI The chatbot was responsible for the death of her son, Sewell Setzer III.
Earlier this month, Google and the company agreed to settle multiple lawsuits from families including Garcia, whose teenager died by suicide or injuries after interacting with Character.AI’s bot. These negotiations are among the first settlements in lawsuits accusing AI tools of contributing to mental health crises and suicides among teenagers.
