- The rise of generative AI in China has prompted people to use the technology to recreate their loved ones.
- They use old photos, recordings, and messages to train chat programs to imitate the dead.
- The technology has been around for some time, but experts told Insider that the technology could raise serious ethical questions.
In 2020, a young Chinese software engineer in Hangzhou stumbled across an essay on lip-sync technology. The premise is relatively simple: use a computer program to match lip movements to voice recordings.
However, my grandfather, who died nearly ten years ago, came to mind.
“Will I ever see my grandfather using this technology again?” Yu Jialin asked himself.
A chronicle of his journey to recreate his grandfather, published in April by investigative journalist Tan Yucheng in the state-run magazine Six Tone, is now surfacing in China using artificial intelligence to reanimate the dead. is one of several testimonies about people using
Combining a variety of emerging AI technologies, people in this country are building chat programs (known as griefbots) that embody the personalities and memories of the deceased in hopes of having the opportunity to talk again with their loved ones.
For Yu, they gave the person who raised her the chance to say her last words.
The software engineer, now 29, told Tan that he was 17 when his grandfather died.
He still regrets the two times he was strict with his grandfather. Tan said Yu once yelled at an old man who interrupted a gaming session, and on another occasion told his grandfather to stop picking him up from school.
Mr Tang said his family stopped mentioning him after his death. “The whole family was trying their best to forget him instead of remembering him,” Yu said.
Griefbot jumps on the ChatGPT bandwagon
The griefbot concept has been tried out for years as an AI-powered program that learns how to imitate humans, mostly through mementos, photos, and recordings. But the rapid advances in generative AI here in his year have pushed the power and accessibility of griefbots to a whole new level.
Older models required huge datasets. Now even laymen and lone engineers like Yu can input tidbits of a person’s past into a language model to recreate almost exactly how a person looks, speaks, and thinks.
“With today’s technology, AI doesn’t need that many samples to learn a person’s style,” Haibin Lu, a professor of information and analytics at the University of Santa Clara, told Insider.
Systems like ChatGPT, a popular text-based program that faithfully mimics human speech, have already learned how most people speak and write naturally, according to AI-focused research. said Mr. Lu, who is doing
“To get a rough 99% similarity to your person, you just need to tweak the system a bit, and the obvious differences will be minimal,” Lou said.
Yu pulled out a stack of old letters from her grandmother to teach the AI model what her grandfather was like. Tan said she had exchanged with Yu’s grandfather when he was young, revealing a side of the man that even Yu never saw as a child.
Tan said the software engineer unearthed photos and videos taken more than a decade ago and found text messages his grandfather had sent him.
But even considering weeks of testing and training, the technology has a long way to go if humans expect something akin to a robotic replica of Black Mirror. Yu’s bot had obvious limitations, and it took him 10 minutes to respond to each prompt, Tang reported.
“Hey Grandpa. Who do you think I am?”
Grandpa replied with a general answer.
Tan said the bot replied, “Who you are doesn’t matter at all. Life is a beautiful miracle.”
But as Yu gave the AI more information about his grandfather, it was able to more accurately represent the man’s habits and preferences. For example, he told Mr Tan that it reminded him of his grandfather’s favorite show.
“Happy Tea House is no longer on the air,” Yu told the chatbot.
“I’m sorry. My favorite show is no longer available. I wish I could have watched a few more episodes,” Grandfather’s bot replied.
That was the moment he felt he had reached somewhere, he told Tan. Eventually the program was refined enough to give Yu the confidence to show his work to his grandmother. She silently watched her late husband answer her questions, thanked her grandchildren, stood up and left her room.
Yu told Tan that her grandmother needed a chatbot to process her emotions and mourn. “Otherwise why would she thank me?” he said.
As for himself, he refused to share intimate conversations with his grandfather’s bot.
“But I think my grandfather forgave me in the end,” he told Tan.
mourn with the times
Sue Morris, director of bereavement services at the Dana-Farber Cancer Institute in Boston, told Insider that as technology advances, it’s natural for humans to grieve differently.
In the 1980s, people wrote down stories to remember their loved ones, said Morris, a psychology professor at Harvard Medical School. Now, in the digital age, preserving photos and videos of deceased people is much more common, she said.
Psychologists often help grieving clients by speaking into an empty chair as if their loved one is sitting there and asking them to imagine the person’s reaction.
“These griefbots feel like a technological step up from that,” Morris said.
But griefbots take real control away from users, she added. Many people deal with grief by controlling when and how they process their emotions.
“You choose when you look at photos and videos and how long you look at them,” she said.
Unexpected triggers, such as insensitively timed messages from chatbots, can leave many people grief-stricken, Morris said. “Perhaps 98% of the time, the program will say the right thing. But what if a small percentage of the time it doesn’t? If so, is it possible that someone is in a further downward spiral?” ?” she said.
Still, while griefbots may sound offensive to some, history shows that social norms are constantly changing when it comes to the dead, says director of the University of Arizona’s Grief, Loss and Social Stress Lab. Mary Frances O’Connor told Insider.
When photography became available to the public in the 19th century, people took pictures with the corpses of their loved ones and hung them in their living rooms, O’Connor said.
“Nowadays you might think this living room display is morbid, but back then it was the norm,” O’Connor said.
GPT-powered griefbots spread in China
As generative AI gains momentum in China, so too are the stories of new griefbots. Another Chinese man who used AI to “resurrect” a loved one, a 24-year-old Shanghai blogger who goes by the name Wu Wu Liu, said he trained a chatbot to imitate his grandmother, who died in March. became a hot topic on social media. .
Like Yu’s grandfather’s bot, Wu’s bot also got limited responses. “But it feels good to see Grandma and talk more,” he said.
Wu said he used ChatGPT, but access to the platform has been restricted in China since February 24.
“I wish I had seen this video sooner,” read the top comment on Wu’s page. “My grandmother passed away last winter. I was caught off guard. There are no audio recordings or high resolution photos of her grandmother.”
And during this year’s annual Tomb Sweeping Festival, a Chinese cemetery used GPT software and voice-cloning AI to recreate the people buried at the facility, reports YiCai. YiCai said the cemetery, which has thousands of people using its platform, costs about $7,300 to recreate the dead.
Seeking human connection with virtual bots has become commonplace in China. Xiaoice is her 2018 Chinese chatbot assistant who looks like a teenage girl and has more than 660 million users. According to Microsoft, which runs the flagship bot, she can act as a confidant and friend, and even accept gifts from her fans.
Earlier versions of Griefbot have found footholds elsewhere in the world. Several US companies and research projects offer griefbots such as Replika, now marketed as social AI apps.
In Canada, a man named Joshua Barbeau digitally remade his girlfriend in 2021 using Project December, an old program built on the predecessor of the current GPT software. Barbeau’s girlfriend, who died eight years ago from a rare liver disease, told the San Francisco Chronicle that talking to her chatbot helped her heal her sense of loss.
And then there’s the Korean documentary “Meeting You,” which features a young mother tearfully reuniting with her dead seven-year-old daughter in virtual reality. Viewers worried that the show was being emotionally manipulated, but the mother in the episode thanked her for the experience, saying she “had a sweet dream.”
Griefbots are always controversial
But Lu, an information analytics professor, said griefbots and their byproducts could pose a serious ethical dilemma.
For fraudsters, the identity of the dead can be easily chosen, he said. According to Lu, they can feed the person’s data to the AI and act like they are mediums communicating with the person’s soul.
“And there’s no scientific evidence that psychic abilities work, right? No one can disable it,” he said.
There is also the challenge of obtaining consent from the dead, Lu said.
“In a future where everyone knows about this technology, you may be able to sign a document saying that your descendants can use your knowledge or ban it,” Lou said.
US-based company HereAfter.AI offers an opt-in experience that allows people to upload their personalities online. AI learns about each person through submitted photos, audio logs, and surveys, creating a digital avatar that can talk to friends and family after death.
Its founder, James Vlahos, spends months recording his terminally ill father recounting memories and reminiscing about his life, then feeds it to “Dadbot” to help him live. I made it so that I could continue to live even if I was no longer able to.
But Lu said it is highly unlikely that a typical modern-day deceased would grant such permission. And if not, it would even be a problem for children and grandchildren to use personal information, he added.
“If a person dies, even if it is a close relative, it does not mean that others have the right to disclose their personal privacy,” Lu said.
As for software engineer Yu, his grandfather’s bot is gone. Yu told Six Tone that he was afraid of relying too heavily on AI for emotional support, so he decided to remove Grandpa Bot.
“Maybe these feelings were so overwhelming that I couldn’t work or live,” he told Tan.
Watch Now: Top Insider Inc. Videos
Loading…
