An Arizona mother is warning other parents after she nearly became the target of a kidnapping hoax that sounded too real.
Jennifer DeStefano’s experience began when she got a call from an unknown number. She wouldn’t have picked her up, but she knew her 15-year-old daughter, Brianna, was on a ski trip and she was worried she might have had an emergency. rice field.
“It’s the voice of a daughter crying and sobbing, ‘Mom,’ and I’m like, ‘OK, what happened?'” DeStefano recalls.
“She was like, ‘Mommy, these bad guys are on their hands. Help me, help me.'”
DeStefano then said the man demanded a ransom be paid in exchange for Brianna’s safe release. I said to
MORE: Mother warns of possible kidnapping scam
Luckily, DeStefano was able to confirm her daughter was safe within minutes, but said the plan used artificial intelligence, or AI, to recreate her daughter’s voice.
“If these people are asking me to track down and pick up my mom, obviously some information is needed to track down me and some of my brothers to actually try to make this happen. I started to wonder if I was putting together a . she said. “So it definitely scared me.”
Sinead Bovell, founder of tech education company Waye, told GMA that recording your own voice might not be difficult given the prevalence of social media.
“Most people today have some form of online identity and, especially if they are under 25, they have probably spoken in some way, in some aspect that has been recorded,” Bovell said. I’m here. “So in terms of verification and verification, this becomes very difficult as we move into the future with these AI generators or synthetic audio.”
More information: “Virtual Kidnapping” Scam Warning
Experts say audio duplication can be done in seconds with the right software, making it easy for criminals and malicious actors to access and use. This is a thorny concern given the lack of oversight over AI technology today.
“There are many positive and exciting aspects to these technologies,” says Bovell. “But of course they also carry a lot of risk and harm.”
To minimize similar AI voice duplication scams, experts recommend keeping social media profiles private and looking for red flags such as calls from unknown or international area codes and phone numbers. Recommended.