AI scam call: This mother believes fake kidnappers cloned her daughter’s voice

AI News


(CNN) Jennifer DeStefano’s phone rang one afternoon as she stepped out of her car outside the dance studio where her younger daughter Aubrey was rehearsing. Her caller was shown as unknown and she briefly considered not answering the call.

But her eldest daughter, 15-year-old Brianna, was training for a ski race, and DeStefano feared it could be a medical emergency.

“Hello?” she answered over speakerphone as she locked her car and took her purse and laptop bag to the studio.

She was greeted with screams and sobs.

“Mom! Fuck you!” said a girl’s voice.

“What happened!? What happened!?” DeStefano asked.

Watch this interactive content on CNN.com

“The voice sounded like Bree. Inflection, everything,” she recently told CNN. As is often the case with skiing, I panicked because I thought she was being dragged off the mountain.”

A deep male voice began to issue orders as cries for help continued in the background. I’m going to drop her off in Mexico my way with her girlfriend and you’ll never see her again.

DeStefano froze. She then ran into the dance studio and, shivering, called for her help. She suddenly drowned.

After a chaotic, rapid-fire sequence of events, including a million-dollar ransom demand, 911 calls, and desperate efforts to contact Brianna, The “kidnapping” was exposed as a fraud. A bewildered Brianna called her mother to tell her that she had no idea what the fuss was about and that all was well.

But Arizona resident DeStefano will never forget four minutes of terror and confusion.

“Mothers know their children,” she later said. “I hear your child crying across the building. You know it belongs to you.”

Artificial intelligence has made kidnapping scams more believable

The call had just stopped outside a dance studio in Scottsdale, near Phoenix, at about 4:55 p.m. on January 20th.

DeStefano now believes he is the victim of a virtual kidnapping scam that has targeted people across the country, tampering with the voices of loved ones to threaten them and demand money. In the United States, families lose an average of $11,000 per fake kidnapping scam, according to her Siobhan Johnson, a special agent in Chicago and her FBI spokesperson.

Americans lost $2.6 billion to identity fraud last year, according to Federal Trade Commission data.

In audio from the 911 call provided to CNN by Scottsdale Police, a mother at a dance studio tries to explain to the dispatcher what’s going on.

“So my mom just moved in and I got a call from someone who has a daughter…it looks like the kidnapper on the phone says he wants $1 million,” says another. “He won’t let her talk to her daughter.”



Jennifer DeStefano (right) was rehearsing a dance for her younger daughter Aubrey (middle) when she received a call claiming her oldest daughter, Briana (left), had been kidnapped.

In the background, DeStefano can be heard yelling, “I want to talk to my daughter!”

The dispatcher quickly identified the call as a hoax.

“So it’s a very popular scam,” she said. “Are they asking her to pick up gift cards or something?”

Identity theft scams have been around for years. Sometimes the person on the phone contacts the grandparents and says that their grandchildren had an accident and need money. Fake kidnappers use common recordings of people screaming.

But federal officials warn that such schemes are becoming increasingly sophisticated and that several recent schemes have something in common. The growth of cheap and accessible artificial intelligence (AI) programs has allowed scammers to clone voices and create snippets of conversations they’re supposedly trapped in.

Hany Farid, professor of computer science at the University of California, Berkeley and member of the Berkeley Artificial Intelligence Lab, said:

“A pretty good clone can be made in less than a minute of audio, some claiming a few seconds is enough,” he added. “The trend over the past few years has been that less and less data is needed to create convincing fakes.”

Farid says that with the help of AI software, voice duplication can be done for as little as $5 a month, making it easily accessible to everyone.

The Federal Trade Commission warned last month that scammers could obtain audio clips from victims’ social media posts.

“Scammers may use AI to clone the voice of your loved one,” the agency said in a statement. “All he needs is a short audio clip of your family member’s voice (available from content posted online) and a voice cloning program. … (it) sounds like your loved one.”

DeStefano: “It was… the sound of her voice.”

Until that day, DeStefano had never heard of the virtual kidnapping plot. Law enforcement has not confirmed whether her AI was used in her case, but DeStefano believes the crooks cloned her daughter’s voice.

I don’t know how she got it.

Briana has a social media presence. Her private TikTok account and her public Instagram account, which includes photos and videos of her skiing, her racing, and her events. But most of her followers are her close friends and family, DeStefano said.

“It was clearly the sound of her voice,” she said. “It was a cry and a sob. What I really got was she’s not a wailer. She’s not a screamer. She’s not a freakout. Janitor. That’s what made me.” It was a voice that matched crying.



Jennifer DeStefano (right) with daughter Brianna: “Mothers know their kids,” she said.

That day, at the dance studio, DeStefano persuaded the caller to reduce the ransom amount. She asked her daughter Aubrey to call Brianna or her father, who was with them at a ski resort in northern Arizona 110 miles away.

Aubrey, 13, shivered and cried, believing it was her sister’s screams.

“Aubrey…heard all the vulgar things they were going to do to her sister. Lots of swearing, threats,” DeStefano said.

Another mother picked up Aubrey’s phone and tried to contact DeStefano’s husband and Brianna. At that point, the threat still seemed real.

Many such scams are happening in Mexico, FBI says

There is no data on how many people are targeted each year by virtual kidnappers.

Most of the calls originate from Mexico and target the southwestern United States, which has a large Latino community, said FBI special agent Johnson.

In the midst of a distressing phone call, distraught parents and relatives often fail to question whether it is the raw voice of their loved one.

The FBI hasn’t noticed an increase in the number of virtual kidnappings in the new age of AI, but Johnson said it issues regular reminders to educate people about the scam.

“I don’t want people to panic. I want them to be prepared. This is an easy crime to stop if you know what to do ahead of time,” Johnson said.

Technology has certainly made deepfakes a major concern, she added, but AI is not the problem, the problem is the criminals using it.

AI expert Farid says that, to the best of his knowledge, current versions of AI software cannot replicate a voice to express a wide range of emotions, like that of a frightened child.

But he said he could not completely rule out the possibility that the screams and sobs were AI-generated.

“It’s very possible that a screaming clone is just around the corner, or maybe I just don’t know the technology that makes this possible,” he said.

Caller eventually reduced ransom to $50,000

A tense few minutes passed before anyone reached Briana, her father, or her brother at the dance studio. Reduced to $50,000, the discussion turned to instructions on how to transfer money.

Her mother called 911 and tried to convince DeStefano that the call was a scam, but she was too distraught to believe it.

“My response to her was, ‘That’s Bree crying.’ It sounded like her,” DeStefano said. “I didn’t believe her because…[my daughter’s]voice was so real, her crying and everything was so real.”

The caller told her he would pick her up in a white van, put a bag over her head and drive her to a place where she could hand over the money because the $50,000 wire was traceable. “If you don’t take all the cash with you, you and your daughter will die,” he said.



Jennifer DeStefano exchanged this text message with her son Alex while on the phone with the alleged kidnapper.

DeStefano tried to buy enough time for the police to arrive.

“He’s mute and I’m having this conversation with someone else. Then I unmute it and say, ‘Hi, I’m sorry, I’m trying to find a way to make you money. I’m trying to see where I can draw it from.

As we were discussing how to exchange money, someone handed DeStefano a phone. The opponent was Brianna.

“She was like, ‘Mom, I’m in bed. I don’t know what’s going on. I’m fine. I’m fine.'”

An enraged DeStefano burst into tears and lashed out at the caller for lying. She hung up on him and called the police.

Tips to stop fake kidnappers

DeStefano said that day changed the way she answered the phone. She is wary of answering unknown calls and rarely utters a word until the caller answers, fearing her voice will be duplicated for future virtual abductions. I tried to understand how the kidnapper obtained my daughter’s voice and considered several scenarios.

“They could have called her.

But not everyone can stop a fake kidnapper like DeStefano. His FBI man, Johnson, shares tips on how to avoid getting scammed:

  • Do not post information about upcoming trips on social media. It gives scammers an opportunity to target your family. “If you’re in the air, your mom can’t call to make sure you’re fine,” Johnson said.
  • Create a family password. If someone calls and says that your child has been kidnapped, you can tell the child to ask for the password.
  • If you receive such a call, take the time to plan and notify law enforcement. “Write someone else in the house a note and let them know what’s going on. Call someone,” Johnson said.
  • If you are in the middle of a virtual abduction and there is someone else in the house, ask that person to call 911 and have the dispatcher contact the FBI.
  • Be careful about giving financial information to strangers over the phone. Virtual kidnappers often demand ransoms via wire transfer services, cryptocurrencies, or gift cards.
  • Most importantly, don’t trust the voice you hear on the phone. If you can’t reach your loved one, ask a family member, friend, or someone in the room to do so.

CNN’s John Surlin contributed to this article.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *