Scammers use AI to clone teenagers’ voices, kidnap scams and demand $1 million from mothers

AI News


Divyanshi Sharma: The popularity of ChatGPT has sparked interest in artificial intelligence (AI), with many companies working on the technology. Google, Microsoft, Meta and his investments in AI technology and their focus in the coming years seem to be in the world of artificial intelligence. Even Elon Musk is gearing up to launch his new AI company called X.AI, according to a recent report. The concept of artificial intelligence has been around for years, but with the advent of his OpenAI tools such as ChatGPT and DALL.E, its popularity has reached new heights.

However, experts also often warn us about the dark side of AI and how the technology can be abused by certain people. OpenAI CEO Sam Altman has also previously expressed concern about the potential misuse of AI. In light of these statements, a new case of a scammer using his AI to clone his teenage voice and demand a ransom from his mother has shocked the world.

Scammers use AI to clone teenagers’ voices

WKYT, a US-based news channel affiliated with CBS News, reports that Arizona woman Jennifer DeStefano received a call from an unknown number one day that turned her world upside down. DeStefano told the news channel his 15-year-old daughter was on a ski trip when she got the call. The moment she picked up the phone, the woman heard her daughter say “Mama” followed by sobs. Next was a man’s voice saying, “Listen, I have your daughter.” I’m going down like this. You call the police, you call everyone, I’m gonna fill her up with her drugs. I’m going with her, and I’m dropping her off in Mexico.

The woman added that she heard her daughter’s voice calling for help in the background.The man then demanded US$1 million to let the boy go. But when DeStefano said he didn’t have that much money, the “kidnapper” he agreed with US$50,000.

DeStefano added that she was at her daughter’s dance studio and was accompanied by other mothers when she got the call. I put on Within minutes, my teenage daughter was confirmed safe for the ski trip. However, the voice on the phone sounded exactly like her daughter’s.

“It wasn’t a question of who this was. It was entirely her voice. It was her refraction. It was the way she cried,” she told local news media, adding, “I’m not into it.” That’s the weird part that really got me to the core.”

Victim’s mother’s Facebook post

The woman spoke of the incident in a recent Facebook post. She shared the news story on her social media platform and warned others to protect themselves from such incidents and “come up with family emergency words or questions that only you know.” That way, you can be sure that the AI ​​isn’t fooling you.”

Her post reads, “Everyone should see this!! There’s a way a free AI app can use your loved one’s voice to scam you! The number that called me was an unfamiliar number. I would like to add that my phone showed up as ‘Unknown’ instead of a thing, which is common in doctors and hospitals. I’ve heard stories of people coming out with phone numbers and photos of their parents. Bree doesn’t have public social media accounts. I have some public interviews for sports/schools that have good samples.

“However, this is of particular concern for children, including our youngest, who have official accounts because of his contract sponsors. , was also the reason his account remained silent for months.. It is not certain whether the kidnapper was related to another kidnapping of a real friend who had just been sentenced to life imprisonment. Only a voice recording was provided to his wife and we were not successful in getting him back.In addition, I was not instructed to send money. It was to physically meet the kidnapper we were trying to do.The police were heading to the crossroads during the call as we were trying to navigate the situation.

“My biggest fear is that this will be used to physically kidnap or kidnap me, as others have requested of me. If it happens, please report!!! The only way to stop this is to raise public awareness.!! Have urgent words and questions ready for any family members who are in need! Stay safe.”

In 2023, with all the hype around AI, incidents like this certainly sound like alarm bells, and steps need to be taken to ensure our safety from the dark effects of AI.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *