Deepfake scams in China raise concerns artificial intelligence Make financial information easily available to hackers. Hackers using advanced AI software reportedly persuaded a man in northern China to send money to a friend, but instead sent it to a fraudulent account.
The victim is based in Baotou, Inner Mongolia, and local police report that hackers used AI to recreate a portrait, including the face and voice, of a friend during a video call. The deepfake scam tricked the victim into believing that he was sending 4.3 million yuan (approximately $622,000) to his friend as a bid deposit. After a friend told the victim he was unaware of the incident, the man reported the fraud to authorities, who were able to recover most of the stolen funds, but are still working to recover the rest. said he was. Reuters report.
Chinese microblogging site Weibo has been used as a forum to discuss the ongoing threat of deepfake scams, with the hashtag “#AI scams exploding across the country” on Monday. It went viral with over 120 million views. “This shows that photos, audio and video can all be used by scammers,” wrote one user. “Can information security rules keep up with the technology of these people?”
This recent incident comes amidst a significant increase in AI scams around the world and reports of scammers. Reproduce voice using AI technology and receive money over the phone. announced by the Department of Homeland Security. report It warns against deepfake scams, stating: “It is clear that the severity and urgency of the current threat posed by synthetic media depends on the exposure, perspective and position of the questioner. ’ was a wide range.”
In the UK, the CEO of a local energy company transferred €220,000 (about $243,000) to a Hungarian supplier’s bank account. After receiving a call from a person who appears to be his boss. The voice was actually that of a fraudster who used AI voice technology to recreate the voice of his boss, the CEO said. wall street journal He recognized a subtle German accent, which he said conveyed the “melody” of his voice.
Meanwhile, in the United States, a police department in southern Wisconsin earlier this month warned residents of a similar scam after receiving a call from someone who “looked like a relative.” NBC Chicago report. Police said they could not officially confirm that it was an AI-generated voice, but wrote on Facebook: director“We want the community to know that this technology is out there.”
The Department of Homeland Security said, “These scenarios will undoubtedly increase as the costs and other resources required to create usable deepfakes simultaneously decrease.” To avoid becoming a victim, DHS called the person who supposedly asked for the money and “requested the victim to make a wire transfer, cryptocurrency transfer, or purchase a gift card, and provide the card number and card number.” We encourage you to be aware of common fraudulent claims such as “Tell me. PIN. ”
