AI voice scam is here

Applications of AI


A local television news station in Arizona made a disturbing report earlier this month. Jennifer Her mother named DeStefano picked up the phone after hearing a 15-year-old crying and she said she was told to pay a dollar. 1 million ransom to return her daughter. In reality, the teenage girl was not kidnapped and is safe. DeStefano believes someone used her AI to create a replica of her daughter’s voice and deployed it against her family. “It was totally her voice,” she said in one interview. “It was her change. It was the way she would have cried. DeStefano’s story has since been picked up by other media outlets.” However, similar stories of her AI voice scam surfaced on her TikTok, Washington PostIn late March, the Federal Trade Commission warned consumers that bad actors were using the technology to power a “family emergency scheme.” This is a scam that camouflages an emergency to trick concerned loved ones into forking cash and personal information.

Applications like this have been around for a while.In 2018 my colleague Charlie Worzel tricked his mother with a rudimentary AI voice cloning program. With a connection, you can synthesize the voice of a stranger. At stake is our ability as ordinary people to trust the voices of those we interact with from afar to be legitimate. You could quickly enter a society where you don’t necessarily know it’s from your mother or boss. . Some experts say it’s time to set up systems with families to prevent the possibility of their voice being synthesized (a codeword, or a kind of two-factor authentication of her by a human). .

One easy way to combat such trickery is to provide a contact word that can be used to verify your identity. For example, you can establish that an urgent request for money or confidential information must include the following terms: lobster bisque. of directorMegan McCardle of . Hany Farid, a professor of informatics at the University of California, Berkeley, said she’s a fan of the idea. “It’s very low tech,” he told me. “You got this super high-tech voice clone and you’re like, ‘What’s the code word, you bastard?'”

But beware of falling into delusion too quickly. A widespread loss of trust in any audio or video fuels the notion that a “liar’s dividend”, or increased public knowledge of counterfeits, makes it easier for malicious actors to undermine legitimate media. There is a possibility. It cannot be said that there is too much trust in America today. Trust in the media and institutions, including organized religion and public schools, is polling miserably, while AI is amplifying our ability to spread false information online. “We want people to know what is possible,” AI expert Henry Ajder, who has spent his five years researching synthetic speech technology, tells me. “We also definitely don’t want to scare people.” If you get an unusual call, it’s always a good idea to stay calm and ask common sense questions that your loved one should know how to answer. can.

Beyond anecdotes, virtually no data exists on AI voice scams. Juliana Gruenwald, an FTC spokesperson, told me the FTC doesn’t track how often her clones are being used in scams like AI or Voice. Statistics from her fraud reports for the first three months of the year show no increase in the number of frauds impersonating family and friends. The FBI, which also keeps data on phone fraud, did not respond to a request for comment.

Still, there is clearly a real risk here. Last month, to tell a story about such a surge of clones on TikTok, I replicated Taylor’s Swift voice using just one minute of him speaking in an old YouTube interview. bottom. Using ElevenLabs’ online Instant Voice Cloning tool, it took five minutes and cost $1. (The company did not respond to a request for comment about how its software could be used to defraud.) All the program needs is a short audio clip of the person speaking. Only. Upload it and AI will do the rest. Also, you don’t have to be a famous person to be vulnerable. Just publish a single clip of your voice from a TikTok, Instagram post, or YouTube vlog to create an AI model of your voice that anyone can use however they like. Our extensive digital history built over the years online could be used against us.

Technology feels lifted from a Philip K. Dick novel, but in some ways it’s a classic American story about the uncertainty of the new frontier. It is written as Birth Certificate: American History, when Americans began to move from the countryside to the cities in the mid-19th century, the nation developed a “genuine cultural fascination” with con artists and “anxiety about being in these new big spaces with all kinds of strangers.” I’m going to interact and I don’t always know who to trust. ” We have developed a technology that is like a credit score, for better or worse. The spread of AI-powered internet is likely the result of previous fears.

We are in a time of change and trying to figure out the benefits and costs of these tools. “I think this is one of those cases where we built it because we could make money off of it and/or we could make money off of it,” Farid said. And maybe nobody stopped thinking about whether or not they should do it.” There are some legitimate use cases for duplicating voices. For example, it can empower people who have impaired or lost the ability to use their voice. In 2021, AI helped actor Val Kilmer use his voice when he lost the ability to speak naturally due to laryngeal cancer. But beneficial uses don’t necessarily require unregulated, free access, he noted, Farid.

Many critics of AI say we need to slow down and think a little more about what the technology could unleash if left alone. Voice replication seems like an area we really should be doing. Perhaps humans will evolve with AI and come up with new verification techniques that will help restore trust, but basically when we start doubting whether the person we’re calling is really who we want them to be. You are in complete danger. new world. Maybe we are already there.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *