AI voice cloning is on the rise.What you need to know

Applications of AI


A few months ago, a family in Arizona was horrified when what they thought was a kidnapping and ransom demand turned out to be a complete scam created by artificial intelligence. As reports of fraudulent calls that look exactly like their loved ones grow, AI could be weaponized to threaten people with easily accessible technology that requires only a small fee, minutes, and a stable internet connection. Many people fear that

Jennifer DeStefano received an anonymous call one afternoon in January while her 15-year-old daughter was out of town for a ski race. DeStefano panicked and screamed when she heard her daughter answer the phone. A man’s voice soon followed, threatening to drug and kidnap DeStefano’s daughter unless he sent her $1 million, he reported.

DeStefano was able to reach her daughter a few minutes later. She wasn’t kidnapped and wasn’t involved in her ransom demands, which made her baffled as to what happened with her well-being. Emergency responders helped the family identify the call as a hoax using her AI.

“It was clearly the sound of her voice,” DeStefano told CNN.

Data on the prevalence of AI-assisted fraudulent calls is limited, but stories of similar incidents have continued to emerge on TikTok and other social platforms this year, raising fears and risks of potential harm from AI. It is rising.

AI voice cloning

AI fraud call is set up by voice cloning. Once a scammer finds an audio clip of someone’s voice girlfriend online, they can easily upload it to an online girlfriend program that duplicates the voice. Such applications appeared several years ago, but under the generative AI boom, they have improved, become more accessible and relatively cheap to use.

Murf, Resemble, and Speechify are popular companies for these services. Most providers offer a free trial period, and monthly subscription fees range from under $15 for basic plans to over $100 for premium options.

The Federal Trade Commission recommends that if you receive a concerned call from a loved one in need, you should call the person who may have contacted you at a regular phone number to verify the story. If a caller asks for money through questionable, difficult-to-trace channels such as wire transfers, cryptocurrencies, or gift cards, it can be a sign of fraud. Security experts recommend establishing safe words with loved ones that can be used in the event of a real emergency or to distinguish between scams.

AI voice cloning in the music industry

AI voice cloning has also spread to music, where people are using the technology to create songs with vocals that sound like popular artists. The song, which draws a caricature of Drake and The Weeknd, and neither artist was involved in its creation, went viral online this month. A management company representing both artists was able to remove the songs from the streaming service only because of the illegally sampled audio and not the AI ​​voices. Drake rapped Ice Spice after his AI-generated track, commenting, “This is the last straw AI.” Munch has also gone viral this month.

Other artists, such as Canadian musician Grimes, are looking to a future where technology like this will continue to grow and potentially change the way the music industry operates. If the song produced is successful, we split 50% of the royalties,” said Grimes. murmured last week. “Please use my voice freely without penalty.”

People can write their own songs, but they record them in the voices of famous singers to attract attention. So far, there are no legal penalties for music deepfakes, but new york times reports that it undermines the artist’s reputation, deprives vocalists of profits, and poses the risk of cultural appropriation of BIPOC artists.

Other must-read articles from TIME


inquiry at letter@time.com.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *