
Apple and Google have removed apps promising deepfake nudes from the App Store and Google Play. According to 404 media reports.
404 Media Earlier this week, it was revealed that Instagram was hosting ads for an app that uses AI to generate deepfake nudes. One such ad used a photo of Kim Kardashian with the slogan, “Any girl can undress for free.” Try it. ” That service, and others like it, appear to allow users to upload real photos of women and generate fake photos of them naked. The 404 media report came just a few weeks later. Meta moved to make it difficult to send and receive nudes via Instagram Messengersaid the initiative is aimed at protecting teenage users from predators.
Deepfake nudity was a big problem in the early days of generative AI. AI-generated Taylor Swift nudes flood X In January, one such photo garnered tens of millions of views and 24,000 reposts. Deepfake nudes of teen girls at schools in New Jersey and Washington, and california An investigation began in late 2023. so far, Approximately 24 states in the United States have introduced legislation. Crack down on sexually explicit content generated by AI. And last month, Two teenagers arrested in Miami, Florida They were charged with third-degree felonies for allegedly using a so-called “undressing” app to create nudes of their classmates.
There is no federal law regarding AI-generated nudes, but U.S. senators introduced a bill in January that would allow victims to sue perpetrators.
“While the images may be fake, the harm to victims of distributing sexually explicit deepfakes is very real,” said U.S. Senators Richard Durbin and Lindsey Graham. He said so. announcement of their bill It is called the DEFIANCE method. “Victims have lost their jobs and may have ongoing depression and anxiety.”
