AI-powered 'deep nude' app faces ban in UK

Applications of AI


The UK government has introduced plans to ban harmful 'nude' apps to combat the rise in online violence against women and girls (VAWG).

Ministers have pledged to make it impossible for children in the UK to take, share or view nude images using mobile phones, and the government will work with technology companies to find innovative solutions to better protect young people from digital abuse.

The creation and provision of nudity apps and tools that use genAI to transform images of real people into deepfake nudes without permission will be completely prohibited based on existing offenses related to the creation and sharing of these harmful images.

This new law is part of a new package of measures designed to combat VAMG across the board, allowing police to target the companies and individuals who design and provide these very real tools.

The government said this would enable it to crack down on the rise in digitally-driven, financially-motivated sexual extortion and the problematic use of AI by pedophile rings.

Jess Phillips, Minister for Safeguarding and Violence Against Women and Girls, said: “Nudity apps will not be used for harmless pranks.”

“They destroy the lives of young people, and we will ensure those who create or provide them face real consequences. Every child has the right to grow up safely, and we will do whatever it takes to make that a reality.”

Although AI is still in its infancy, numerous studies have shown that the frenetic pace of the technology is fueling the epidemic of deepfake exploitation.

Figures referenced by the government show that 24 million people Less than a year after the first incarnation of ChatGPT was launched, we visited the “nudification” service in September 2023.


Recommended reading


Most recently, Report Remove, run by the Internet Watch Foundation and Childline, reported that between January and September of this year, 19% Of the incidents they reported, some said some or all of the images had been manipulated using AI.

This problem has become so endemic that More than half of teens We now believe that creating and sharing deepfake nudes is worse than real images.

But despite the government's plans to step up enforcement, some activists say it is not enough, with NSPCC strategy director Dr Maria Neofitou saying: BBC The charity said it was “disappointed” that device-level protections were not introduced and called on Labor to force tech companies to find ways to prevent the spread of abusive content on their platforms.





Source link