The Charity warns that AI-generated child sex abuse video “now”…

AI Video & Visuals


Videos generated by AI for child sexual abuse are now “indistinguishable” from actual footage, a major charity warns.

The Internet Watch Foundation (IWF) has said it has helped to remove and remove abusive images online, with criminals creating increasingly realistic and extreme content, warning that technology could quickly make the creation and distribution of feature films of such material possible.

Very realistic videos are no longer confined to short, glitches-filled clips previously common in technology, and perpetrators are now using AI to create videos that often include portraits of real children on a vast scale.

New IWF data released Friday revealed a staggering 1,286 AI-generated child sex abuse videos have been discovered in the first half of this year.

Only two such videos were discovered at the same time last year.

The government minister described the figure as “completely scary,” saying the criminals behind the video “are as disgusting as people who pose a threat to children in real life.”

So far in 2025, all the videos confirmed so far have been so convincing that they have to be treated under British law as if they were real footage, IWF said.

Over 1,000 videos were rated as Category A images. This includes portrayals of rape, sexual torture and bestiality.

Images of child sexual abuse generated by AI were found on 210 separate web pages (Alamy/PA).

Data also showed that 210 separate web pages in the first half of this year found images of child sexual abuse generated by AI in comparison to 42 web pages in 2024, but images reporting to charities increased by 400%.

Each web page can contain multiple images or videos.

The figure comes after the IWF said 291,273 reports of images of child sexual abuse were previously reported last year.

The charity is urging governments to ensure the safe development and use of AI models by introducing binding regulations that ensure that technology designs cannot be abused.

Derek Ray Hill, interim CEO of IWF, said:

“I am disappointed to see this technology continues to develop at a pace and being abused in new and unsettling ways.

“As we saw in the still images, AI videos of child sexual abuse have reached a point where it is indistinguishable from real movies.

“The children depicted are often realistic and perceived, and the harm of this material is realistic and the threat it poses could further escalate.”

Rayhill said the government is “too easy” for criminals to produce videos and that the issue “have to get a grip” on the issue as it is now inevitable that child sexual abuse films produced by real child-length AIs were inevitable.

“The Prime Minister recently pledged to ensure that high-tech creates a better future for children. He has retreated only efforts to carry out the government's pledge to protect children and halve violence against girls, even at any delay.

“Our analysts say that almost all of this AI abuse image is characterized by girls.

Jess Phillips' Minister of Protection (PA)

Jess Phillips's Minister of Protection said: “These statistics are totally horrifying, and people who commit these crimes are as disgusting as people who pose a threat to their children in real life.

“Because child sexual abuse materials generated by AI are a serious crime, we have introduced two new laws to crack down on this sleazy material.

“Soon, perpetrators who generate tools that generate materials or manuals that teach them to operate legitimate AI tools will face longer prison sentences and continue to work with regulators to protect more children.”

An anonymous senior analyst at IWF said the creators of AI child sexual abuse images have video quality that is “leaps and boundaries” of what is available last year.

“The first AI child sexual abuse video we saw was a deepfake of the known victims placed on the actors in existing adult porn videos. It wasn't refined, but it could still be pretty convincing,” he said.

“The first complete synthetic child sexual abuse video I saw at the beginning of last year was a collection of a series of jerky images and was not convincing.

“But now they're really round the corner. The quality is amazing and the category of attacks drawn is getting more extreme as the tool improves its ability to generate videos showing two or more people.

“The video also includes a set showing known victims in new scenarios.”

The IWF advised anonymously reporting images and videos of child sexual abuse to charities, including the exact URL where the content is located.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *