AI-generated child sex abuse video “Now is just like real footage,” Internet Watch Foundation warns

AI Video & Visuals


The number of AI-generated videos of child sexual abuse has skyrocketed, and now the charity warns that it is “indistinguishable” from actual footage.

The Internet Watch Foundation (IWF) said that it helps to remove and remove abuse images online, and criminals are creating more realistic and extreme sexual abuse content, allowing them to quickly create and share feature films of material.

Very realistic videos of abuse are no longer confined to clips filled with short glitches that were previously common in technology, and perpetrators are now using AI to create videos that often contain portraits of real children on a large scale.

New IWF data released Friday shows 1,286 AI-generated child sex abuse videos have been discovered in the first half of this year.

Only two such videos were discovered at the same time last year.

So far in 2025, all the videos confirmed so far have been so convincing that they have to be treated under British law as if they were real footage, IWF said.

Over 1,000 videos were rated as Category A images. This includes portrayals of rape, sexual torture and bestiality.

Data also showed that 210 separate web pages were found in AI-generated images of child sexual abuse in the first half of this year compared to 42 web pages in 2024, but showed an uptick in reporting images to charities by 400%.

Each web page can contain multiple images or videos.

The figure comes after the IWF said 291,273 reports of images of child sexual abuse were previously reported last year.

The charity is urging governments to ensure the safe development and use of AI models by introducing binding regulations that ensure that technology designs cannot be abused.

Derek Ray Hill, interim CEO of IWF, said:

“I am disappointed to see this technology continues to develop at a pace and being abused in new and unsettling ways.

“As we saw in the still images, AI videos of child sexual abuse have reached a point where it is indistinguishable from real movies.

“The children depicted are often realistic and perceived, and the harm of this material is realistic and the threat it poses could further escalate.”

Rayhill said the government is “too easy” for criminals to produce videos and that the issue “have to get a grip” on the issue as it is now inevitable that child sexual abuse films produced by real child-length AIs were inevitable.

He added: “The Prime Minister recently pledged to ensure that high-tech creates a better future for children. He has retreated only efforts to protect children and to keep the government's pledge to halve violence against girls, even at any delay.

“Our analysts say that almost all of this AI abuse image is characterized by girls.

An anonymous senior analyst at IWF said the creators of AI child sexual abuse images have video quality that is “leaps and boundaries” of what is available last year.

“The first AI child sexual abuse video we saw was a deepfake of the known victims placed on the actors in existing adult porn videos. It wasn't refined, but it could still be pretty convincing,” he said.

“The first complete synthetic child sexual abuse video I saw at the beginning of last year was a collection of a series of jerky images and was not convincing.

“But now they're really round the corner. The quality is amazing and the category of attacks drawn is getting more extreme as the tool improves its ability to generate videos showing two or more people.

“The video also includes a set showing known victims in new scenarios.”

The IWF advised anonymously reporting images and videos of child sexual abuse to charities, including the exact URL where the content is located.

Jess Phillips's Minister of Protection said: “These statistics are totally horrifying, and people who commit these crimes are as disgusting as people who pose a threat to their children in real life.

“Because child sexual abuse materials generated by AI are a serious crime, we have introduced two new laws to crack down on this sleazy material.

“Soon, perpetrators who generate tools that generate materials or manuals that teach them to operate legitimate AI tools will face longer prison sentences and continue to work with regulators to protect more children.”

An anonymous senior analyst at IWF said the creators of AI child sexual abuse images have video quality that is “leaps and boundaries” of what is available last year.

“The first AI child sexual abuse video we saw was a deepfake of the known victims placed on the actors in existing adult porn videos. It wasn't refined, but it could still be pretty convincing,” he said.

“The first complete synthetic child sexual abuse video I saw at the beginning of last year was a collection of a series of jerky images and was not convincing.

“But now they're really round the corner. The quality is amazing and the category of attacks drawn is getting more extreme as the tool improves its ability to generate videos showing two or more people.

“The video also includes a set showing known victims in new scenarios.”

The IWF advised anonymously reporting images and videos of child sexual abuse to charities, including the exact URL where the content is located.

STV News is now available on WhatsApp

Get all the latest news from all over the country

Follow STV News

Follow STV News on WhatsApp

Scan your mobile device QR code for all the latest news from across the country

WhatsApp Channel QR Code



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *