Damning report uncovers first AI-generated child sexual abuse video

AI Video & Visuals


Close-up of a person typing on a laptop. The person's hands are resting on the keyboard and a watch is visible on his left wrist. The background is light, with light streaming through a window reminiscent of sunlight. The surface appears to be a wooden desk.

A report by the Internet Watch Foundation (IWF) has found that generative AI models are being used to create deepfakes of real child sexual abuse victims.

A shocking investigation by the UK-based IWF has found that the rise in AI videos means we are starting to see a surge in synthetic child sexual abuse videos.

The IWF, which describes itself as “the frontline against online child sexual abuse,” says it has identified AI models customized for more than 100 child sexual abuse victims.

The article gave the example of a real abuse victim whose abuser uploaded photos of her from the ages of 3 to 8.

The nonprofit reports that Olivia (not her real name) was rescued by police in 2023, but years later, dark web users were using AI tools to computer-generate images of her in new abusive situations.

Criminals collect images of victims like Olivia, who is now in her 20s, and use them to fine-tune their AI models and create new material – some of these models are available for free download online, the report said.

Misuse of AI video technology

AI video technology has made great strides this year, and unfortunately, that's reflected in the report.

The snapshot investigation was conducted in March and April of this year, and IWF identified nine deepfake videos on one dark web forum hosting child sexual abuse material (CSAM). When IWF analysts investigated the forum in October, no deepfake videos were found.

Deepfake videos include adult pornographic videos that have been altered to show a child's face, as well as existing child sexual abuse videos with the face of a child superimposed onto them.

Deepfakes are particularly convincing because the original sexual abuse videos were filmed of real children, IWF analysts say.

Many of the deepfake videos IWF has seen appear to be behind free, open-source AI software, and the techniques shared by criminals on the dark web are similar to those used to generate deepfake adult pornography.

IWF is concerned that AI CSAM will become more photorealistic as AI video technology improves, and it has already seen a steady increase in reports of illicit AI imagery.


Image credits: Header photo licensed from Depositphotos.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *