The watchdog group said the three websites used to create the abusive images received 100,000 visits each month from Australians.
Published November 27, 2025
Australia’s internet regulator has announced that Australian internet users have been blocked from accessing several websites that use artificial intelligence to create child sexual exploitation material.
eSafety Commissioner Julie Inman-Grant said on Thursday three “nudity” sites had been withdrawn from Australia after receiving official warnings.
Recommended stories
list of 4 itemsend of list
Mr Grant’s office said the sites received about 100,000 visits a month from Australians and were the subject of a high-profile case involving AI-generated child sexual abuse images of Australian school students.
Mr Grant said such “nudification” services, which use AI to make images of real people appear naked, were having a “devastating” impact on Australian schools.
“We took enforcement action in September because this provider did not have safeguards in place to prevent its service from being used to create child sexual exploitation material, and even sold options for features such as the ability to undress ‘any girl’ and ‘schoolgirl’ image generation and ‘sex mode,'” Grand said in a statement.
The move comes after Mr Grant’s office issued a formal warning to the UK-based company running the site in September, threatening civil penalties of up to A$49.5 million ($32.2 million) if it did not put in place safeguards to prevent image-based abuse.
Mr Grant said AI model hosting platform Hugging Face had also taken additional steps to comply with Australian law, including changes to its terms of service that require account holders to take steps to minimize the risk of abuse associated with the platform.
Australia is at the forefront of global efforts to prevent children from harming themselves online, banning social media for under-16s and cracking down on apps used for stalking and creating deepfake images.
As platforms that allow the creation of photorealistic materials with the click of a mouse proliferate, concerns are growing over the use of AI to create non-consensual, sexually explicit images.
A survey conducted last year by US-based advocacy group Thorne found that 10% of respondents aged 13 to 20 said they knew someone who had a nude image created using a deepfake, and 6% said they had been the direct victim of such abuse.

