Australia clamps downs on ‘nudify’ sites used for AI-generated child abuse

3 hours ago 2

Three websites used to create abuse imagery had received 100,000 monthly visits from Australians, watchdog says.

Published On 27 Nov 2025

Internet users in Australia have been blocked from accessing several websites that used artificial intelligence to create child sexual exploitation material, the country’s internet regulator has announced.

The three “nudify” sites withdrew from Australia following an official warning, eSafety Commissioner Julie Inman Grant said on Thursday.

Recommended Stories

list of 4 itemsend of list

Grant’s office said the sites had been receiving approximately 100,000 visits a month from Australians and featured in high-profile cases of AI-generated child sex abuse imagery involving Australian school students.

Grant said such “nudify” services, which allow users to make images of real people appear naked using AI, have had a “devastating” effect in Australian schools.

“We took enforcement action in September because this provider failed to put in safeguards to prevent its services being used to create child sexual exploitation material and were even marketing features like undressing ‘any girl,’ and with options for ‘schoolgirl’ image generation and features such as ‘sex mode,’” Grand said in a statement.

The development comes after Grant’s office issued a formal warning to the United Kingdom-based company behind the sites in September, threatening civil penalties of up to 49.5 million Australian dollars ($32.2m) if it did not introduce safeguards to prevent image-based abuse.

Grant said Hugging Face, a hosting platform for AI models, had separately also taken steps to comply with Australian law, including changing its terms of service to require account holders to take steps to minimise the risks of misuse involving their platforms.

Australia has been at the forefront of global efforts to prevent the online harm of children, banning social media for under-16s and cracking down on apps used for stalking and creating deepfake images.

The use of AI to create non-consensual sexually explicit images has been a growing concern amid the rapid proliferation of platforms capable of creating photo-realistic material at the click of a mouse.

In a survey carried out by the US-based advocacy group Thorn last year, 10 percent of respondents aged 13-20 reported knowing someone who had deepfake nude imagery created of them, while 6 percent said they had been a direct victim of such abuse.

Read Entire Article
Berita Olahraga Berita Pemerintahan Berita Otomotif Berita International Berita Dunia Entertainment Berita Teknologi Berita Ekonomi