‘Nudify’ apps that use AI to undress women are soaring in use

· Australian Financial Review

Margi Murphy

Apps and websites that use artificial intelligence to undress women in photos are soaring in popularity, according to researchers.

In September alone, 24 million people visited undressing websites, the social network analysis company Graphika found.

Many people, such as participants in artist Spencer Tunick’s mass nude pictures, consent to being seen naked.  Getty

Many of these undressing, or “nudify”, services use popular social networks for marketing, according to Graphika. For instance, since the beginning of this year, the number of links advertising undressing apps increased more than 2400 per cent on social media, including on X and Reddit, the researchers say. The services use AI to recreate an image so that the person is nude. Many of the services work on women only.

These apps are part of a worrying trend of non-consensual pornography being developed and distributed because of advances in artificial intelligence – a type of fabricated media known as deepfake pornography. Its proliferation runs into serious legal and ethical hurdles, as the images are often taken from social media and distributed without the consent, control or knowledge of the subject.

The rise in popularity corresponds to the release of several open-source diffusion models, or artificial intelligence that can create images that are far superior to those created just a few years ago, Graphika says. Because they are open source, the models the app developers use are available for free.

“You can create something that actually looks realistic,” says Santiago Lakatos, an analyst at Graphika, noting previous deepfakes were often blurry.

One image posted to X advertising an undressing app used language that suggests customers could create nude images and then send them to the person whose image was digitally undressed, inciting harassment. One of the apps, meanwhile, has paid for sponsored content on Google’s YouTube, and appears first when searching with the word “nudify”.

A Google representative says the company doesn’t allow ads “that contain sexually explicit content”.

“We’ve reviewed the ads in question and are removing those that violate our policies,” the company says.