Planned AI App ‘DeepNude’ Shuts Down, Could Show Any Woman Naked

An app developer in Estonia came up with an algorithm that was designed to convert any photo of a fully clothed woman into a nude of that same woman, using artificial intelligence. But after a wave of understandably negative press coverage, including an exposé by the tech site Motherboard that deemed the app “horrifying,” the maker of “DeepNude” says he has discontinued work on the app, and taken it off the market.

For the past couple of years, the “deepfake” phenomena has caused growing public concern. Deepfakes are videos created using AI that can place a person in almost any situation simply using a photo or series of photos that are then animated by the algorithm to appear real. As AVN.com has reported, the first reported uses of “deepfakes” AI involved placing celebrities into hardcore porn videos, and the technology has developed to the point where even a single photo found on social media can be used to create deepfake nonconsensual porn.

The DeepNude app was reportedly based around a similar idea: “an application that uses neural networks to remove clothing from the images of women, making them look realistically nude,” according to Motherboard. Using a machine-learning algorithm, the app would reconstruct a photo of a clothed woman by “swap[ping] clothes for naked breasts and a vulva.”

Though the uses of deepfake AI to create political disinformation has caused alarm in the media, “the most devastating use of deepfakes has always been in how they're used against women,” Motherboard wrote. The DeepNude app “dispenses with” the idea that the technology has any purpose other than “claiming ownership over women’s bodies,” the site wrote.

According to a report by Vox.com, the app was remarkably easy to use, taking  just 30 second to create a realistic nude image from a non-nude photo. The app worked only on photos of women. 

But on Friday, the DeepNude developer took to Twitter to announce that the app, which charged a $50 fee to download, was no more.

“The probability that people will misuse it is too high,” the developer wrote—leaving unclear what he felt would be a proper use of the app. “We don’t want to make money this way.”

To see an example of his the app worked, click on this link, from the pop culture site The Verge. 

Photo By Grant Robertson / Wikimedia Commons