Controversial deepfake app DeepNude shuts down hours after being exposed

Less than a day after receiving widespread attention, the deepfake app that used AI to create fake nude photos of women is shutting down. In a tweet, the team behind DeepNude said they “greatly underestimated” interest in the project and that “the probability that people will misuse it is too high.”

DeepNude will no longer be offered for sale and further versions won’t be released. The team also warned against sharing the software online, saying it would be against the app’s terms of service. They acknowledge that “surely some copies” will get out, though.

Motherboard first drew attention to DeepNude yesterday afternoon. The app, available for Windows and Linux, used AI to alter photos to make a person appear nude and was only designed…

Continue reading…

from The Verge – All Posts