GitHub Bans Copies Of DeepNude, The App Used To Undresses Women

Harin


GitHub has banned the posting, storing, and sharing the source code of DeepNude, the AI used to create “deepfakes” porn images from any female photo.

GitHub has banned the posting, storing, and sharing the source code of DeepNude, the app that is used to create “deepfakes” porn images from any female photo. GitHub shared that the source code violated its rules by containing sexual and obscene content. The app has been removed from multiple repositories, even one run by its creator.

DeepNude, the AI that is used to create “deepfakes” porn images from any female photo.

Originally, DeepNude was a paid application that uses deepfakes-similar technology to turn normal photos into nude images of women. The application’s downloading times keep increasing. After Motherboard exposed the app, it was shut down by the development team. However, its copies were still floating online on many sites, including GitHub.

Things get worse when the team behind the AI decided to upload the core algorithm on the platform. The team wrote:

The guidelines of GitHub states that:

However, “obscene” or “pornographic” content is not allowed on the platform.

DeepNude didn’t start the fake nude photo concept. However, among many methods, it is simpler and can be used by anyone. That’s why it is even more dangerous.

Warnings have been raised by politicians and commentators about the potential political impact of deepfakes. But in the beginning, the technology was used to create fake porn of women. Those pictures from DeepNude threaten women whose lives could be ruined by fake nudes.

It’s hard to stop the copies of DeepNude from spreading online. However, with GitHub’s decision, it may be harder to find the app and its algorithm.

Tags
Next Story