Sidebar Ads

Deepfake application is controversial, DeepNude closes hours after exposure

Deepfake application, the app, DeepNude, technology news, tech news, all news, apps,
Deepfake application is controversial, DeepNude closes hours after exposure

Deepfake application is controversial, DeepNude closes hours after exposure, After less than a day of receiving widespread attention, Deepfake application that uses artificial intelligence to create fake nude pictures for women is stopped. 

In a tweet, the team behind DeepNude said they "greatly underestimated" the project and that "the potential for people to abuse it is very high."

DeepNude will not be displayed for sale and other versions will not be released. The team also warned against the program's online participation, saying it would be against the application's service conditions. They admit that "some copies definitely" will come out.

The first painting drew attention to DeepNude yesterday afternoon. The app, available for Windows and Linux, uses artificial intelligence to change images to make the person look nude and determined to work only for women. 

It has been on sale for months, DeepNude says, "Frankly, the app is not great" in what it does.

But it still works well enough to attract widespread concern about its use. While people were able to digitally manipulate images, DeepNude made this ability instant and available to anyone. 

These images can then be used to harass women: deepfake has already been used to free women into pornographic videos without their consent, with little they can do to protect themselves during the spread of these videos.

The app designer, who just went by Alberto, told The Verge earlier today that he thought someone else would soon create an application like DeepNude if he did not do it first. "Technology is ready (affordable)," he said. Alberto said DeepNude would "definitely end" if they saw the app being misused.


The DeepNude team ends its announcement of a shutdown by saying "The world is not yet ready for DeepNude", as if there will be some time in the future when the program can be used appropriately. 

But deep drilling is going to be easier and harder to find. In the end, knowing if something is real or fake is not the problem. 

These applications allow people to quickly and easily misuse people's images, and at the moment, there is little protection, if any, to prevent this from happening.

For the latest tech news and updates, follow us on FacebookTwitter,. Also, if you like our efforts, consider sharing this story with your friends, this will encourage us to bring more exciting updates for you.

Post a Comment

0 Comments