Aug. 16, 2021

The Naked Truth About Disturbing New AI That Undresses Any Woman

The Naked Truth About Disturbing New AI That Undresses Any Woman

Technology is great for so many reasons, but for every beneficial innovation, there is often something slightly more insidious to be found. The deep fake phenomenon has been a part of pop culture for a few years. With technology that one-ups Photoshop, a form of artificial intelligence called deep learning can be used to hilariously recast classic sitcoms and put actors in roles we would have loved to see them play on the big screen. 

Unfortunately, while these deep fake videos and images can be used to elicit a laugh, some users are taking advantage of this AI to create fake videos that show women in a state of undress – and worse – without their consent.

This situation is not entirely new, as deep fake pornography surfaced on the internet, primarily on Reddit, back in 2017. A report published in October 2019 by Dutch cybersecurity startup Deeptrace found that in September of 2019, there were over 15,000 deepfake videos online, a near doubling over nine months. A whopping 96% of them were NSFW and almost all of them, 99% to be exact, featured the faces of female celebrities mapped onto the bodies of adult entertainers. 

With the increasing ease of availability of programs and apps that can digitally create deep fakes, the technology could move from the world of celebrity and into the darker realm of revenge porn. As Danielle Citron, a professor of law at Boston University, tells The Guardian, “Deepfake technology is being weaponized against women.”

People looking to find these videos don’t have to look hard, as new sites continue to pop up. In June 2019, an app was released that used AI to digitally remove clothing from images of women. The app, which had a free and a paid version, only stayed online for a few weeks before it was removed from app stores. The creators refunded their subscribers and posted a Tweet stating, “Despite the safety measured adopted (watermarks) if 500,000 people use it, the probability that people will misuse it is too high. We don’t want to make money this way.” 

In the wake of this, a new site emerged that promises to “nudify” women’s bodies with eerily realistic-looking results. Since the beginning of this year, the site has received more than 38 million hits – despite being kicked off its original hosting service. This site isn’t relegated to the depths of the dark web – a quick Google search can pull it up. In addition, the siter encourages users to spread the word via social media, even offering referral rewards. Users can “nudify” one image for free every few hours, but if they earn referral rewards, they can get up to 100 free images. The site also accepts cryptocurrency to skip wait times for additional photos. 

These altered mages have made their way to platforms including Twitter, Facebook, Reddit and others, and can be found in both private and public channels. 

The rise of these “nudified” images has empowered anonymous internet to utilize the technology for rather nefarious purposes. U.K.-based deepfake expert Henry Ajder told the Huffington Post that, “This is a really, really bleak situation,” He added that, “The realism has improved massively,” and asserted, “The vast majority of people using these [tools] want to target people they know.” 

If you need to “nudify” someone, why not do it the safe and SFW way and use an app like the one Cubby mentioned, Nüdifier. The developers of the app, which produces a faux pixilated “nude” image say that the app is fun and easy and “makes the mysterious and previously high-cost process of Nüdification™ available to everyone.”