From deepfakes to fake art

We have seen AIs able to generate almost anything, from deepfakes, and cloned voices, to art. Dall-E and Midjourney are the most famous art generators that are spreading on the internet and are ever more amazing people with their ability to create works of many different styles starting from a text prompt.

However, these systems are not free to use. Dall-E is open for testing only for a restricted number of users, while Midjourney needs a subscription plan. Nonetheless, to this list, we can add another art generator: Stable Diffusion. This new tool was created by a startup that aims to democratize this new way of generating images. That’s why they offer their application for free with a desktop app, unlike the other two.

Many users are exploring the potentialities of these AI tools, but while some are trying to combine their art with these new means, others are experimenting with how this AI can generate porn images.

The results are more imperfect than when creating art, with subjects with multiple limbs or not perfectly clean, for example. But we know that this technology is continuously improving, as we have already seen in other fields.

The origin of the idea of using image-generating systems to create porn seems to have been started on discussion boards like 4chan and Reddit after a user leaked the open-source Stable Diffusion system.

The trend of AI-generated porn images seems to have started with the launch of Porn Pen, a website where users can change the look of various nude AI-generated models using some tags.

A person claiming to be Porn Pen’s inventor presented the application on the Hacker News forum of Y Combinator, as an experiment that makes use of cutting-edge text-to-image models.

“I explicitly removed the ability to specify the custom text to avoid harmful imagery from being generated”, they wrote. “New tags will be added once the prompt-engineering algorithm is fine-tuned further”.

However, Porn Pen brings up a number of ethical issues, such as the biases in image-generating systems and the data sources that gave rise to them. Beyond the technological ramifications, one wonders if new technology to produce tailored porn could harm those who earn a profession by producing sexual content.

As reported here, Ashley, a sex worker and peer organizer who addresses content moderation issues, believes that Porn Pen’s current creations don’t pose a threat to sex workers.

“There are endless media out there”, said Ashley, who did not want her last name to be published for fear of being harassed for their job. “But people differentiate themselves not by just making the best media, but also by being an accessible, interesting person. It’s going to be a long time before AI can replace that”.

Adult makers are required to prove their identity and age on monetizable porn sites like OnlyFans and ManyVids so that the corporation is aware they are consenting adults. Naturally, because AI-generated porn models are artificial, they are unable to perform this.

Ashley is concerned that if porn sites focus more on A.I. porn, it could result in stricter regulations for sex workers, who currently face more regulation as a result of laws like SESTA/FOSTA. To investigate the consequences of this legislation, which makes working as an online sex worker more challenging, Congress introduced the Safe Sex Workers Study Act in 2019.

This study revealed that after losing the economic stability offered by access to internet platforms, community organizations documented increased homelessness among sex workers.

“SESTA was sold as fighting child sex trafficking, but it created a new criminal law about prostitution that had nothing about age”, Ashley said.

In addition, there are currently few rules governing deep-faked porn in the world. Only Virginia and California have laws limiting specific uses of faked and deep-faked, fabricated pornographic media in the United States.

While early tests from Redditors and 4chan users suggest that Stable Diffusion, one of the systems most likely powering Porn Pen, is particularly adept at producing sexual deepfakes of celebrities, although it has few NSFW images in its dataset, it should be noted that Porn Pen, perhaps not coincidentally, offers a “celebrity” option. Furthermore, since the code is open source, there is nothing to stop Porn Pen’s developer from testing the system on more nude pictures.

“It’s definitely not great to generate [porn] of an existing person”, Ashley said. “It can be used to harass them”.

However, deep-fake porn is frequently produced as a form of harassment and intimidation. Almost always, these pictures are made without the subject’s permission and with bad intentions. Sensity AI, a research company, discovered in 2019 that non-consensual porn was included in 96% of deepfake movies online.

According to Knives and Paintbrushes collective member Mike Cook, it’s possible that the dataset contains individuals who have not given their agreement for their images to be used for training, such as sex workers.

“Many of [the people in the nudes in the training data] may derive their income from producing pornography or pornography-adjacent content”, Cook said. “Just like fine artists, musicians, or journalists, the works these people have produced are being used to create systems that also undercut their ability to earn a living in the future”.

Theoretically, a porn actor may take legal action against the maker of a deep-faked image using copyright safeguards, defamation rules, and even human rights laws. However, as the MIT Technology Review points out, collecting proof to back up the legal claim can be a significant burden.

A Wired investigation discovered that nonconsensual deepfaked movies were raking up millions of views on famous porn sites like Pornhub when more basic AI tools first made deepfaked porn popular a few years ago. On websites similar to Porn Pen, other deep-faked works found a home. According to Sensity data, the top four deep-faked porn websites saw more than 134 million views in 2018.

“AI image synthesis is now a widespread and accessible technology, and I don’t think anyone is really prepared for the implications of this ubiquity”, Cook continued. “In my opinion, we have rushed very, very far into the unknown in the last few years with little regard for the impact of this technology”.

To illustrate Cook’s point, one of the most well-known websites for AI-generated porn expanded in the latter part of last year thanks to referrals, partner deals, and an API. This allowed the website, which hosts hundreds of nonconsensual deepfakes, to continue operating despite restrictions on its payment system. Researchers also found a Telegram bot in 2020 that produced degrading deepfake photos of over 100,000 women, including young girls.

“I think we’ll see a lot more people testing the limits of both the technology and society’s boundaries in the coming decade”, Cook said. “We must accept some responsibility for this and work to educate people about the ramifications of what they are doing”.

The problem with AI image-generators is the same as with deepfakes. When our identity is used for other purposes and it’s easy to make people believe we are doing something we never did, our person is at risk. Our pictures and videos are easily accessible to algorithms and bad actors since we are on social networks and we exchange photos with our smartphones. However, since bad actors are more difficult to contrast, we should at least regulate how algorithms can access data and use it.