The Future Is Here
We may earn a commission from links on this page

It Was Only a Matter of Time Before Internet Trolls Made More Sophisticated Fake Porn Videos

We may earn a commission from links on this page.

Fake porn involves manipulating a video or photo by putting someone else’s face on a porn star’s body. In recent months, a growing group of Reddit users have used machine learning algorithms to swap celebrities’ faces into porn scenes. And now it seems to have entered the grossest, and most personal, phase yet.

Today, Motherboard reported that users on Discord and Reddit are asking for advice on how to make fake porn videos of crushes and exes. One person even boasted about how they had already created one of a former classmate. “i made a pretty good vid of a girl i went to high school with using only ~380 pics scraped from insta & FB,” one user wrote on Discord, according to a screenshot obtained by Motherboard.


While the latest development in fake porn videos is particularly disturbing, it’s unsurprising in hindsight, a predictable culmination of just about every problem that currently exists online. It’s an amalgamation of toxic online communities, gross invasions of privacy, the abuse of online manipulation tools, and revenge porn.

Fake porn isn’t new: People have been exploiting exes and celebrities online for years using photo-editing software like Photoshop. But now more powerful (and free) tools using machine learning allow users to create more realistic fake footage so long as they have a few hours of spare time and enough images of their victim. And it’s evident that this isn’t a niche problem: The Deepfakes subreddit, a community dedicated to creating fake porn videos and other manipulations, has over 30,000 subscribers.


It’s not as simple as downloading an app or web tool and then uploading your victim’s profile photo you ripped from Facebook or Instagram—a user needs a bunch of photos of the targeted individual. To collect enough images, a user needs to use an open-source photo-scraping tool, which can grab photos of someone that are publicly available online. They can then use free web-based tools to find a porn star match to their victim.

While this tech might seem groundbreaking, all the other parts of the deeply fucked-up puzzle were already there. In Celebgate and Gamergate, we saw similar harassment campaigns and gross invasions of privacy, and the face data collection capability of sites like Facebook has long been known. And while media outlets raised alarm last year about similar technology being used for fake news—which is now certainly possible—it’s important to also keep an eye on how this technology can be unleashed as a means of targeted abuse, and an unspeakably disconcerting execution of revenge porn.

What we are seeing isn’t a completely new practice, but rather the predictable consequence of shitty people with access to more sophisticated technology. It’s emblematic of a continued practice that, thanks to AI, can continue to thrive in the way it was always destined to.