Vote 2020 graphic
Everything you need to know about and expect during
the most important election of our lifetimes

The Purge of AI-Assisted Fake Porn Has Begun

Photo: Getty
Photo: Getty

It was only a matter of time before more sophisticated fake porn videos surfaced online. But a crackdown on this super-realistic fake porn is already beginning.


Reddit and Gfycat, two popular platforms where users have been uploading the fake porn, have begun to eradicate the manipulated smut, which is often so convincing that it blurs the contours of reality itself.

This type of fake porn, also referred to as deepfakes, involves mapping someone else’s face onto a porn star’s body. While fake porn has existed for years, free and more powerful tools using artificial intelligence now afford trolls a way to create more realistic videos, given they have enough images of their victim to recreate a scene. The Deepfakes subreddit dedicated to this type of content has thousands of users who often use celebrity faces, though the practice has also evolved to include classmates and former partners. These are often done without the person’s permission.


As the Next Web pointed out on Wednesday, some of these videos and GIFs are being removed from some of the more popular sites hosting them after news outlets began reporting on the trend. “I just noticed that […] my upload yesterday was deleted, could be a copyright issue,” a Reddit user wrote, according to the Next Web. “I don’t think it’s copyright since random Japanese idols with little to minimal presence in the West have been removed,” another redditor reportedly wrote, “[e]ven the botched ones that have no resemblance to any human being are gone. That suggests someone from [G]fycat proactively removed all the gifs linked here.”

Another Redditor posted in Deepfakes last night, noting that older Gfycat links had been removed from the site. “Seems like a targeted purge, so people should avoid using the website,” they wrote. “Anyone who posted a link before should try and rehost as well.”

In a statement emailed to Gizmodo, Gfycat confirmed that it is proactively purging its platform of deepfakes. Gfycat’s terms of service doesn’t have an explicit policy on revenge porn, but it does prohibit any content that is “unlawful, harmful, threatening, abusive, harassing, tortious, excessively violent, defamatory, vulgar, obscene, libelous, invasive of another’s privacy, hateful racially, ethnically or otherwise objectionable.” The company’s spokesperson said deepfakes clearly violate the site’s rules.

“Our terms of service allow us to remove content we find objectionable,” the Gyfcat spokesperson said. “We find this content objectionable and are actively removing it from our platform.”


Reddit hasn’t yet responded to our request for comment, but deepfakes are likely a violation of the site’s terms of service as well. Reddit clearly states that unwelcome content includes “the posting of photographs, videos, or digital images of any person in a state of nudity or engaged in any act of sexual conduct, taken or posted without their permission.” Of course, the whole matter is complicated by the fact that the naked bodies in deepfake porn do not belong to the people whose faces are attached to them.

Fake porn videos aren’t just harassing and a gross invasion of privacy—they likely violate copyright laws. To create deepfakes, someone will need hundreds of images of their victim. To collect these, they can use an open-source photo-scraping tool that will grab photos of their victim that are available online. But that victim can request the removal of those images if they are posted without their permission, citing copyright infringement. Both Reddit and Gfycat have copyright infringement policies that state they’ll remove offending content.


Just as advanced fake porn was a predictable outcome of our trollish reality, so was the inevitable backlash and policing. And trolls are already trying to figure out which platforms they can flock to next, posting alternative sites like Russian social networks and for those worried about their videos being deleted. It’s a vicious cycle—and entirely unsurprising.

Share This Story

Get our newsletter


I don’t know where I fall on this issue. On hand it seems like a gross misuse of a persons identity, on the other hand....what is identity? a face? whats the difference between making and sharing a faked image or video or creating a fanfic sex story about a celebrity? or even just closing your eyes and imagining it? should that be illegal too? your using their likeness for your own pleasure. you’re not profiting off these people? I also think its up to the artist to prevent this. my argument to them is; Make better art, and people wont try to subvert your existance. In china they share engineering blueprints, the value isnt in the idea its the execution. In art nobody becomes famous because they copy_pasted the script of Apocalypse now. The things that have staying power are not easy to copy nobody copys paul Thomas Anderson, but there are 20 Micheal Bays. I think if you are part of the celebrity culture, then you signed up for this shit, and your actions not your words can change the narrative. do you own your identity if its not relativly different from all the other celebrities?

buuuuut, Idk how id feel if my face was put onto someone elses body. We get used to our faces as our own, so I can imagine its pretty damaging to experience exploitation like that.