It was only a matter of time before more sophisticated fake porn videos surfaced online. But a crackdown on this super-realistic fake porn is already beginning.
Reddit and Gfycat, two popular platforms where users have been uploading the fake porn, have begun to eradicate the manipulated smut, which is often so convincing that it blurs the contours of reality itself.
This type of fake porn, also referred to as deepfakes, involves mapping someone else’s face onto a porn star’s body. While fake porn has existed for years, free and more powerful tools using artificial intelligence now afford trolls a way to create more realistic videos, given they have enough images of their victim to recreate a scene. The Deepfakes subreddit dedicated to this type of content has thousands of users who often use celebrity faces, though the practice has also evolved to include classmates and former partners. These are often done without the person’s permission.
As the Next Web pointed out on Wednesday, some of these videos and GIFs are being removed from some of the more popular sites hosting them after news outlets began reporting on the trend. “I just noticed that […] my upload yesterday was deleted, could be a copyright issue,” a Reddit user wrote, according to the Next Web. “I don’t think it’s copyright since random Japanese idols with little to minimal presence in the West have been removed,” another redditor reportedly wrote, “[e]ven the botched ones that have no resemblance to any human being are gone. That suggests someone from [G]fycat proactively removed all the gifs linked here.”
Another Redditor posted in Deepfakes last night, noting that older Gfycat links had been removed from the site. “Seems like a targeted purge, so people should avoid using the website,” they wrote. “Anyone who posted a link before should try and rehost as well.”
In a statement emailed to Gizmodo, Gfycat confirmed that it is proactively purging its platform of deepfakes. Gfycat’s terms of service doesn’t have an explicit policy on revenge porn, but it does prohibit any content that is “unlawful, harmful, threatening, abusive, harassing, tortious, excessively violent, defamatory, vulgar, obscene, libelous, invasive of another’s privacy, hateful racially, ethnically or otherwise objectionable.” The company’s spokesperson said deepfakes clearly violate the site’s rules.
“Our terms of service allow us to remove content we find objectionable,” the Gyfcat spokesperson said. “We find this content objectionable and are actively removing it from our platform.”
Reddit hasn’t yet responded to our request for comment, but deepfakes are likely a violation of the site’s terms of service as well. Reddit clearly states that unwelcome content includes “the posting of photographs, videos, or digital images of any person in a state of nudity or engaged in any act of sexual conduct, taken or posted without their permission.” Of course, the whole matter is complicated by the fact that the naked bodies in deepfake porn do not belong to the people whose faces are attached to them.
Fake porn videos aren’t just harassing and a gross invasion of privacy—they likely violate copyright laws. To create deepfakes, someone will need hundreds of images of their victim. To collect these, they can use an open-source photo-scraping tool that will grab photos of their victim that are available online. But that victim can request the removal of those images if they are posted without their permission, citing copyright infringement. Both Reddit and Gfycat have copyright infringement policies that state they’ll remove offending content.
Just as advanced fake porn was a predictable outcome of our trollish reality, so was the inevitable backlash and policing. And trolls are already trying to figure out which platforms they can flock to next, posting alternative sites like Russian social networks vk.com and ok.ru for those worried about their videos being deleted. It’s a vicious cycle—and entirely unsurprising.