
It was only a matter of time before more sophisticated fake porn videos surfaced online. But a crackdown on this super-realistic fake porn is already beginning.
Reddit and Gfycat, two popular platforms where users have been uploading the fake porn, have begun to eradicate the manipulated smut, which is often so convincing that it blurs the contours of reality itself.
Advertisement
This type of fake porn, also referred to as deepfakes, involves mapping someone elseâs face onto a porn starâs body. While fake porn has existed for years, free and more powerful tools using artificial intelligence now afford trolls a way to create more realistic videos, given they have enough images of their victim to recreate a scene. The Deepfakes subreddit dedicated to this type of content has thousands of users who often use celebrity faces, though the practice has also evolved to include classmates and former partners. These are often done without the personâs permission.
As the Next Web pointed out on Wednesday, some of these videos and GIFs are being removed from some of the more popular sites hosting them after news outlets began reporting on the trend. âI just noticed that [âŚ] my upload yesterday was deleted, could be a copyright issue,â a Reddit user wrote, according to the Next Web. âI donât think itâs copyright since random Japanese idols with little to minimal presence in the West have been removed,â another redditor reportedly wrote, â[e]ven the botched ones that have no resemblance to any human being are gone. That suggests someone from [G]fycat proactively removed all the gifs linked here.â
Another Redditor posted in Deepfakes last night, noting that older Gfycat links had been removed from the site. âSeems like a targeted purge, so people should avoid using the website,â they wrote. âAnyone who posted a link before should try and rehost as well.â
Advertisement
In a statement emailed to Gizmodo, Gfycat confirmed that it is proactively purging its platform of deepfakes. Gfycatâs terms of service doesnât have an explicit policy on revenge porn, but it does prohibit any content that is âunlawful, harmful, threatening, abusive, harassing, tortious, excessively violent, defamatory, vulgar, obscene, libelous, invasive of anotherâs privacy, hateful racially, ethnically or otherwise objectionable.â The companyâs spokesperson said deepfakes clearly violate the siteâs rules.
âOur terms of service allow us to remove content we find objectionable,â the Gyfcat spokesperson said. âWe find this content objectionable and are actively removing it from our platform.â
Reddit hasnât yet responded to our request for comment, but deepfakes are likely a violation of the siteâs terms of service as well. Reddit clearly states that unwelcome content includes âthe posting of photographs, videos, or digital images of any person in a state of nudity or engaged in any act of sexual conduct, taken or posted without their permission.â Of course, the whole matter is complicated by the fact that the naked bodies in deepfake porn do not belong to the people whose faces are attached to them.
Advertisement
Fake porn videos arenât just harassing and a gross invasion of privacyâthey likely violate copyright laws. To create deepfakes, someone will need hundreds of images of their victim. To collect these, they can use an open-source photo-scraping tool that will grab photos of their victim that are available online. But that victim can request the removal of those images if they are posted without their permission, citing copyright infringement. Both Reddit and Gfycat have copyright infringement policies that state theyâll remove offending content.
Just as advanced fake porn was a predictable outcome of our trollish reality, so was the inevitable backlash and policing. And trolls are already trying to figure out which platforms they can flock to next, posting alternative sites like Russian social networks vk.com and ok.ru for those worried about their videos being deleted. Itâs a vicious cycleâand entirely unsurprising.