Creeps Are Using a Neural Network to Dox Porn Actresses

Image: Arhivach.org / Gizmodo
Image: Arhivach.org / Gizmodo

Earlier this April, a Russian photographer named Egor Tsvetkov used photos and an app called FindFace, a neural network that can link photos with social media profiles using facial recognition, to show how much information we willfully give up online. Unfortunately the nightmare he was trying to warn against swiftly came true.

Advertisement

According to Global Voices, three days after the Russian press covered Tsvetkov’s project, a online hive of scum and villainy on a 4Chan-esque Russian forum called Dvach began using the FindFace app to analyze photos of porn actresses. After finding a match, the hordes descended, attacking them on the popular Russian social media network, Vkontakte, and posting photos to friends and family members. The photos don’t appear to limited to images of women from Russia, either. The woman in the above image appears to be an American college student.

Screenshot of Vkontakte dox, translated by Global Voices.
Screenshot of Vkontakte dox, translated by Global Voices.
Advertisement

The doxxers’ motivation for this thinly-veiled misogyny was “moral outrage,” Global Voices reports, and that women who work in the sex industry are “corrupt and deceptive.” You know, your normal infantile bullshit masquerading as a righteous cause.

Vkontakte quickly shut down any forums harassing these women. However, in an interview with Tjournal, FindFace admitted that there’s nothing they can do to stop it but will “provide any information needed to find the users responsible for this harassment.”

Illustration for article titled Creeps Are Using a Neural Network to Dox Porn Actresses
Screenshot on Dvach archive, translated via Google Translate
Screenshot on Dvach archive, translated via Google Translate
Advertisement

Russian security firm Kaspersky, who wrote a lengthy blogpost detailing how FindFace works, says the app often makes mistakes, so some people are likely being misidentified as well.

The whole thing is disturbing. Image recognition and machine learning are only going to get better. Apps like Google Photos, while obviously less stalkerish than FindFace, can identify most objects in your camera roll with creepy accuracy and facial recognition is no new thing when it comes to security and law enforcement. But like most technology, all it takes is an angry mob of digital mouth breathers (which US online communities have been known to breed) to forge an all-new layer of hell for women on the internet.

Advertisement

[Global Voices via Kaspersky]

Share This Story

Get our newsletter

DISCUSSION

splatworthy
Splatworthy

This is a tough one. People get into porn to make money, and one of the reasons it pays so well is because it comes with associated risks and stigma. There is a lot of self-promotion required to succeed, and they are constantly selling themselves and trying to expose themselves as much as possible to make as much money as possible. That's the whole game. There is very little of their lives they can expect to remain private. That doesn't make the way some people treat them at all right, but they are definitely playing with a two edged sword. They chose the money and the risk. As progressive and sex positive as I try to be, a career in porn still seems like a poor life choice.