How often do you see a truly horrifying image on a mainstream site like Facebook? Almost never, right? Well, that's because some unfortunate individuals look at thousands of images like that per day so you don't have to.
Yes, there are entire companies dedicated to reviewing content flagged as offensive for various websites that accept user-uploaded media, from Facebook to YouTube. Think child porn, mangled corpses and animal abuse; the worst of the worst. Since an algorithm that seeks out naked people can only go so far, they bring in human beings. And it sounds pretty rough.
The surge in Internet screening services has brought a growing awareness that the jobs can have mental health consequences for the reviewers, some of whom are drawn to the low-paying work by the simple prospect of making money while looking at pornography.
"You have 20-year-old kids who get hired to do content review, and who get excited because they think they are going to see adult porn," said Hemanshu Nigam, the former chief security officer at MySpace. "They have no idea that some of the despicable and illegal images they will see can haunt them for the rest of their lives."
Yikes! To be fair, if you take a job just because you think you'll get to look at porn all day, maybe you deserve what you get. But still, these companies find themselves needing to pay for therapy for their workers and dealing with situations like employees throwing up or crying when they see the worst images.
Really, these people should be paid like kings for what they do. Without them, you'd have a much bigger chance of stumbling upon some image that's so upsetting that you can't get it out of your head. They're like sponges that soak up the awfulness so you don't have to. And it sucks that their jobs even need to exist, but anyone who has hung out on the internet for any amount of time knows that they really, really do. Sigh. [NY Times]