Isn't Facebook great? (It's not.) But isn't it nice and clean and kid friendly? This is true for a very specific reason: the social media giant outsources the gnarly task of finding and deleting inappropriate content. In the November issue of Wired, Adrian Chen offers a peek into the darkest corners of the industry. It's only slightly horrifying.
It's not just Facebook, of course. Pretty much any social media site you can think of uses some sort of moderation to keep abusive content off its pages. Chen specifically visited the offices of a company in the Philippines that handles moderation for Whisper, the not-so-anonymous secret-sharing app. There, contracted workers likely make less in a day than you do in an hour by looking at pictures of everything from child beastiality to brutal violence. This sort of thing takes a toll on content-moderating workers—of whom there are an estimated 100,000 worldwide.
Eight years after the fact, Jake Swearingen can still recall the video that made him quit. He was 24 years old and between jobs in the Bay Area when he got a gig as a moderator for a then-new startup called VideoEgg. Three days in, a video of an apparent beheading came across his queue.
"Oh fuck! I've got a beheading!" he blurted out. A slightly older colleague in a black hoodie casually turned around in his chair. "Oh," he said, "which one?" At that moment Swearingen decided he did not want to become a connoisseur of beheading videos. "I didn't want to look back and say I became so blasé to watching people have these really horrible things happen to them that I'm ironic or jokey about it," says Swearingen, now the social media editor at Atlantic Media.
The average length of employment for content moderators is between three and six months. It's sort of incredible that some workers even last that long. Especially considering one of Chen's sources earned just about $300 a month.
Image via Shutterstock