Advertisement
Advertisement

Meanwhile, a woman who penned a popular post about attractive braces for those with Ehlers Danlos Syndrome found the post flagged as explicit—presumably because the algorithm can’t tell the difference between a finger and a penis.

Advertisement

And this post of a woman in Victorian dress was also flagged for being too salacious when it wasn’t even salacious by 19th century British terms.

Advertisement

Great British Bake Off? Presumably just loaded with tits and ass.

Advertisement

Even Jesus was not spared.

Some of these posts have since been found to be acceptable, either through slow improvements to the tool or because post creators are systematically appealing every post they’ve produced that has been flagged. Either way, it is a mess and a clear example of why relying on “automated tools to identify adult content and humans to help train and keep our systems in check” is inherently flawed.

Advertisement

The blanket ban on adult content doesn’t save kids from porn, but its indiscriminate hand does harm artists and marginalized communities seeking to express themselves. There is no easy way to moderate the internet—carving out a specific form of expression cannot be done, and certainly not with an algorithm that is programmed by people to behave like a person might.

We’ve seen this failure of automated online moderation before. Facebook relied on algorithms and became prey to foreign actors. London’s Metropolitan Police attempted to create an algorithm that would find illegal nudes, which instead kept returning images of sand dunes.

Advertisement

Tumblr, meanwhile, has never been particularly gifted at coding its own website. It has so many flaws that regular users rely on browser add-ons like XKit, which fixes a wide range of annoyances including broken communication functions and whitelisting or blacklisting controversial subjects via Tumblr’s tag system. Its iOS app, before being removed from the Apple store due to child pornography concerns, was notable for being incapable of actually loading the images users share—you know the entire point of the platform. It’s repeatedly struggled to deal with that pernicious child porn problem and a Nazi problem using algorithms. It’s regularly banned popular (and crucial) tags like “lesbian,” “trans,” and “LBGTQ” in an effort to stop harmful content while leaving up “necrophilia.” How on Earth could it be expected to properly flag a “female-presenting nipple” by simply seeing an image? Fools.

It’s also unlikely to get better any time soon. Networks like Tumblr and Facebook employ computer algorithms to act as moderators because it’s a lot cheaper than employing actual humans, and Tumblr, in particular, has had a cash flow problem for some time. The network has failed to monetize its sizeable audience in the way Facebook, or even Reddit, has, and its parent company, Oath, has never actually understood what it is, why it’s popular, or what it could be. Clearly, that is still the case.