Following a months-long staffing spree, Facebook’s Independent Oversight Board announced the first six cases it’s taking on to deliberate Facebook’s sprawling policies surrounding content moderation.
Just to recap, the self-declared purpose of this Oversight Board is to act essentially as an unbiased panel meant to review some of Facebook’s more contentious content moderation decisions. As we’ve previously noted, each of the hires made to the Board thus far have some sort of track record championing human rights. At the end of the day, a Board that was shaped by Facebook’s own hand still hews towards Facebook’s principles, as in its charter which states it will “pay particular attention to the impact of removing content in light of human rights norms protecting free expression.” More speech solves bad speech as been the unofficial motto of Facebook, and many other online platforms, to what could be described as limited success. Still, the Board’s ultimate decisions are ones that are (supposedly) independent, and supersede those of Facebook itself.
The full list of cases (which you can read here) involve posts pulled from Instagram or Facebook proper for violating one of the platform’s community standards. These six run the gamut of controversial posts pulled from Facebook’s platform over its policies surrounding hate speech, nudity, and “dangerous organizations.” Here’s the first five in a nutshell:
- In the first case, a Facebooker posted screenshots of two tweets by the ex-Malaysian Prime Minister Mahathir Mohamad, stating—among other things— that “Muslims have a right to be angry and kill millions of French people for the massacres of the past.” While Facebook took the post down for violating its policies surrounding hate speech, the user appealed on the grounds that the post wasn’t one of support, but was meant to “raise awareness” of Mohamad’s “horrible words.”
- Case number two involved a post from a Burmese Facebooker asking why (again, among other things) there has been “no retaliation against China” over its horrific treatment of Uyghur Muslims. Facebook also cracked down on this post for violating its hate speech policies. According to the Board, the poster in question claimed that the post wasn’t meant to be taken purely at face value, and was instead meant “to emphasize that human lives matter more than religious ideologies.
- The third appeal also involves a post found in violation of Facebook’s hate speech policies, this time over a post implying that “Azerbaijani aggression” was behind the destruction of some Armenian-built churches in the country. Per the appeal, this wasn’t mean to call out this population in particular, but was instead meant to “demonstrate the destruction of cultural and religious monuments.”
- The fourth appeal moves out of hate speech territory and into something a bit more salacious. Specifically, a Brazilian Instagram user appealed to the Board over a post that was meant to raise awareness of the signs that could signify breast cancer. The post—which included five photos of female breasts on full display—was removed for violating the platform’s rules surrounding “adult nudity.”
- Appeal five was from a US-based user, whose post quoting World War II-era German politician Joseph Goebbels that was taken down for violating Facebook’s policies surrounding dangerous individuals and organizations was apparently taken out of context. “The user indicated in their appeal to the Oversight Board that the quote is important as the user considers the current US presidency to be following a fascist model,” the Board wrote.
The sixth case—and the only one that was brought forward by Facebook itself—involves a video posted by a French user criticizing the country’s coronavirus response and implying that local officials should instead turn to the ineffective albeit very-hyped drug hydroxychloroquine if they want to save lives. Facebook only pulled the video for violating its policies surrounding “violence and incitement” once it was watched around 50,000 times.
According to Facebook, the case exemplifies “the challenges faced when addressing the risk of offline harm that can be caused by misinformation about the COVID-19 pandemic.” Aside from the company’s submission, the Board said in a statement that it Facebookers from across the globe poured in over 20,000 cases since user appeals were opened this past October. As for why these six were chosen, it explained that it’s “prioritizing cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook’s policies.”
While six cases is certainly better than none at all, it’s still a bit disappointing to see this board only grappling with a fraction of a fraction of the 20,000 posts that were reported their way. Facebook has spent the better part of the past two years crowing over the creation of this oversight board, while at the same time ignoring the myriad critics pointing out that its very existence seems to be little more than an exercise in “corporate whitewashing.”
If you want to make your feelings heard, each of these cases have a week-long open comment period starting Tuesday and ending the morning of December 8th. Per the post, the Board will reach some sort of ruling and require action on Facebook’s part within 90 days.