Facebook Shares More Details About the Content Oversight Board You'll Be Getting Pissed Off About in the Future

Illustration for article titled Facebook Shares More Details About the Content Oversight Board You'll Be Getting Pissed Off About in the Future
Photo: Getty

Facebook, which turns 15-years-old next week, has yet to figure out how to moderate its increasingly destructive platform. In November, the company announced that it had plans to establish an independent board that would help oversee some of the more contentious and complex issues, and on Monday, it released even more details pertaining to this future body of decisionmakers.


The draft—titled, “An Oversight Board for Content Decisions”—goes into more specific details on how many people would make up the board, how Facebook would ensure that the review process remains unbiased, and how members will be chosen, among other things.

“The board will be made of experts with experience in content, privacy, free expression, human rights, journalism, civil rights, safety and other relevant disciplines,” Facebook wrote in a proposal. The company stated that there will be a “diverse set of up to 40 global experts” on the board, who will each serve part-time for a three-year term, which can be renewed once. They will receive fixed compensation. The identities of the board members will be public.

Facebook writes that the board will review the company’s “most challenging content decisions—focusing on important and disputed cases.” These include issues that both Facebook and Facebook users raise to the board, whether it’s someone looking for an appeal on a decision they disagree with, or a decision by the company that’s “especially difficult to resolve,” among other possible cases.

Each case will be assigned a panel—a random grouping of board members rather than the entire board—which has two weeks to issue an explanation for how they came to their decision. While the explanations will be made public, individual members who oversaw the case will not be.

The first group of board members will be chosen by Facebook, with succeeding members chosen by the existing board. While Facebook can float possible names to the board, the board’s members still have to approve new members, and Facebook doesn’t have the power to take a board member’s seat away unless they violated their agreement, according to the proposal.

And while Facebook does get to choose the original members and propose future ones, current or former employees of the social network can’t join the board; this includes contingent workers and government officials. The drafted proposal also notes that board members have to disclose any conflicts of interest.


This doesn’t mean that the board is wholly untainted by the influence of Facebook—the proposal does state that they will be assisted by a full-time staff. These staff members aren’t technically part of the board, and they don’t have decisionmaking powers of the board, but they will expectedly work alongside them in some capacity. The proposal states that their role is to “serve the board and ensure that its decisions are implemented.”

The proposal gives some helpful insight into what this board is going to shape up to be and the types of content that it will review, but as Facebook points out, it’s just “a starting point” and doesn’t include the full breadth of questions and concerns raised regarding this new external body. Facebook wrote in the proposal that it will be publishing a final charter that will include the values guiding this future board, which are arguably a crucial factor in best understanding the influence these individuals will have on the platform.


It’s a good system to have in place—to ensure desperately needed oversight and accountability—but it still doesn’t tackle some of the most egregious day-to-day issues plaguing the platform, like harassment, misinformation, and targeted violence.

What’s more, it’s hard not to view the creation of this independent board as a vehicle for Facebook to evade PR disasters by offloading responsibility to an outside team. Again, this isn’t inherently bad, but Facebook has yet to indicate that it prioritizes cleaning up its most harmful messes over its public image. For example, as Motherboard revealed on Monday, the company reportedly directs its moderators to flag posts that might make the company look bad ahead of elections (or what the company internally refers to as a “PRFireRisk,” according to Motherboard). It’s this ill-judgment that serves as a model example of why independent experts should have the last say when it comes to controversial content. But it also puts on display the daily messes that surely won’t rise to the ranks of this board.




I can see this going the way of ESPN’s ombudsman, which started out as a sincere effort to offer criticism of the site but by the end turned into a way for the network to respond to criticism without directly responding, then went away altogether.