Meta Oversight Board Says Facebook and Instagram Skirt Moderation Rules for Famous People

The watchdog said that Meta’s "cross-check" systems on Facebook and Instagram gave celebrities and businesses more leeway in posting unmoderated content.

We may earn a commission from links on this page.
A silhouette walks in front of a screen displaying the Meta logo reflected in a plane of glass.
Meta’s oversight board has been analyzing the company’s cross-check system for over a year, and now has basically concluded with a 2021 report saying it gave celebrities the power to post harmful content with little chance of a full review.
Photo: Dan Kitwood (Getty Images)

On Tuesday, Meta’s Oversight Board dropped a more than-50 page report detailing how the company needs to overhaul its systems that have allowed major influencers and celebrities leeway to post disingenuous or harmful content that would otherwise be moderated.

It all has to do with Meta’s so-called “cross-check” system, which was detailed last year in a bombshell report by The Wall Street Journal. Cross-check, also referred to as Xcheck, was designed to shield a cultivated list of a few million celebrities, influencers, politicians, businesses and more from content moderation that its other 3.7 billion users were subjected to. The cross-check system was supposed to have real humans personally look at these special accounts to see if they were violating the platforms’ rules, but sometimes these analyses fell by the wayside for days at a time. Some cross-check notifications never got reviewed at all.


The semi-independent Oversight Board seemed to agree with the Journals’ article, writing “we found that the program appears more directly structured to satisfy business concerns” by essentially giving “certain users” extra protection from content moderation. The company had even failed at tracking whether cross-check was more accurate than its automated systems. Further, the board noted the company had lied to it repeatedly about cross-check often giving celebrities a free pass on content, as noted by Facebook whistleblower Frances Haugen in what’s become known as The Facebook Papers.

Essentially, anyone with a strong online presence ended up “whitelisted,” according to the 2021 WSJ article. Anyone on the list was given a full 24 hours to personally take down or change offending content so they could avoid any penalties. Most of those who were on the list didn’t even know it. This system had reportedly included former President Donald Trump before he was eventually banned in 2021. The company has not yet decided if Trump will be let back on Facebook come 2023.


Citing thousands of pages of internal documents and several briefings with company execs, the board said it sometimes took the company “more than five days” before Facebook staff got to review posts under XCheck. All the while, the offending content remained up on the platform. It gave some accounts much more power to violate Facebook’s policies in particular, as the Journal noted the system blocked moderators from removing nude photos of a woman posted by prominent Brazilian soccer player Neymar da Silva Santos Jr. Any other account should have been cited by the company’s policies.

The board’s conclusions come more than a year after it originally accepted Meta’s request to look into its internal systems. The board told Meta it needed to restructure its moderation systems, mostly to make it much more transparent about who is eligible for extra review when the moderation system makes mistakes. The board also said “high-severity” content needs to be removed or hidden while it’s under review. Meta has 90 days to review the oversight board’s opinions and respond.


This report is notable especially since the board usually sides against overt moderation of individual posts. The committee recently told the company to reinstate a post comparing the Russian army fighting in the Ukraine to Nazis in WWII. The board will also give recommendations on whether Meta should rescind its covid misinformation policies. Come 2023, Facebook and Instagram’s hidden moderation policies could look very different.