The Future Is Here
We may earn a commission from links on this page

Meta Asks Oversight Board if It Can Finally Stop Caring About Covid Misinformation

The company's asking its self-selected Oversight Board to weigh in on whether or not expanded content moderation efforts are still needed or "appropriate."

We may earn a commission from links on this page.
Image for article titled Meta Asks Oversight Board if It Can Finally Stop Caring About Covid Misinformation
Photo: Andrew Caballero-Reynolds (Getty Images)

Meta, the company routinely railed against by experts and lawmakers for turbocharging covid-19 lies, wants its hand-picked Oversight Board to weigh in on whether the whole misinformation thing is still that big of a deal.

In a blog post published Tuesday, Meta President of Global Affairs, Nick Clegg, said the company is asking its Oversight Board whether measures put in place to address covid-19 misinformation were still necessary and “appropriate.” Clegg says it broadened its misinformation policies during the pandemic to give the company flexibility in removing factually incorrect information related to masks, social distancing, and vaccines. With some countries moving quickly to reduce restrictions, Clegg wants to know if those pandemic-era modifications to Meta’s moderation strategy are still worthwhile.


“As the pandemic has evolved, the time is right for us to seek input from the Oversight Board about our measures to address COVID-19 misinformation, including whether those introduced in the early days of an extraordinary global crisis remains the right approach for the months and years ahead,” Clegg wrote. “The world has changed considerably since 2020.”


As a quick recap, Meta (then Facebook) officially launched the Oversight Board back in 2020 to serve as a type of unbiased, semi-independent panel that can review the company’s most difficult content moderation policies. The Oversight Board makes “binding” decisions of which Meta has agreed to abide, though it’s unclear how or if they could enforce a decision Meta vehemently disagreed with. Meta recently revealed a potential weakness in the Oversight Board’s design earlier this year after it withdrew an earlier request for policy guidance from the board related to content moderation involving Russia’s war in Ukraine. Though panel members on Meta’s so-called “Supreme Court” do have a track record championing human rights, they were also hand selected by the company and reportedly receive somewhere around $240,000 in compensation.

Still, the Oversight Board has already waded into plenty of murky water and emerged with policy guidelines. They ultimately had the final say as to whether or not Meta was keeping former President Donald Trump off the platform. Since then the board has issued decisions in 25 other cases. Meta’s struggled to keep pace with the board’s recommendations. In November, Meta revealed it had “fully implemented” just 12 of the 69 recommendations issued by the board in the previous two quarters.

“The policies in our Community Standards seek to protect free expression while preventing this dangerous content,” Clegg said in this week’s blog post. “But resolving the inherent tensions between free expression and safety isn’t easy, especially when confronted with unprecedented and fast-moving challenges, as we have been in the pandemic. That’s why we are seeking the advice of the Oversight Board in this case.”

Even though covid-19 definitely isn’t over in the real world, it’s clear Meta would prefer to turn the page on one of its thorniest content moderation dilemmas in recent years. Since the early days of the pandemic, activists, digital information experts, and medical professionals have slammed Meta for its response to covid-19 misinformation across its family of apps. Many of those concerns haven’t let up. Just two months ago, a group of more than 500 doctors, nurses and other healthcare professionals outraged over the company’s covid-19 misinformation response published an open letter addressed to Meta shareholders pushing for a proposal that would mandate an independent assessment of the performance of Meta’s Audit and Risk Oversight Committee. Shareholders ultimately voted against that proposal. President Joe Biden himself last year famously blamed Facebook for “killing people,” before ultimately backtracking and clarifying that he meant misinformation was killing people.


To be sure, Facebook has put in place a number of new policies and procedures ostensibly aimed at addressing misinformation on its platforms, but even still, various studies suggest Facebook and many of the users on it are potentially being influenced by anti-vax content. Case in point, a study published last summer by researchers from numerous universities specializing in public health found that 25% of people who said they just received their news from Facebook said they wouldn’t get vaccinated. The only group of people surveyed who were more likely to identify as anti-vax were people who said they only received covid-related news from Newsmax. According to this survey, exclusive news consumers on Facebook were more likely to be anti-vaxxers than people who only consumed Fox News exclusively. Yeah, that Fox News.

I guess it’s worth giving Meta some credit here. Managing “misinformation” of any kind is a difficult, complicated, thankless job that’s almost sure to leave one group or another feeling burned. Eventually, it might seem reasonable to ease up on covid-19 related content enforcement. But even then Meta can choose to own that policy change and pronounce it loudly, rather than cower behind its expensive, self-selected Oversight Board. Again, as it did with the Trump decision before it, Meta’s trying to use its Oversight Board as its corporate punching bag.