Facebook on Saturday pushed back against recent reports from the Wall Street Journal that cited a trove of leaked company documents to outline how Facebook executives have been slow to respond to known problems across its platforms that harm users. In a company blog post, Facebook’s vice president of global affairs, Nick Clegg, said the articles contained “deliberate mischaracterizations” and “conferred egregiously false motives to Facebook’s leadership and employees.”
The Journal, referencing internal documents that included research reports, online employee discussions, and drafts of presentations to senior management, said Facebook’s researchers sounded alarms about “the platform’s ill effects” time and time again but they went ignored by higher-ups. The documents revealed company research showing how detrimental Instagram can be for teen mental health, that Facebook’s executives failed to address employee concern about reports of the platform being co-opted by human traffickers in developing countries, and that Facebook gives preferential treatment to certain high-profile users that flout its rules.
Clegg said that while it was “absolutely legitimate” for Facebook to be held accountable for how it tackles harmful issues on its platforms, the Journal’s reports used cherry-picked quotes from leaked material to create “a deliberately lop-sided view of the wider facts.”
“At the heart of this series is an allegation that is just plain false: that Facebook conducts research and then systematically and willfully ignores it if the findings are inconvenient for the company,” Clegg said.
He went on to add: “With any research, there will be ideas for improvement that are effective to pursue and ideas where the tradeoffs against other important considerations are worse than the proposed fix. The fact that not every idea that a researcher raises is acted upon doesn’t mean Facebook teams are not continually considering a range of different improvements.”
Clegg defended Facebook’s handling of posts about coronavirus vaccine information, another issue flagged in the Journal’s reporting. He said the “intersection between social media and well-being” is still an evolving issue within the research community, and that social media is changing rapidly in response to an “ever-growing body of multi-method research and expert input.”
According to an article published Friday, Facebook’s researchers warned the company that anti-vaxxers were teaming up to flood the comment section of vaccine-related content with propaganda and other false claims. An internal report in early 2021 estimated that more than 40% of comments on vaccine-related content appeared to discourage people from getting the coronavirus vaccine. Global health experts like the World Health Organization and Unicef, whose posts were among those getting bombarded, had also expressed to Facebook concerns about the problem.
These targeted anti-vax misinformation campaigns ramped up in the months after Facebook CEO Mark Zuckerberg signaled that the platform would not take strong action against anti-vaccination misinformation the same way it has for the coronavirus pandemic.
“If someone is pointing out a case where a vaccine caused harm or that they’re worried about it — you know, that’s a difficult thing to say from my perspective that you shouldn’t be allowed to express at all,” he told Axios in September 2020.