According to a juicy new report from the Wall Street Journal, several Facebook employees threatened to quit over CEO Mark Zuckerberg’s decision to allow posts by Donald Trump to remain on the social network.
Citing “people familiar with the matter,” the report says that some workers took issue with “certain posts about banning Muslims from entering the US,” arguing that they violated Facebook’s stated policy on hate speech. Zuckerberg, however, apparently elected to keep the posts up over concerns about censorship. Following that decision, some employees working on a team that reviewed content threatened to quit.
According to the report, the issue first came to a head in December, after users reported Trump’s Facebook content as hate speech. While a few employees apparently said internally that the posts did indeed violate the policy, content reviewers were ordered to leave the posts up. Monika Bickert, Facebook’s head of global policy management, later said they were left up because the company wanted to remain fair in the middle of an election season. In January, a Muslim employee asked Zuckerberg during a town hall meeting how he could excuse Trump’s comments. Zuckerberg apparently responded that while the Republican nominee’s remarks did count as hate speech, the ramifications involved in removing them were too big.
Some employees found this reasonable, according to the Journal, but some didn’t—including some Muslim employees. Some spoke to their managers about the policy; some made internal Facebook groups that objected to it; others said they would leave the company.
In an email, a spokesperson for Facebook had this to say:
“When we review reports of content that may violate our policies, we take context into consideration. That context can include the value of political discourse. Many people are voicing opinions about this particular content and it has become an important part of the conversation around who the next U.S. president will be. For those reasons, we are carefully reviewing each report and surrounding context relating to this content on a case by case basis.”
The spokesperson declined to comment on the specifics of the Journal report.
Interestingly, all of this comes on the same day that the company announced it would alter its approach toward what kind of content it allows—a contentious issue that has come up repeatedly in the past.
“In the weeks ahead, we’re going to begin allowing more items that people find newsworthy, significant, or important to the public interest—even if they might otherwise violate our standards,” a blog post detailing the change said.
It’s unclear how closely today’s announcement ties in with the issues reported in the Journal, but it’s worth noting that the specifics of the changes remain vague—there are no details about what form these changes will take, or when and how they will be implemented. (Facebook’s spokesperson declined to comment on the record about specific questions regarding the policy changes.)
The 2016 election has proven to be a veritable minefield for Faceboook. Gizmodo reported in May that the platform’s trending news curators, which were later ousted, were suppressing conservative news. This week, free speech issues popped up yet again after news broke that Peter Thiel, a Silicon Valley billionaire and Facebook board member, was planning to donate $1.25 million to Donald Trump’s campaign.
In that case, too, Facebook and Zuckerberg came to Thiel’s defense, arguing that it was important for preserving “diversity.” His explanation started out by noting he wanted “to quickly address the questions and concerns about Peter Thiel as a board member and Trump supporter.” Given that the memo was not a public statement, but rather a leaked internal memo, it seems eminently possible—particularly now, in light of the Journal’s report—that those concerns came from within the organization.