Facebook Experiments With Being Less Awful, Says Not to Get Used to It or Anything

Illustration for article titled Facebook Experiments With Being Less Awful, Says Not to Get Used to It or Anything
Photo: Bill Clark-Pool (Getty Images)

Has Facebook learned jack shit from the past few nightmare years? Not really, per a report in the New York Times on Tuesday. Facebook only started giving more weight to reputable publishers in the News Feed days after the 2020 election and doesn’t plan on making that a long-term thing. Executives on its policy team also blocked or sought to water down changes that would limit content the company defined as “bad for the world” or “hate bait,” as well as shot down a feature that would warn users if they fell for hoaxes.

Advertisement

According to the Times, CEO Mark Zuckerberg agreed days after the election to tweak the Facebook news feed to emphasize “news ecosystem quality” (NEQ), a “secret internal ranking it assigns to news publishers based on signals about the quality of their journalism,” because of rampant misinformation spread by Trump and his conservative allies over the election’s results. The Times wrote:

Typically, N.E.Q. scores play a minor role in determining what appears on users’ feeds. But several days after the election, Mr. Zuckerberg agreed to increase the weight that Facebook’s algorithm gave to N.E.Q. scores to make sure authoritative news appeared more prominently, said three people with knowledge of the decision, who were not authorized to discuss internal deliberations.

The change was part of the “break glass” plans Facebook had spent months developing for the aftermath of a contested election. It resulted in a spike in visibility for big, mainstream publishers like CNN, The New York Times and NPR, while posts from highly engaged hyperpartisan pages, such as Breitbart and Occupy Democrats, became less visible, the employees said.

Facebook had allegedly been weighing similar options to slow down the flow of misinformation in the event of a contested election—such as a pilot program to test something resembling a “virality circuit breaker,” which automatically stops promoting posts that go explosively viral until fact-checkers can look at it.

Report after report emphasized that Facebook remained a massive vector for the spread of right-wing disinformation efforts going into the elections, in part because it was fearful of upsetting Republicans convinced social media firms are secretly censoring them. Pro-Trump conspiracy theories alleging Democrats were preparing to win the election by fraud flourished with little intervention. So it’s rather convenient that Facebook only decided to weight NEQ more heavily in the news feed when it became clear Trump had lost.

The break-the-glass strategy wasn’t activated in the weeks or months prior to Nov. 3, when conservative media was promoting wild predictions of a rigged election. The platform’s useless warning labels failed to prevent post-election claims of mass voter fraud from the president and GOP-aligned media personalities from going viral. Nor did Facebook ever have a “plan to make these [NEQ changes] permanent,” Facebook integrity division chief Guy Rosen told the Times. That’s despite employees reportedly asking at company meetings whether the company could just leave the NEQ weights in place to improve the news feed somewhat.

According to the Times, Facebook internally released the results of a test this month called “P(Bad for the World)”, in which it gauged reducing the reach of posts users dubbed “bad for the world.” After it found a stricter approach decreased total user sessions as well as time spent on the site, it rolled out a less aggressive version that didn’t impact those metrics as much. To put it another way: Facebook knows being “bad for the world” in moderation is good for business.

Advertisement

Sources told the paper that before the election, executives on its policy team vetoed a “correct the record” feature that would direct users who engaged with or shared hoaxes to a fact-checking page and prevented an anti-“hate bait” feature from being enabled on Facebook Pages—instead limiting it to Groups. In both cases, the executives claimed that the changes might anger conservative publishers and politicians. (Rosen denied to the Times that the decisions were made on political grounds.)

Trump and the GOP’s threats to punish social media sites for liberal bias are dead in the water, and Facebook is likely to shift with the political winds in the coming months. But if its history is any indication, Facebook will continue playing a shell game of promising to rein in toxicity while actively encouraging it.

Advertisement

“The question is, what have they learned from this election that should inform their policies in the future,” Vanita Gupta, CEO of the Leadership Conference on Civil and Human Rights, told the Times. “My worry is that they’ll revert all of these changes despite the fact that the conditions that brought them forward are still with us.”

It’s not clear how much increasing NEQ’s clout on News Feed rankings has affected the number of times users log in or how long they spend on the site when they get there. Facebook’s News Feed lead, John Hegeman, told the paper the company would study any potential impact, though like Rosen indicated the changes are temporary.

Advertisement

"... An upperclassman who had been researching terrorist groups online." - Washington Post

DISCUSSION

priest-of-maiden
Priest of Maiden

Just delete your Facebook account. There is literally no reason to continue to have a Facebook account.