
Facebook’s so-called “Supreme Court” is reportedly interested in seeking company’s permission to review the machine-learning models used to determine which Facebooks posts are given the most prominence in users’ feeds.
Alan Rusbridger, former editor of Britain’s Guardian newspaper and one of 20 people Facebook handpicked to sit on its Oversight Board, said on Tuesday that, after a mere five months in operation, some members of the board are already feeling the the constraints of having to review controversial Facebook decisions on a case-by-case basis. In response, he said, the board may try to shift some of its scrutiny over to how Facebook’s product is itself engineered to influence users.
“We’re already a bit frustrated by just saying ‘take it down’ or ‘leave it up’,” Rusbridger told members of the House of Lords, Britain’s upper house of Parliament.
He continued: “What happens if you want to make something less viral? What happens if you want to put up an interstitial? What happens if, without commenting on any high-profile current cases, you didn’t want to ban someone for life but wanted to put them in a ‘sin bin’ so that if they misbehave again you can chuck them off?”
Rusbridger, whose remarks were first reported by the Guardian, went on to suggest the Oversight Board may seek direct access to “the algorithm” employed by Facebook to curate individual users’ feeds.
The Guardian quotes Rusbridger, who stepped down as editor in 2014 following the paper’s explosive coverage of the Edward Snowden leaks, as saying: “At some point, we’re going to ask to see the algorithm, I feel sure, whatever that means. Whether we’ll understand when we see it is a different matter.”
Facebook did not respond when asked if it would consider granting the Oversight Board access to the algorithm or whether it would allow the board to select its own experts for such a review.
The board, which only began hearing cases last fall, is already facing intense pressure to hold the multi-billion dollar company accountable for what experts in online extremism call a veritable deluge of hate speech, disinformation, and conspiracy theories. U.S. civil rights leaders have accused executives of largely ignoring the problem, despite being repeatedly presented with evidence of violence and other real-world consequences, disproportionately affecting religious minorities and communities of color.
In October, Democratic Reps. Anna Eshoo and Tom Malinowski accused Facebook of directly facilitating extremist violence across the country, saying the company’s inaction has resulted in U.S. citizens being deprived of their constitutional rights.
The lawmakers pointed specifically at the algorithm, which many researchers—and one of Facebook’s own internal studies—say is geared toward inflaming societal divisions along ideological and hot-button political issues. This is done purposefully, the same researchers say, to drive up engagement, which increases profits. (Asked for comment at the time, Facebook did not respond.)
Rusbridger on Tuesday sought to portray the Oversight Board, which is working to select another 20 members without Facebook’s help, as being fully independent from Facebook’s corporate structure, saying the board does not exist “to please” the company. The board has even ejected Facebook staff in the past, he said, when they’ve attempted to observe deliberations.