One of the enduring criticisms of Facebook is that the News Feed is a black box. Starting this week, however, the box will suddenly take on a shade of dark gray.
No one outside the company fully understands why they saw this post or that one. We do know that Facebook’s News Feed algorithms are responsible for what appears in our News Feeds. And we know it’s regularly adjusted and tweaked without really knowing how. Even the company itself said the opaque system is leading to mistrust. Now, Facebook is taking small steps to shed at least some light on why the News Feed does what it does.
Facebook announced on Sunday that it’s adding a “Why am I seeing this post?” feature to News Feeds as a way to add transparency to what is literally the central piece of the social network. The tool, which goes live this week, will provide users with some context around why posts appear—as well as, importantly, links to controls that allow users to tweak things like what they see in their News Feed and how to change their privacy settings.
The greater availability of content controls is as just as important as the tidbits users will learn about what pops up on their News Feed since most Facebook users still don’t know how their information is being used by the company, according to a survey by Pew.
There are two key points here. First, users should have easier access to control over their own News Feed. Second, it’s not clear just how much Facebook users will now know about why posts appear and what the company knows about them even with this tool. We’ll begin to answer that question later this week as the new features roll out.
Facebook, like Google and other tech giants, is regularly criticized for a lack of algorithmic transparency. Manipulation of the News Feed was at the center of Facebook’s role in foreign interference in the 2016 United States elections. The News Feed is where misinformation on vaccines is delivered, it’s how all misinformation on the platform spreads rapidly, and it’s ground zero for strange news stories that sow fear and go unfathomably viral without any real explanation.
At Silicon Valley companies, it’s often the algorithms—mathematical formulas and procedures designed to process certain information and complete tasks—that shape what we see on the world’s biggest sites. Google’s search, Twitter’s (non-chronological) timeline, and Facebook’s News Feed are three of many examples of how opaque batches of code are deciding what we see and hear without us in on the decision process.
The new News Feed post feature is similar to the “Why am I seeing this ad?” feature that’s been present on ads in Facebook since 2014. The same criticisms apply to Facebook’s system for displaying ads, which contain all the same power to decide what gets put in front of you on the social network.
That ads tool is also getting an update, Facebook announced, so that users will soon be able to see if advertisers worked with marketing partners on specific ads, and when advertisers uploaded targeting information that causes the ad to be shown in the first place.
Neither the News Feed tool nor the improved ads tool has launched yet. But more importantly, even when they do, they’ll only give some measure of context rather than full transparency. It’s not clear exactly what users will know about what they see versus what will remain hidden or when the information they do see will lose its relevance, a significant point given News Feed’s history of drastic and opaque changes.
Facebook’s attempt at News Feed transparency comes as the company makes a show of its efforts to change amid a growing chorus of critics. Following co-founder Mark Zuckerberg’s declaration that the famously know-it-all and track-them-all company would focus on privacy, Zuckerberg called for more regulation, and the company announced it would finally ban white nationalism and fight misinformation on topics like anti-vaccination conspiracies.
Meanwhile, the company is regularly announcing the removal of accounts used by political operators around the world who use Facebook to drive an agenda in manipulative ways. On Monday, for example, Facebook announced the removal of hundreds of inauthentic accounts from India and Pakistan being used to post propaganda in the wake of historic tensions between the two nuclear-armed neighbors.
It’s clear that Facebook sees its global scale and continued ambition for growth as something beyond its own control. In Silicon Valley, the word “scale” is both an executive’s favorite excuse and their most coveted goal.
The question for each of Facebook’s actions now is, do they actually address and solve these problems or are they band-aids on stab wounds?