Facebook's Rose-Colored News Feed

Pope Francis. Haim. An inspirational Little League coach. You've likely seen these stories on Facebook today, along with ice buckets and your friends' babies. I know that because they're sitting atop the social network's "trending" list. What's not there at all? Ferguson. Welcome to Facebookworld, where everything's wonderful all of the time.

There's been no small amount of handwringing about Facebook's role in news, which is another way of saying Facebook's role in how we shape our own personal versions of the universe. With so much pain in the world—much of it intertwined with a singular moment in Missouri—it's rightly troubling that Facebook would rather show us reheated Instagram photos and Viral Nova manipulations than events unfolding in Ferguson. Or Iraq. Or anywhere that might force critical thought. Rome is burning, and Facebook wants to know if we Like the fiddler.

So how did we get here? And maybe more importantly... should we even care?

Your Filtered Feed

While there are certainly ways of viewing your full unfiltered News Feed, the vast majority of Facebook users only ever witness a small fraction of what our friends are sharing at any given moment. That's both because the full glut would be sensory overload, and because it wouldn't leave much room for the brand advertisements that make Facebook money.

So what does the algorithm pluck out of the ether? Facebook keeps that largely secretive, since it's the engine that drives a multi-billion-dollar business. But there are a few clear parameters. Ars dives a bit deeper into this, but in short Facebook will demonstrably show you:

  • What you've asked it to, implicitly, through your history of Liking and sharing and commenting and clicking.
  • The items your friends have Liked and shared and commented and clicked on.
  • Ads.
  • Things that make you happy.
  • Any combination of the above that include video.
  • Probably an Upworthy post?
  • More ads.

As it turns out, the things we are Liking and sharing and commenting and clicking on aren't breaking news updates. They're not politically charged analyses. They're not things we disagree with. They're the easily digestible lists, the things only 90s kids would know. All Facebook does is provide the world's biggest chamber; we're the ones who fill it with echoes.

And you know what? That's probably fine.

Lust For Like

So no, Facebook is not the place to go for your conventions to be challenged. It is not good at news. Facebook is a series of mirrors that shows you cascading versions of what you already know to be true in your heart, even if it's false.

That's not ideal. But it's also not apocalyptic. In fact, it's odd that we would have expected anything else.

Facebook's objective is to connect you to people; the more connected you feel, the longer you stay on the site. The longer you stay on the site, the more ads you see. The more ads you see, the more likely it is that you may accidentally click one. That's why we don't see incendiary topics—and yes, many of your friends disagree with you about Ferguson—in our feeds. If we wanted to pick a fight, we'd be on Twitter.

To expect your News Feed to make you feel anything more than warm and fuzzy is like expecting Cheetos to build muscle mass. In an ideal world, sure. But junk food, like Facebook's algorithm, exists without agency; it simply wants to be consumed.

And even if Facebook did suddenly decide it had a social obligation to surface stories it deemed important, you'd run into the opposite issue. Important to whom? Whose take do you get to see? And does arguing about it—or agreeing with it—on Facebook ever do anything but more firmly entrench you in your point of view?

Meanwhile, there are other places to get news. Some of them have the same problems of self-reinforcement; I largely agree with The Nation, and would never reach for The Weekly Standard. But there's no shortage of options that exist outside of your News Feed to read about things that matter. That we've come to depend on Facebook as a news ticker is on us, not the algorithm.

Facebook shows us the world through rose-colored glasses, sure. But the real problem is that we decided to wear them in the first place.