As the FBI’s investigation into Russian election interference reaches a fever pitch, Facebook rolled out a new News Feed alert Monday night. The bulletin told users who followed pages created by Russian trolls that those pages have been removed. And some of the affected users did not like this.
A brief search revealed that numerous people believe that this is an act of censorship by Facebook. Some users argued that they should be allowed to decide what’s “true, fake, or otherwise,” a challenge that’s bound to be a slippery slope in this era of algorithm-based confirmation bias.
Others took on a more conspiratorial tone, claiming that Facebook failed to reveal which pages were removed (despite the alert containing a link listing the pages in question).
Facebook first released the information in December, creating a help page that showed users if they liked or followed pages and accounts associated with the Internet Research Agency, Russia’s notorious troll farm, but today’s alert seems to have inspired newfound alarm. The fact that Facebook explicitly stated which pages were deleted seems to have done little to reduce the anger over the allegedly clandestine silencing.
It’s worth pointing out that the Russian troll pages spanned the political spectrum, targeting people on both the left and the right. All sides, however, seemed certain that the Russian-linked accounts they followed were legitimate. “No FB that’s not how it’s done,” wrote one user, convinced that the page Black Matters, which the House Intelligence Committee says paid thousands of dollars in rubles to promote its content, was being silenced. “The point was to remove fake pages & possible Russian run pages.”
There are plenty of reasons not to trust Facebook, especially when it comes to fake news. The social network removing pages that it’s identified as linked to a known Russian propaganda outfit, however, is not the most suspicious thing happening here. Not only has Facebook testified before Congress about these accounts, but other companies like Twitter have identified similar content on their networks and are also committing to removing it.
One could argue that Facebook is doing too little too late, especially as we learn more about Russian interference in the 2016 election. But the idea that Facebook is at war with the First Amendment by removing these pages is naive at best.
Here’s the thing. Any robot or troll anywhere in the world could create a page called, say, “Veterans Come First.” The machine or the human could fill that page with lots of real or real-looking content that celebrates veterans and the military. The page could look just as legitimate as something set up by the USAA or your local American Legion post. In fact, Facebook is designed to make all pages look the same, which is part of why fake pages or fake news articles can be so hard to spot. Just because something is on Facebook—or the internet as a whole—doesn’t mean it’s legitimate.
At the end of the day, trusting an anonymous collection of pixels is just as dubious as it sounds. Facebook is actually doing users a service by removing the Russian troll content in the same way that it does users a service by removing abusive or inappropriate content, but many people clearly do not see it that way.
Asked for comment, a Facebook spokesperson said the company is “undertaking significant efforts to let people on Facebook know that this tool is available,” linking to an earlier blog post on the topic. In the meantime, calm down folks. We’re all figuring this out together.