Photo: Getty

Facebook wants to prove to lawmakers and the public that it’s working diligently to combat political disinformation and foreign election meddling, and on Tuesday, the company revealed it recently shut down a network that appears to be mirroring the kind of activity Russia is accused of orchestrating in the run-up to the 2016 election.

In a blog post, Facebook disclosed that it has removed 32 pages and accounts from its namesake social network and its subsidiary Instagram. It claims they were removed for the offense of engaging in “coordinated inauthentic behavior.” Does this mean the beleaguered company just took down another Russian troll farm ahead of the midterms and saved democracy? Not even close.


Facebook says it’s still in the early stages of its investigation and that it’s not even sure who’s behind the network of accounts. The company says it first identified the suspicious activity “about two weeks ago” and has been in contact with political leaders on Capitol Hill and the FBI in the meantime. We’re just going to be blunt and say up front that what Facebook claims to have found is not particularly impressive.

First of all, the seven Instagram accounts that were deactivated apparently had zero followers. Out of the eight Facebook Pages, four only had between zero and ten followers. But four pages, “Aztlan Warriors,” “Black Elevation,” “Mindful Being,” and “Resisters,” brought the total follower count up to “more than 290,000 accounts.”

In case the names of those four accounts don’t make it clear, Facebook indicates that they were set up to spread typical left-wing talking points around racial equality, immigrant rights, and general opposition to the Trump administration. The various pages did try to stage 30 events since May 2017, but they don’t appear to have had much success. Around half of the events had fewer than 100 users express their interest in attending, but the most popular event had 4,700 users who were interested and 1,400 users say they were going. Anyone who’s ever held a Facebook event knows those numbers would likely be lower as far as actual turnout goes, but it’s still significant. Facebook said it has begun notifying users that they were duped by a bad actor.

Screenshot of a message Facebook is sending to users who planned to attend a counter-protest to the next Unite the Right event.
Image: Facebook

One event that Facebook identified by name was “No Unite the Right 2 - DC,” a counter-protest to a planned demonstration from the same Nazis who marched in Charlottesville, Virginia last year—an event that resulted in multiple assaults and the murder of one counter-protester, Heather Heyer. Facebook’s Head of Cybersecurity Policy, Nathaniel Gleicher, wrote that approximately 2,600 users were interested in the event, and more than 600 users said they would attend.

The company took pains to emphasize it doesn’t have the tools to make a determination on what the network’s political motivations might be. It did say that “it’s clear that whoever set up these accounts went to much greater lengths to obscure their true identities than the Russian-based Internet Research Agency has in the past.” Make of that what you will.


Facebook says the “bad actors [...] paid third parties to run ads on their behalf.” We’ve reached out to the company for more information on those third parties and to find out if they’ve been contacted by Facebook.

Since Facebook is only claiming the pages were set up by “fake accounts” and were taken down for violating its policies against “coordinated inauthentic behavior,” it’s entirely possible that Americans could be behind the activity. It seems unlikely, however, that Facebook would make a big deal of this if it suspected that were the case. It also seems unlikely that Americans would spend approximately $11,000 on 150 ads to promote these basic accounts that aren’t doing much aside from pushing basic political talking points like “Abolish ICE.”


Americans are more than capable of being anti-fascist or pro-racist all on their own. When Facebook released the 3,500 Russia-linked ads it had uncovered from the 2016 election, there really wasn’t much in there that you couldn’t find a red-blooded American spewing online. As has been repeated ad nauseam, it seems like the people behind the campaign simply intended to amplify divisions in the U.S. It seems reasonable to say that if that campaign had any effect whatsoever, it was through various attempts at voter suppression.


It’s easy to get lost in the debates over the intelligence communities’ assessment that Russia used hacking and disinformation to help Trump win the election, especially when the president contradicts himself in the course of a sentence about his own beliefs on the subject. And there’s a fine argument to be made that the point is to whittle away at Americans’ belief in any kind of objective truth.

Whatever the case, there’s no rational observer out there who believes Russia hasn’t tried to hack the U.S. and manipulate our political process. It’s an activity that you should assume every nation with an intelligence agency engages in one way or another. What matters is that we do something to mitigate any negative consequences that could come from it.


Facebook is publicly treating this case as just the beginning of its efforts to fight disinformation, saying, “We believe this could be partly due to changes we’ve made over the last year to make this kind of abuse much harder.” But it acknowledges, “security is not something that’s ever done.”

The fact is, it’s kind of surprising we haven’t seen more disclosures like this since Facebook found itself embroiled in a scandal around the election. It may be holding up a pelt today to show off what it’s doing before the midterm elections, but with all its resources and how much emphasis it’s placed on its efforts to combat this problem, one has to ask: That’s it?


It may be the extent of Facebook’s public disclosures, but there are certainly others out there and what we’re seeing is more than likely just adversaries testing what can be done with the new cyber tools at their disposal. We’ve seen disinformation have far more explicit and deadly consequences in other countries, it could only be a matter of time before someone gets it right here. (Or maybe they already did, I suppose.)

Facebook deserves no pats on the back. It should’ve been aware of the potential consequences its platform could enable and it should’ve worked harder to fight them. The fact that doing so doesn’t have direct profit potential is all you need to know for why it’s been a blind spot.


But the way the current political climate has evolved raises other questions about how the public will respond to further disclosures. Just as we’ve seen with Trump supporters who were tricked into organizing events in 2016, people don’t care if they were unwittingly manipulated into doing something they would have done anyway. I’m sure those finding out today that a fake account promoted a rally against Nazis would still go protest Nazis tomorrow. If Facebook keeps notifying us about fake accounts we follow, will we become numb to the potential problems, or maybe suspect that everyone on the internet is a bot? Does it matter if we do?

The answer is yes, it matters. It matters that we all become more literate when it comes to cybersecurity and the media. It matters that we retain skepticism while admitting that a global internet invites bad faith arguments and deliberate sabotage of our relationships. It matters that we also acknowledge we’re totally capable of sabotaging our own relationships and blaming it on a scapegoat. It matters that Facebook takes responsibility for the tools that reap billions of dollars in profits. It matters that we have some amount of faith in our own government to give us accurate information regarding cyber threats.


Facebook is a garbage company that has an opportunity to be better. If today’s disclosures are a PR stunt while it applies band-aids to a gaping wound, it’ll matter to the regulators of the world. And as time goes on, it’s going to matter if the public can trust these kinds of reports.

[Facebook via New York Times]


Share This Story

Get our newsletter