Google's Top Suggestion for the Sutherland Springs Killer is a Debunked Antifa Theory

We may earn a commission from links on this page.

Was America’s latest mass shooting committed by a member of antifa, the leaderless anti-fascist collective? No. But Google’s autocomplete results are still suggesting that he might be.

Americans watched social media in horror yesterday in the aftermath of a familiar sight: A gunman in Sutherland Springs, Texas entered a church and killed 26 people, wounding 20 others, ten of whom are still in critical condition. People were curious about the motive for the shooting that even left an 18-month-old baby dead, and some on the far right claimed the gunman was a leftist. Google has amplified that debunked theory.

Multiple news organizations, including the New York Times, are now reporting that the motive for the shooting was some kind of family dispute. The gunman’s mother-in-law attended the church, though she wasn’t present at the time of the shooting.


“This was not racially motivated, it wasn’t over religious beliefs, it was a domestic situation going on,” Freeman Martin, a spokesman for the Texas Department of Public Safety, said in a press conference earlier today.

We don’t know the specifics of the dispute yet, but there are no signs that whatever drove the shooter was some kind of anti-Republican bias. Far from it.


But if you search the gunman’s name, Devin Kelley, on Google the first auto-suggestion is for the term antifa. Far right conspiracy theorists like Mike Cernovich, a man most well known for pushing the debunked Pizzagate conspiracy, didn’t waste any time in claiming that Kelley was a leftist.


In the immediate aftermath of the shooting, Google results were floating plenty of unhinged conspiracy theories, including those from InfoWars and Cernovich. And while Google acknowledges that it has to do better to present reliable news coverage, the company says that it’s difficult to monitor these things in real time.

In a response to previous questions from yesterday, Google told Gizmodo:

The search results appearing from Twitter, which surface based on our ranking algorithms, are changing second by second and represent a dynamic conversation that is going on in near real-time. For the queries in question, they are not the first results we show on the page. Instead, they appear after news sources, including our Top Stories carousel which we have been constantly updating. We’ll continue to look at ways to improve how we rank tweets that appear in search.


But that doesn’t address the autosuggestions from Google that are still present. Not to mention the countless videos that still populate YouTube (the video service is owned by Google) and claim that the gunman was a leftist bent on killing Christians.

As mass killings continue unabated, it can be incredibly depressing to see conspiracy theorists exploit these tragedies for their own gain. Because make no mistake, the business model of men like Cernovich is to create as much fear and resentment as possible so that they can sell their books and their supplements.


Whether it’s deliberately placed fake news after the Las Vegas shooting or this most recent tragedy, the internet has provided a depressingly efficient method for spreading disinformation. And as long as companies like Google allow these outrage-peddling charlatans to profit from it, the country will be worse off.

Update 5:00pm: Google sent Gizmodo a statement about the issue that more or less says they’re working on it.

Autocomplete predictions are algorithmically generated based on users’ search activity and interests. Because of this, terms that appear in Autocomplete may be unexpected or unpleasant. In this case, there is great interest in the topic which is being reflected in the tool. We try to be careful with autocompletions on names, and in this case, our system did not work as intended. We’re currently working on our system for name detection to improve this process moving forward.