Google’s search algorithm is a powerful thing—the tiniest tweaks can tank businesses, subtly influence your purchasing decisions, and shape what information you see. Google has long stated that it “do[es] not use human curation to collect or arrange the results on a page,” but a Wall Street Journal investigation has found that the company does interfere with search results more than it cares to admit.
Fair warning: The WSJ report is as detailed as it is lengthy. It spans over 100 interviews, as well as an independent test of Google’s algorithm versus rivals Bing and DuckDuckGo. (You can read the methodology of that test here.) The general gist is that in recent years, Google has shifted from a hands-off, the algorithm-knows-best culture, to one that takes a more active role in deciding how information appears to users. Overall, the report found that Google had made 3,200 changes to its algorithms in 2018; for context, it made 2,400 changes in 2017 and 500 in 2010, according to the report.
Among the more damning-yet-thoroughly-unsurprising findings is that Google changes its algorithm to favor bigger businesses like Amazon and Facebook over smaller independent sellers. In its reporting, the WSJ found that in at least one case, it also made changes on behalf of eBay after a third of its traffic plummeted in 2014.
The WSJ also found that Google keeps blacklists to either remove or prevent sites from showing up in certain search results, despite the company publicly denying the existence of such blacklists. Notably, these blacklists are separate from those required by law, such as lists for child abuse or copyright infringement. Along those lines, Google also reportedly has its engineers create algorithms and blacklists to filter out “controversial” results for topics like immigration or abortion.
On top of that, the report found that Google engineers regularly tweak what you see in features like auto-complete suggestions, “knowledge panels” (those boxes you see on the right side of a search page), featured snippets, and news results. Given that these curated features aren’t considered organic searches, the report found, they aren’t held to the same standard and are thus easier for Google to edit.
And of course, Google is paying thousands of contractors—not particularly well, mind you—to assess the quality of algorithm rankings. The contractors are basically informed as to what Google considers “correct” result rankings, and told to adjust their rankings accordingly. One interviewed contractor said Google paid $13.50 per hour for about 20 hours per week and was told to rank the National Suicide Prevention Lifeline as the number one result across all searches related to suicide.
All of this is a lot to parse—each individual finding itself carries a lot of implications! Altogether, though, the controversy boils down to the fact that Google regularly defends itself against regulators by saying humans do not meddle with its sacred algorithm. At the same time, it’s under increasing pressure to do something about misinformation proliferated via search. In one cited example, Google searches in 2017 for addiction centers often surfaced facilities with dubious records. After lobbying, industry leaders noted “rehab” searches changed to show the Substance Abuse and Mental Health Services Administration—a hotline run by the U.S. Department of Health and Human Services.
Arguably, while Google meddling in search results can be wildly dangerous—think caving to business or political pressures to alter what results you get, and thus how you are informed about the world—there are some cases where it’s understandable, as when it tweaked lesbian-related searches to be less pornographic or when it removed Holocaust-denying results. Part of the problem is, Google is frustratingly tight-lipped about how frequently and under what circumstances it’ll interfere with search results—usually under the reasoning that the more transparent it is about the process, the more secure the algorithm is from bad actors who would seek to “game the system.”
Even when it does acquiesce, it’s not like Google then notifies the parties involved of its reasoning. (See: This seminal article in which Google’s algorithm lied about how long it takes to caramelize onions, and the quiet fix that quickly followed.)
For its part, Google takes serious issue with the WSJ’s findings. With regard to the blacklists, the company told Gizmodo its public denial is in reference to a specific question regarding blacklists for political reasons—which it says it does not do. As for favoring big companies over smaller ones, it says the search algorithms are designed to boost “authoritative” and relevant results. It also says contractors are provided specific guidelines, which are publicly available, to ensure consistent data.
“We have been very public and transparent around the topics covered in this article, such as our Search rater guidelines, our policies for special features in Search like Autocomplete and valid legal removals, our work to combat misinformation through Project Owl, and the fact that the changes we make to Search are aimed at benefitting users, not commercial relationships,” a Google spokesperson told Gizmodo over email. “This article contains a number of old, incomplete anecdotes, many of which not only predated our current processes and policies but also give a very inaccurate impression of how we approach building and improving Search. We take a responsible and principled approach to making changes, including a rigorous evaluation process before launching any change—something we started implementing more than a decade ago. Listening to feedback from the public is a critical part of making Search better, and we continue to welcome feedback.”