How Upvote/Downvote Sites like Reddit Breed Irrational Herd Behavior

We may earn a commission from links on this page.

Are you a think-for-yourselfer? Do you weigh positive and negative Yelp reviews with a cold, dispassionate sagacity? Do you fancy yourself immune to the influence of others when you browse Reddit? That's cute. Newly published research says you're wrong.

A study that appears in the latest issue of Science reveals that users on social news aggregator sites like Digg, Reddit and Hacker News are heavily influenced by the opinions of other users, viewing comments differently depending on how they're rated previously. Sound obvious? It's not. Here's some background.

Who knows — maybe you have a Vulcan-like capacity for reason, and you really can resist the powerful effect of others' opinions (what is sometimes referred to as social influence bias). But even if you can, can we at least agree that you're kind of swimming upstream?

Advertisement

The last decade or more has been marked by an explosion of interest in aggregated opinion, and how we can use it to inform our own beliefs, ideas, and decisions. Need a new Blu-ray Player? Search for it on Amazon and sort your results by average customer review. Not sure whether you should be offended by "Blurred Lines" (is it the catchiest song you've heard in months, deplorably sexist, or just "subtly ridiculing?")? Here's a Reddit thread full of ready-made opinions (rated by popularity) primed to create, inform, or supplant your own.

Advertisement

When you're through reading this post (yes, this one), scroll to the bottom of the page and eat your fill of insights, ideas, suggestions, criticisms and other twopennies. Hell, maybe just stop reading right here and scroll to the comments right now for the quick rehash and some witty banter. We've all done it. That's half the reason we read things online anyway, right? For the comments?

Advertisement

We're all highly suggestible sheep. Photo by Bertoz, via Flickr.

Here's the thing. In an ideal world, the online rating systems we use to score Blu-ray players, music, restaurants, and even opinions would give rise to massive quantities of useful information that ultimately tell us something about the quality of whatever is being rated. (Call it the wisdom of the massively connected crowd.)

Advertisement

Only it doesn't always work that way. Sometimes a famous sushi bar is famous because people have parroted for years that its fish is the freshest in town, even though it's really only the eighth- or maybe ninth-freshest, and its preparation isn't even overseen by a trained itamae. Sometimes a popular movie is beloved not because it's a good movie, but because everyone says it's a good movie.

Advertisement

Researchers describe the tendency to make decisions based on prior ratings "irrational herding." There's evidence that suggests this herd-like behavior could be partly responsible for everything from terrible music on America's Top 40 to the rich-get-richer dynamics of economic inequality.

And yet, our understanding of how social influence affects collective judgement is restricted, because it's basically impossible to distinguish irrational herding from unified agreement on true quality — are "Blurred Lines" and "Get Lucky" the songs of the summer because they're good songs, or because Stephen Colbert says they are? It's probably some mix of both – but to really find out, we need a way to measure the extent to which social influence begets irrational herding.

Advertisement

To that end, Lev Muchnik of the Hebrew University of Jerusalem, Sinan Aral of MIT and Sean J. Taylor of NYU collaborated with a major social news site to conduct a massive randomized control experiment – one that would examine how a comment's "upvotes" or "downvotes" distorts its public reception. Their observations, which are published in the latest issue of Science, demonstrate that, regardless of its quality, a comment's very first vote had a huge impact on individual rating behavior and gave rise to herding effects.

Advertisement

Here's how it went down. On an unnamed social news aggregator described by the researchers as "similar to Digg.com and Reddit.com," the researchers monitored the status of 101,281 comments made by users in comment-threads like this one, this one, and this one. The comments were made over the course of five months, in which time they were viewed more than 10 million times and rated again by other users a total of 308,515 times.

Advertisement

But here's where things get interesting: the site was rigged such that every time a user left a comment, it was automatically administered an upvote (positive), a downvote (negative) or no vote at all (control). At the end of the experiment, the researchers found that comments that automatically received an upvote – just one upvote – were 32% more likely to receive another upvote by the first user to see them, relative to the control group. Those comments were also more likely to snowball in popularity; by the end of the study, the upvote group had received, on average, a 25% higher rating than the control group. Interestingly, comments that automatically received a downvote were actually more likely than the control group to receive an upvote from the second voter, reflecting what the researchers call a "correction effect."

"People are more skeptical of negative social influence," said Aral in a statement. "They're more likely to 'correct' a negative vote and give it a positive vote." The teams results suggest that positive social influences tend to accumulate, giving rise to herding effects, while negative social influences wind up being neutralized.

Advertisement

Another interesting observation: comments that appeared in threads relating to business, culture and society, and politics exhibited far greater herding effects than those pertaining to economics, general news, and IT.

"If perceptions of quality are biased by social influence," the authors write, "attempts to aggregate collective judgment and socialize choice could be easily manipulated, with dramatic consequences for our markets, our politics, and our health."

Advertisement

This study obviously lends itself to some pretty cynical conclusions (although the observed "correction effect" suggests we internet voters may be at least halfway decent people), but the authors stress it's important to remember that the better we understand how social biases color public opinion, the more we can do to keep them from being used inappropriately.

"Our message is not that we should do away with crowd-based opinion aggregation," Aral says. "Our point is that you need solid science under the hood trying to understand exactly how these mechanisms work in a broad population, what that means for the diffusion of opinion, and how can we design the systems to be fair, to have less incentives for manipulation and fraud, and be safe in aggregating opinions."

Advertisement

The researchers' findings are published in the latest issue of Science