The Future Is Here
We may earn a commission from links on this page

The Backfire Effect shows why you can't use facts to win an argument

We may earn a commission from links on this page.

"Never let the facts get in the way of a good story" isn't just a maxim for shady politicians and journalists. It's also the way people often live their lives. One study indicates that there may even be a "backfire effect," which happens when you show people facts that contradict their opinions.

The study tested this effect by having people read a dummy story about a political issue at the time. One such story was regarding the presence of weapons of mass destruction in Iraq. People rated how politically conservative they were, and answered some preliminary questions. (Some of the questions were a bit extreme. The study did mini-tests to examine whether thoughts of mortality made people more fearful and nationalistic, and so participants were asked to "jot down" what they thought the process of dying was like. This writing exercise was not found to have any effect on people's responses afterwards.)

Advertisement

After their little essay on death, the participants were given one of two copies of a news story. Both had a quote from then-President Bush which was taken out of context and which seemed to indicate that Iraq had weapons of mass destruction. One version had a quote from the Duelfer Report showing that there was no evidence of stockpiles of these weapons and no programs to create them. The other version did not include a discrediting quote.

After reading one version or the other of this article, the participants were asked whether they agreed or disagreed with a statement claiming that Iraq had weapons of mass destruction prior to the US invasion. The people who rated themselves as liberal, left of center, or centrist, did not agree — and whether they read the correction had little effect on their views. The people who rated themselves as conservative did agree. And they agreed even more, when they read the article with the correction than when they read the article without the correction.

Advertisement
Advertisement

The study did a few other similar experiments, all with the same theme. And there were things that would amplify this "Backfire Effect." Participants in the experiments were more likely to experience the Backfire Effect when they sensed that the contradictory information had come from a source that was hostile to their political views. But under a lot of conditions, the mere existence of contradictory facts made people more sure of themselves — or made them claim to be more sure.

Everyone has experienced the frustration of bringing up pertinent facts, in the middle of an argument, and having those facts disregarded. Perhaps the big mistake was not arguing, but bringing up facts in the first place.

Via When Corrections Fail: The Persistence of Political Misperceptions