A Collection of Strange Beliefs, Amusing Deceptions, and Dangerous Delusions

From Abracadabra to Zombies | View All

backfire effect

The "backfire effect" is a term coined by Brendan Nyhan and Jason Reifler to describe how some individuals when confronted with evidence that conflicts with their beliefs come to hold their original position even more strongly:

For instance, in a dynamic process tracing experiment, Redlawsk (2002) finds that subjects who were not given a memory-based processing prime came to view their preferred candidate in a mock election more positively after being exposed to negative information about the candidate. Similarly, Republicans who were provided with a frame that attributed prevalence of Type 2 diabetes to neighborhood conditions were less likely to support public health measures targeting social determinants of health than their counterparts in a control condition (Gollust, Lantz, and Ubel 2009).

Another example of the backfire effect is given by Yale political scientist John Bullock. He found that a group of Democratic volunteers who did not favor the appointment of John G. Roberts Jr. to the U.S. Supreme Court became even more negative in their views about Roberts when told that he had been accused in an ad by an abortion-rights group of "supporting violent fringe groups and a convicted clinic bomber." The increase from 56% disapproval to 80% disapproval after being provided with data confirming their opinion is understandable. We know that providing political misinformation works by feeding into people's pre-existing beliefs. We're likely to accept information uncritically, true or false, that fits with what we already believe. What isn't so understandable is why, after being shown a refutation of the ad by abortion-rights supporters and being told that the advocacy group had withdrawn the ad, Democratic disapproval of Roberts dropped only to 72 percent. For many, the facts didn't have a significant effect on belief in the direction of the evidence. Despite strong evidence contrary to their belief, many were led to strengthen their pre-existing belief.

Nyhan and Reifler found a backfire effect in a study of conservatives. The Bush administration claimed that tax cuts would increase federal revenue (the cuts didn't have the promised effect). One group was offered a refutation of this claim by prominent economists that included current and former Bush administration officials. About 35 percent of conservatives told about the Bush claim believed it. The percentage of believers jumped to 67 when the conservatives were provided with the refutation of the idea that tax cuts increase revenue.

A final example:

A 2006 study by Charles Taber and Milton Lodge at Stony Brook University showed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong.*

Some think the backfire effect is due to a cognitive deficit: people view unfavorable information as being in agreement with their beliefs (Lebo and Cassino 2007). Nyhan and Reifler, however, interpret backfire effects "as a possible result of the process by which people counterargue preference-incongruent information and bolster their preexisting views." That seems like a roundabout way of saying that people dig in when confronted with evidence contrary to their beliefs, but it doesn't seem to explain why they do so. Another explanation involves communal reinforcement and the assumption that there is more information you don't have that supports your belief.  If one knows that there is a community of believers who share your beliefs and one believes that there is probably information you don't have but which would outweigh the contrary information provided, rationalization becomes easier. It is possible that the rationalization process leads one to give more weight to reinforcement by the community of believers. How much play one's belief gets in the media, versus the play of contrary information may also contribute to the backfire effect. If messages supporting your belief are presented far more frequently in the media than messages contrary to your belief, or presented repeatedly by people you admire, the tendency might be to give those supportive messages even more weight than before.

Whatever the cause, the backfire effect is very curious. The more ideological  and the more emotion-based a belief is, the more likely it is that contrary evidence will be ineffective. There is some evidence that lack of self-confidence and insecurity correlate with the backfire effect. More research is needed to fully explain what additional factors lead some people to respond to contrary evidence by treating it as if it were additional support for one's belief. Further research is also needed to see if different groups are more susceptible to the backfire effect (liberals and conservatives, theists and atheists, skeptics and true believers) and, if so, why.

I have seen the backfire effect in action many times, but have no idea why it happens. One experience in particular stands out in my memory. A young woman in my introductory philosophy class told me after we had gone through some very strong arguments against theism and Christianity from philosophers like Bertrand Russell and some very weak arguments from the likes of C. S. Lewis, that she was more convinced than ever of her Christian beliefs. Furthermore, she told me that my arguments had strengthened her resolve to transfer to a Bible college and major in theology. For other examples of the backfire effect, see my discussion of homeopathy with Jacob Mirman, M.D. and my discussion with Valerie the astrologer, who continues to send me "proofs" of her beloved astrology despite my strongest arguments that prediction from a theory is not the same as retrofitting events to fit with a theory.

See also continued influence effect, motivated reasoning, nasty effect, Belief Armor, Evaluating Personal Experience, Why Do People Believe in the Palpably Untrue?, and Defending Falsehoods.

further reading

books

Carroll, Robert Todd. 2011. Unnatural Acts: Critical Thinking, Skepticsim, and Science Exposed! James Randi Educational Foundation.

Kahneman, Daniel. 2011. Thinking, Fast and Slow. Farrar, Straus and Giroux. See my review of this book.

McRaney, David. 2011. You Are Not So Smart. Gotham.

articles

Carretta, T. R. and R. L. Moreland. 1983. The direct and indirect effects of inadmissible evidence. Journal of Applied Social Psychology, 13, 291—309. (abstract)

Cook, John and Stephan Lewandowsky. 2011. The Debunking Handbook "Debunking myths is problematic. Unless great care is taken, any effort to debunk misinformation can inadvertently reinforce the very myths one seeks to correct. To avoid these 'backfire effects,' an effective debunking requires three major elements. First, the refutation must focus on core facts rather than the myth to avoid the misinformation becoming more familiar. Second, any mention of a myth should be preceded by explicit warnings to notify the reader that the upcoming information is false. Finally, the refutation should include an alternative explanation that accounts for important qualities in the original misinformation."

Drum, Kevin. 2008. The Backfire Effect. Mother Jones.

Gollust, Sarah E., Paula M. Lantz, and Peter A. Ubel (2009). “The Polarizing Effect of News Media Messages About the Social Determinants of Health.” American Journal of Public Health 99(12): 2160-2167.

Johnson, Hollyn M. and Colleen M. Seifert. 1994. Sources of the Continued Influence Effect When Misinformation in Memory Affects Later Inferences  Journal of Experimental Psychology. Vol. 20, No. 6, 1420-1436. "The findings suggest that misinformation can still influence inferences one generates after a correction has occurred; however, providing an alternative that replaces the causal structure it affords can reduce the effects of misinformation."

Lebo, Matthew and Daniel Cassino. 2007. “The Aggregated Consequences of Motivated Reasoning.” Political Psychology 28:6.

Nyhan, Brendan and  Jason Reifler. 2010. When Corrections Fail: The persistence of political misperceptions. (prepublication version) Political Behavior 32(2): 303-330.

Redlawsk, David (2002). “Hot Cognition or Cool Consideration? Testing the Effects of Motivated Reasoning on Political Decision Making.” Journal of Politics 64(4):1021-1044.

news

new I Don’t Want to Be Right BY MARIA KONNIKOVAn New Yorker, May 16, 2014. "The longer the narrative remains co-opted by prominent figures with little to no actual medical expertise—the Jenny McCarthys of the world—the more difficult it becomes to find a unified, non-ideological theme. The message can’t change unless the perceived consensus among figures we see as opinion and thought leaders changes first."[/new]

According to a New Study, Nothing Can Change an Anti-Vaxxer’s Mind "...researchers...tested four different pro-vaccination messages on a group of parents with children under 18 and with a variety of attitudes about vaccination to see which one was most persuasive in persuading them to vaccinate....the results are utterly demoralizing: Nothing made anti-vaccination parents more amendable to vaccinating their kids. At best, the messages didn't move the needle one way or another, but it seems the harder you try to persuade a vaccination denialist to see the light, the more stubborn they get about not vaccinating their kids....but there's also some preliminary research from the James Randi Educational Foundation and Women Thinking Inc. that shows that reframing the argument in positive terms can help. When parents were prompted to think of vaccination as one of the steps you take to protect a child, like buckling a seat belt, they were more invested in doing it than if they were reminded that vaccine denialists are spouting misinformation. Hopefully, future research into pro-vaccination messaging, as opposed to just anti-anti-vaccination messaging, will provide further insight." [Here's a link to the article in Pediatrics that the Slate article is referring to.]

The Backfire Effect More on the press’s inability to debunk bad information By Craig Silverman "...the backfire effect makes it difficult for the press to effectively debunk misinformation. We present facts and evidence, and it often does nothing to change people’s minds. In fact, it can make people dig in even more. Humans also engage in motivated reasoning, a tendency to let emotions set us on a course of thinking that’s highly biased, especially on topics we care a great deal about."

The Power of Political Misinformation by Shankar Vedantam, The Washington Post. 15 September 2008. "...a series of new experiments show that misinformation can exercise a ghostly influence on people's minds after it has been debunkedeven among people who recognize it as misinformation. In some cases, correcting misinformation serves to increase the power of bad information."

Keohane, Joe. 2010. How facts backfire Researchers discover a surprising threat to democracy: our brains. Boston.com. "The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic — a factor known as salience — the stronger the backfire. The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didn’t backfire, but the readers did still ignore the inconvenient fact that the Bush administration’s restrictions weren’t total."

blogs

The Backfire Effect - You Are Not So Smart - A Celebration of Delusion by David McRaney An excellent overview of the backfire effect, comfirmation bias, selective skepticism, and biased assimilation. "What should be evident from the studies on the backfire effect is you can never win an argument online. When you start to pull out facts and figures, hyperlinks and quotes, you are actually making the opponent feel as though they are even more sure of their position than before you started the debate. As they match your fervor, the same thing happens in your skull. The backfire effect pushes both of you deeper into your original beliefs."

The Frontal Cortex, Cognitive Dissonance and Politics, 16 September 2008.

Last updated 14-Mar-2015

© Copyright 1994-2016 Robert T. Carroll * This page was designed by Cristian Popa.