A Collection of Strange Beliefs, Amusing Deceptions, and Dangerous Delusions

From Abracadabra to Zombies | View All

confirmation bias

"It is the peculiar and perpetual error of the human understanding to be more moved and excited by affirmatives than by negatives." --Francis Bacon

Confirmation bias refers to a type of selective thinking whereby one tends to notice and to look for what confirms one's beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one's beliefs. For example, if you believe that during a full moon there is an increase in admissions to the emergency room where you work, you will take notice of admissions during a full moon, but be inattentive to the moon when admissions occur during other nights of the month. A tendency to do this over time unjustifiably strengthens your belief in the relationship between the full moon and accidents and other lunar effects.

This tendency to give more attention and weight to data that support our beliefs than we do to contrary data is especially pernicious when our beliefs are little more than prejudices. If our beliefs are firmly established on solid evidence and valid confirmatory experiments, the tendency to give more attention and weight to data that fit with our beliefs should not lead us astray as a rule. Of course, if we become blinded to evidence truly refuting a favored hypothesis, we have crossed the line from reasonableness to closed-mindedness.

Numerous studies have demonstrated that people generally give an excessive amount of value to confirmatory information, that is, to positive or supportive data. The "most likely reason for the excessive influence of confirmatory information is that it is easier to deal with cognitively" (Gilovich 1993). It is much easier to see how a piece of data supports a position than it is to see how it might count against the position. Consider a typical ESP experiment or a seemingly clairvoyant dream: Successes are often unambiguous or data are easily massaged to count as successes, while negative instances require intellectual effort to even see them as negative or to consider them as significant. The tendency to give more attention and weight to the positive and the confirmatory has been shown to influence memory. When digging into our memories for data relevant to a position, we are more likely to recall data that confirms the position (ibid.).

Researchers are sometimes guilty of confirmation bias by setting up experiments or framing their data in ways that will tend to confirm their hypotheses. They compound the problem by proceeding in ways that avoid dealing with data that would contradict their hypotheses. For example, some parapsychologists used to engage in optional starting and stopping in their ESP research. Experimenters might avoid or reduce confirmation bias by collaborating in experimental design with colleagues who hold contrary hypotheses, as Richard Wiseman (skeptic) and Marilyn Schlitz (proponent) have done.* Individuals have to continually remind themselves of this tendency and actively seek out data contrary to their beliefs. Since this is unnatural, it appears that the ordinary person is doomed to bias.

See also ad hoc hypothesis, backfire effect, cognitive dissonance, communal reinforcement, control study, motivated reasoning, selective thinking, and self-deception.

For examples of confirmation bias in action, see  "alternative" health practice, curse, ESP, intuitive, lunar effect, personology, plant perception, the Sokal hoax, therapeutic touch, and thought field therapy.

To see confirmation bias at work, review the conspiracy theories offered for the JFK assassination or the 9/11 conspiracy theories. It is a good lesson to observe how easily intelligent people can see intricate connections and patterns that support their viewpoint and how easily they can see the faults in viewpoints contrary to their own. As long as one ignores certain facts and accepts speculation as fact, one can prove just about anything to one's own satisfaction. It is much harder cognitively, but a requirement of good science, to try to falsify a pet hypothesis.

reader comments

further reading

books and articles

Belsky, Gary and Thomas Gilovich. Why Smart People Make Big Money Mistakes-And How to Correct Them: Lessons from the New Science of Behavioral Economics (Fireside, 2000).

Evans, B. Bias in Human Reasoning: Causes and Consequences (Psychology Press, 1990).

Gilovich, Thomas. How We Know What Isn't' So: The Fallibility of Human Reason in Everyday Life (New York: The Free Press, 1993).

Levine, Robert. The Power of Persuasion - How We're Bought and Sold by  (John Wiley & Sons 2003)

Reason, James. Human Error (Cambridge University Press 1990).

Shermer, Michael. The Borderlands of Science: Where Sense Meets Nonsense (Oxford University Press 2002).

Shermer, Michael. Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time 2nd ed. (Owl Books 2002).


Coincidences: Remarkable or Random? by Bruce Martin

Smart People Believe Weird Things: Rarely does anyone weigh facts before deciding what to believe by Michael Shermer

Schlitz, M., Wiseman, R., Radin, D., & Watt, C. (2005). Of two minds: Skeptic-proponent collaboration within parapsychology. Proceedings of the 48th Annual Convention of the Parapsychological Association. USA. 171-177.

Last updated 29-Oct-2015

© Copyright 1994-2016 Robert T. Carroll * This page was designed by Cristian Popa.