Confirmation Bias

The Misconception: Your opinions are the result of years of rational, objective analysis.

The Truth: Your opinions are the result of years of paying attention to information which confirmed what you believed while ignoring information which challenged your preconceived notions.

Source: EIL

Have you ever had a conversation in which some old movie was mentioned, something like “The Golden Child” or maybe even something more obscure?

You laughed about it, quoted lines from it, wondered what happened to the actors you never saw again, and then you forgot about it. Until…

You are flipping channels one night and all of the sudden you see “The Golden Child” is playing. Weird. The next day you are reading a news story, and out of nowhere it mentions forgotten movies from the 1980s, and holy shit, three paragraphs about “The Golden Child.” You see a trailer that night at the theater for a new Eddie Murphy movie, and then you see a billboard on the street promoting Charlie Murphy doing stand-up in town, and then one of your friends sends you a link to a post at TMZ showing recent photos of the actress  from “The Golden Child.”

What is happening here? Is the universe trying to tell you something? No. This is called the frequency illusion.

Since the party and the conversation where you and your friends took turns saying “I-ah-I-ah-I want the kniiiife” you’ve flipped channels plenty of times; you’ve walked past lots of billboards; you’ve seen dozens of stories about celebrities; you’ve been exposed to a handful of movie trailers. The thing is, you disregarded all the other information, all the stuff  unrelated to “The Golden Child.” Out of all the chaos, all the morsels of data, you only noticed the bits which called back to something sitting on top of your brain. A few weeks back, when Eddie Murphy and his Tibetan adventure were still submerged beneath a heap of pop-culture at the bottom of your skull, you wouldn’t have paid any special attention to references to it.

If you are thinking about buying a new car, you suddenly see people driving them all over the roads. If you just ended a long-time relationship, every song you hear seems to be written about love. If you are having a baby, you start to see them everywhere. The frequency illusion is often confused with confirmation bias, and the two are pretty similar, but the major difference is that confirmation bias involves an active pursuit for the truth.

Check any Amazon.com wish list, and you will find people rarely seek books which challenge their notions of how things are or should be. During the 2008 U.S. presidential election, Valdis Krebs at orgnet.com analyzed purchasing trends on Amazon. People who already supported Obama were the same people buying books which painted him in a positive light. People who already disliked Obama were the ones buying books painting him in a negative light. Just like with pundits, people weren’t buying books for the information, they were buying them for the confirmation.

Krebs has researched purchasing trends on Amazon and the clustering habits of people on social networks for years, and his research shows what psychological research into confirmation bias predicts: you want to be right about how you see the world, so you seek out information which confirms your beliefs and avoid contradictory evidence and opinions.

Half-a-century of research has placed confirmation bias among the most dependable of mental stumbling blocks. Journalists looking to tell a certain story must avoid the tendency to ignore evidence to the contrary; scientists looking to prove a hypothesis must avoid designing experiments with little wiggle room for alternate outcomes. Without confirmation bias, conspiracy theories would fall apart. Did we really put a man on the moon? If you are looking for proof we didn’t, you can find it.

“If one were to attempt to identify a single problematic aspect of human reasoning that deserves attention above all others, the confirmation bias would have to be among the candidates for consideration. Many have written about this bias, and it appears to be sufficiently strong and pervasive that one is led to wonder whether the bias, by itself, might account for a significant fraction of the disputes, altercations, and misunderstandings that occur among individuals, groups, and nations.”

– Raymond S. Nickerson

In a 1979 University of Minnesota study by Mark Snyder and Nancy Cantor, people read about a week in the life of an imaginary woman named Jane. Throughout the week, Jane did things which showcased she could be extroverted in some situations and introverted in others. After a few days the subjects were asked to return, and the researchers divided the people into two groups. The scientists asked people in each group to help decide if Jane would be suited for a particular job. One group was asked if she would be a good librarian; the other group was asked if she would be a good real-estate agent. People then searched their memories for examples that might suggest she was right for that position. In the librarian group, people easily remembered all the moments that made her seem like an introvert, ignoring the moments she seemed more extroverted. They then said that she seemed perfect for that career. The real-estate group did the same thing, but upside down, searching the same kind of memories but for different information and coming to the opposite conclusion. After this, when the groups were asked if she would be good at the other profession, most people stuck with their original assessment, saying she wasn’t suited for the other job at all, that she was too introverted or too extroverted, depending on the original question. The study suggests even in your memories you fall prey to confirmation bias, recalling those things which support your beliefs, forgetting those things which debunk them.

An Ohio State study in 2009 showed people spend 36 percent more time reading an essay if that essay aligns with their opinions. Another study at Ohio State in 2009 showed subjects clips of the parody show “The Colbert Report,” and people who considered themselves politically conservative consistently reported “Colbert only pretends to be joking and genuinely meant what he said.”

“Thanks to Google, we can instantly seek out support for the most bizarre idea imaginable. If our initial search fails to turn up the results we want, we don’t give it a second thought, rather we just try out a different query and search again.”

– Justin Owings

A popular method for teaching confirmation bias, first introduced by P.C. Wason in 1960, is to show the following numbers to a classroom: 2, 4, 6 

The teacher then asks the classroom to guess why those numbers are in that particular order and to guess the teacher’s secret rule for selecting them in that way. The teacher then shows the number 10, 12, 14, and asks again to imagine what rule might be in play. Then the teacher reveals 22, 24, 26. The students are then tasked with coming up with three numbers of their own using the rule they think is in play. The teacher will then say “yes” or “no” if the order matches the rule. When the student thinks they have it figured out, they say their numbers out loud. Students typically offer sets like 6, 8, 10 or 32, 34, 36. The teacher then says “yes” over and over again, and the majority of people believe that the instructor’s confirmation means they have figured out the rule, but they haven’t. The teacher then reveals that 3, 9, 555 also follows the rule or 1, 2, 3. To figure out the rule, students would have had to offer sets like 2, 2, 2 or 9, 8, 7 – these, the teacher would say, did not fit the rule. With enough guesses playing against what the students think the rule may be focused on disconfirmation instead of confirmation, students finally figure out what the original rule was: any three numbers in ascending order.

The exercise is intended to show how you tend to come up with a hypothesis and then work to prove it right instead of working to prove it wrong. Once satisfied, you stop searching. In psychology they actually call this the “makes sense stopping rule.” When you wonder why something happens or what the truth may be, you stop looking for answers once your presumptions are satisfied.

You seek out safe havens for your ideology, friends and coworkers of like mind and attitude, media outlets guaranteed to play nice. Whenever your opinions or beliefs are so intertwined with your self-image you couldn’t pull them away without damaging your core concepts of self, you avoid situations which may cause harm to those beliefs.

“The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it.”

– Francis Bacon

Punditry is a whole industry built on confirmation bias. Rush Limbaugh and Keith Olbermann, Glenn Beck and Arianna Huffington, Rachel Maddow and Ann Coulter – these people provide fuel for beliefs, they pre-filter the world to match existing world-views. If their filter is like your filter, you love them. If it isn’t, you hate them.

Whether or not pundits are telling the truth, or vetting their opinions, or thoroughly researching their topics is all beside the point. You watch them not for information, but for confirmation.

“Be careful. People like to be told what they already know. Remember that. They get uncomfortable when you tell them new things. New things…well, new things aren’t what they expect. They like to know that, say, a dog will bite a man. That is what dogs do. They don’t want to know that man bites a dog, because the world is not supposed to happen like that. In short, what people think they want is news, but what they really crave is olds…Not news but olds, telling people that what they think they already know is true.”

Terry Pratchett through the character Lord Vetinari from his novel, “The Truth: a novel of Discworld

Over time, by never seeking the antithetical, through accumulating subscriptions to magazines, stacks of books and hours of television, you can become so confident in your world-view no one could dissuade you.

Remember, there’s always someone out there willing to sell eyeballs to advertisers by offering a guaranteed audience of people looking for validation. Ask yourself if you are in that audience.

In science, you move closer to the truth by seeking evidence to the contrary. Perhaps the same method should inform your opinions as well.


51fiivrubrl-_sy300_I wrote a whole book full of articles like this one: You Are Now Less Dumb

Watch the beautiful new trailer here.

Go deeper into understanding just how deluded you really are and learn how you can use that knowledge to be more humble, better connected, and less dumb in the sequel to the internationally bestselling You Are Not So SmartPreorder now at: Amazon B&N | BAM | Indiebound | iTunes


Sources:

  • Knobloch-Westerwick, S., & Meng, J. (2009, June). Looking the other way: Selective exposure to attitude-consistent and counter attitudinal political information. Communication Research 36(3), 426–448. Krebs, V. (2008). New Political Patterns. Retrieved December 2010 from http://www.orgnet.com/divided.html.
  • Krebs, V. (2000, March). Working in the connected world: Book network. IHRIM Journal, 87–90. LaMarre, H. L., Landreville, K. D., & Beam, M. A. (2009, April).
  • The irony of satire: Political ideology and the motivation to see what you want to see in The Colbert Report. The International Journal of Press/Politics 14(2), 212–231. Marks, M. J., & Fraley, R. C. (2006, January). Confirmation bias and the sexual double standard. Sex Roles 54(1/2), 19–26. Nickerson, R. S. (1998).
  • Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology 2(2), 175–220. Owad, T. (2006, January 4).
  • Snyder, M., & Cantor, N. (1998). Understanding personality and social behavior: A functionalist strategy. In D. T. Gilbert, S. T. Fiske, & G. Lindzey (Eds.). The handbook of social psychology: Vol. 1. (4th ed., pp. 635–679). Boston: McGraw-Hill.