What Is Confirmation Bias?
The Quick AnswerIn Critical Thinking, confirmation bias is overrating ideas that support a preconception and underrating ideas that contest it.
Here's a short example to illustrate confirmation bias:
Due to confirmation bias, this person might scrutinize the study, searching for flaws or alternative explanations that discredit its findings. They may question the sample size, methodology, or funding sources of the study to discount its results. At the same time, they might continue seeking out and emphasizing anecdotal evidence or studies that align with their belief in the supplement's effectiveness.
Easy Definition of Confirmation Bias
Don't just use the evidence that fits your theory. If you do, that's confirmation bias.
Academic Definition of Confirmation Bias
Confirmation bias is selective thinking where information that confirms a preconception is: (1) automatically noticed (2) actively sought (3) overvalued and (4) accepted without reservation. On the other hand, information that contradicts the preconception is: (1) automatically ignored (2) not sought (3) undervalued and (4) rejected out of hand. In summary, confirmation bias occurs when someone has reached a conclusion and shapes the evidence — either knowingly or unknowingly — to make it fit. Confirmation bias will also cause people to recall memory selectively or interpret events in a way that supports their preconception.
An Example of Confirmation Bias
Decision-making among the "anti-vaxxers"In July 2021, in the middle of the COVID-19 pandemic, a high proportion of citizens around the world (reported as 19% in the US) stated that they would never take a COVID-19 vaccine. Having most likely absorbed some misinformation or overplayed statistics about the COVID-19 vaccines, these individuals developed a distrust of the vaccines and, thereafter, sought only information that confirmed their distrust. Conversely, messaging designed to highlight the benefits of the vaccines was either avoided or not believed.
Even when faced with overwhelming evidence of the efficacy of the vaccines (in the form of reduced hospitalizations), many of the anti-vaxxers were so rooted in their positions about the vaccines, they simply ignored this evidence.
This is a great example of how confirmation bias – the mother of all cognitive biases – can affect decision-making. In essence, people will believe what they want to believe even when saturated with overwhelming evidence that contradicts their predisposed idea. Minds that already have the answer they want will take shortcuts to reach that answer. The anti-vaxxers literally brainwashed themselves.
But, what if the anti-vaxxers were right? What if those opposed to the anti-vaxxers' views were absorbing only information that supported the vaccines and dismissed the anti-vaxxers' observations? What if the anti-anti-vaxxers were the ones with confirmation bias?
Well, the best way to overcome confirmation bias is to slow down your thinking, pause and reflect, and seek external views. My strong sense was that the anti-vaxxers did not do that, instead feeding their thinking through news "echo-chambers" that simply reflected their initial stance.
So, let's make a call. Were the anti-vaxxers wrong? Yes. Had they succumbed to confirmation bias? Yes.
Read about Base-rate Fallacy and COVID-19
Read about Availability Bias and COVID-19
Another Example of Confirmation Bias
The moon-landing conspiracy theoryOn 21st July 1969, Neil Armstrong (the Commander of Apollo 11) was the first man to set foot on the moon. According to a survey by Gallup, 6% of the US population believes his moon landing was faked. In Russia, another poll states that over a quarter of the population do not believe the moon landing occurred. In fact, across the globe, this remains a particularly persistent theory. So, is there anything in it? Let's investigate.
The Apollo programme was huge. At current prices, it would have cost around $110 billion. Including the astronauts, scientists, engineers and technicians, more than 400,000 skilled workers contributed to the programme. To date, not one of them has even hinted that the landings were faked.
The best way to keep a secret is to tell no one. Four hundred thousand people isn't "no one." It's a rock concert. In fact, it's eight huge rock concerts. Because of the way guarding a secret works (i.e., the rigour with which it's protected is reduced after each telling), the chances of keeping the Apollo crew, their wives and their families quiet about a faked landing for over 50 years are very small. The chances of keeping 400,000 people quiet are infinitesimally small. This alone ought to tell us that the moon landing was not faked.
(Benjamin Franklin, 1706–1790)
The moon landing happened. Fact. Those who genuinely believe the moon landing was faked (i.e., not those seeking to showcase their intellect by defending a hopeless argument) are allowing confirmation bias to affect their reasoning. Confirmation bias is causing them to promote information that fits their preconception of a faked landing and disregard information that doesn't fit. In essence, they want the landing to be faked, so they shape the evidence to match their theory. For example, conspiracists can easily recall the film clip of the American flag waving on the moon. As there is no atmosphere (and so no wind) on the moon, they believe the waving flag cannot be on the moon. So the moon landing must have been faked.
The more serious conspiracists will also cite space-travel experts who state that the Van Allen radiation belts, solar flares, solar wind, coronal mass ejections and cosmic rays make a trip to the moon impossible. You could almost forgive them for believing it. But they should, of course, seek a view from the scientists who overcame these issues before making a judgement. Conspiracists will also tell you that the USA had good reasons to fake the landing: they needed to win the space race against the Soviet Union; they needed to convince US citizens that NASA was a good use of their taxes; and they needed a distraction from the Vietnam War. They'll add that the USA only ever claimed to land on the moon (six times in all) during President Nixon's term, and he was a proven liar.
So, in this very quick look at the moon-landing conspiracy, we've seen the conspiracists enthuse over evidence that fits their theory and ignore that which doesn't. It's a classic example of confirmation bias. We've also seen the conspiracists:
- Put too much value in a story that comes to mind easily – in this case, the "flag waving in the wind" story (see Availability Bias).
- Blindly take the word of experts (see Appeal to Authority Fallacy).
- Adopt fallacious reasoning based on the USA's motivation and Nixon's personal qualities (see Ad Hominem Argument).
Oh, after being planted, a flag suspended from a horizontal pole will continue to flap about as it settles after the planting process. And, in a vacuum, it takes much longer for the flag to settle.
Another Example of Confirmation Bias
Find the evidence to support a case for a warIn 1999, Rafid Ahmed Alwan (an Iraqi citizen who defected from Iraq to Germany) became a Western intelligence agent codenamed "Curveball." He was of interest to the West because he claimed he had worked as a chemical engineer at a plant that manufactured mobile biological-weapon laboratories as part of Iraq's weapons of mass destruction (WMD) capability.
Many intelligence officers were unconvinced by Curveball's assertions. For example, his German handlers (those who elicited the information from him) repeatedly said he was "out of control". Also, a US official who was assigned to investigate his claims suspected he was "a lying alcoholic". Later, a CIA official described him as "a guy trying to get his green card...and playing the system for what it was worth". Despite repeated warnings from numerous intelligence analysts who questioned Rafid's claims, the US Government used his information to build a case for the 2003 invasion of Iraq. The weighting placed on Rafid's information was very apparent. President George W. Bush said in 2003: "We know that Iraq, in the late 1990s, had several mobile biological weapons labs."
The war ultimately cost 600,000 lives and trillions of dollars without uncovering a single WMD capability. So, how did the West get it so wrong?
This entire intelligence failure has its roots in one place: the political decision to invade Iraq had been made early by George Bush and was supported by Britain's Prime Minister Tony Blair. As a result, there was a political imperative to find facts to fit the story that Iraq had WMD. And any information that could support the story was shoe-horned in. This was confirmation bias at play, which – as any analyst will tell you – is the most prevalent and dangerous of all the biases. Despite all the observations by those closest to Curveball, confirmation bias ruled, and the information he provided was treated as "high-grade intelligence." The system (politicians, intelligence chiefs and analysts) was effectively looking for supporting evidence and ignoring non-supporting evidence.
There's a lot more to this story, and there are members of the intelligence community who still believe there are WMD yet to be found in Iraq. They might be right. Absence of evidence is not evidence of absence. But, let's be clear on one thing: The West didn't have sufficient good intelligence to go to war, and it over-played and shaped what it did have to justify the case for war. The case to go to war was riddled with confirmation bias.
Another Example of Confirmation Bias
You're fattist – no, you're fattestImagine there are two applicants for a job selling medical supplies to hospitals:
Job Candidate A: A fat 50-year-old with greasy hair and a degree from a standard university. He has 10 years' experience in the industry.
Job Candidate B: A fit 25-year-old with just one poor exam pass in woodwork. His father worked in the industry for 10 years.
About Candidate A. It's not exactly a top level university, is it? And, he could be overqualified. Being 50, he probably won't integrate well into the company. I'm not sure his degree is well aligned to our business.
About Candidate B. I sense his potential. I think he's looking for a long-term career. It's in his blood. He could even fix that doorframe in the coffee room.
Look at Candidate A's description. He'd lost the job at "Fat." Thereafter, the panel simply interpreted or ignored the remainder of the facts to fit their opening impression of Candidate A.
In most cases, it will be quite obvious that those with confirmation bias are shoehorning in evidence that fits or ignoring evidence that doesn't. It's usually an active defence against being proven wrong. But, sometimes, they don't do it knowingly. They just have too much confidence in their own beliefs, which makes convincing them of their bias difficult.
A Practical Application for Understanding Confirmation Bias
Undermine their best evidenceThere's something called a White Crow Event. It's when a single piece of evidence emerges that is absolute proof that a theory is correct. It gets its name from this idea: As soon as you see one white crow, you know for certain that "all crows are not black." That statement is 100% true as soon as you see a white one. Here's another example. Imagine we're trying to prove it's possible to contact the dead. As soon as one person is proven to have done it once, that's a White Crow Event. All debate is over. It's possible. Why do you need to know about White Crow Events? Well, most arguments are not won by presenting evidence of a White Crow Event, i.e., something that will win the argument outright. They are won by the person who presents the most convincing story involving all the facts at hand. This is really great news because it means anything your opponent says to support their argument is going to be open to interpretation. As soon as they present something to support their argument, you can often jump straight in with something like:
"That's confirmation bias. You're making the evidence fit your theory. That could be indicative of a hundred things."
Be aware that the most common response when telling someone they have confirmation bias is for them to claim you have it. There's an unwritten law here. The first person to use the term "confirmation bias" to undermine their opponent achieves the most effect, so make sure you're first to the punch.
Summary of Confirmation BiasIf you think someone is interpreting the facts to support their theory, tell them they are displaying confirmation bias and that they could easily infer different conclusions from the evidence.
Critical Thinking TestAre you good at spotting the biases, fallacies, and other cognitive effects? Can you spot when statistics have been manipulated? Can you read body language? Well, let's see!
- This test has questions.
- A correct answer is worth 5 points.
- You can get up to 5 bonus points for a speedy answer.
- Some questions demand more than one answer. You must get every part right.
- Beware! Wrong answers score 0 points.
- 🏆 If you beat one of the top 3 scores, you will be invited to apply for the Hall of Fame.
- Do you disagree with something on this page?
- Did you spot a typo?
- Do you know a bias or fallacy that we've missed?