Confirmation bias
Automatic translate
We demand extremely strong evidence for ideas that do not fit our beliefs, while accepting extremely weak evidence for ideas that do fit our beliefs.
Confirmation bias is a cognitive distortion in which people search for, perceive, interpret, and remember information selectively, favoring their existing beliefs. Simply put, we demand ironclad proof from our opponents, while readily accepting our own views at face value.
This bias is not an isolated character flaw or a sign of low intelligence. Psychological studies have documented it in people with a wide range of education and professional experience. It affects their evaluation of scientific data, medical decisions, political views, and everyday judgments.
2 Three mechanisms of one distortion
3 Classical experiments
4 Neurobiological basis
5 Related cognitive phenomena
6 Confirmation bias and cognitive dissonance
7 Manifestations in science and medicine
8 Politics, media, and echo chambers
9 Factors that increase bias
10 Bias in everyday decisions
11 Bias Reduction Techniques
12 Relationship with other cognitive distortions
13 Bias in professional contexts
14 Rationality and the limits of the concept
Historical context
Long before the advent of experimental psychology, thinkers noted that people cling to facts that suit them. The English philosopher Francis Bacon wrote in the early 17th century: people "notice coincidences and overlook lapses; they retain the former in their memory, but forget and pass over the latter." In his treatise Novum Organum (1620), Bacon described the "idols of the species" — innate errors of human perception, among which the most prominent is the tendency to confirm existing judgments.
Systematic study of the phenomenon began in the mid-20th century. In 1960, British psychologist Peter Wason published the results of a series of experiments that became classics of cognitive science. The term confirmation bias became firmly established in the scientific literature after the work of Raymond Nickerson and a major 1998 review in the Review of General Psychology.
Three mechanisms of one distortion
Researchers identify three relatively independent processes that together form a complete picture of biased thinking.
Biased search for information
The first and most obvious mechanism is that people choose which sources to read, which people to listen to, and which questions to ask. This unconsciously biases the choice toward what already aligns with existing beliefs. Wason’s "2-4-6" experiment demonstrated this clearly back in 1960: subjects were presented with a number sequence and asked to guess the rule by which it was constructed. Participants repeatedly named triplets of numbers that confirmed their hypothesis, rather than testing numbers that could refute it.
The search for confirmation is so structured that people are often unaware of it. Clicking on search results, subscribing to channels, and talking to loved ones all act as filters that subtly filter out inconsistent data.
Biased interpretation
The second mechanism is triggered when a person already has inconvenient information in hand. Then, they don’t reject it outright — instead, they scrutinize it for methodological flaws, question the author’s competence, and declare the sample unrepresentative. They simply don’t make the same claims about research that supports their views.
This very mechanism was documented by Lord, Ross, and Lepper in a now-iconic 1979 experiment. They recruited 48 students — some supported the death penalty as a crime deterrent, others opposed it. Both groups were presented with two studies: one confirmed the deterrent effect of the death penalty, the other refuted it. Both groups found the study that supported their position convincing and found flaws in the opposite one. As a result, their original views not only persisted but became even more polarized.
Biased memorization
The third mechanism operates after the event has occurred. Memory doesn’t work like a video recording — it reconstructs the past with each playback. Details that match our expectations are recalled more accurately and in greater detail, while those that contradict them are blurred or lost entirely. This phenomenon is related to the concept of mnemonic confirmation : people are more likely to recall instances where their predictions were correct and less likely to recall instances where they were wrong.
Classical experiments
Wason’s flashcard problem
In 1966, Wason developed the "choice task" — one of the most cited tests in the history of cognitive psychology. Subjects were presented with four cards with the symbols "E," "K," "4," and "7." They were required to verify the rule "if there’s a vowel on one side, there’s an even number on the other" and turn over only those cards that would disprove this rule.
The correct answer was "E" and "7": these are the cards that could violate the rule. However, only about 10% of participants gave the correct answer. Most chose "E" and "4" — the cards that would confirm, rather than disprove, the rule. Wason attributed this to confirmation bias: participants were looking for confirmation of their hypothesis, not refutation.
Experiment with the death penalty
A 1979 study by Lord, Ross, and Lepper revealed something even more alarming. When both groups — death penalty supporters and opponents — were presented with the same set of data, their assessments didn’t converge. On the contrary, the divergence between them grew. This phenomenon is known as attitude polarization : the more contradictory data is presented to people with strong opinions, the further apart they become.
Crucially, this result doesn’t arise because one party is lying or deliberately cheating. Each participant is subjectively convinced that they are evaluating the data objectively.
Active selection of evidence
A 2022 study published in the journal eLife showed that bias manifests itself not only in interpretation but also in the process of information gathering itself. Subjects performed perceptual tasks with two choice phases, between which they were allowed to freely request additional information. The more confident a person was in their initial answer, the more persistently they requested the information that supported that decision. Confidence thus strengthens bias — and this makes it particularly persistent in situations where a person feels competent.
Neurobiological basis
What’s going on in the brain?
Modern neuroimaging techniques have allowed us to gain insight into the mechanism of distortion. An international team of scientists from Virginia Tech, University College London, and the University of London used functional MRI in combination with behavioral tests. It turned out that the posterior medial prefrontal cortex functions as a kind of "belief filter": activity in this area reflects the willingness to incorporate new information into an existing worldview. When information aligned with the participant’s position, this area responded significantly more strongly than when it did not.
A separate 2020 study, published in the journal Nature Communications, used magnetoencephalography (MEG). The authors found that when people are highly confident in their decision, the neural processing of confirming information is enhanced, while the processing of disconfirming information is actually suppressed. In other words, confident people literally perceive less contradictory signals at the neural level.
Persuasion as a threat
In 2016, neuroscientists Jonas Kaplan, Sarah Gimbel, and Sam Harris conducted an experiment with politically committed volunteers in an fMRI scanner. When the subjects were presented with arguments that contradicted their political views, the same brain regions activated as those activated when they perceived a physical threat. This explains why people react so emotionally to disagreement: the brain perceives a challenge to one’s beliefs as a danger signal.
This reaction has an evolutionary explanation. Belonging to a group with shared beliefs was a prerequisite for survival throughout most of human history. Exclusion from the community meant a real threat to life. The brain still reacts to social challenges as if physical safety were at stake.
Related cognitive phenomena
Misinformation bias
The concept of disconfirmation bias describes an asymmetry in critical analysis: arguments that contradict one’s beliefs are examined significantly more thoroughly than those that support them. Studies have shown that participants spent more time evaluating inconsistent arguments and formulated more criticisms of them than of arguments that supported their position.
The boomerang effect
In some settings, presenting disconfirming evidence not only fails to persuade people to change their minds, but actually makes their beliefs even more extreme. This "boomerang" effect has been described as an extreme form of confirmation bias. Mathematical models show that it occurs when new information is perceived not as data, but as an attack on one’s identity.
Perseveration of beliefs
Belief perseverance is the retention of beliefs even after the evidence on which they were based is proven false. A classic example: subjects were led to believe that a certain person was "well suited" for a risky profession, then told that this characterization was fictitious. Their ratings of the person still remained higher than those of a control group that had never heard the fictitious description.
Illusory correlation
People tend to "see" a connection between two phenomena, even when there isn’t one, if such a connection is consistent with their existing beliefs. This phenomenon — illusory correlation — explains the persistence of many stereotypes: instances that confirm the stereotype are noticed and remembered; instances that refute it slip through the cracks.
Confirmation bias and cognitive dissonance
Confirmation bias and cognitive dissonance are related, though not identical, phenomena. Cognitive dissonance, described by Leon Festinger in 1957, is the discomfort experienced when simultaneously holding two contradictory beliefs. Confirmation bias serves as one of the primary ways to resolve this discomfort: it’s easier to ignore the contradiction than to revise a belief.
Data collected in 2022 during a study of political ideology showed that exposure to politically inconsistent information significantly worsened participants’ moods. Moreover, this negative emotional shift predicted a subsequent increase in adherence to their original views. In other words, the negative mood associated with "incorrect" information itself becomes a motivation to reject it.
Manifestations in science and medicine
Reproducibility and experimental design
Scientists are more susceptible to bias than the general public. A researcher passionate about their own hypothesis is prone to noticing results that confirm it and dismissing anomalies as artifacts. To combat this, science employs blind and double-blind experiments, randomization, and independent replication.
A blind method literally deprives the researcher of the ability to know which group received the active intervention, thereby eliminating the possibility of biased interpretation. This doesn’t mean the researcher is dishonest; it means their brain works the same way as everyone else’s.
Diagnostics and clinical reasoning
In medicine, confirmation bias manifests itself in the phenomenon of diagnostic anchoring : a physician who has formed an early hypothesis about a patient’s illness begins to interpret new symptoms in favor of that hypothesis and underestimates data pointing to an alternative diagnosis. This is a documented cause of diagnostic errors in clinical practice.
A similar process occurs in the judicial system: investigators who are committed to a particular version of events may inadvertently ignore evidence that contradicts that version — a phenomenon known as "tunnel vision" in investigative work.
Politics, media, and echo chambers
Polarization of political views
Confirmation bias is one of the persistent factors of political polarization. Research shows that people with different political views not only evaluate facts differently, but they often disagree on what constitutes a fact. According to the Pew Research Center (2019), 73% of Democrats and Republicans in the US disagreed on even basic factual statements.
Echo chambers in the digital environment
Recommendation algorithms and social media mechanisms create structures in which people predominantly receive information that aligns with their pre-existing views. This isn’t the result of malicious intent — platforms are optimized for user engagement, and familiar and enjoyable content is inherently more engaging. The result is a self-reinforcing cycle: beliefs reinforce the choice of sources, and the choice of sources reinforces beliefs.
A study of political echo chambers during the COVID-19 pandemic in 2020–2021 found that right-leaning users formed significantly denser and more isolated information clusters: approximately 80% of the audience reached by far-right accounts themselves held right-wing views, while the same figure for left-leaning accounts was approximately 40%.
Disinformation and the Persistence of Myths
One of the practical problems associated with this bias is the exceptional persistence of false beliefs. Once a person has accepted a false statement, a refutation is received with even greater skepticism than the original message. The "boomerang" effect has also been described in debunking myths about vaccines, political conspiracy theories, and pseudoscientific medical practices.
Factors that increase bias
Not all beliefs and not all situations are equally vulnerable. Research identifies several conditions under which confirmation bias is more pronounced.
- The emotional significance of the issue. The more important a belief is to one’s personal identity, the stronger the defensive reaction to inconsistent data.
- High confidence in being right. Confident decisions generate a more pronounced neural "filter" against contradictory information, as confirmed by MEG data.
- Group membership. When a belief marks membership in a social group, revising it threatens that membership — and the brain reacts to it as a threat.
- Durability of belief. Deeply held beliefs, formed over years, are more resistant to revision than recently formed ones.
- Data ambivalence. The more ambiguous the evidence, the more room for biased interpretation. Clear facts are harder to reinterpret.
Bias in everyday decisions
Investments and financial behavior
An investor convinced of an asset’s prospects will read analytics that support their choice and ignore warnings to the contrary. Behavioral economists identify this pattern as post-decision confirmation bias : after purchasing a stock, a person begins to actively search for data justifying the purchase and systematically underestimates signals to sell.
Interpersonal relationships
Our first impression of someone initiates a confirmation cycle: our formed opinion determines which behavioral traits we notice later. A positive initial assessment leads us to "see" evidence of sympathy; a negative one leads us to collect evidence of dislike. This makes initial judgments about people disproportionately influential.
Patients’ medical decisions
A patient convinced of the effectiveness of an alternative treatment method will notice improvements and attribute them to that method, while any deterioration is attributed to external factors. This is why anecdotal experience is so convincing to the patient — and so unreliable as a source of medical data.
Bias Reduction Techniques
Institutional mechanisms
In science, blinding, randomization, and independence
Historically, replication developed precisely as a defense against unintentional bias on the part of the researcher. The adversarial system in legal proceedings serves a similar function: the prosecution and defense deliberately present opposing interpretations of the same facts, assuming that a more accurate picture will emerge from this confrontation.
*
Individual strategies
At the individual level, several approaches have been shown to be effective.
- "Consider the opposite." Before making a decision, deliberately ask, "When will I be wrong?" This technique reduces bias by shifting attention from searching for confirmation to searching for refutation.
- The "what if the opposite" strategy. When testing a hypothesis, first try to disprove it — as Wason did, initially testing his principle through negation.
- Pre-registration. In science, publicly publishing hypotheses and methods before collecting data eliminates the possibility of retroactively adjusting them to the results.
- Bias awareness training. A single training session — a 30-minute video or a 90-minute educational game — showed a significant reduction in confirmation bias, with the effect persisting after two months.
Unlocking limits
It’s important to understand that completely eliminating this bias is unrealistic. Confirmation bias is partly functional — it speeds up information processing and reduces cognitive load. The goal isn’t to treat every belief as a random hypothesis, but to be able to recognize situations where the stakes are high and asymmetry in evidence evaluation can be costly.
Relationship with other cognitive distortions
Confirmation bias does not exist in isolation; it is linked to a whole network of related phenomena.
- Halo effect - the first positive impression creates a general positive "halo", which then attracts all subsequent assessments.
- Hindsight bias – after an event has occurred, people are convinced they “knew it all along” because their memory is reconstructed according to the known outcome .
- Irrational primacy effect - the information received first has a disproportionately large influence, since it becomes the “frame” through which everything that follows is interpreted.
- Motivated reasoning is a broader phenomenon, a special case of which is confirmation bias: when the goal of reasoning is not to find the truth, but to justify a pre-desired conclusion .
Bias in professional contexts
Intelligence and threat analysis
In the field of intelligence analysis, confirmation bias has been documented as a systemic threat. Analysts who have formed a preliminary picture of a situation tend to interpret incoming data as confirming that picture. This is why structured analytical techniques are being adopted in professional intelligence communities, such as the Analysis of Competing Hypotheses (ACH), developed by Richard Hure at the CIA in the 1970s.
Management and organizational decisions
Executives who have made a strategic decision find it difficult to reconsider even when negative signals mount. This pattern has been described as one of the factors of the " sunk cost fallacy " : the more invested in a decision, the more actively people seek arguments in its favor.
Judicial system
"Tunnel vision" in investigative work — when an investigation focuses on a specific suspect and begins to gather evidence against them — is considered one of the documented causes of miscarriages of justice. The Innocence Project in the United States found that the erroneous conviction of innocent people was often accompanied by this systemic bias in the investigative process.
Rationality and the limits of the concept
Adaptive function
Some researchers point out that confirmation bias cannot be categorized as a clear-cut "error" in thinking. Under conditions of limited cognitive resources and the need to act quickly, relying on previously validated beliefs reduces the computational load. The "confirm, don’t disprove" strategy works quite reliably in a stable environment, where past experience is a good predictor of the future.
The problem arises when the environment changes and the cognitive strategy cannot keep up with the changes — or when complex systemic judgments are involved that require precisely the kind of precision that bias undermines.
Relationship with Bayesian thinking
According to Bayesian probability theory, belief updating in response to new evidence should obey Bayes’ theorem: prior probabilities are adjusted proportionally to the strength of the new evidence. Confirmation bias violates this principle: the weight of new evidence is determined not by its quality, but by its consistency with prior beliefs. Confirmatory evidence is overweighted, while disconfirmatory evidence is underweighted. Models that describe this mathematically reproduce effects observed in experiments, including attitude polarization and the boomerang effect.
Confirmation bias is one of the most enduring and widely documented phenomena in cognitive psychology. Four centuries after Bacon, decades of experimental testing, and neuroimaging data all contribute to a coherent picture: the human brain is wired to reinforce existing beliefs more easily than to challenge them. This realization isn’t a reason for intellectual paralysis, but rather a basis for stricter demands on one’s own thinking where it truly matters.
You cannot comment Why?