Cognitive Biases & Anchoring Bias

1. Dr.Muratova Nazira

2. Subhash Raibole

Shyam Karthick, Dheivasigamani Sudha, Kathiravan Mahalingam,

Gnanamani Hariprasad, Swaraj Parmeshwar, Aravindhkumar,

Raut Vivek Suresh, Aishwarya Jyotzsna, Sekar Rajesh,

Selvaraj Sureka, Prajwal Muralidhar, Asim Ayesha

Malik Areeba, Hussain Nadir

(1. Lecturer, International Medical Faculty, Osh State University, Osh, Kyrgyz Republic.

2. Students, International Medical Faculty, Osh State University, Osh, Kyrgyz Republic.)

 

Abstract:

Cognitive biases are increasingly recognized as a major contributor to diagnostic errors in clinical medicine. Among them, anchoring bias is one of the most prevalent and impactful. Anchoring bias refers to the tendency of clinicians to rely heavily on the initial piece of information or first impression when making diagnostic decisions, often failing to adjust this judgment when new evidence emerges. This review explores anchoring bias in detail, including its background, etiology, pathogenesis, clinical manifestations, diagnostic approach, and strategies for prevention. Understanding this bias is essential for improving clinical reasoning, minimizing errors, and enhancing patient safety. By integrating awareness, structured diagnostic approaches, and reflective thinking, healthcare professionals can significantly reduce the impact of anchoring bias in medical practice.

 

Introduction:

Clinical decision-making is a cornerstone of medical practice. Physicians routinely assess patient symptoms, interpret clinical findings, and arrive at diagnoses under varying degrees of uncertainty. Ideally, this process should be objective, analytical, and evidence-based. However, human cognition is inherently imperfect and influenced by biases that can lead to systematic errors.

Anchoring bias is one of the most important cognitive biases affecting clinicians. It occurs when excessive weight is given to the initial clinical impression, leading to insufficient adjustment even when new or conflicting information becomes available. This bias can result in misdiagnosis, delayed treatment, and adverse patient outcomes.

In modern healthcare systems, where time pressure, high patient volume, and complex cases are common, anchoring bias becomes even more relevant. Recognizing and addressing this bias is crucial for improving diagnostic accuracy and ensuring high-quality patient care.

 

Context and Background:

The concept of anchoring bias originates from cognitive psychology and behavioral economics. It was first described by psychologists Amos Tversky and Daniel Kahneman, who demonstrated that individuals tend to rely heavily on initial information when making judgments.

In clinical settings, anchoring bias manifests when a diagnosis is formed early in the patient encounter, leading a practitioner to interpret all subsequent information in a way that supports this initial impression while alternative diagnoses are not adequately considered. This cognitive trap is particularly common in high-stress environments such as emergency medicine, internal medicine, and primary care settings, where rapid decision-making is often required. When an "anchor" is set, the clinician may unintentionally downplay or ignore "red flags" that contradict their first instinct, which can lead to diagnostic errors. Recognizing this bias is a critical component of clinical reasoning, as it encourages the use of "diagnostic pauses" to re-evaluate the patient's data from a neutral perspective.

 

Why It Matters

Diagnostic errors account for a significant proportion of medical errors, with cognitive biases contributing to the vast majority of these clinical mistakes. Among these mental shortcuts, anchoring bias stands out as one of the leading causes, often occurring when a clinician's initial impression of a patient prevents them from correctly processing later evidence that contradicts the first diagnosis. This "cognitive lock-in" can lead to delayed treatment or incorrect management, highlighting the need for structured diagnostic checklists to counteract these natural human tendencies.

 

Etiology and Pathogenesis:

Etiology (Causes of Anchoring Bias)

Anchoring bias arises due to multiple interacting factors:

  1. Cognitive Factors

The human brain's reliance on intuitive thinking, often referred to as System 1 thinking, provides a rapid, automatic, and emotional response to complex situations, yet it is this very speed that often leads to anchoring bias. Because we possess a limited cognitive capacity, we naturally employ mental shortcuts, or heuristics, to process vast amounts of clinical data without becoming overwhelmed. While these shortcuts are essential for efficiency in a busy medical environment, they can cause a clinician to latch onto the first piece of information they receive, failing to transition into the more analytical and effortful System 2 thinking required to vet a diagnosis thoroughly.

2.Environmental Factors

Environmental factors such as time constraints, a heavy workload, and emergency situations significantly exacerbate the risk of anchoring bias. In these high-pressure scenarios, the "cognitive load" on a clinician increases, forcing a heavier reliance on rapid, intuitive heuristics rather than slow, analytical reasoning. When a physician is juggling multiple critically ill patients or facing a crowded waiting room, the brain naturally seeks the path of least resistance to reach a decision quickly. Consequently, the first plausible diagnosis identified is often "anchored" upon, as the mental energy required to pivot and explore rarer or more complex differential diagnoses is limited by the urgency of the environment.

3.Individual Factors

Individual factors play a significant role in the susceptibility to anchoring bias, as overconfidence in one's initial clinical intuition can lead to a premature closing of the diagnostic process. Fatigue and stress further compromise a clinician's ability to engage in the effortful, analytical "System 2" thinking required to challenge an early assumption. Interestingly, experience level does not grant immunity; while junior doctors may anchor due to a limited repertoire of differential diagnoses, senior doctors can fall into the same trap by over-relying on "pattern recognition" from past cases. This suggests that regardless of seniority, a high cognitive load combined with physical exhaustion creates a fertile ground for mental shortcuts to override thorough clinical reasoning.

4.Systemic Factors

Systemic or environmental gaps frequently reinforce anchoring bias, particularly when poor communication during handovers leads a receiving clinician to adopt a previous provider's potentially flawed diagnosis as an absolute truth. This is compounded by incomplete patient data, which forces the brain to fill in the blanks using biased heuristics rather than objective evidence. Furthermore, a lack of structured diagnostic protocols—such as differential diagnosis checklists or "diagnostic pauses"—removes the necessary friction that would otherwise slow down intuitive thinking and allow for a more analytical review of the case. Without these systemic safeguards, the initial "anchor" remains unchallenged throughout the patient's care trajectory.

 

Pathogenesis (Mechanism of Development):

he development of anchoring bias follows a predictable cognitive trajectory that often leads to diagnostic pitfalls. Step 1 begins with Initial Impression Formation, where a clinician builds a mental model based on early symptoms or the first available test results—for instance, assuming a patient with a fever in an endemic area must have malaria. In Step 2, this initial diagnosis leads to a Fixation on Anchor, where the "malaria" label becomes the dominant framework through which all other data is viewed.

This leads directly to Step 3: Selective Information Processing, a phase where supporting evidence (like a headache) is emphasized while contradictory evidence (like a lack of splenomegaly or a negative smear) is ignored or minimized. Step 4 is the Failure to Adjust, where even as new findings emerge, they do not significantly alter the clinician’s initial diagnosis. Finally, Step 5 culminates in a Diagnostic Error, resulting in an incorrect or dangerously delayed diagnosis because the true cause was never fully explored.

 

Cognitive Basis and Mechanism

The psychological mechanism behind anchoring bias is rooted in the dual-process theory of cognition, which describes the tension between two distinct modes of thought. System 1 Thinking is fast, intuitive, and relies on rapid pattern recognition based on past experience; while efficient, it is highly prone to bias. In contrast, System 2 Thinking is slow, analytical, logical, and deliberate. While it is far less prone to error, it requires significant mental effort and "cognitive fuel."

Anchoring bias occurs when a clinician over-relies on System 1, leading to an initial hypothesis being accepted prematurely without sufficient activation of System 2 to re-evaluate the data. This creates a cycle of diagnostic fixation: a first impression is formed (the anchor), followed by selective attention to supporting data, and finally an inadequate adjustment of the diagnosis even when new, contradictory findings emerge.

4. Causes of Anchoring Bias in Clinical Settings

Several critical factors contribute to the reinforcement of anchoring bias in clinical practice, often intersecting to create a "perfect storm" for diagnostic error. These factors can be categorized into environmental pressures and individual cognitive states:

1. Environmental and Situational Pressures

  • Time Pressure: In emergency settings or during periods of high patient load, clinicians are forced into rapid decision-making. This speed necessitates a reliance on System 1 thinking, where the first plausible "anchor" is accepted to save time.

  • Incomplete Information: Anchoring often occurs when an early diagnosis is made before a full history, physical examination, or laboratory results are available. Once that initial mental scaffold is built, it is difficult to dismantle even when the full data set arrives.

  • Handover Errors: A common systemic trigger is accepting a previous provider’s diagnosis without a fresh reassessment. This "inherited anchoring" propagates errors through the care continuum.

2. Cognitive and Individual Factors

  • Overconfidence: Experienced clinicians may trust their initial intuition excessively due to years of successful pattern recognition. This can lead to a premature closing of the diagnostic process.

  • Cognitive Load: High levels of fatigue, chronic stress, and the necessity of multitasking drain the "mental fuel" required for System 2 analytical thinking.

  • The "Certainty Effect": The human brain naturally prefers the comfort of a definitive diagnosis over the ambiguity of an undifferentiated symptoms list, leading it to latch onto the first available explanation.

 

Clinical Examples of Anchoring Bias

1. Chest Pain: Gastritis vs. Myocardial Infarction (MI)

  • The Anchor: "Gastritis" (likely based on the patient's description of "burning" or "indigestion").

  • The Error: When objective evidence (ECG changes) appears, it is either ignored or dismissed as a non-specific finding because it doesn't fit the gastritis narrative.

  • Consequence: Missing a myocardial infarction can lead to irreversible cardiac muscle damage or death.

2. Fever: Malaria vs. Sepsis

  • The Anchor: "Malaria" (based on geography and common patterns).

  • The Error: Even when the CBC shows leukocytosis (high white blood cell count) and a left shift—which point strongly toward a bacterial source—the clinician remains "stuck" on the parasitic diagnosis.

  • Consequence: A delay in starting broad-spectrum antibiotics, potentially allowing the infection to progress to septic shock.

3. Trauma: Concussion vs. Intracranial Hemorrhage

  • The Anchor: "Simple Concussion" (based on the initial stable presentation).

  • The Error: Subtle neurological changes (like increasing lethargy or a slight pupillary change) are minimized as "post-concussive symptoms" rather than recognized as signs of increasing intracranial pressure.

  • Consequence: Failure to catch an expanding hematoma (such as an epidural or subdural bleed) until the patient's condition becomes critical.

 

Impact on Patient Care

The impact of anchoring bias on modern medicine is profound, directly linking cognitive theory to patient outcomes. When a clinician "anchors" on an initial impression, the resulting cognitive tunnel vision creates a cascade of systemic failures that affect both the patient's health and the healthcare system's efficiency.

The Clinical Cascade of Consequences

  • Delayed Diagnosis: The most immediate effect is the "time-to-treatment" gap. While the clinician pursues an incorrect anchor, the true underlying pathology (e.g., an evolving sepsis or an internal bleed) continues to progress.

  • Incorrect Treatment: Patients may be subjected to unnecessary medications or procedures—such as anticoagulants for a suspected MI when the true cause is an aortic dissection—which can cause secondary harm.

  • Increased Complications: Delays and wrong treatments lead to higher rates of morbidity (long-term illness) and mortality (death).

  • Systemic Inefficiency: Longer hospital stays and the cost of "un-doing" the wrong treatment path significantly increase healthcare expenditures.

 

 Relationship with Other Cognitive Biases

Anchoring bias rarely acts in isolation; it often forms the foundation for a cascade of cognitive errors that reinforce a clinician's initial, potentially flawed, hypothesis. Confirmation bias is perhaps its most frequent partner, occurring when a doctor actively seeks out information that supports their "anchor" while subconsciously filtering out or discrediting any data that contradicts it. This synergy makes it incredibly difficult to pivot away from an incorrect path because the clinician only "sees" the evidence that proves them right.

Premature closure represents the point where the diagnostic process effectively ends because the clinician is satisfied with the initial anchor and stops searching for other possibilities. This is often described as the most common error in clinical reasoning, as it shuts down the analytical "System 2" thinking before all the facts are in. When a physician decides they have the answer, the curiosity required to catch a rare or complex condition simply vanishes.

Availability bias further complicates this by pushing the clinician toward a diagnosis that is easily remembered, perhaps due to a recent similar case or a particularly dramatic patient outcome they witnessed previously. When a memorable past case acts as the anchor, the doctor may "force" the current patient's symptoms to fit that specific mold. Together, these three biases create a powerful psychological inertia that can lead to significant diagnostic errors and compromised patient safety.

Recognition of Anchoring Bias

Recognizing the internal and external signals of anchoring bias is a vital skill for maintaining diagnostic accuracy in a clinical setting. A clinician should immediately suspect they have anchored when the current diagnosis fails to explain all of the patient's symptoms or when new laboratory and imaging data directly contradict the initial impression. Furthermore, if a patient is not responding to the prescribed treatment as expected, it serves as a major clinical indicator that the original "anchor" may be incorrect and requires a full analytical reassessment.

Several cognitive red flags serve as early warning signs that a mental shortcut is taking over the reasoning process. Phrases such as "this must be the same as the last case" or "I already know the diagnosis" indicate that availability bias and premature closure are likely influencing the encounter. These statements suggest the clinician has stopped gathering new information and is instead relying on past patterns that may not apply to the current patient.

Another significant red flag is the act of ignoring or downplaying abnormal findings, such as a slightly elevated white blood cell count or an atypical physical exam result, simply because they do not fit the established narrative. When a diagnosis is made very early in an encounter without a full evaluation, the risk of this "cognitive lock-in" is at its highest. Maintaining a high index of suspicion for these behaviors allows a medical team to pause and deliberately engage System 2 thinking before a diagnostic error can cause patient harm.

 

Strategies to Prevent Anchoring Bias

To combat anchoring bias effectively, clinicians must move from automaticity to deliberate reflection by intentionally pausing to reconsider their initial findings. This process is often sparked by a simple but powerful question: "What else could this be?" By forcing the brain to expand the differential diagnosis to include at least three to five possibilities, the physician manually overrides the tendency of System 1 thinking to settle on the most obvious or available answer.

The use of structured checklists and diagnostic time-outs provides a systematic safeguard against missed information, ensuring that every patient receives a thorough evaluation regardless of how "classic" their presentation seems. These protocols are particularly effective when paired with a culture of seeking second opinions, as discussing a complex case with a colleague often introduces a fresh perspective that can shatter a persistent anchor. Regular reassessment of the patient’s condition is also mandatory, especially when there is no clinical improvement, as this lack of progress is a primary indicator that the original working diagnosis may be incorrect.

In modern medical education, awareness training has become a cornerstone of the curriculum to help students recognize their own cognitive vulnerabilities. Teaching methods such as case-based learning and high-fidelity simulation training allow residents to experience the "trap" of anchoring in a safe environment, while reflective practice and morbidity and mortality meetings provide a space to analyze past errors without judgment. By integrating these systematic approaches into clinical guidelines, the medical community encourages evidence-based decision-making that prioritizes accuracy over speed, ultimately reducing the morbidity and mortality associated with cognitive bias.

 

Real-Life Case Study (Example)

In this clinical case study, the patient’s initial presentation of fever, cough, and breathlessness led to an immediate anchor of community-acquired pneumonia. Because pneumonia is a common diagnosis with overlapping symptoms, the clinician initiated standard antibiotic treatment without considering the broader differential diagnosis. This early fixation created a mental framework that initially blinded the medical team to alternative pathologies, demonstrating how an anchor can prematurely close the diagnostic process.

As the case progressed, clear red flags emerged that should have triggered a shift from System 1 intuitive thinking to a more analytical System 2 review. The lack of clinical improvement despite appropriate antibiotics, the development of hemoptysis, and the presence of significant weight loss are all indicators that the initial pneumonia diagnosis was insufficient. In a typical anchoring scenario, these findings are often minimized as "slow-to-resolve" symptoms rather than being recognized as evidence of a different disease process entirely.

The final revelation of pulmonary tuberculosis highlights a classic failure to reassess the patient's condition when the clinical trajectory deviated from the expected path. The error analysis shows that by ignoring these specific red flags and failing to perform a diagnostic time-out, the team allowed the initial anchor to dictate the care plan for too long. This delay in diagnosing tuberculosis not only risks the patient's health through progressive lung damage but also poses a significant public health risk due to the potential for community transmission.

 

Clinical Manifestations:

Anchoring bias is not a biological pathology but rather a cognitive one that manifests through specific patterns of clinical error and behavior. The most prominent manifestation is diagnostic fixation, where a clinician remains mentally wedded to a single diagnosis even as new and contradictory evidence emerges. This fixation creates a filter that causes abnormal findings—such as an unexpected lab result or a unique physical sign—to be dismissed as "noise" or clinical outliers rather than being recognized as evidence that the initial theory is incorrect.

Furthermore, this bias typically results in an incomplete differential diagnosis, where the clinician fails to generate or maintain a robust list of alternative possibilities. Because the mind has already "solved" the case, the investigative process essentially stops, leading to a dangerous period of inappropriate management where the patient receives treatment for a condition they do not have.

The ultimate consequence of these behaviors is a significantly delayed diagnosis, where the correct underlying cause is only identified after the patient fails to improve or experiences a critical clinical deterioration. By recognizing these behavioral patterns as "symptoms" of a flawed reasoning process, medical teams can implement checks to ensure that management plans are constantly being validated against the full spectrum of available patient data.

 

Diagnostic Approach:

Recognizing anchoring bias requires a high level of clinical self-awareness, particularly when a patient is not improving as expected or when the clinical picture does not perfectly align with the working diagnosis. Clinicians must remain vigilant for new symptoms that emerge during the course of treatment, as these are often the first signs that the initial "anchor" was incorrect. Overconfidence in a first impression is perhaps the most dangerous behavioral signal, as it often leads to the dismissal of contradictory data and a premature halt to the diagnostic process.

To mitigate these risks, healthcare teams can employ several structured diagnostic strategies, such as expanding the differential diagnosis to include at least three to five possibilities and adopting a "rule-out the worst-case scenario" approach. Implementing a formal diagnostic time-out allows the team to pause and critically ask if any evidence has been ignored before finalizing a plan. The use of standardized checklists and clinical decision support systems (CDSS) provides an objective framework that reduces omission errors and ensures every patient receives a systematic evaluation regardless of the clinician's current cognitive load.

Long-term prevention and management of these cognitive traps rely on robust medical education and healthy workplace cultures. Integrating the study of cognitive biases into the medical curriculum and using simulation training allows providers to practice "de-biasing" in a controlled environment. Furthermore, promoting team-based decision-making and multidisciplinary discussions helps dilute individual biases by introducing diverse perspectives. Addressing systemic issues like excessive workload and fatigue is also essential, as these factors directly deplete the mental energy required for the slow, analytical thinking that keeps anchoring bias at bay.

 

Conclusion:

Anchoring bias is a significant and common cognitive error in clinical practice that can lead to serious diagnostic and therapeutic consequences. It arises from an over-reliance on initial impressions and inadequate adjustment in response to new information. Despite its prevalence, anchoring bias is preventable through awareness, structured clinical reasoning, and systematic approaches to diagnosis.

By incorporating strategies such as differential diagnosis expansion, diagnostic time-outs, and reflective practice, clinicians can minimize the impact of anchoring bias. Ultimately, improving awareness of cognitive biases is essential for enhancing patient safety, reducing medical errors, and promoting high-quality healthcare delivery.

 

References:

  • Tversky A, Kahneman D. Judgment under Uncertainty: Heuristics and Biases. Science. 1974.

  • Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Academic Medicine. 2003.

  • Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Archives of Internal Medicine. 2005.

  • Norman GR, Eva KW. Diagnostic error and clinical reasoning. Medical Education. 2010.

  • Singh H, Meyer AN, Thomas EJ. The frequency of diagnostic errors in outpatient care. BMJ Quality & Safety. 2014.

  • Croskerry P. From mindless to mindful practice — cognitive bias and clinical decision making. New England Journal of Medicine. 2013.

  • Kahneman D. Thinking, Fast and Slow. Farrar, Straus and Giroux; 2011.

  • Schiff GD et al. Diagnostic error in medicine: analysis of 583 physician-reported errors. Archives of Internal Medicine. 2009.

Previous
Previous

Vitamin Deficiency in Children

Next
Next

Juvenile Idiopathic Arthritis Pediatric