Teaching Critical Thinking: A Case for Instruction in Cognitive Biases to Reduce Diagnostic Errors and Improve Patient Safety : Academic Medicine

Secondary Logo

Journal Logo

Perspectives

Teaching Critical Thinking: A Case for Instruction in Cognitive Biases to Reduce Diagnostic Errors and Improve Patient Safety

Royce, Celeste S. MD; Hayes, Margaret M. MD; Schwartzstein, Richard M. MD

Author Information
Academic Medicine 94(2):p 187-194, February 2019. | DOI: 10.1097/ACM.0000000000002518
  • Free

Abstract

Diagnostic errors contribute to as many as 70% of medical errors. Prevention of diagnostic errors is more complex than building safety checks into health care systems; it requires an understanding of critical thinking, of clinical reasoning, and of the cognitive processes through which diagnoses are made. When a diagnostic error is recognized, it is imperative to identify where and how the mistake in clinical reasoning occurred. Cognitive biases may contribute to errors in clinical reasoning. By understanding how physicians make clinical decisions, and examining how errors due to cognitive biases occur, cognitive bias awareness training and debiasing strategies may be developed to decrease diagnostic errors and patient harm. Studies of the impact of teaching critical thinking skills have mixed results but are limited by methodological problems.

This Perspective explores the role of clinical reasoning and cognitive bias in diagnostic error, as well as the effect of instruction in metacognitive skills on improvement of diagnostic accuracy for both learners and practitioners. Recent literature questioning whether teaching critical thinking skills increases diagnostic accuracy is critically examined, as are studies suggesting that metacognitive practices result in better patient care and outcomes. Instruction in metacognition, reflective practice, and cognitive bias awareness may help learners move toward adaptive expertise and help clinicians improve diagnostic accuracy. The authors argue that explicit instruction in metacognition in medical education, including awareness of cognitive biases, has the potential to reduce diagnostic errors and thus improve patient safety.

Recognition of the role of diagnostic errors in patient morbidity and mortality has recently increased, as highlighted by the 2015 report Improving Diagnosis in Health Care, in which the National Academies of Science, Engineering, and Medicine defined diagnostic error as “the failure to establish an accurate and timely explanation of the patient’s health problem(s) or communicate that explanation to the patient.”1 Despite the attention given to the function of health care systems as a cause of medical error, relatively little has been done to address the cognitive component of diagnostic error, which may contribute to as many as 70% of medical errors.2–5

Prevention of diagnostic errors is more complex than building safety checks into health care systems; it requires an understanding of the clinical reasoning and cognitive processes through which diagnoses are made. Clinical reasoning—the process of applying cognitive skills, knowledge, and experience to diagnose and treat patients—is inherently difficult to assess, which makes cognitive errors difficult to detect. The steps of clinical reasoning usually occur rapidly, are rarely documented or explained, and may not be apparent even in the mind of the clinician.6,7

When a diagnostic error is recognized, it is imperative to identify where and how the mistake in clinical reasoning occurred. Cognitive biases, or predispositions to respond to data on the basis of prior experience or the exigencies of current conditions, can contribute to diagnostic errors. Although Norman et al8 recently argued that knowledge deficits are the primary cause for diagnostic errors, there is substantial evidence to suggest that cognitive biases contribute to diagnostic errors. In the majority of real-world malpractice cases attributed to diagnostic error, the errors are not due to ignorance but, rather, to the failure to consider the correct diagnosis.3,9 Gandhi et al5 found 64% of closed malpractice claims to be due solely to diagnostic error, with 79% of those cases including a “failure of judgment.” The role of cognitive bias in diagnostic error is underappreciated by physicians, who may be unfamiliar with how these assumptions influence their decision making.6 Case studies of diagnostic delay and misdiagnosis illustrate the central role of cognitive bias in diagnostic failure, showing that errors arising from cognitive bias play a role in over 50% of identified cases of diagnostic error in ambulatory clinics and in up to 83% of cases involving physician-reported diagnostic errors.9–12 By examining how errors due to cognitive biases occur, strategies may be developed to avoid mistakes and patient harm.

In this Perspective, we explore the role of cognitive bias in diagnostic error. We examine the effect of instruction in critical thinking and metacognitive skills in the development of diagnos tic accuracy for both learners and practitioners. We suggest that developing these skills may help learners and clinicians move toward adaptive expertise and improve diagnostic accuracy. We examine the literature that questions the benefits of teaching clinical reasoning skills to increase diagnostic accuracy, and we identify methodological problems with those studies. Lastly, we examine evidence suggesting that metacognitive practices result in better patient care and outcomes. We argue that explicit instruction by medical educators about metacognition and cognitive biases as components of critical thinking has the potential to help reduce diagnostic errors and thus improve patient safety.

The Diagnostic Process

When considering how to prevent diagnostic errors that are due to cognitive processes, it is important to understand how physicians make clinical decisions. Cognitive psychologists have proposed that problem solving and decision making occur through a dual process model: intuitive, rapid, pattern-based decision making, termed System 1; and more analytic, logical reasoning, termed System 2.13,14 Although many descriptions of dual processing exist in the literature,15 there is general agreement that System 1 is the use of pattern recognition, rules of thumb, or mental short cuts, known as heuristics, to make quick, almost instantaneous decisions. System 2 is the more analytic approach to problem solving and is typically employed when confronted with an unfamiliar problem, a difficult decision, or contradictory evidence. The dual process theory suggests that the two systems function in sequence: Heuristics are used to immediately solve the problem, and analytic reasoning may (or may not) be employed to alter the original impression.16

Metacognition—the capacity for self-reflection on the process of thinking and self-regulation in monitoring decision making—can be described as the purposeful engagement of System 2 problem solving through reflection and deliberative examination of one’s own reasoning. Metacognitive strategies often result in activation of System 2 decision making because the process of reflection may prompt a more analytic examination of available data. For example, when you are asked which animal causes the most human death after you view a documentary on shark attacks, your immediate answer, using System 1 processing, is likely to be “sharks.” On reflection (a metacognitive practice), you might engage System 2 processing and arrive at the correct answer, which is the mosquito.17 Because analytic reasoning requires effort—as well as an understanding of and the ability to engage in hypothetico-deductive and/or inductive reasoning—errors may arise simply due to the expediency of depending on heuristics. As Tversky and Kahneman18 note, “people rely on a limited number of heuristic principles which reduce … complex tasks of assessing probabilities … to simpler operations” which can “lead to severe and systemic errors.” The biological plausibility of the dual process model has been demonstrated using functional MRI, brain glucose utilization, and studies of patients with neurological lesions.13,16,19

The theory of adaptive expertise complements the dual process theory with the idea of expert practice—that is, of balancing efficiency and innovation in clinical problem solving.20,21 In this model, the routine expert is an individual at any level of training who appropriately uses preexisting knowledge to quickly solve routine, familiar, or uncomplicated problems. In contrast, the adaptive expert is able to employ a deep conceptual understanding and engage in reflection to create novel solutions for complicated or unfamiliar problems, thus adding to his or her knowledge base, reasoning capacity, and ability to solve cases not previously encountered.22 Expert practice requires reflection for growth; without engagement in this metacognitive process, practice improvement is stalled, and the chance of diagnostic errors occurring increases.23 Adaptive expertise is not a static competency; rather, it develops as the individual’s knowledge and problem-solving skills grow. In this theory, a clinician of any experience level who possesses foundational knowledge may make appropriate diagnoses not only in simple scenarios but also in more complex, uncertain, or unfamiliar cases by employing logic and reasoning. Conversely, an experienced clinician may arrive at an incorrect diagnosis if he or she fails to appreciate the need for reflection or innovation when confronted with a complex problem, a novel presentation, or contradictory data.

The dual process and adaptive expertise models can be used together to explain how routine experts differ from adaptive experts in their approaches to diagnosing a complex problem. Routine experts may rapidly and correctly arrive at a diagnosis, drawing on previous experience and knowledge to employ heuristic illness scripts. They may not recognize the need to use analytic reasoning strategies when faced with an unfamiliar problem or data that do not fit into the solution proposed by the heuristic, or they may use analysis primarily to switch to another previously encountered pattern. This type of dual process approach to diagnosis is predicated on adequate clinical knowledge, experience, the lack of distracting cognitive overload, and the ability to engage in reflection or metacognition when indicated, and it results in the efficiency experienced clinicians bring to clinical decision making.23,24 Adaptive expertise relies on the ability to engage in both types of thinking and adds the step of innovation—that is, designing novel solutions to new or complex problems not encountered previously by drawing creatively on prior experience and knowledge. Adaptive experts, therefore, balance efficiency and innovation in response to changing conditions, using both System 1 and System 2 approaches to problem solving and applying their foundational knowledge and learned experience to formulate novel solutions.

Learners are rarely efficient in their clinical decision making. They may try to employ heuristics in an effort to be efficient, but they may be more prone to errors than experienced clinicians, as their lack of experience provides inadequate knowledge or self-regulation to determine when heuristics fail. However, both novices and experienced clinicians can experience diagnostic error due to cognitive bias. For example, a clinician may conclude that a postoperative patient with dyspnea has a pulmonary embolism by depending on a heuristic (i.e., postoperative patients are at increased risk for thromboembolic disease), resulting in the cognitive bias of premature closure (acceptance of an early impression as the diagnosis without adequate verification or consideration of other explanations)—and a missed diagnosis of pulmonary edema, which would be clear from a more detailed evaluation of the patient. Warning signs of clinical situations in which the interaction between heuristics and cognitive bias may lead to diagnostic error include the failure to generate more than one possible diagnosis or the failure to account for all the data. These “red flags” should prompt the clinician to further analyze the case, a cognitive process that represents metacognition in the moment. Just as the study of basic science prepares medical students for future learning of complex subjects through the development of a framework for clinical knowledge,25 an awareness of how cognitive biases may interact with heuristics can provide a scaffolding for learning metacognitive reflective strategies and may allow learners to understand both the value and risk inherent in the use of heuristics.

Lastly, decisions about diagnoses are made in the clinical context, using the physician’s understanding of base rates of disease, likelihood ratios, and pretest probabilities. The physician’s personal experiences, in addition to his or her understanding of the sensitivity and specificity of a diagnostic test, contribute to the interpretation of test results and determination of a diagnosis. Without an awareness of potential bias based on anecdotal experience (the “N of one”), even the most experienced physician can make a diagnostic error. In a recent study, Rottman26 demonstrated that physicians use Bayesian reasoning and are more likely to make a correct diagnosis when their use of probabilistic reasoning is based on their understanding of base rates, likelihood ratios, and test sensitivities (i.e., their knowledge from experience); when given the results of a test and informed of its actual sensitivity, however, they are more likely to suffer from premature closure.

Ultimately, clinical reasoning requires integration of multiple approaches including heuristics, Bayesian principles of clinical epidemiology, inductive reason ing based on thorough understanding of mechanisms of disease, and, as we will discuss below, the ability to reflect on and correct for the effect of cognitive biases (Figure 1).

F1
Figure 1:
Multiple factors may be involved in clinical decision making, although the approaches employed by a given clinician may depend on the clinical situation, his or her level of training, and his or her comfort level with different problem-solving strategies. Clinical reasoning typically draws on foundational clinical knowledge, epidemiology, and evidence-based medicine. The clinician may depend on heuristics to reach a diagnosis, or may engage in more deliberate processes (e.g., inductive reasoning, Bayesian reasoning, hypothetico-deductive reasoning) and further reflect on the decision-making process through metacognitive practices. Learners should be encouraged to understand the role of all these processes to mature and develop their critical thinking and clinical reasoning skills.

Evidence for the Role of Cognitive Mistakes in Diagnostic Errors

One of the challenges of examining the role of cognitive processing in diagnostic error is that most cognitive mistakes are made in a subset of cases. These mistakes can arise at multiple steps in the diagnostic process. For example, an unusual presentation of a common illness, the presence of comorbidities, or patient characteristics that change the perceived base rate can lead to incorrect or incomplete diagnoses due to the cognitive biases of anchoring (fixation on specific features of a patient’s initial presentation, failure to adjust with new information), framing (decisions affected by the clinical context in which a problem is considered or by the analysis provided by a prior provider), or ascertainment (thinking shaped by what the physician hopes or expects to find). Cognitive overload may contribute to faulty reasoning strategies.

In daily practice, most cases are “routine,” with an easily recognizable diagnosis or a classical presentation of a common problem. For these cases, using System 1 decision making is quick, accurate, and appropriate. In analyzing the causes of diagnostic errors, studying how physicians arrive at the diagnosis in classic presentations (even of unusual conditions) is not useful in discriminating between use of heuristics and analytic reasoning, and will not help identify whether a knowledge deficit or a reasoning deficit is the source of an error. Rather, to detect cognitive bias, what must be examined is how physicians arrive at the diagnosis in atypical presentations, where cognitive biases may be unmasked in confronting a difficult diagnosis.27

For example, in one study residents were asked to evaluate computer-based cases of differing complexity.28 One of the cases consisted of a classic presentation of carbon monoxide poisoning. This is an example of a diagnosis that senior internal medicine residents are likely to recognize halfway through the case vignette. No amount of analytical reasoning will change a clinician’s mind about this diagnosis, and for experienced clinicians, there is no reason to employ a more analytic approach.

In contrast, when presented with cases with atypical presentations or with conflicting or complex data for which the diagnosis is less certain, experienced physicians may have no better diagnostic accuracy than medical students or residents. Using four cases in which contradictory information was introduced midway through each case, Krupat et al29 found that diagnostic accuracy was not different among faculty physicians, residents, and medical students. The more experienced physicians tended to persist with their initial impressions despite the additional discordant information. Thus, the decision that sufficient information has been gathered may lead to premature closure and diagnostic error.26 Unusual presentations, although representing a minority of cases, are the ones that may lead to substantial patient harm. In these cases, physicians would benefit from better awareness of cognitive processing and application of rigorous analytic reasoning.

In patient care, a knowledge deficit is an uncommon cause for misdiagnosis. In an analysis of closed claims data from over 23,000 malpractice cases in Massachusetts, 20% of total cases were attributed to diagnostic errors.9 In 73% of these diagnostic error cases, there was an identifiable lapse in clinical reasoning. In contrast, only 3% of total cases were attributed to a knowledge deficit; in these cases, the error occurred not because the doctor was unfamiliar with the diagnosis but, rather, because the doctor did not consider the diagnosis. Similar results were found in an analysis of primary care malpractice claims where 72.1% of successful claims were related to diagnostic errors.3 The errors ultimately attributed to faulty clinical reasoning occurred in the failure to obtain or update a patient and family history, to perform an adequate physical exam, to order appropriate diagnostic tests, and/or to refer patients appropriately. Although taking an incomplete history or performing an inadequate physical exam is not a cognitive mistake, the failure to recognize the need to update the history or pursue further information is a key component of cognitive biases such as premature closure and confirmation bias (looking for confirming evidence to support a hypothesis rather than seeking disconfirming evidence to refute it).

Chart reviews and other quality improve ment initiatives have demonstrated the frequency of diagnostic errors due to cognitive mistakes. An emergency medicine review of the charts of patients presenting with abdominal pain found that 35% had diagnostic errors, with 69% of those errors due to incomplete history taking, incorrect or unindicated testing, or lack of follow-up on abnormal test results.30 Delayed or missed diagnoses are also common for diagnoses that may have unusual presentations, such as tuberculosis, HIV-associated disease, cancer, and cardiovascular disease.31 “Secret shopper” programs, which use standardized patients to visit outpatient clinics, have demonstrated a 10% to 15% error rate with common diseases.31 In the inpatient setting, 83% of diagnostic errors have been found to be preventable,32 whereas autopsy studies have consistently shown a 10% to 20% rate of missed diagnoses.33,34

Cognitive bias is less well recognized as a root cause of diagnostic error than are failures of health care systems. Physicians openly acknowledge and address medical infrastructure factors, but they may not be comfortable discussing cognitive mistakes, which are often perceived as individual failings.35 For example, physicians recognize cognitive overload from excessive automated electronic medical record alerts as a cause for delay in diagnosis or care.36 However, physicians’ familiarity with other forms of cognitive bias and their contribution to diagnostic error may be limited.35

Studies of real-world cases have demonstrated the effect that cognitive bias can have on decision making, leading to faulty judgment and possible risk or harm to patients. In obstetrics, for example, transient increases in unscheduled cesarean deliveries were attributed to availability bias following catastrophic cases of uterine rupture37 or neonatal hypoxic ischemic encephalopathy.38 A systematic review39 of the literature on cognitive bias in practicing physicians found that overconfidence, anchoring, availability bias (judging the likelihood of an event based on the ease of mental retrieval), and tolerance of risk were associated with diagnostic inaccuracies or suboptimal management. Chart reviews from the Netherlands found that cases with faulty information processing due to cognitive biases, such as premature closure, confirmation bias, and overconfidence, were more likely to lead to diagnostic error and patient harm than were cases with faulty or incomplete information gathering.40

Physicians who display more reflective capacity, a form of metacognition, may have better patient outcomes. Yee et al41 found that obstetricians who scored higher on reflective capacity tests had higher rates of successful attempts of vaginal birth after cesarean delivery. Additionally, Moulton et al42 found that surgeons attributed procedural errors to a suspension of metacognitive self-monitoring during surgery.

The malpractice and diagnostic error literatures clearly demonstrate a role for improved clinical reasoning and suggest that educational interventions for teaching critical thinking are needed. Such interventions may attempt to improve metacognitive strategies, teach cognitive bias mitigation strategies, or increase awareness of cognitive bias.

Norman et al8 recently suggested that educational strategies to recognize and address cognitive bias have been unsuccessful so far. Demonstrating efficacy of any educational intervention in terms of patient safety or outcomes is difficult. Blumenthal-Barby and Krieger,43 in a review of the literature on cognitive bias and heuristics in medical decision making, pointed out that few studies of cognitive bias in learners had ecological validity. Most studies were based on experimental case vignettes rather than clinical decision making, and the cognitive biases studied were limited to a few—framing, omission (the tendency to judge adverse outcomes of actions as worse than adverse outcomes of inaction), relative risk (the tendency to prefer to choose an intervention when given the relative risk rather than the absolute risk), and availability biases.43 The applicability of many of such studies to clinical reasoning and decision making is questionable. For example, in a study attempting to assess whether reflection improves diagnostic accuracy, Norman et al28 divided second-year residents into a “speed cohort” and “reflect cohort.” Participants were asked to read a series of computer-based cases and make the diagnosis. The speed cohort was instructed to do this “as quickly as possible,” while the reflect cohort was instructed to be “thorough and reflective.” The authors found no significant difference in the two cohorts’ diagnostic accuracy and concluded that encouraging reflection and increased attention to analytic thinking does not increase diagnostic accuracy. However, the experimental conditions used (computer modules with a timer displaying elapsed duration of the exercise) are not an accurate representation of a busy emergency department, which the authors were trying to replicate. Further, the intervention did not include any explicit instruction on cognitive biases, metacognitive strategies, or other techniques for reflection. Although the average difference in time spent on each case by the cohorts (20 seconds) was statistically significant, this difference is unlikely to be meaningful with respect to the thinking processes employed or to real-world experience. Conversely, others have found that training in reflective practice may improve diagnostic accuracy: In a study of internal medicine residents, Mamede et al44 demonstrated improved diagnostic accuracy in first- and second-year residents. Instruction in reasoning skills, probabilistic decision making, and Bayesian reasoning may improve diagnostic accuracy by decreasing the effects of cognitive biases—in particular premature closure, neglect of base rates of disease, and inappropriate reliance on heuristics.45,46

Effect of Metacognitive Strategies on Diagnostic Accuracy

When evaluating the available data on the efficacy of teaching metacognitive skills to improve clinical reasoning and avoid diagnostic error, it is important to recognize the level of expertise in the study population. Attempts to teach strategies to raise awareness of cognitive bias in clinical reasoning—sometimes referred to as cognitive forcing strategies, debiasing strategies, or cognitive bias mitigation—have shown conflicting results with different groups.6,47,48 Studies involving medical students have demonstrated limited improvement in diagnostic accuracy, which may be due to knowledge deficits inherent in early-stage learners: Debiasing strategies are unlikely to improve diagnostic accuracy or speed in the short term if knowledge deficits exist.24,47,48 Furthermore, students may not have enough experience to fall victim to anchoring or availability biases. Therefore, although novice students are not immune to cognitive bias, they may not immediately benefit from instruction in metacognitive techniques or debiasing strategies.

As students advance in their training and transition to residency, they acquire knowledge and experience, and they become more likely to problem solve using pattern recognition and heuristics. However, as trainees acquire experience and develop illness scripts, they also become more prone to making diagnostic errors due to availability bias and anchoring.44 Findings from efforts to teach metacognitive skills to residents are interesting and consistent with this developmental stage.28,49–53 For example, Monteiro et al52 found that higher-achieving residents benefited from instructions to “reflect before answering” when answering test questions at all levels of difficulty, whereas lower-achieving residents benefited only when answering easier questions, demonstrating that even a short instruction to reflect on decisions may improve diagnostic accuracy for those with a baseline of knowledge. The inverse relationship that Norman et al28 found between diagnostic accuracy and time required to diagnose cases suggests that residents either are in the process of developing both knowledge and metacognitive skills or that they are spending more time because they simply do not know the answers.

Educational interventions providing detailed instruction in recognizing common cognitive biases and debiasing strategies have demonstrated both short- and long-term improvements in residents’ critical thinking skills.49,50 A longitudinal curriculum of metacognitive skills and debiasing strategies resulted in increased awareness of common cognitive biases, as well as improved discussions with patients, families, and colleagues. Importantly, this effect persisted at the one-year follow-up.49 In another study, introduction of cognitive bias awareness into peer review of cases of diagnostic errors resulted in the development and implementation of algorithms and protocols for avoiding affective bias (bias due to an emotional response), use of standardized neurological evaluations, and increased consultations for difficult cases.35

Metacognition Prompts Clinical Reasoning Strategies

Reliance on System 1 decision making is not the cause for all diagnostic errors; indeed, there is some evidence that more deliberative thinking can also result in errors.48 However, the inappropriate or unexamined use of heuristics can result in impaired decision making,54 and as Schulz55 notes, “when making a decision, making a wrong decision feels the same as making a right decision.” Even physicians who are familiar with the effects of cognitive biases and have an awareness of the pitfalls of dependence on heuristics may not believe themselves to be vulnerable to their influence, and they may only acknowledge a few of many instances of cognitive biases in their own decision making.56 Learning and practicing strategies to avoid biased thinking—that is, debiasing or cognitive forcing strategies—requires effort and vigilance.6 These strategies seem to work best when they disrupt the automaticity of clinical reasoning, requiring the clinician to reevaluate his or her initial thought processes through reconsideration of the evidence.57–59

Reflection on one’s reasoning is of paramount importance. Clinicians can and should be taught to examine heuristics, monitor their own reasoning for mistakes and biases, and self-regulate their thought processes. Cognitive bias awareness strategies, like other critical thinking skills, can be taught to learners in medicine as potential tools to advance patient safety and patient care. Learners can be taught cognitive bias awareness along with other strategies to promote critical thinking, such as the five microskills model known as the “one-minute preceptor,”60 to help them develop habits of mind and healthy skepticism about their own thought processes. Engaging in simple practices such as calling for a “diagnostic time-out” (an explicit pause to reflect on the thinking process leading to the diagnosis) at patient handoff or when confronted with a complex patient promotes reflection and metacognition for physicians at any level.60,61

Evidence for Better Diagnostic Accuracy and Patient Outcomes

Few studies have looked at the efficacy of cognitive bias awareness training in preventing diagnostic error and patient harm. A recent review of interventions to prevent diagnostic errors found that the vast majority of interventions were not educational: Only 11 of 109 studies included clinician education, and few reported any patient outcomes.62

Longitudinal and integrated curricula are effective at improving awareness of cognitive biases and use of reflective practice.49,63 Introduction of an integrated cognitive bias awareness curriculum for residents and practicing physicians at Maine Medical Center led to an increase in the reporting of diagnostic errors, as well as protocols to standardize patient care.64 At the University of Pennsylvania Perelman School of Medicine, partici pants in a longitudinal program designed to increase awareness of the role of cognitive bias in diagnostic error were able to identify the roles of different cognitive biases in diagnostic errors and to generate strategies to avoid similar errors in the future.65,66 These studies suggest that continuing education in cognitive biases and metacognition may improve patient outcomes.

Future Directions and Conclusions

As medical education shifts from a focus on content transfer to the development of critical thinking and problem-solving skills, educators must push learners to develop and practice the skills needed to reason from foundational principles and concepts to explain the history, physical examination, and laboratory data for a given patient.67,68 As students enter clinical clerkships and progress to residency programs, faculty must continue to reinforce these skills; when not practiced regularly, inductive reasoning and metacognitive skills atrophy. Faculty development efforts should emphasize techniques that incorporate critical thinking skills instruction without adding time in already-crowded GME curricula.69 Over time, efforts to teach reflective practice and cognitive bias awareness strategies may lead to better diagnostic habits and ultimately to improved patient safety.63 Teaching analytic approaches such as Bayesian reasoning, improving physicians’ understanding of probabilistic decision making with likelihood ratios, and increasing appreciation of diagnostic test specificity and sensitivity may also help to decrease diagnostic errors made because of cognitive mistakes.26,45 To help learners achieve adaptive expertise, educators must help them recognize the appropriate use and challenges of both heuristics and analytic reasoning, adapt their approach to diagnosis to the clinical scenario, and become comfortable with regular encounters with uncertainty.24

Workplace interventions may assist practicing clinicians in their efforts to avoid diagnostic errors. Global changes in medical practice have led to a clinical environment in which feedback on diagnostic accuracy is difficult to obtain, denying physicians the opportunity to learn from mistakes.33,34 Electronic medical records have the potential to facilitate feedback on diagnostic accuracy.31 Health care systems, government regulatory bodies, malpractice insurance companies, and third-party payers are likely to invest in programs to teach debiasing strategies and other approaches for improved diagnostic accuracy, including the development of clinical decision support tools.1,3,49,70 Computer-based training systems for assessment of clinical reasoning have been employed successfully to improve diagnostic accuracy.71

Further, a change in the culture of medicine regarding diagnostic error is needed. As a profession, physicians have become more comfortable with identifying and addressing health care systems factors that lead to medical error.35 Diagnostic error is perceived as more difficult to address and prevent. Physicians are reluctant to acknowledge their own diagnostic errors, especially when mistakes are often seen as personal failings and professional lapses.64 Small changes in both culture and communication may help establish a safer environment for admitting uncertainty in diagnoses and acknowledging errors. A simple change in name, as suggested by Singh, from “diagnostic errors” to “missed opportunities in diagnosis” may help destigmatize and depersonalize these errors.72 Techniques to promote a culture of safe and open communication should be employed, such as routinely incorporating a diagnostic time-out for difficult cases or at patient handoffs.60,61,73 Morbidity and mortality conferences should return to the original intent: identifying and learning from diagnostic errors, focusing on an exploration of reasoning rather than taking a punitive or judgmental approach to assigning blame.

Physicians should embrace the idea of uncertainty as a mark of a sophisticated approach to clinical medicine, rather than as an admission of ignorance or incompetence.74 Enlisting other health care professionals, patients, and families as meaningful partners in the diagnostic process is also potentially powerful for detection and prevention of diagnostic errors.70 Lastly, relabeling “differential diagnosis” as “diagnostic hypotheses,” as an expression of uncertainty and fallibility, would encourage testing and potentially changing one’s conclusions as a clinical scenario unfolds.75

Culture change is difficult. The effort in medical education to teach critical thinking skills and metacognitive strategies explicitly to promote a culture of patient safety is still in its early stages and has not yet conclusively demonstrated improved patient outcomes. Just as standardization of medical education as a science-based discipline helped bring unimagined improvements to medicine in the 20th century, the increased focus on the development of critical thinking skills may reap similar benefits in this century. Education is one pillar of the patient safety movement. Medical educators must continue to work to incorporate critical thinking skills training throughout the medical education continuum.

References

1. National Academies of Sciences, Engineering, and Medicine. Improving Diagnosis in Health Care. 2015.Washington, DC: National Academies Press.
2. Saber Tehrani AS, Lee H, Mathews SC, et al. 25-year summary of US malpractice claims for diagnostic errors 1986–2010: An analysis from the National Practitioner Data Bank. BMJ Qual Saf. 2013;22:672–680.
3. Schiff GD, Puopolo AL, Huben-Kearney A, et al. Primary care closed claims experience of Massachusetts malpractice insurers. JAMA Intern Med. 2013;173:2063–2068.
4. Kachalia A, Gandhi TK, Puopolo AL, et al. Missed and delayed diagnoses in the emergency department: A study of closed malpractice claims from 4 liability insurers. Ann Emerg Med. 2007;49:196–205.
5. Gandhi TK, Kachalia A, Thomas EJ, et al. Missed and delayed diagnoses in the ambulatory setting: A study of closed malpractice claims. Ann Intern Med. 2006;145:488–496.
6. Croskerry P. From mindless to mindful practice—Cognitive bias and clinical decision making. N Engl J Med. 2013;368:2445–2448.
7. Wachter RM. Why diagnostic errors don’t get any respect—And what can be done about them. Health Aff (Millwood). 2010;29:1605–1610.
8. Norman GR, Monteiro SD, Sherbino J, Ilgen JS, Schmidt HG, Mamede S. The causes of errors in clinical reasoning: Cognitive biases, knowledge deficits, and dual process thinking. Acad Med. 2017;92:23–30.
9. CRICO. Malpractice risks in the diagnostic process: 2014 CRICO strategies national CBS report. https://www.rmf.harvard.edu/Malpractice-Data/Annual-Benchmark-Reports/Risks-in-the-Diagnostic-Process. Published 2014. Accessed December 30, 2016.
10. Singh H, Giardina TD, Meyer AN, Forjuoh SN, Reis MD, Thomas EJ. Types and origins of diagnostic errors in primary care settings. JAMA Intern Med. 2013;173:418–425.
11. Schiff GD, Hasan O, Kim S, et al. Diagnostic error in medicine: Analysis of 583 physician-reported errors. Arch Intern Med. 2009;169:1881–1887.
12. Okafor N, Payne VL, Chathampally Y, Miller S, Doshi P, Singh H. Using voluntary reports from physicians to learn from diagnostic errors in emergency medicine. Emerg Med J. 2016;33:245–252.
13. Evans JS, Stanovich KE. Dual-process theories of higher cognition: Advancing the debate. Perspect Psychol Sci. 2013;8:223–241.
14. Kahneman D. Thinking, Fast and Slow. 2011.New York, NY: Farrar, Straus and Giroux.
15. Evans JS. Dual-processing accounts of reasoning, judgment, and social cognition. Annu Rev Psychol. 2008;59:255–278.
16. Evans JS. The heuristic-analytic theory of reasoning: Extension and evaluation. Psychon Bull Rev. 2006;13:378–395.
17. World Health Organization. Neglected tropical diseases: Mosquito-borne diseases. www.who.int/neglected_diseases/vector_ecology/mosquito-borne-diseases/en. Accessed October 11, 2018.
18. Tversky A, Kahneman D. Judgment under uncertainty: Heuristics and biases. Science. 1974;185:1124–1131.
19. Goel V, Buchel C, Frith C, Dolan RJ. Dissociation of mechanisms underlying syllogistic reasoning. Neuroimage. 2000;12:504–514.
20. Sockalingam S, Mulsant BH, Mylopoulos M. Beyond integrated care competencies: The imperative for adaptive expertise. Gen Hosp Psychiatry. 2016;43:30–31.
21. Mylopoulos M, Regehr G. Putting the expert together again. Med Educ. 2011;45:920–926.
22. Mylopoulos M, Regehr G. Cognitive metaphors of expertise and knowledge: Prospects and limitations for medical education. Med Educ. 2007;41:1159–1165.
23. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S81.
24. Eva KW. What every teacher needs to know about clinical reasoning. Med Educ. 2005;39:98–106.
25. Mylopoulos M, Woods N. Preparing medical students for future learning using basic science instruction. Med Educ. 2014;48:667–673.
26. Rottman BM. Physician Bayesian updating from personal beliefs about the base rate and likelihood ratio. Mem Cognit. 2017;45:270–280.
27. Reason J. Human Error. 1990.Cambridge, UK: Cambridge University Press.
28. Norman G, Sherbino J, Dore K, et al. The etiology of diagnostic errors: A controlled trial of system 1 versus system 2 reasoning. Acad Med. 2014;89:277–284.
29. Krupat E, Wormwood J, Schwartzstein RM, Richards JB. Avoiding premature closure and reaching diagnostic accuracy: Some key predictive factors. Med Educ. 2017;51:1127–1137.
30. Medford-Davis L, Park E, Shlamovitz G, Suliburk J, Meyer AN, Singh H. Diagnostic errors related to acute abdominal pain in the emergency department. Emerg Med J. 2016;33:253–259.
31. Graber ML. The incidence of diagnostic error in medicine. BMJ Qual Saf. 2013;22(suppl 2):ii21–ii27.
32. Zwaan L, de Bruijne M, Wagner C, et al. Patient record review of the incidence, consequences, and causes of diagnostic adverse events. Arch Intern Med. 2010;170:1015–1021.
33. Shojania KG, Burton EC, McDonald KM, Goldman L. Changes in rates of autopsy-detected diagnostic errors over time: A systematic review. JAMA. 2003;289:2849–2856.
34. Winters B, Custer J, Galvagno SM Jr, et al. Diagnostic errors in the intensive care unit: A systematic review of autopsy studies. BMJ Qual Saf. 2012;21:894–902.
35. Reilly JB, Myers JS, Salvador D, Trowbridge RL. Use of a novel, modified fishbone diagram to analyze diagnostic errors. Diagnosis (Berl). 2014;1:167–171.
36. Singh H, Spitzmueller C, Petersen NJ, Sawhney MK, Sittig DF. Information overload and missed test results in electronic health record-based settings. JAMA Intern Med. 2013;173:702–704.
37. Riddell CA, Kaufman JS, Hutcheon JA, Strumpf EC, Teunissen PW, Abenhaim HA. Effect of uterine rupture on a hospital’s future rate of vaginal birth after cesarean delivery. Obstet Gynecol. 2014;124:1175–1181.
38. Dan O, Hochner-Celnikier D, Solnica A, Loewenstein Y. Association of catastrophic neonatal outcomes with increased rate of subsequent cesarean deliveries. Obstet Gynecol. 2017;129:671–675.
39. Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: A systematic review. BMC Med Inform Decis Mak. 2016;16:138.
40. Zwaan L, Thijs A, Wagner C, Timmermans DR. Does inappropriate selectivity in information use relate to diagnostic errors and patient harm? The diagnosis of patients with dyspnea. Soc Sci Med. 2013;91:32–38.
41. Yee LM, Liu LY, Grobman WA. Relationship between obstetricians’ cognitive and affective traits and delivery outcomes among women with a prior cesarean. Am J Obstet Gynecol. 2015;213:413.e1–413.e7.
42. Moulton CA, Regehr G, Lingard L, Merritt C, MacRae H. Slowing down to stay out of trouble in the operating room: Remaining attentive in automaticity. Acad Med. 2010;85:1571–1577.
43. Blumenthal-Barby JS, Krieger H. Cognitive biases and heuristics in medical decision making: A critical review using a systematic search strategy. Med Decis Making. 2015;35:539–557.
44. Mamede S, van Gog T, van den Berge K, et al. Effect of availability bias and reflective reasoning on diagnostic accuracy among internal medicine residents. JAMA. 2010;304:1198–1203.
45. Krynski TR, Tenenbaum JB. The role of causality in judgment under uncertainty. J Exp Psychol Gen. 2007;136:430–450.
46. Edgell SE, Harbison JI, Neace WP, Nahinsky ID, Lajoie AS. What is learned from experience in a probabilistic environment? J Behav Decis Mak. 2004;17:213–229.
47. Sherbino J, Kulasegaram K, Howey E, Norman G. Ineffectiveness of cognitive forcing strategies to reduce biases in diagnostic reasoning: A controlled trial. CJEM. 2014;16:34–40.
48. Sherbino J, Yip S, Dore KL, Siu E, Norman GR. The effectiveness of cognitive forcing strategies to decrease diagnostic error: An exploratory study. Teach Learn Med. 2011;23:78–84.
49. Ruedinger E, Mathews B, Olson APJ. Decision–diagnosis: An introduction to diagnostic error and medical decision-making. MedEdPORTAL. 2016;12:10378. https://doi.org/10.15766/mep_2374-8265.10378. Accessed November 8, 2018.
50. Hunzeker A, Amin R. Teaching cognitive bias in a hurry: Single-session workshop approach for psychiatry residents and students. MedEdPORTAL. 2016;12:10451. https://doi.org/10.15766/mep_2374-8265.10451. Accessed November 8, 2018.
51. Hess BJ, Lipner RS, Thompson V, Holmboe ES, Graber ML. Blink or think: Can further reflection improve initial diagnostic impressions? Acad Med. 2015;90:112–118.
52. Monteiro SD, Sherbino J, Patel A, Mazzetti I, Norman GR, Howey E. Reflecting on diagnostic errors: Taking a second look is not enough. J Gen Intern Med. 2015;30:1270–1274.
53. Schmidt HG, Mamede S, van den Berge K, van Gog T, van Saase JL, Rikers RM. Exposure to media information about a disease can cause doctors to misdiagnose similar-looking clinical cases. Acad Med. 2014;89:285–291.
54. McLaughlin K, Eva KW, Norman GR. Reexamining our bias against heuristics. Adv Health Sci Educ Theory Pract. 2014;19:457–464.
55. Schulz K. Being Wrong: Adventures in the Margin of Error. 2010.New York, NY: Ecco/HarperCollins.
56. Dhaliwal G. Premature closure? Not so fast. BMJ Qual Saf. 2017;26:87–89.
57. Mamede S, Schmidt HG. Reflection in medical diagnosis: A literature review. Health Prof Educ. 2017;3:15–25.
58. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: Origins of bias and theory of debiasing. BMJ Qual Saf. 2013;22(suppl 2):ii58–ii64.
59. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: Impediments to and strategies for change [published online ahead of print August 30, 2013]. BMJ Qual Saf. doi:10.1136/bmjqs-2012-001713
60. Neher JO, Gordon KC, Meyer B, Stevens N. A five-step “microskills” model of clinical teaching. J Am Board Fam Pract. 1992;5:419–424.
61. Trowbridge RL Jr, Rencic JJ, Durning SJ. Teaching Clinical Reasoning. 2015.Philadelphia, PA: American College of Physicians.
62. McDonald KM, Matesic B, Contopoulos-Ioannidis DG, et al. Patient safety strategies targeted at diagnostic errors: A systematic review. Ann Intern Med. 2013;158(5 pt 2):381–389.
63. Reilly JB, Ogdie AR, Von Feldt JM, Myers JS. Teaching about how doctors think: A longitudinal curriculum in cognitive bias and diagnostic error for residents. BMJ Qual Saf. 2013;22:1044–1050.
64. Graber ML, Trowbridge R, Myers JS, Umscheid CA, Strull W, Kanter MH. The next organizational challenge: Finding and addressing diagnostic error. Jt Comm J Qual Patient Saf. 2014;40:102–110.
65. Umscheid CA, Williams K, Brennan PJ. Hospital-based comparative effectiveness centers: Translating research into practice to improve the quality, safety and value of patient care. J Gen Intern Med. 2010;25:1352–1355.
66. Ogdie AR, Reilly JB, Pang WG, et al. Seen through their eyes: Residents’ reflections on the cognitive and contextual components of diagnostic errors in medicine. Acad Med. 2012;87:1361–1367.
67. Krupat E, Richards JB, Sullivan AM, Fleenor TJ Jr, Schwartzstein RM. Assessing the effectiveness of case-based collaborative learning via randomized controlled trial. Acad Med. 2016;91:723–729.
68. Schwartzstein RM, Roberts DH. Saying goodbye to lectures in medical school—Paradigm shift or passing fad? N Engl J Med. 2017;377:605–607.
69. Hayes MM, Chatterjee S, Schwartzstein RM. Critical thinking in critical care: Five strategies to improve teaching and learning in the intensive care unit. Ann Am Thorac Soc. 2017;14:569–575.
70. Commonwealth of Massachusetts Board of Registration in Medicine, Quality and Safety Division. Advisory: Diagnostic process in inpatient and emergency department settings. http://www.mass.gov/eohhs/docs/borim/cde-advisory.pdf. Published March 2016. Accessed October 5, 2018.
71. Kunina-Habenicht O, Hautz WE, Knigge M, Spies C, Ahlers O. Assessing clinical reasoning (ASCLIRE): Instrument development and validation. Adv Health Sci Educ Theory Pract. 2015;20:1205–1224.
72. Singh H. Editorial: Helping health care organizations to define diagnostic errors as missed opportunities in diagnosis. Jt Comm J Qual Patient Saf. 2014;40:99–101.
73. Mull N, Reilly JB, Myers JS. An elderly woman with “heart failure”: Cognitive biases and diagnostic error. Cleve Clin J Med. 2015;82:745–753.
74. Hatch S. Snowball in a Blizzard: A Physician’s Notes on Uncertainty in Medicine. 2016.New York, NY: Basic Books.
75. Simpkin AL, Schwartzstein RM. Tolerating uncertainty—The next medical revolution? N Engl J Med. 2016;375:1713–1715.
Copyright © 2018 by the Association of American Medical Colleges