A Universal Model of Diagnostic Reasoning : Academic Medicine

Secondary Logo

Journal Logo

Diagnostic Reasoning

A Universal Model of Diagnostic Reasoning

Croskerry, Pat MD, PhD

Author Information
Academic Medicine 84(8):p 1022-1028, August 2009. | DOI: 10.1097/ACM.0b013e3181ace703
  • Free

Abstract

Diagnostic reasoning is the most critical of a physician's skills. As Nuland1 notes, “It is every doctor's measure of his own abilities; it is the most important ingredient in his professional self-image.” Yet the rate at which doctors fail in this critical aspect of clinical performance is surprisingly high. Autopsy findings have consistently shown a 20% to 40% discrepancy with the antemortem diagnosis,2,3 and a third of these autopsies would not have taken place if the true diagnosis had been known.2 Despite improved technology and an improved evidence base in medicine, the misdiagnosis rate detected through autopsy studies has not changed significantly during the last century.4 The contribution of diagnostic error to patient morbidity and mortality is significant, but strategies for reducing it do not come easily to hand. The development of clinical decision support tools such as DXplain,5 ILIAD,6 Quick Medical Reference,7 ISABEL,8 and many others over the last five decades reflect the effort to augment and improve the diagnostic performance of clinicians.

Improving diagnostic reasoning would seem to be an important goal for the safety of patients; however, a major impediment has been the variety of approaches that have been taken toward understanding the clinical reasoning that underlies the diagnostic process. These cluster into two main groups (see List 1), following the historical division into intuitive or analytical approaches toward thinking, reasoning, and deciding.9,10 The various approaches that have been taken toward decision making have two implicit purposes: first, to explain the ways in which we think and, second, to generate a practical approach to decision making that has important clinical utilization.

T2-14
List 1 Comparison of Intuitive and Analytical Approaches to Decision Making

The intuitive approach leans heavily on the experience of the decision maker and, therefore, uses reasoning that depends on inductive logic. Experienced decision makers recognize overall patterns (gestalt effects) in the information presented and act accordingly—action is recognition primed.11 The experience of the decision maker will determine how well the information presented is interpreted as the decision maker seeks to make sense of the overall gestalt. Typically, such decisions are made under uncertainty; they employ heuristics or mental shortcuts,12 and they may be made quickly using thin-slice sampling (ie, relying on instinctive first impressions).13 As we rarely have all of the information necessary to make an informed decision, such “rational” decisions have bounds or limitations,14 but we do the best we can under the circumstances. In recent years, the intuitive approach has also come to incorporate elements of evolutionary psychology—the view that some of our thinking is driven by cognitive modules that are hardwired in the Darwinian sense.15 Also, there is accumulating interest in the role of preattentive, or preconscious, mental processes—the view that perceptual analysis can effortlessly occur without deliberate intention or awareness and lead to judgment and action.16,17

The analytical approach, in contrast, takes place under more ideal conditions, where there are fewer boundaries and greater availability of resources, resulting in less uncertainty; decisions made under these circumstances approach normative reasoning and rationality more closely. If all the relevant variables and the parameters of test performance are known, then one can use the Bayesian method to calculate fairly exact probabilities of the likelihood of a particular disease. The analytic reasoning mode is classically Popperian, with hypothesis testing and deductive reasoning; it is analytical, involves critical thinking, and is logically sound. Arborization, or multiple branching, is an algorithmic approach using a series of unambiguous branching points and is particularly useful for delegated decision making.18 Essentially, it is analytic decision making by proxy, the branching points having been researched and refined by experts in the field. The exhaustion strategy involves first collecting all possible relevant data and then searching through the data for a diagnosis. It characterizes the approach of novices, but it may also be employed when diagnoses are rare and esoteric,18 as well as under conditions of sleep deprivation and fatigue.19 Robust decision making is more analytical than intuitive. It adopts a systematic approach to remove uncertainty within the resources available to make safe and effective decisions.20 Cognitive continuum theory proposes that there is not a dichotomy but a continuum between intuitive and analytical approaches.21

These various approaches have their origins in the fields of mathematics, philosophy, and, predominantly, psychology, and medical decision-making strategies have generally borrowed from them. However, after decades of research activity in medical decision making, no unifying approach has emerged and, correspondingly, it has not been possible to teach a consistent approach toward medical decision making. I propose a model here that brings together recent developments in cognitive psychological theory, these varied approaches to medical decision theory, and the realities of clinical practice.

System 1 and System 2 Processes

Description

Two fundamental approaches to reasoning, intuitive and analytical, have been formally recognized during the last 20 years.9,10 This dichotomy has since gathered momentum in the psychology literature and is now widely recognized as dual process theory, or System 1 and System 2 reasoning.22,23 The main characteristics of the two systems are listed in Table 1. They have recently been further elaborated and clustered by Evans.24

T1-14
Table 1:
Principal Characteristics of Type 1 and Type 2 Decision-Making Processes

System 1 is an intuitive approach that proves effective much of the time. Importantly, it is highly context-bound, with the potential for ambient conditions to exert a powerful influence.23 In forming their early diagnostic impressions, physicians may be consciously or subconsciously influenced by a variety of factors, including patient characteristics (appearance, demeanor, degree of discomfort, communication issues, past experience with the patient), characteristics of the illness (acuity, severity, past experience with the presenting complaints), immediate issues in the medical environment (other patients' needs, workload, priority setting, interruptions, distractions), resource issues (availability of specific tests, procedures, consultants, hospital beds), overarching issues (professional, ethical, medicolegal), and others. Morbidity and mortality rounds often do not take account of these contextual and ambient conditions when cases are being reviewed; regrettably, this is also a feature of medicolegal investigations.

System 1 is characterized by heuristics and other mental shortcuts. For example, most physicians would have little difficulty recognizing the characteristic distribution and appearance of herpes zoster, or the combination of signs and symptoms of an acute myocardial infarct. Many diagnostic decisions in medicine are based on this type of pattern recognition, which is strongly related to how fully manifested the disease is (i.e., how characteristic and pathognomic the presentation is for a particular illness). The system is fast, frugal, requires little effort, and frequently gets the right answer. But occasionally it fails, sometimes catastrophically. Predictably, it misses the patient who presents atypically, or when the pattern is mistaken for something else. In a major study of acute coronary syndrome, for example, the diagnostic error rate increased 10-fold when patients presented without the cardinal symptom of chest pain.25

System 2, in contrast, is engaged when the patient's signs and symptoms are not readily recognized as belonging to a specific illness category, or do not follow a particular script. For a patient presenting with a global headache, for example, there are a variety of diagnostic possibilities: muscle tension headache, rebound headache, migraine, subarachnoid hemorrhage, meningitis and others. The degree of pathognomicity is low, and uncertainty is correspondingly high; the various possibilities must now be teased out from each other in a systematic search. A System 2 approach is required—it is analytical, slow, and resource intensive, but more likely to get the correct diagnosis than would System 1.

Origins

The two systems have important and differing origins. System 1 is a passive, reflexive set of systems that may be triggered by context, images, emotions, and older parts of the brain—modules that evolved to cope with specific survival needs of our ancient evolutionary past. It is capable of parallel processing, and it is responsive to more than one feature at a time.23 Most System 1 responses seem to be evolutionarily adaptive, but they may not be instrumentally rational in modern contexts, leading to potential mismatches between our cognitive capabilities and prevailing environmental circumstances.26 It is this inherent vulnerability of intuitive thinking, and the use of heuristics that goes with it,27 that account for much of the error in System 1. Although System 1 reflects the innate responsivity of the brain, repetitive processing by System 2 can eventually lead to a System 1 response.23 For example, the first time a medical student sees a shingles rash, it will not be meaningful, but with repeat presentations the formulation of the diagnosis will eventually become reflexive.

In contrast, System 2 is the logical, rational software of the brain and only processes one channel at a time.23 It requires conscious activation. It is a linear system that is built through learning—the nurture part of our reasoning faculties. It becomes increasingly competent as we mature, socialize, and go through formal education. It is refined by training in critical thinking and logical reasoning.28

Operating Characteristics of the Model

The unmodified process

The first step in the diagnostic reasoning process is the presentation of the patient's symptoms and signs to the decision maker (see Figure 1). This is usually through direct contact between physician and patient, but it may take a less proximate form in which the patient's signs, symptoms, and results of investigations are relayed to a physician through an intermediary, such as a junior house physician to an attending staff member, or a family physician to a consultant. In the learning context, virtual patients or written descriptions of a patient's illness may be used to teach about the diagnostic process. In my experience, some fidelity of information is often lost in these second-hand accounts because of the intrusion of the first observer's thinking and interpretation biases, as well as a loss of context and ambient influences. A similar concern holds for the ecological validity of research in medical decision making that is removed from real clinical practice.

F1-14
Figure 1:
Model for diagnostic reasoning based on pattern recognition and dual-process theory. The model is linear, running from left to right. The initial presentation of illness is either recognized or not by the observer. If it is recognized, the parallel, fast, automatic processes of System 1 engage; if it is not recognized, the slower, analytical processes of System 2 engage instead. Determinants of System 1 and 2 processes are shown in dotted-line boxes. Repetitive processing in System 2 leads to recognition and default to System 1 processing. Either system may override the other. Both system outputs pass into a calibrator in which interaction may or may not occur to produce the final diagnosis.

If salient features of the presentation are initially recognized, System 1 processes engage immediately and automatically. Thus, recognized visual presentations of illness or injury (e.g., dermatological conditions, dislocations, fractures, stigmata of particular diseases such as alcoholism, endocrine disorders, cardiovascular disorders) or recognized combinations of salient symptoms or findings (syndromes, toxidromes, illness scripts, compiled experiences) will trigger pattern-recognition types of responses in System 1. Importantly, this process is reflexive and unconscious—no deliberate thinking effort is involved. These responses can only occur through prior System 2 learning; the more pathognomic the presentation, the stronger the response to it. Studies of expert decision making strongly support the success of this pattern-recognition approach.29,30 This recognition-primed processing forms the basis of a variety of approaches to decision making, as described above,13,31,32 and broader views of the control of conscious processes.16,33 As noted, these various approaches fall into two general categories: the intuitive approaches shown in List 1 are predominantly based on System 1, whereas the analytical ones are based on System 2. Approaches such as robust decision making20 and cognitive continuum theory21 are a combination of the two systems. Fuzzy trace theory is also a dual-process-type model of reasoning but with an emphasis on memory and perception of risk.34 It distinguished two forms of representation: verbatim and gist. The latter has many characteristics of System 1 and appears similarly vulnerable to inconsistencies in reasoning and irrational biases.

In addition to the pattern-recognition response, other System 1 responses may be generated simultaneously, in parallel to that response. For example, often the first responses that physicians have toward patients involve their (the physicians') feelings, and these may vary in both intensity and polarity. Physicians may have positive feelings toward some patients but negative feelings toward others,35,36 and often they may be unaware that these preconscious affective dispositions can play a significant role in decision making.16,37 Other System 1 responses can be triggered simultaneously with the pattern-recognition response, such as heuristics (mental shortcuts, rules of thumb), intuitions, and others. Some known determinants of System 123 are shown in the model. System 1 decision making has been popularized in the book Blink38 as the rapid cognitive style mentioned earlier called thin slicing,13 although this approach may, in certain circumstances, prove perilous in medicine.39,40

If the presentation is not recognized, or if it is unduly ambiguous or there is uncertainty, System 2 processes engage instead. Now the system is an analytic one, attempting to make sense of the presentation through objective and systematic examination of the data, and by applying accepted rules of reasoning and logic. It is a linear processing system, slower than System 1, more costly in terms of investment of resources, relatively free of affective influences, but considerably less prone to error. Some of the factors that influence System 2 reasoning are shown in the model.23 If there are no subsequent modifications of System 1 or System 2 processing, their individual or blended outputs determine the calibration of response and the eventual veracity of the diagnosis.

Modifiers of the process

The model has several mechanisms for modifying its output. First, System 1 and System 2 may interact with each other so that the final output is a synthesis of the two.23 For example, the patient's initial presentation might trigger a System 1 response in the decision maker that subsequently sets up a System 2 analytical approach. The monitoring capacity of System 2 over System 1 allows it to reject the latter by applying a rational override. Thus, while the first look at a rash might trigger a shingles diagnosis, if there are aberrant or atypical features (it crosses the midline or doesn't follow a dermatome distribution), System 2 can override and force a reassessment. Importantly, inattentiveness, distraction, fatigue, and cognitive indolence may all diminish System 2 surveillance and allow System 1 more latitude than it deserves. With fatigue and sleep deprivation, for example, the diagnostic error rate can increase fivefold.41 The monitoring capacity of System 2 depends on deliberate mental effort and works best when the decision maker is well rested, well slept, free from distraction, and focused on the task at hand. Metacognition, the ability to step back and reflect on what is going on in a clinical situation, is essentially System 2 monitoring in action, and may save a critical miss from occurring. Quirk42 defines metacognition as the act of “thinking about one's own and another's thinking and feeling.” It forces a monitoring step similar to the one that occurs when clinical decision support tools are used.

Second, System 1 may override otherwise sound reasoning developed by System 2. Consider, for example, a physician who may have attended a teaching session on a clinical decision rule for determining the pretest probability assessment of pulmonary embolus, or who has read about it in a journal or discussed it with a colleague and decided that this is the most rational and optimal approach to follow in this particular clinical situation. Such decision rules are based on aggregate data and developed objectively through System 2 reasoning and investigation in the cold light of day, as in the arborization, multiple-branching approach. However, when the physician is faced with a particular patient in a real-life situation, the physician may choose to override the decision rule and follow his or her intuitive feelings. Occasionally, there may be virtue in this; “situational appreciation” in some circumstances may be important for determining what is appropriate in a particular situation,43 but as an overall strategy it may prove irrational. These override decisions are not uncommon in medicine and may underlie, in part, the difficulties in acceptance and incorporation of clinical decision rules, referred to as knowledge uptake, transfer, or translation. Essentially, these inconsistencies, quirks, self-deceptions, and variances in individual decision making represent departures from a rational approach; they occur for historical, habitual, emotional, situational, and a variety of other reasons. Thus, even though well-developed clinical decision guidelines may be shown to consistently outperform the decision-making capabilities of the individual physician, there may still persist an irrational belief in some individuals that they know best and can always do better for the patient. This overconfidence is a major source of diagnostic error.44,45

In presenting the model, I am not suggesting that all medical reasoning and decision making fall neatly into one or the other system. As has been noted, instead of a discrete separation of the two systems, a cognitive continuum with oscillation occurring between System 1 and 2 has been proposed, resulting in varying degrees of efficiency and accuracy in judgment.21 When conflict occurs between competing goals of the two systems, it might be prudent and safer for the patient if the clinician applied a System 2 override of System 1.46 In recent reviews, the oversimplification that has occurred in dual-process approaches, and the inherent difficulties in accommodating the multiple and heterogeneous attributes of Systems 1 and 2, have been noted. Evans24 suggests that it might be more appropriate to talk about type 1 and type 2 processes, and Stanovich46 has recently proposed a subdivision of System 2 into algorithmic and reflective levels. The algorithmic mode engages the intellectual abilities and cognitive strategies of the decision maker, whereas the higher-level reflective mode involves beliefs, overall goals, and general knowledge.

Models of cognitive reasoning are relatively slow works in progress. Although the model proposed here may appear to simplify the complex processes at work in reasoning, decision making, and judgment, it nevertheless provides a basic framework for medical decision making within a sound theoretical structure, incorporating the disparate and diverse approaches that have been observed historically.

Clinical Relevance Versus Scientific Rigor of the Two Systems

The dual-processing theory can be used to bridge the current division of approaches toward clinical reasoning and decision making. Orthodox medical decision theorists have historically emphasized a scientifically rigorous approach toward decision making that is typically based on statistical and mathematical models. An inherent and prevailing assumption, concerning what are predominantly System 2 approaches, is that they can be employed by well-rested, well-slept decision makers under ideal conditions in which there are no distractions or untoward intrusion of affect and all the required data are available. However, as Reason47 notes, the cognitive reality often departs from this formalized ideal.

In many medical settings, decision makers function under suboptimal conditions. They may be hurried, distracted, fatigued, sleep deprived, and limited by resource constraints. Shortcuts and heuristic reasoning may come into play under conditions of cognitive busyness, overload, noisy signals, fatigue, and resource limitations.48–51 In many medical settings, workload is dynamic, often varying unpredictably, and providers must select strategies to maintain throughput of patients. One obvious strategy is to attenuate workload by using heuristics and shortcuts that achieve speed and frugality of cognitive effort—what has been referred to as the cognitive miser assumption.52 However, these strategies, which characterize System 1 approaches, are known to be vulnerable to a variety of cognitive and affective biases.53 Further, the area of the brain believed to be the neuroanatomical substrate for System 2 reasoning—the anterior cingulate cortex, prefrontal cortex, and medial aspect of the temporal lobe54—is the same area that suffers neurocognitive compromise through sleep deprivation.55 Thus, the combined effects that occur under adverse working conditions (i.e., the increased use of heuristics, together with the functional compromise of System 2) result in decrements in clinical performance, especially those aspects of performance associated with decision making.

Finally, many clinical situations are often characterized by too many variables or unknowns, too many ethical and financial restrictions, or too many other resource limitations to ever allow a simple quantitative approach to guide a particular clinical decision, and actuarial models simply cannot be applied in many clinical situations.56 This is the clinical reality that medical decision makers face daily.

Thirty years ago, this dilemma was presciently recognized by Elstein57 as a clinical-statistical polarization. The prevailing perception among medical decision makers at that time was that there was no scientific worth without quantification and statistics (i.e., the System 2 approach), a view that is little changed today: “The broader community of medical decision making researchers,” stated Hamm,58 “has not embraced the topic of heuristics and biases approach with sustained enthusiasm.” Failure of the theory of heuristic strategies (System 1 approach) has been attributed to its weak predictive power, its inability to describe the judgment process in sufficient detail or to explain individual differences, and its failure to assist physicians in improving their decision making.59 So, the paradox remains: an important feature of clinicians' decision making is apparently disqualified from study by those who research the field of medical decision making.56 This situation does little to alleviate the discomfort of physicians whose “dilemma,” stated Hammond, “lies in the rivalry between intuition and analysis. Intuition offers an immediate if risky judgment; analysis, though safer, takes longer—if it can be done at all.”21 Periodically, articles are published describing selected heuristics and biases in clinical reasoning with caveats about their pitfalls and failures,60–62 but the overall breadth of the problem is not addressed. There are over 50 known cognitive biases,53 many with evident influence in clinical decision making,63 and a variety of affective biases.53,64 To date, there has been little research on the role these biases play in real clinical decision making.65

It is important to resolve this issue for several reasons. First, clinical decision making is a critical aspect of clinical performance. In fact, it is difficult to imagine anything of greater importance or relevance to patient outcomes and to patient safety. Yet, it seems that insufficient emphasis is being placed on core aspects of decision making that are integral to clinical practice. Second, the failure to conduct clinical research in this area has led to a general pessimism about developing strategies to overcome the undesirable effects of heuristics and biases—that is, to develop cognitive and affective de-biasing strategies.66 Again, as Elstein57 observed, “The more it is insisted that a clinical situation cannot be analyzed in terms of risks and likelihoods, estimated however roughly, the more investigation in these terms is discouraged.” Third, clinicians in training and those already in practice need a comprehensive approach to clinical decision making that facilitates their understanding of this complex process and allows them to gain insight and understanding into their own decision making. For the safety of patients, the imperative to think critically, reason, decide, and diagnose well always remains.

References

1 Nuland SB. How We Die: Reflection on Life's Final Chapter. New York, NY: Alfred A. Knopf; 1994.
2 Gawande A. Final cut. In: Complications: A Surgeon's Notes on an Imperfect Science. New York, NY: Henry Holt and Company; 2002;197–198.
3 Graber M. Diagnostic errors in medicine: A case of neglect. Jt Comm J Qual Patient Saf. 2005;31:106–113.
4 Lundberg GD. Low-tech autopsies in the era of high-tech medicine: Continued value for quality assurance and patient safety. JAMA. 1998;280:1273–1274.
5 Barnett GO, Cimino JJ, Hupp JA, Hoffer EP. DXplain. An evolving diagnostic decision-support system. JAMA. 1987;258:67–74.
6 Warner HR Jr. Iliad: Moving medical decision-making into new frontiers. Methods Inf Med. 1989;28:370–372.
7 Miller RA, Masarie FE Jr. Use of the Quick Medical Reference (QMR) program as a tool for medical education. Methods Inf Med. 1989;28:340–345.
8 Ramnarayan P, Cronje N, Brown R, et al. Validation of a diagnostic reminder system in emergency medicine: A multi-centre study. Emerg Med J. 2007;24:619–624.
9 Bruner J. Actual Minds, Possible Worlds. Cambridge, Mass: Harvard University Press; 1987.
10 Hammond KR. Intuitive and analytic cognition: Information models. In: Sage A, ed. Concise Encyclopedia of Information Processing in Systems and Organizations. Oxford, UK: Pergamon Press; 1990;306–312.
11 Klein G. Sources of power. Cambridge, Mass: MIT Press; 1999.
12 Kahneman D, Slovic P, Tversky A. Judgment under uncertainty: Heuristics and biases. Cambridge, UK: Cambridge University Press; 1982.
13 Ambady N, Rosenthal R. Thin slices of behavior as predictors of interpersonal consequences: A meta-analysis. Psychol Bull. 1992;2:256–274.
14 Simon HA. Rational choice and the structure of the environment. Psychol Rev. 1956;63:129–138.
15 Cosmides L, Tooby J. Cognitive adaptations for social exchange. In: Barkow JH, Cosmides L, Tooby J, eds. The Adapted Mind: Evolutionary Psychology and the Generation of Culture. Oxford, UK: Oxford University Press; 1992:163–228.
16 Bargh JA, Chartrand TL. The unbearable automaticity of being. Am Psychol. 1999;54:462–479.
17 Dijksterhuis A, Bos MW, Nordgren LF, von Baaren RB. On making the right choice: The deliberation-without-attention-effect. Science. 2006;311:1005–1007.
18 Sackett DL, Haynes RB, Guyatt GH, Tugwell P. Clinical epidemiology: A basic science for clinical medicine. Boston, Mass: Little, Brown and Company; 1991.
19 Croskerry P. Shiftwork, fatigue, and safety in emergency medicine. In: Croskerry P, Cosby KS, Schenkel S, Wears R, eds. Patient Safety in Emergency Medicine. Philadelphia, Pa: Lippincott Williams & Wilkins; 2008.
20 Ullman DG. Making Robust Decisions: Decision Management for Technical, Business, and Service Teams. Victoria, BC: Trafford Publishing; 2006.
21 Hammond K. Human Judgment and Social Policy: Irreducible Uncertainty, Inevitable Error, Unavoidable Injustice. New York, NY: Oxford University Press; 2000.
22 Sloman S. The empirical case for two systems of reasoning. Psychol Bull. 1996;119:3–22.
23 Stanovich KE. The Robot's Rebellion: Finding Meaning in the Age of Darwin. Chicago, Ill: The University of Chicago Press; 2004.
24 Evans JS. Dual-processing accounts of reasoning, judgment, and social cognition. Annu Rev Psychol. 2008;59:255–278.
25 Brieger D, Eagle KA, Goodman SG, et al. Acute coronary syndromes without chest pain, an underdiagnosed and undertreated high-risk group: Insights from the Global Registry of Acute Coronary Events. Chest. 2004;126:461–469.
26 Simpson T, Carruthers P, Laurence S, Stich S. Nativism past and present. In: Carruthers P, Laurence S, Stich S, eds. The Innate Mind. New York, NY: Oxford University Press; 2005.
27 Gilovich T, Griffin D, Kahneman D. Heuristics and biases: The psychology of Intuitive Judgment. New York, NY: Cambridge University Press; 2002.
28 Risen J, Gilovich T. Informal logical fallacies. In: Sternberg RJ, Roediger HL, Halpern DF, eds. Critical Thinking in Psychology. New York, NY: Cambridge University Press; 2007:110–130.
29 Regehr G, Norman GR. Issues in cognitive psychology: Implications for professional education. Acad Med. 1996;71:988–1001.
30 Schmidt HG, Norman GR, Boshuizen HP. A cognitive perspective on medical expertise: Theory and implication. Acad Med. 1990;65:611–621.
31 Klein GA, Orasanu J, Calderwood R, Zsambok CE. Decision Making in Action: Models and Methods. Norwood, NJ: Ablex Publishing; 1993.
32 Dijksterhuis AP, Nordgren LF. A theory of unconscious thought. Perspect Psychol Sci. 2006;1:95–109.
33 Wilson T. Strangers to ourselves: Discovering the adaptive unconscious. Cambridge, Mass: Belknap Press of Harvard University Press; 2004.
34 Reyna VF. How people make decisions that involve risk: A dual-process approach. Curr Dir Psychol Sci. 2004;13:60–66.
35 Groves JE. Taking care of the hateful patient. N Engl J Med. 1978;298:883–887.
36 Zajonc RB. Feeling and thinking: Preferences need no inferences. Am Psychol. 1980;35:151–175.
37 Schwarz N. Feelings as information: Moods influence judgments and processing strategies. In: T Gilovich, D Griffin, D Kahneman, eds. Heuristics and Biases: The Psychology of Intuitive Judgment. New York, NY: Cambridge University Press; 2002:534–547.
38 Gladwell M. Blink: The Power of Thinking Without Thinking. New York, NY: Little, Brown and Company; 2005.
39 Croskerry P. Critical thinking and decision making: Avoiding the perils of thin- slicing. Ann Emerg Med. 2006;48:720–722.
40 Groopman J. How Doctors Think. New York, NY: Houghton Mifflin Company; 2007.
41 Landrigan CP, Rothschild JM, Cronin JW, et al. Effect of reducing interns' work hours on serious medical errors in intensive care units. N Engl J Med. 2004;351:1838–1848.
42 Quirk M. Intuition and Metacognition in Medical Education: Keys to Developing Expertise. New York, NY: Springer Publishing Company; 2006.
43 Wiggins D. Deliberation and Practical Reasoning. Essays on Aristotle's Ethics. Berkeley, Calif: University of California Press; 1980:221–240.
44 Berner ES, Graber ML. Overconfidence as a cause of diagnostic error in medicine. Am J Med. 2007;121:S2–S23.
45 Croskerry P, Norman GR. Overconfidence in clinical decision making. Am J Med. 2007;121:S24–S29.
46 Stanovich KE. Distinguishing the reflective, algorithmic, and autonomous minds: Is it time for a tri-process theory? In: Evans J, Frankish K, eds. In Two Minds: Dual Processes and Beyond. Oxford, UK: Oxford University Press; 2009:55–88.
47 Reason J. Human Error. New York, NY: Cambridge University Press; 1990.
48 Gilbert DT, Pelham BW, Krull DS. On cognitive busyness: When person perceivers meet person receivers. J Pers Soc Psychol. 1988;54:733–740.
49 Gilbert DT, McNulty SE, Giulano TA, Benson JE, Blurry words and fuzzy deeds: The attribution of obscure behavior. J Pers Soc Psychol. 1992;62:18–25.
50 Gigerenzer G, Czerlinski J, Martignon L. How good are fast and frugal heuristics? In: Gilovich T, Griffin D, Kahneman D, eds. Heuristics and Biases: The Psychology of Intuitive Judgment. New York, NY: Cambridge University Press; 2002:559–581.
51 Nicholson S. Choice context and decision-making: An application to voter fatigue. Available at: (http://www.allacademic.com/meta/p266581_index.html). Accessed April 20, 2009.
52 Krueger J, Funder DC. Towards a balanced social psychology: Causes, consequences and cures for the problem-seeking approach to social cognition and behavior. Behav Brain Sci. 2004;27:313–376.
53 Baron J. Thinking and Deciding. 3rd ed. New York, NY: Cambridge University Press; 2000.
54 Lieberman MD, Jarcho JM, Satpute AB. Evidence-based and intuition-based self-knowledge: An fMRI study. J Pers Soc Psychol. 2004;87:421–435.
55 Durmer JS, Dinges DF. Neurocognitive consequences of sleepdeprivation. Semin Neurol. 2005;25:117–129.
56 Croskerry P. The theory and practice of clinical decision-making. Can J Anesth. 2005;52(suppl 1):R1–R8.
57 Elstein AS. Clinical judgment: Psychological research and medical practice. Science. 1976;194:696–700.
58 Hamm RM. Theory about heuristic strategies based on verbal protocol analysis: The emperor needs a shave. Med Decis Making. 2004;24:681–686.
59 Poses RM, Cebul RD, Wigton RS. You can lead a horse to water—Improving physicians' knowledge of probabilities may not affect their decisions. Med Decis Making. 1995;15:65–75.
60 Bordage G. Why did I miss the diagnosis? Some cognitive explanations and educational implications Acad Med. 1999;74(10 suppl):S138–S143.
61 Bornstein BH, Emler AC. Rationality in medical decision making: A review of the literature on doctors' decision-making biases. J Eval Clin Pract. 2001;7:97–107.
62 Klein JG. Five pitfalls in decisions about diagnosis and prescribing. BMJ. 2005;330:781–783.
63 Croskerry P. Achieving quality in clinical decision making: Cognitive strategies and detection of bias. Acad Emerg Med. 2002;9:1184–1204.
64 Croskerry P. Diagnostic Failure: A Cognitive and Affective Approach. Advances in Patient Safety: From Research to Implementation. Rockville, Md: Agency for Health Care Research and Quality; 2006.
65 Croskerry P, Abbass A, Wu AW. How doctors feel: Affective issues in patients' safety. Lancet. 2008;372:1205–1206.
66 Croskerry P. The importance of cognitive errors in diagnosis and strategies to prevent them. Acad Med. 2003;78:1–6.
© 2009 Association of American Medical Colleges