Skip Navigation LinksHome > February 2014 - Volume 89 - Issue 2 > Deciding About Fast and Slow Decisions
Academic Medicine:
doi: 10.1097/ACM.0000000000000121
Commentaries

Deciding About Fast and Slow Decisions

Croskerry, Pat MD, PhD; Petrie, David A. MD; Reilly, James B. MD, MS; Tait, Gordon PhD

Free Access
Article Outline
Collapse Box

Author Information

Dr. Croskerry is professor and director, Critical Thinking Program, Division of Medical Education, Faculty of Medicine, Dalhousie University, Halifax, Nova Scotia, Canada.

Dr. Petrie is professor of emergency medicine and professor, Department of Emergency Medicine, Faculty of Medicine, Dalhousie University, and chief, Capital District Health Authority Department of Emergency Medicine, Halifax, Nova Scotia, Canada.

Dr. Reilly is associate director, Internal Medicine Residency, Allegheny General Hospital, Western Pennsylvania Hospital Educational Consortium, Pittsburgh, Pennsylvania, and assistant professor of medicine, Temple University School of Medicine, Philadelphia, Pennsylvania.

Dr. Tait is assistant professor, Departments of Surgery and Anesthesia, and staff scientist, Department of Anesthesia, Toronto General Hospital, University Health Network, Faculty of Medicine, University of Toronto, Toronto, Ontario, Canada.

Editor’s Note: This is a commentary on Norman G, Sherbino J, Dore K, et al. The etiology of diagnostic errors: A controlled trial of system 1 versus system 2 reasoning. Acad Med. 2014;89:277–284; and on Schmidt HG, Mamede S, van den Berge K, van Gog T, van Saase JLCM, Rikers RMJP. Exposure to media information about a disease can cause doctors to misdiagnose similar-looking clinical cases. Acad Med. 2014;89:285–291.

Funding/Support: None reported.

Other disclosures: None reported.

Ethical approval: Reported as not applicable.

Correspondence should be addressed to Dr. Croskerry, Critical Thinking Program, Division of Medical Education, Faculty of Medicine, Dalhousie University, 5849 University Ave., PO Box 15000, Halifax, Nova Scotia, Canada B3H 4R2; telephone: (902) 494-4147; fax: (902) 494-2278; e-mail: croskerry@eastlink.ca.

Collapse Box

Abstract

Two reports in this issue address the important topic of clinical decision making. Dual process theory has emerged as the dominant model for understanding the complex processes that underlie human decision making. This theory distinguishes between the reflexive, autonomous processes that characterize intuitive decision making and the deliberate reasoning of an analytical approach. In this commentary, the authors address the polarization of viewpoints that has developed around the relative merits of the two systems. Although intuitive processes are typically fast and analytical processes slow, speed alone does not distinguish them. In any event, the majority of decisions in clinical medicine are not dependent on very short response times. What does appear relevant to diagnostic ease and accuracy is the degree to which the symptoms of the disease being diagnosed are characteristic ones.

There are also concerns around some methodological issues related to research design in this area of enquiry. Reductionist approaches that attempt to isolate dependent variables may create such artificial experimental conditions that both external and ecological validity are sacrificed. Clinical decision making is a complex process with many independent (and interdependent) variables that need to be separated out in a discrete fashion and then reflected on in real time to preserve the fidelity of clinical practice. With these caveats in mind, the authors believe that research in this area should promote a better understanding of clinical practice and teaching by focusing less on the deficiencies of intuitive and analytical systems and more on their adaptive strengths.

In medical education, an important focus has developed in recent years on clinical decision making and especially the thought processes that underlie it. This is considerably more than of passing academic interest—how doctors think and how they make decisions have major implications for patient safety. Failures in these two most important aspects of clinical decision making, diagnosis and therapeutic management, may have devastating consequences for patients.

Back to Top | Article Outline

Two Processes of Decision Making

Fortuitously, over the last few decades social scientists have made significant progress into understanding how we think. The dominant theory to have emerged divides decision making into two broad types of processes: intuitive (fast, reflexive, and requiring minimal cognitive resources), and analytical (slow, deliberate, and demanding more conscious effort), and the characteristics of the two systems are now well described. Intuitive processes are largely based on pattern recognition, allowing us to save considerable time and effort by matching already-known patterns to particular decisions and actions. A very important point, not always appreciated, is that no reasoning per se occurs in the intuitive mode; the brain simply reacts to the perceived pattern. Given that we spend the vast majority of our time in the intuitive mode, and given the consensus from the psychology literature that the majority of faults in decision making occur there,1,2 it seems important to understand when and how to defer to the analytical mode. It has become clear that there are certain situations where we can reasonably and comfortably trust our intuitions, and others where it would be ill advised to use anything other than analytical reasoning.

A polarization of viewpoints about these two modes has occurred, with some arguing for the power of intuition3–5 and others cautioning against its perils1,2,6; however, this is not a useful dichotomy. It is simplistic to say, overall, that intuition is superior to analytical reasoning or vice versa. Human decision making involves both processes, and different situations require different approaches. For example, decisions that need to be made in a split second, those that depend on social and emotional intelligence, those that call for inspiration and creativity, and those that spontaneously drive a well-learned sequence of behavior may be effectively made in the intuitive mode. In contrast, those that have more precise requirements and no room for error, such as the staging of an aggressive cancer, can only be made analytically. The same should hold for any medical diagnosis that carries a significant outcome.

Back to Top | Article Outline

Less Is More?

In their classic work on diagnostic reasoning, Elstein et al7 repeatedly stressed the importance of a thorough and comprehensive workup to avoid premature diagnoses, and Zachary Cope, the eminent surgeon, said: “Spot diagnosis may be magnificent, but it is not sound diagnosis. It is impressive but unsafe.”8 But others have argued for less effort, not more. Gigerenzer’s5 approach towards the management of patients with potential myocardial infarction argues for a simple decision tree that depends on whether or not a patient has chest pain. Certainly, such simple rules will capture the majority of patients suffering an acute coronary syndrome (ACS). However, a majority is not an acceptable benchmark in clinical medicine. In a series of over 20,000 cases it was demonstrated that the absence of chest pain at presentation actually increased 10-fold the chances of missing an ACS.9 Those who favor the less-effort strategies might want to ask themselves which approach they would prefer if they were experiencing an atypical ACS. The prevailing lesson in clinical medicine is that simple and quick will often be correct but do not always meet an acceptable standard.

Two reports in this issue of Academic Medicine investigate decision making using different approaches. In the first, by Norman et al,10 in which inexperienced physicians were presented with written clinical cases on computer screens in a laboratory setting, the authors found that simply encouraging slowing down and increasing attention to analytical thinking was insufficient to increase diagnostic accuracy. Whether the authors intended it or not, the conclusion most readers will draw from this result, and that of a similar study that preceded it,11 is that diagnostic decisions made faster are more accurate. A further assumption is that if decisions are made quickly, they are likely made in the intuitive mode, and therefore making decisions intuitively is a good thing. Except to support these assumptions, it is not otherwise clear why experimental efforts would be made to investigate putative relationships between diagnostic accuracy and speed, as this is generally not an issue in medicine. In the other study in this issue of the journal that focuses on decision making, by Schmidt et al,12 a classic cognitive bias, availability, was investigated in senior internal medicine residents. Cognitive bias, in contrast, is a very important issue in medicine.

Back to Top | Article Outline

Equating Speed With Intuitive Reasoning

It is widely accepted that responses from the intuitive system are fast and analytic responses slower. The time required for an intuitive response varies, but some diagnoses (e.g., herpes zoster) can be made in a matter of milliseconds. However, it does not follow that fast responses are necessarily intuitive. If an emergency physician is shown an electrocardiogram of a patient with clear ST segment elevation in the inferior leads, and the patient is experiencing chest discomfort, it may reasonably be concluded that the patient is having an acute inferior infarct. Thus, if the clinician has sufficient knowledge and the clinical information given is adequate, then the diagnosis can be made in seconds but is not necessarily intuitive—some reasoning is involved to make the connection between the particular leads on the ECG, the area of the heart that might be damaged, and the chest symptoms.

The same holds for a variety of diagnoses in medicine that may present pathognomonically (i.e., the symptoms and signs are highly characteristic of the disease): herpes zoster, Colles fracture, otitis media, cerebrovascular accident, and many others. In fact, all diagnoses in medicine could be arranged along a continuum of “manifestness,” from pathognomonic at one end to vague and ill defined at the other. Although pathognomonic diagnoses are not initially intuitive, they will become so with repeated exposure. In Norman and colleagues’10 study, if we allowed a generous 10 seconds for a pattern recognition (intuitive) response to be made, the response times reported (in the 40- to 60-second range) allow more than enough time to make an analytic response. In the example given in that report’s Appendix, most physicians would very quickly make the diagnosis of carbon monoxide poisoning, simply because the blood level is toxic and fits the patient’s presentation. Further assessment of the patient might reveal other significant comorbidities, but, at least in a computer-based exercise such as this, simply seeing a high carbon monoxide blood value immediately makes the diagnosis; well-packaged problems facilitate good decision making. Thus, fast responses are not necessarily intuitive; fast analytic decisions may be made when all the necessary information is available. In an experimental study13 on the impact of training on decision making, for example, participants were able to conduct an (analytic) conscious search within eight seconds.

A fundamental rule of human performance is that speed (S) of response trades off (TO) with accuracy (A)—known in psychology as SATO. Mostly, speed of diagnosis is not critical in medicine except in certain circumstances (e.g., trauma cases, a rapidly deteriorating patient) or in therapeutic interventions that are tightly coupled. Under those conditions, the reliability of decision making will depend on the decision maker’s having an adequate medical knowledge base, and a signal that is fairly distinct from the noise (see Figure 1). However, where there is uncertainty and overlap of signal and noise, diagnostic failure becomes more likely, such as in the case of a patient presenting with headache, for which there are hundreds of causes, or chest pain, which has at least 25 different causes. Many presentations in primary and emergency care involve low signal-to-noise ratios along with atypical presentations and incomplete histories, and may also contain inaccurate or misleading information.

Figure 1
Figure 1
Image Tools

Except in pathognomonic presentations (condition A in Figure 1), it is fairly uncommon in clinical medicine to have all of the relevant information required to make a diagnosis available at the outset. However, that was what was provided to the residents in both the current study by Norman et al10 and the study that preceded it.11 The correlation observed between diagnostic accuracy and speed likely represents variance in knowledge across residents, given that they were all relatively inexperienced. Residents who knew more and had greater comfort with the material presented were likely to be sufficiently confident to respond more quickly. Thus, the observed relationship between speed and accuracy noted could be due to the fact that students who are not sure of the answer are more likely to take longer and are also more likely to get the incorrect answer. Admonishments to slow down and reflect will not be effective in improving accuracy when the medical knowledge of the student is inadequate to answer the question. Had the cases presented in these studies been more typical of the lower signal-to-noise ratios more common in clinical practice (conditions B and C in Figure 1), it seems unlikely that any speed–accuracy correlation would hold. Clinically prudent physicians are more than aware of SATO and would consider a fast response in such cases dangerous and ill advised. This feature of the study is critical for its interpretation and, we believe, may render it fundamentally flawed. Three of the four of us who wrote this commentary are senior physicians, and none would encourage residents under our supervision to make speedy diagnoses—even in cases that appeared straightforward. In many respects, this makes a discussion of speed and accuracy of diagnosis in clinical medicine redundant.

Back to Top | Article Outline

External and Ecological Validity

Even if the conclusions around the findings from Norman and colleagues’10 study were valid, another major difficulty concerns the methodology. Medical clinical decision making is a highly complex process with multiple independent variables known to be influential: context, ambient conditions, patient characteristics, disease characteristics, characteristics of the decision maker (gender, personality, age, intellectual ability, knowledge, experience, mood, well-being, degree of fatigue, sleep deprivation, and others). While reductionism remains the classic scientific approach to minimize or eliminate the influence of independent variables, the conditions under which decision making was examined in that study10 and the previous one11 are so detached from clinical practice as to markedly reduce external and ecological validity to the point of making any conclusions extremely tenuous.

The major difficulty with such reductionist approaches to clinical decision making is that, in separating out independent variables, one is removing the very environment that characterizes the process being studied. For example, studying the characteristics of cholera toxin in the laboratory reveals very little about a cholera epidemic, which is influenced by independent variables not usually found in a Petri dish (population demographics, group immunity, crowding, local water supply, sanitation, and climate).14 Failure to consider the role of relevant independent variables leads to a loss in both external validity and ecological validity. In contrast, the methodology of the study by Schmidt et al12 is acceptable in this regard. With a single discrete manipulation, they demonstrate powerful effects of a cognitive bias (intuitively and unconsciously triggered) on residents’ clinical decision making. Furthermore, they were also able to demonstrate cognitive debiasing by encouraging reflection, a deliberate analytical intervention. Slowing down and being reflective have been demonstrated in other studies by Mamede and her colleagues15 to improve diagnostic accuracy and other outcomes. Importantly, in teaching the strategies for optimal clinical decision making, we not only need to encourage slowing down but also to explain why it is necessary—that is, what is involved in the analytic process. Finally, given the changes that occur in decision making with experience, any generalizability of observations on novice physicians under artificial conditions10,11 to real decision making in clinical practice is likely limited.

Back to Top | Article Outline

An Issue of Balance?

We began this commentary by making a general point about patient safety and how important it is to understand the process of clinical decision making. With the polarization about approaches to that process that has occurred, some have portrayed dual-process theory as an either/or dilemma: One type of process leads to error all the time; another type of process leads to accuracy all the time. Such framing of the two types of processes drives the debate towards the merits of fast versus slow thinking rather than considering how to optimize patient safety. Therefore, as we explain in Table 1, the imperative should switch to minimizing the practice and teaching that appear in the error-prone approach column, and maximizing the practice and teaching that appear in the error-reduction approach column. That is, we should focus on which approach is better rather than on which process is worse. Despite the concerns we have expressed about Norman and colleagues’10 study and its predecessor,11 we strongly encourage further studies in the important field of clinical decision making. They should investigate operating characteristics of the decision maker under both natural and experimental conditions as well as the influence of the many contextual variables in clinical medicine that have been mentioned. However, the clinical applicability of findings remains paramount.

Table 1
Table 1
Image Tools
Back to Top | Article Outline

References

1. Myers DG Intuition: Its Powers and Perils. 2002 New Haven, Conn Yale University Press

2. Kahneman D Thinking, Fast and Slow. 2011 Toronto, Ontario, Canada Doubleday

3. Klein G The Power of Intuition. 2004 New York, NY Doubleday

4. Gladwell M Blink: The Power of Thinking Without Thinking. 2005 New York, NY Little, Brown and Company

5. Gigerenzer G Gut Feelings: The Intelligence of the Unconscious. 2007 New York, NY Viking Penguin

6. Novella S Your Deceptive Mind: A Scientific Guide to Critical Thinking Skills. 2012 Chantilly, Va The Great Courses

7. Elstein AS, Shulman LS, Sprafka SA Medical Problem Solving: An Analysis of Clinical Reasoning. 1978 Cambridge, Mass Harvard University Press

8. Silen W Cope’s Early Diagnosis of the Acute Abdomen. 197915th ed New York, NY Oxford University Press

9. Brieger D, Eagle KA, Goodman SG, et al. Acute coronary syndromes without chest pain, an under diagnosed and undertreated high-risk group: Insights from the Global Registry of Acute Coronary Events. Chest. 2004;126:461–469

10. Norman G, Sherbino J, Dore K, et al. The etiology of diagnostic errors: A controlled trial of system 1 versus system 2 reasoning. Acad Med. 2014;89:277–284

11. Sherbino J, Dore KL, Wood TJ, et al. The relationship between response time and diagnostic accuracy. Acad Med. 2012;87:785–791

12. Schmidt HG, Mamede S, van den Berge K, van Gog T, van Saase JLCM, Rikers RMJP. Exposure to media information about a disease can cause doctors to misdiagnose similar-looking clinical cases. Acad Med. 2014;89:285–291

13. Wan X, Nakatani H, Ueno K, Asamizuya T, Cheng K, Tanaka K. The neural basis of intuitive best next-move generation in board game experts. Science. 2011;331:341–346

14. Fang FC, Casadevall A. Reductionistic and holistic science. Infect Immun. 2011;79:1401–1404

15. Mamede S, Splinter TA, van Gog T, Rikers RM, Schmidt HG. Exploring the role of salient distracting clinical features in the emergence of diagnostic errors and the mechanisms through which reflection counteracts mistakes. BMJ Qual Saf. 2012;21:295–300

© 2014 by the Association of American Medical Colleges

Login

Article Tools

Images

Share