Secondary Logo

Journal Logo

ARTICLES

The Importance of Cognitive Errors in Diagnosis and Strategies to Minimize Them

Croskerry, Pat MD, PhD

Author Information

Abstract

The recent article by Graber et al.1 provides a comprehensive overview of diagnostic errors in medicine. There is, indeed, a long overdue and pressing need to focus on this area. They raise many important points, several of which deserve extra emphasis in the light of recent developments. They also provide an important conceptual framework within which strategies may be developed to minimize errors in this critical aspect of patient safety. Diagnostic errors are associated with a proportionately higher morbidity than is the case with other types of medical errors.2–4

The no-fault and system-related categories of diagnostic errors described1 certainly have the potential for reduction. In fact, very simple changes to the system could result in a significant reduction in these errors. However, the greatest challenge, as they note, is the minimization of cognitive errors, and specifically the biases and failed heuristics that underlie them. Historically, there has prevailed an unduly negative mood toward tackling cognitive bias and finding ways to minimize or eliminate it.

The cognitive revolution in psychology that took place over the last 30 years gave rise to an extensive, empirical literature on cognitive bias in decision-making, but this advance has been ponderously slow to enter medicine. Decision-making theorists in medicine have clung to normative, often robotic, models of clinical decision making that have little practical application in the real world of decision making. What is needed, instead, is a systematic analysis of what Reason5 has called “flesh and blood” decision-making. This is the real decision making that occurs at the front line, when resources are in short supply, when time constraints apply, and when shortcuts are being sought. When we look more closely at exactly what cognitive activity is occurring when these clinical decisions are being made, we may be struck by how far it is removed from what normative theory describes. Although it seems certain we would be less likely to fail patients diagnostically when we follow rational, normative models of decision making, and although such models are deserving of “a prominent place in Plato's heaven of ideas,”6 they are impractical at the sharp end of patient care. Cognitive diagnostic failure is inevitable when exigencies of the clinical workplace do not allow such Olympian cerebral approaches.

Medical decision makers and educators have to do three things: (1) appreciate the full impact of diagnostic errors in medicine and the contribution of cognitive errors in particular; (2) refute the inevitability of cognitive diagnostic errors; and (3) dismiss the pessimism that surrounds approaches for lessening cognitive bias.

For the first, the specialties in which diagnostic uncertainty is most evident and in which delayed or missed diagnoses are most likely are internal, family, and emergency medicine; this is borne out in findings from the benchmark studies of medical error.2–4 However, all specialties are vulnerable to this particular adverse event. The often impalpable nature of diagnostic error perhaps reflects why it does not appear in lists of serious reportable events.7 For the second, there needs to be greater understanding of the origins of the widespread inertia that prevails against reducing or eliminating cognitive errors. This inertia may exist because such errors appear to be so predictable, so widespread among all walks of life, so firmly entrenched, and, therefore, probably hardwired. Although the evolutionary imperatives that spawned them may have served us well in earlier times, it now seems we are left with cognitively vestigial approaches to the complex decision making required of us in the modern world. Although “cognitive firewalls” may have evolved to quarantine or avoid cognitive errors, they are clearly imperfect8 and will require ontogenetic assistance (i.e., cognitive debiasing) to avoid their consequences. Accepting this, we should say less about biases and failed heuristics and more about cognitive dispositions to respond (CDRs) to particular situations in various predictable ways. Removing the stigma of bias clears the way toward accepting the capricious nature of decision-making, and perhaps goes some way toward exculpating clinicians when their diagnoses fail.

An understanding of why clinicians have particular CDRs in particular clinical situations will throw considerable light on cognitive diagnostic errors. The unmasking of cognitive errors in the diagnostic process then allows for the development of debiasing techniques. This should be the ultimate goal, and it is not unrealistic.

Certainly, a number of clear strategies exist for reducing the memory limitations and excessive cognitive loading1 that can lead to diagnostic errors, but the most important strategy may well lie in familiarizing clinicians with the various types of CDRs that are out there, and how they might be avoided. I made a recent extensive trawl of medical and psychological literature, which revealed at least 30 CDRs,9 and there are probably more (List 1). This catalogue provides some idea of the extent of cognitive bias on decision-making and gives us a working language to describe it. The failures to show improvement in decision support for clinical diagnosis that are noted by Graber et al.1 should come as no surprise. They are likely due to insufficient awareness of the influence of these CDRs, which is often subtle and covert.10 There appears to have been an historic failure to fully appreciate, and therefore capture, where the most significant diagnostic failures are coming from.

List 1
List 1:
List 1. Cognitive Dispositions to Respond (CDRs) That May Lead to Diagnostic Error*
List 1
List 1:
List 1. Continued

Not surprisingly, all CDRs are evident in emergency medicine, a discipline that has been described as a “natural laboratory of error.”11 In this milieu, decision-making is often naked and raw, with its flaws highly visible. Nowhere in medicine is rationality more bounded by relatively poor access to information and with limited time to process it, all within a milieu renowned for its error-producing conditions.12 It is where heuristics dominate, and without them emergency departments would inexorably grind to a halt.13 Best of all, for those who would like to study real decision making, it is where heuristics can be seen to catastrophically fail. Approximately half of all litigation brought against emergency physicians arises from delayed or missed diagnoses.14

If we accept the pervasiveness and predictability of the CDRs that underlie diagnostic cognitive error, then we are obliged to search for effective debiasing techniques. Despite the prevailing pessimism, it has been demonstrated that, using a variety of strategies15,16 (Table 1), CDRs can be overcome for a number of specific biases.16–23 It appears that there are, indeed, cognitive pills for cognitive ills,22 which makes intuitive sense. This is fortunate, for otherwise, how would we learn to avoid pitfalls, develop expertise, and acquire clinical acumen, particularly if the predisposition for certain cognitive errors is hardwired? However, medical educators should be aware that if the pills are not sufficiently sugared, they may not be swallowed.

Table 1
Table 1:
Cognitive Debiasing Strategies to Reduce Diagnostic Error*

Yates et al.24 have summarized some of the major impediments that have stood in the way of developing effective cognitive debiasing strategies, and they are not insurmountable. The first step is to overcome the bias against overcoming bias. Metacognition will likely be the mainstay of this approach. A recent cognitive debiasing technique using cognitive forcing strategies is based on metacognitive principles10 and seems to be teachable to medical undergraduates and postgraduates.25 Essentially, the strategy requires first that the learner be aware of the various cognitive pitfalls, and second that specific forcing strategies be developed to counter them.

Much of clinical decision making, as Reason5 notes, is where “the cognitive reality departs from the formalized ideal.” This cognitive reality is extremely vulnerable to error. The problem is that cognitive error is high-hanging fruit and difficult to get at, and there will be a tendency to pursue more readily attainable goals. There is a story about a jogger who came across a man on his knees under a streetlight one evening. He explained that he had dropped his wedding ring. The jogger offered to help him search, and he accepted. With no luck after a half hour, the jogger asked the man if he was sure he had dropped the ring at the place where they were searching. The man replied that he actually dropped it several yards away in the shadows. “Then why are we looking here?” asked the jogger. “Because the light is better,” came the reply.

Real solutions to cognitive diagnostic errors lie in the shadows, and they will be difficult to find. One very clear goal in reducing diagnostic errors in medicine is to first describe, analyze, and research CDRs in the context of medical decision making, and to then find effective ways of cognitively debiasing ourselves and those whom we teach. Not only should we be able to reduce many cognitive diagnostic errors, but we may also be pleasantly surprised to find how many can be eliminated.

REFERENCES

1. Graber M, Gordon R, Franklin N. Reducing diagnostic errors in medicine: what's the goal? Acad Med. 2002;77:981–92.
2. Brennan TA, Leape LL, Laird NM, et al. Incidence of adverse events and negligence in hospitalized patients: results of the Harvard Medical Practice Study 1. N Eng J Med. 1991;324:370–6.
3. Wilson RM, Runciman WB, Gibberd RW, et al. The Quality in Australian Health Care Study. Med J Australia 1995;163:458–71.
4. Thomas EJ, Studdert DM, Burstin HR, et al. Incidence and types of adverse events and negligent care in Utah and Colorado. Med Care. 2000;38:261–2.
5. Reason, J. Human Error. New York: Cambridge University Press, 1990.
6. Simon HA. Alternate visions of rationality. In: Arkes HR, Hammond KR (eds.). Judgment and Decision Making: An Interdisciplinary Reader. New York: Cambridge University Press, 1986: 97–113.
7. Serious reportable events in patient safety: A National Quality Forum consensus report. Washington, D.C.: National Quality Forum, 2002.
8. Cosmides L, Tooby J. Consider the source: the evolution of adaptations for decoupling and metarepresentation. In: Sperber D (ed.). Metarepresentation. Vancouver Studies in Cognitive Science. New York: Oxford University Press, 2001.
9. Croskerry P. Achieving quality in clinical decision making: cognitive strategies and detection of bias. Acad Emerg Med 2002;9:1184–1204.
10. Croskerry P. Cognitive forcing strategies in clinical decision making. Ann Emerg Med. 2003;41:110–20.
11. Bogner, MS. (ed.). Human Error in Medicine. New Jersey: Lawrence Erlbaum Associates, 1994.
12. Croskerry P, Wears RL. Safety errors in emergency medicine. In: Markovchick VJ and Pons PT (eds.). Emergency Medicine Secrets, 3rd ed. Philadelphia: Hanley and Belfus, 2002: 29–37.
13. Kovacs G, Croskerry P. Clinical decision making: an emergency medicine perspective. Acad Emerg Med. 1999;6:947–52.
14. Data from the U.S General Accounting Office, the Ohio Hospital Association and the St. Paul (MN) Insurance Company, 1998 〈http://hookman.com/mp9807.htm〉. Accessed 4/24/03.
15. Fischhoff B. Debiasing. In: Kahneman D. Slovic P. and Tversky A (eds). Judgment under Uncertainty: Heuristics and Biases. New York: Cambridge University Press, 1982: 422–44.
16. Arkes HA. Impediments to accurate clinical judgment and possible ways to minimize their impact. In: Arkes HR, Hammond KR (eds). Judgment and Decision Making: An Interdisciplinary Reader. New York: Cambridge University Press, 1986: 582–92.
17. Nathanson S, Brockner J, Brenner D, et al. Toward the reduction of entrapment. J Applied Soc Psychol. 1982;12:193–208.
18. Schwartz WB, Gorry GA, Kassirer JP, Essig A. Decision analysis and clinical judgment. Am J Med. 1973;55:459–72.
19. Slovic P, Fischhoff B. On the psychology of experimental surprises. J Exp Psychol Hum Percept Perform. 1977;3:544–51.
20. Edwards W, von Winterfeldt D. On cognitive illusions and their implications. In: Arkes HR, Hammond KR (eds). Judgment and Decision Making: An Interdisciplinary Reader. New York: Cambridge University Press, 1986: 642–79.
21. Wolf FM, Gruppen LD, Billi JE. Use of a competing-hypothesis heuristic to reduce pseudodiagnosticity. J Med Educ 1988;63:548–54.
22. Keren G. Cognitive aids and debiasing methods: can cognitive pills cure cognitive ills? In: Caverni JP, Fabre JM, Gonzales M (eds). Cognitive Biases. New York: Elsevier, 1990: 523–52.
23. Plous S. The Psychology of Judgment and Decision Making. Philadelphia: Temple University Press, 1993.
24. Yates JF, Veinott ES, Patalano AL. Hard decisions, bad decisions: on decision quality and decision aiding. In: Schneider S, Shanteau J. (eds.). Emerging Perspectives in Judgment and Decision Making. New York: Cambridge University Press, 2003.
25. Croskerry P. Cognitive forcing strategies in emergency medicine. Emerg Med J. 2002;19(suppl 1):A9.
26. Croskerry P. The feedback sanction. Acad Emerg Med. 2000;7:1232–38.
27. Hogarth RM. Judgment and Choice: The Psychology of Decision. Chichester, England: Wiley, 1980.
    © 2003 Association of American Medical Colleges