Skip Navigation LinksHome > August 2003 - Volume 78 - Issue 8 > Cognitive Underpinnings of Diagnostic Error
Academic Medicine:
Articles

Cognitive Underpinnings of Diagnostic Error

Gordon, Ruthanna PhD; Franklin, Nancy PhD

Free Access
Article Outline
Collapse Box

Author Information

Dr. Gordon is a graduate student, and Dr. Franklin is associate professor, Department of Psychology, State University of New York at Stony Brook, Stony Brook, New York.

Correspondence and requests for reprints should be sent to Dr. Gordon or Dr. Franklin, Department of Psychology, State University of New York, Stony Brook, NY 11794-2500. Dr. Gordon's e-mail is 〈rrgordon@ic.sunysb.edu〉; Dr. Franklin's is 〈nfranklin@notes.cc.sunysb.edu〉.

This article and the preceding one are responses to the article by Croskerry in this issue.

Collapse Box

Abstract

Unfortunately, general limits in cognitive performance extend to diagnostic situations. The authors remain optimistic about reducing cognition-based error, but not as optimistic as Croskerry (see his accompanying article), since the reality is that predictable patterns of error will persist.

As we learn about how physicians arrive at diagnoses, intuition suggests that we will make constant progress toward the elimination of diagnostic errors. But sadly, this is not the case. Even among experts, some kinds of bias and some kinds of errors are not going to be eliminated. Our goal in last year's Academic Medicine1 was to speak to this inevitability that human decision makers will make predictable kinds of errors, while also arguing that there is still much room for optimism with regard to their rate.

We agree with Croskerry2 that metacognitive training offers the greatest hope for reducing cognitive error in individual practitioners. People can be led, for example, to use their knowledge and see analogies more effectively than they had previously.3 Success (albeit limited) has sometimes been shown in reducing heuristic-based errors through warnings or metacognitive monitoring.4 Continuing education, electronic database and decisional aids, and situational simulation are valuable means of driving accuracy up, as is the implementation of system solutions, as we note in our earlier article.

However, decisional heuristics and the pitfalls that come with them are not going away, no matter how expert, intelligent, and vigilant we are. They come with being human, and they normally increase accuracy and reduce demand on working memory. However, they are difficult to penetrate cognitively and thus difficult to monitor or disregard (and disregarding them can lead to errors as well). Because of this, metacognitive training provides no guarantee of success. In addition, other principles governing our behavior present challenges to our rational thinking. For example, people sometimes devalue information that contradicts their current point of view,5 or change their view (sometimes to the polar opposite decision) as a function of minor changes in framing.6 In addition, errors in perception and memory present very different kinds of challenges for correction, and many lie beyond conscious awareness.

Thus, although we, like Croskerry,2 are optimistic about improving diagnostic accuracy, we stress the need to be realistic. Croskerry's point2 seems to be largely that the goal should be an ambitiously optimistic one. Indeed, only through believing that cognitive-based errors are highly reducible can medicine create a culture where the proper effort is made to train and monitor the cognitive act of diagnosis. However, the literature teaches us that the outcomes we can realistically expect fall short of what should remain our ambitious goals. We encourage our colleagues to pay more attention to the cognitive underpinnings of diagnostic error and trust that diagnostic errors of all types can be reduced. We encourage researchers to continue empirically testing cognitive training procedures within real-time medical contexts, with the hope that medical decision making can become a standard part of medical curricula.7 Significant improvement, even if it does not reach perfection, is a very worthy goal.

Back to Top | Article Outline

REFERENCES

1. Graber M, Gordon R, Franklin N. Reducing diagnostic errors in medicine: what's the goal? Acad Med. 2002;77:35–46.

2. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003;78:775–780.

3. Gick ML, Holyoak KJ. Analogical problem solving. Cog Psych. 1980;12:306–55.

4. Lichtenstein S, Fischhoff B. Training for calibration. Org Behav Hum Perf. 1980;26:149–71.

5. Kunda Z. The case for motivated reasoning. Psych Bull. 1990;108:480–98.

6. Kahneman D, Tversky A. On the study of statistical intuitions. In: Kahneman D, Slovic P, Tversky A (eds). Judgements Under Uncertainty: Heuristics and Biases. Cambridge: Cambridge University Press, 1982:493–508.

7. Hall KH. Reviewing intuitive decision-making and uncertainty: the implications for medical education. Med Educ. 2002;36:216–24.

Cited By:

This article has been cited 5 time(s).

Journal of Clinical Nursing
A review of clinical decision making: models and current research
Banning, M
Journal of Clinical Nursing, 17(2): 187-195.
10.1111/j.1365-2702.2006.01791.x
CrossRef
Medical Teacher
Twelve tips for teaching avoidance of diagnostic errors
Trowbridge, RL
Medical Teacher, 30(5): 496-500.
10.1080/01421590801965137
CrossRef
Internal and Emergency Medicine
Rational error in internal medicine
Federspil, G; Vettor, R
Internal and Emergency Medicine, 3(1): 25-31.
10.1007/s11739-008-0088-4
CrossRef
Annals Academy of Medicine Singapore
Cognitive Aspect of Diagnostic Errors
Phua, DH; Tan, NCK
Annals Academy of Medicine Singapore, 42(1): 33-41.

Academic Medicine
Using Simulation to Instruct Emergency Medicine Residents in Cognitive Forcing Strategies
Bond, WF; Deitrick, LM; Arnold, DC; Kostenbader, M; Barr, GC; Kimmel, SR; Worrilow, CC
Academic Medicine, 79(5): 438-446.

PDF (94)
Back to Top | Article Outline

© 2003 Association of American Medical Colleges

Login

Article Tools

Share