The accuracy of medical diagnosis is often degraded when diagnosis deviates from the principles of normative decision making. Although diagnostic errors can potentially be reduced by metacognitive training, that proposal (put forth by Croskerry in an accompanying article) needs validation and extensive exploration.
Dr. Graber is chief, Medical Service at the VA Medical Center, Northport, New York, and professor and vice chair, Department of Medicine, Stony Brook University Health Sciences Center, School of Medicine, Stony Brook, New York.
Correspondence and requests for reprints should be sent to Dr. Graber, Department of Medicine, Stony Brook University Health Sciences Center, Stony Brook, NY 11794-8430; e-mail: 〈email@example.com〉.
This article and the following one are responses to the preceding article by Croskerry.
In his article in this issue of Academic Medicine, Croskerry is correct to emphasize how frequently medical diagnosis departs from the pristine boundaries of normative decision making.1 Many, or perhaps even most, medical diagnoses arise instead from Reason's “flesh and blood” processes that rely heavily on subconscious framing of the problem, extensive use of simplifications, and diagnostic assignments based on heuristic thinking with its inherent biases. Croskerry's recent compilation of these “cognitive dispositions to respond”2 (see List 1 of his article in this issue for an abbreviated version) is an excellent starting place to examine the many ways in which the process of arriving at a diagnosis deviates from normative techniques.
Croskerry goes on to propose that diagnostic accuracy could be improved if we can “debias” the clinician by metacognitive training. My colleagues and I wholeheartedly agree,3 but we wonder whether this can be accomplished. Croskerry cites examples of successful debiasing experiments and holds up the existence of medical experts as proof that debiasing can succeed. There are a few examples of training exercises that seem to succeed in debiasing the learners. However, these have typically been classroom demonstrations, with testing done in close-time proximity to the actual training. It is an unproven assumption that success in improving the accuracy of decision making in a laboratory setting, or in a field outside medicine, guarantees the same results in the sometimes unique world of medical decision making. Finally, according to the current paradigms, the “expert” becomes so from an overwhelming mastery of content-specific knowledge, not from training in the art of metacognition.
We agree with Croskerry that, in theory, metacognitive training will inevitably improve diagnostic accuracy, but the general pessimism regarding this approach exists for good reason. Metacognitive training cannot be presented as a proven success story—this technique needs to be validated through research and application. Can we train physicians to employ metacognition during the decision-making process? Can we actually show that diagnostic accuracy improves? What techniques work the best? During which stage of medical education should these skills be taught? Is there a downside if we teach clinicians to doubt their first impressions?
The challenge for those of us who believe in the potential of metacognitive training to improve diagnostic accuracy is to begin addressing these questions and to validate this approach in the real world of day-to-day medical care.
1. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003;78:775–780.
2. Croskerry P. Achieving quality in clinical decision making: cognitive strategies and detection of bias. Acad Emerg Med. 2002;9:1184–1204.
3. Graber M, Gordon R, Franklin N. Reducing diagnostic errors in medicine: what's the goal? Acad Med. 2002;77:981–92.