Secondary Logo

Journal Logo

Reflecting Upon Reflection in Diagnostic Reasoning

Ilgen, Jonathan S., MD, MCR; Bowen, Judith L., MD; Eva, Kevin W., PhD

doi: 10.1097/ACM.0000000000000415
Letters to the Editor
Free

Assistant professor, Division of Emergency Medicine, University of Washington, School of Medicine, Seattle, Washington; ilgen@u.washington.edu.

Professor, Department of Medicine, Oregon Health & Science University, School of Medicine, Portland, Oregon.

Professor and director of education research and scholarship, Department of Medicine, and senior scientist, Centre for Health Education Scholarship, University of British Columbia, Vancouver, British Columbia, Canada.

Disclosures: None reported.

Back to Top | Article Outline

To the Editor:

We are writing regarding the commentary by Croskerry et al1 in the February 2014 issue. The commentary synthesizes important issues confronting investigators interested in the many factors that affect diagnostic reasoning. However, we were surprised by the assertion that the paper-based cases used by Norman et al2 were “so detached from clinical practice as to markedly reduce … validity to the point of making any conclusions extremely tenuous,”1 particularly in view of Croskerry and colleagues’ advocacy of cognitive biases as a principal source of diagnostic error.

Theories of cognitive bias stem almost entirely from Tversky and Kahneman’s3 use of paper-based studies with undergraduate psychology students, which were devoid of clinical context. Evidence of cognitive bias in medicine is mostly based on retrospective reviews of adverse events4 and small studies using paper cases.5 A naturalistic setting may add the appearance of ecological validity—as Croskerry et al suggest—but countless variables including limited sampling of context-dependent skills, the common lack of absolute certainty regarding the correct diagnosis, and the empirically established fact that multiple reasoning processes (analytic and nonanalytic) are active any time a judgment is being made,6 confound the ability to infer reasoning from observed behavior in such settings.7 Paper-based cases are not without drawbacks, but enable excellent psychometric properties, provide similar learning outcomes to simulated patient-based cases,8 and correlate with performance in practice.7

Our surprise was heightened given that Croskerry et al provided positive commentary on a study by Schmidt et al5 that used paper-based cases. The difference appears to be that the study by Schmidt et al demonstrated an influence of the availability heuristic. We are less convinced, however, by Croskerry and colleagues’ interpretation that the “deliberate analytical intervention” of reflection is a robust mechanism to optimize diagnostic performance. Schmidt et al effectively demonstrated a benefit of reflection on a subset of cases where availability bias was induced through creation of a deliberate nonanalytic intervention. That does not invalidate the argument that the reasoning processes that create biases exist because they generally offer a useful path towards diagnostic success—not the only path, but a useful path. Deliberate analytic interventions might help in some cases, but can also create detriment in others.

We recently compared diagnostic accuracy between participants encouraged to use either first impressions or reflection.9 Our nearly 400 clinician participants (students, residents, and faculty) completed a computer-based assessment using cases drawn from the same collection of paper-based cases used by Schmidt and colleagues.5 Prior to giving their answers, participants were instructed to either “trust [their] sense of familiarity” or engage in structured reflection.9 Those under the reflection condition spent nearly three times longer solving clinical cases (evidence that they followed directions), yet accuracy was identical between the two groups. Consistent with Norman et al,2 we found no evidence that reflection increased accuracy on either straightforward or complex cases.

This is not to say that we should give up on reflection, as this exercise may improve learning among novice clinicians. It is just one path to success, however, and it remains unclear how experienced clinicians could be expected to consciously identify “certain situations where we can reasonably and comfortably trust our intuitions, and others where it would be ill advised to use anything other than analytical reasoning.”1

Jonathan S. Ilgen, MD, MCR

Assistant professor, Division of Emergency Medicine, University of Washington, School of Medicine, Seattle, Washington; ilgen@u.washington.edu.

Judith L. Bowen, MD

Professor, Department of Medicine, Oregon Health & Science University, School of Medicine, Portland, Oregon.

Kevin W. Eva, PhD

Professor and director of education research and scholarship, Department of Medicine, and senior scientist, Centre for Health Education Scholarship, University of British Columbia, Vancouver, British Columbia, Canada.

Back to Top | Article Outline

References

1. Croskerry P, Petrie DA, Reilly JB, Tait G. Deciding about fast and slow decisions. Acad Med. 2014;89:197–200
2. Norman G, Sherbino J, Dore K, et al. The etiology of diagnostic errors: A controlled trial of system 1 versus system 2 reasoning. Acad Med. 2014;89:277–284
3. Tversky A, Kahneman D. Judgment under uncertainty: Heuristics and biases. Science. 1974;185:1124–1131
4. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med. 2005;165:1493–1499
5. Schmidt HG, Mamede S, van den Berge K, van Gog T, van Saase JL, Rikers RM. Exposure to media information about a disease can cause doctors to misdiagnose similar-looking clinical cases. Acad Med. 2014;89:285–291
6. Jacoby LL. A process dissociation framework: Separating automatic from intentional uses of memory. J Memory Language. 1991;30:513–541
7. Ilgen JS, Humbert AJ, Kuhn G, et al. Assessing diagnostic reasoning: A consensus statement summarizing theory, practice, and future needs. Acad Emerg Med. 2012;19:1454–1461
8. La Rochelle JS, Durning SJ, Pangaro LN, Artino AR, van der Vleuten CP, Schuwirth L. Authenticity of instruction and student performance: A prospective randomised trial. Med Educ. 2011;45:807–817
9. Ilgen JS, Bowen JL, McIntyre LA, et al. Comparing diagnostic performance and the utility of clinical vignette-based assessment under testing conditions designed to encourage either automatic or analytic thought. Acad Med. 2013;88:1545–1551
© 2014 by the Association of American Medical Colleges