Skip Navigation LinksHome > September 2009 - Volume 84 - Issue 9 > How Should the ACGME Core Competencies Be Measured?
Academic Medicine:
doi: 10.1097/ACM.0b013e3181b185c6
Letters to the Editor

How Should the ACGME Core Competencies Be Measured?

Lurie, Stephen MD, PhD; Mooney, Christopher MA; Lyness, Jeffrey MD

Free Access
Article Outline
Collapse Box

Author Information

Director of assessment, Office of Curriculum and Assessment, University of Rochester School of Medicine and Dentistry, Rochester, New York; (stephen_lurie@urmc.rochester.edu). (Lurie)

Information analyst, Office of Curriculum and Assessment, University of Rochester School of Medicine and Dentistry, Rochester, New York. (Mooney)

Director of curriculum, Office of Curriculum and Assessment, University of Rochester School of Medicine and Dentistry, Rochester, New York. (Lyness)

Back to Top | Article Outline

In Reply:

We respectfully disagree with Dr. Bell that the ACGME core competencies could have a significantly more meaningful role in formative than in summative evaluation. Our review found that irrespective of rating method or venue, faculty are unable to reliably differentiate the six core competencies in their assessment of learners. Thus, attempts to provide feedback will continue to fail to reflect the core competencies, regardless of whether the assessment is intended in a formative or summative way. While it might be argued that formative assessment should be more personalized and less psychometrically rigorous than summative assessment, our review finds no evidence that faculty can reliably differentiate the core competencies under any circumstances.

Dr. Bell suggests that a more appropriate assessment would involve the construct of patient-centeredness, as demonstrated by the quality of a resident’s consideration of patients’ larger psychosocial context. While we wholeheartedly agree with this idea in principle, we are not certain of how it would be measured in a reliable way. It is likely that this higher-order skill would prove just as elusive and frustrating to assess as the current ACGME core competencies.

As we noted in our article, educational theory is valuable for setting a common educational context and vocabulary. But to lead to useful measurement tools, any educational theory must accommodate itself to pervasive cognitive biases in the way that human beings assess one another, as well as to the inherent statistical uncertainty in measurement. Thus, rather than continuing to debate the most appropriate abstract “competence,” we recommend that current assessment tools be rigorously examined for precisely what information they are, in fact, able to provide about learners. This process could lead to an evidence-based model of professional competence. While such a model might not seem as theoretically elegant as one based purely on abstract theory, it would have the advantage of leading to directly interpretable outcome measures that could be meaningfully compared across time, learners, and educational settings.

Stephen Lurie, MD, PhD

Director of assessment, Office of Curriculum and Assessment, University of Rochester School of Medicine and Dentistry, Rochester, New York; (stephen_lurie@urmc.rochester.edu).

Christopher Mooney, MA

Information analyst, Office of Curriculum and Assessment, University of Rochester School of Medicine and Dentistry, Rochester, New York.

Jeffrey Lyness, MD

Director of curriculum, Office of Curriculum and Assessment, University of Rochester School of Medicine and Dentistry, Rochester, New York.

© 2009 Association of American Medical Colleges

Login

Article Tools

Share