Secondary Logo

Journal Logo

Realizing the Promise of Competency-Based Medical Education

Holmboe, Eric S. MD

doi: 10.1097/ACM.0000000000000515
Commentaries
Free

Competency-based medical education (CBME) places a premium on both educational and clinical outcomes. The Milestones component of the Next Accreditation System represents a fundamental change in medical education in the United States and is part of the drive to realize the full promise of CBME. The Milestones framework provides a descriptive blueprint in each specialty to guide curriculum development and assessment practices.

From the beginning of the Outcomes project in 1999, the Accreditation Council for Graduate Medical Education and the larger medical education community recognized the importance of improving their approach to assessment. Work-based assessments, which rely heavily on the observations and judgments of clinical faculty, are central to a competency-based approach. The direct observation of learners and the provision of robust feedback have always been recognized as critical components of medical education, but CBME systems further elevate their importance. Without effective and frequent direct observation, coaching, and feedback, the full potential of CBME and the Milestones cannot be achieved. Furthermore, simply using the Milestones as end-of-rotation evaluations to “check the box” to meet requirements undermines the intent of an outcomes-based accreditation system.

In this Commentary, the author explores these challenges, addressing the concerns raised by Williams and colleagues in their Commentary. Meeting the assessment challenges of the Milestones will require a renewed commitment from institutions to meet the profession’s “special obligations” to patients and learners. All stakeholders in graduate medical education must commit to a professional system of self-regulation to prepare highly competent physicians to fulfill this social contract.

Dr. Holmboe is senior vice president, Milestone Development and Evaluation, Accreditation Council for Graduate Medical Education, Chicago, Illinois.

Editor’s Note: This is a Commentary on Williams RG, Dunnington GL, Mellinger JD, Klamen DL. Placing constraints on the use of the ACGME Milestones: a commentary on the limitations of global performance ratings. Acad Med. 2015;90:404–407.

Funding/Support: None reported.

Other disclosures: Dr. Holmboe receives royalties from Mosby-Elsevier for a textbook on assessment and serves on the boards of Medbiquitous and the National Board of Medical Examiners.

Ethical approval: Reported as not applicable.

Correspondence should be addressed to Dr. Holmboe, Accreditation Council for Graduate Medical Education, 515 N. State St., Chicago, IL 60611; telephone: (312) 755-5087; e-mail: eholmboe@acgme.org.

In their Commentary, Williams and colleagues1 sound a timely alarm about important assessment issues related to the Milestones component of the Next Accreditation System (NAS). First, they argue that without sufficient direct observation by faculty, the full potential of the Milestones cannot be realized. Second, they posit that using the reporting Milestones as typical end-of-rotation evaluations to simply “check the box” to meet accreditation requirements also will undermine the goals and purpose of the Milestones. These issues deserve our attention. In this Commentary, I explore these assessment challenges in further detail.

Back to Top | Article Outline

The Need for Direct Observation and Feedback

Evidently it is not deemed necessary to assay students’ and residents’ clinical performance once they have entered the clinical years. Nor do clinical instructors more than occasionally show how they themselves elicit and check the reliability of the clinical data. To a degree that is often at variance with their own professed scientific standards, attending staff all too often accept and use as the basis for discussion, if not recommendations, findings reported by students and residents without ever evaluating the reporter’s mastery of the clinical methods utilized or the reliability of the data obtained.2

The above passage, emphatically highlighting the need for direct observation, was written by the late, great George Engel in 1976 as part of an editorial response to an article demonstrating significant deficiencies in clinical skills among students and residents.2,3 Engel emphasized that the absence, or insufficiency, of direct observation and feedback, which is essential to professional development, leads to deficient outcomes. Furthermore, as he realized then and recent research has clearly demonstrated, self-assessment without observation and feedback is ineffective.4,5 In addition, Anders Ericsson’s6,7 work on the importance of deliberate practice and coaching in multiple fields is applicable to medical education, especially in a competency-based model. Deliberate practice involves repetitively working on well-defined tasks under the watchful eyes of a coach who provides informative feedback to help the learners continuously improve and refine their competence. Recently, Boud and Molloy8 argued that effective feedback only truly occurs when the recipient of the feedback has actually attempted the action. Thus, the role and importance of direct observation and mentored deliberate practice in the development of expertise is and always has been central in medical education.

Back to Top | Article Outline

The Reality of Direct Observation and Feedback in Practice

Despite sound educational theory and robust empirical evidence about the need for longitudinal observation, coaching, and feedback, the quality and quantity of direct observation has been persistently insufficient across the medical education continuum.9 During my time at the American Board of Internal Medicine, for example, when we asked programs about the number of times formal observation occurred (e.g., use of a mini-Clinical Evaluation Exercise or other tool), they reported that almost a third of all postgraduate year 2 and 3 residents were never formally observed over an entire academic year. Disciplines such as surgery and anesthesia can take advantage of the operating room to perform ongoing direct observation,10 but they may not observe other important competencies, such as history taking and communication, in nonoperative settings. The persistent lack of direct observation continues to undermine the quality of the educational experience and reduces the probability that graduates are truly prepared to enter unsupervised practice.

Williams and colleagues importantly highlight the challenges in performing direct observation, citing the common and seemingly intractable problems of faculty time and skills in assessment. Others have argued that efforts such as the Milestones are unfunded mandates.11 The unfunded mandate rationale, however, is exceedingly difficult to justify and support when you consider the public investment in graduate medical education (GME)—$15 to $21 billion a year from Medicare, Medicaid, and other sources—in conjunction with the strong empiric evidence base supporting the role of direct observation, feedback, and coaching in professional development.12,13

The Milestones also incorporate the importance of patient input into the assessment process, usually captured through patient experience surveys (i.e., their observation of the learner). More important, patients are entitled to appropriate levels of supervision of physicians-in-training to ensure high-quality care and safety in the learning environment. We too often ignore the fact that patients sit squarely in the middle of the medical education system, and faculty can only effectively adjust levels of supervision, and delegate authority for additional components of care, through the robust assessment of key clinical competencies in authentic care situations. Effective assessment requires direct observation, not only the usual assessment proxies of morning report presentations, case presentations on “rounds” or conducted in classrooms, sign-outs, and other techniques removed from the bedside. As Kogan and colleagues14 recently pointed out, we must view rater cognition (skill) as both an educational and clinical care issue. Patients must be the subject, not the object, of assessment.

The availability of time for faculty to conduct direct observation is mostly a system issue that rests squarely with the institution, which must assume responsibility for the portion of the social contract between the profession and the public related to training physicians and other health care professionals to meet 21st-century health care needs. Recently, Grover and colleagues11 argued that changes should not be made to the indirect medical education (IME) payments received by training institutions from Medicare. Their primary argument revolved around the importance of the research and clinical care missions of academic health centers. To that, I would add strong support for the importance of the education mission, given its critical and direct relationship with the clinical care provided today and into the future. Regardless of whether changes are made to current IME funding, more support should be directed to clinical teaching and assessment to ensure high-quality care and education.

Back to Top | Article Outline

Overcoming Obstacles to Direct Observation and Feedback

As Asch and colleagues15 demonstrated in their landmark study of residency training in obstetrics, the quality of care a resident experiences and delivers is very likely to be the quality of care that she or he will deliver as a practicing physician. Ensuring that patients receive safe and high-quality care in conjunction with learners receiving safe and high-quality training is an ethical and moral responsibility of training institutions.16 Similarly, faculty must provide appropriate levels of supervision and mentorship, but those involved in clinical education, including members of the clinical competency committee, must be given adequate time and support to meet these responsibilities and engage in ongoing professional development around clinical care, teaching, and assessment.17

Clinician–educator faculty constitute the essential core of a competency-based medical education (CBME) system. One of the most important skills in the clinician–educators’ toolbox is direct observation, which requires ongoing practice and feedback. This year marks the 50th anniversary of the modern Hippocratic Oath; thus, it is time that both CBME and the robust assessment needed to realize CBME’s full potential serve as a catalyst to embrace our social contract and rediscover our “special obligations” to patients and learners.18 To meet these special obligations, we must invest in our clinician–educator faculty to ensure that they have the time and necessary teaching and assessment competencies to meet our public obligation to produce a highly competent health care workforce.

Back to Top | Article Outline

Competencies, Milestones, and the NAS

The implementation of outcomes-based medical education, using the competencies as a conduit to transformation, has been challenging for residency and fellowship programs. Many educators have struggled to translate the conceptual definitions and descriptions of the competencies into meaningful changes in both the curriculum and assessment tools. One reason for this struggle is the lack of shared mental models, or frames of reference, regarding the competencies. The Milestones were developed collaboratively within each specialty to create the core blueprint, or roadmap, of the discipline in narrative, developmental language. However, the primary purposes and role of the Milestones in the NAS must be clear.

First, the Milestones are not “the” NAS but, rather, one of nine total elements.19 Second, the Milestones should serve as a framework to inform and guide the development of curricula, choice of assessment methods and instruments, and assessment judgments by the clinical competency committee. Medicine is not a static profession, and the Accreditation Council for Graduate Medical Education (ACGME) fully recognizes that the Milestones will need to be refined and revised. The Milestones also do not define the totality of a discipline. Rather, they are key elements of a larger “whole” of clinical competence. Many educators agree that residents must possess this subset of skills to progress to unsupervised practice within each specialty. Substantial professional judgment, on the part of faculty, is still required to assess a resident’s overall fitness for practice. When performed systematically, measurement of these key elements can enhance our ability to assure the public of the effectiveness of our efforts, and can promote continuous improvement efforts for the entire GME system. At this time, and for the foreseeable future, the Milestones should be used only as a framework for formative assessment. We are truly in version 1.0, and much effort must be invested over the next three to five years to explore what works, for whom, in what circumstances, and why across the specialties.20

The Milestones are designed to be more synthetic and to guide subsequent mentored deliberate practice over the ensuing educational experiences. For this reason, the reporting Milestones were never intended to be used as evaluation tools for short rotations (e.g., 1–2 months) but, rather, to guide the synthesis of multiple assessments twice a year by a group of experienced educators on the clinical competency committee. Evaluation forms used for short rotations should be based primarily on the purpose of that rotation in relation to the outcomes expected in the specialty. Mapping the purpose and objectives of the rotation to the reporting Milestones can be a useful exercise for the program and faculty and can help guide the development of evaluation forms that are better aligned with the purpose of the rotation. Some specialties have developed more granular milestone frameworks and are using them as an item bank to build evaluation forms better aligned with the goals of their rotations and curricular experiences.21 Other specialties are working with entrustable professional activities (EPAs) that define important end states for disciplines and can be very useful in informing and guiding curriculum, assessment, and supervision decisions.22

Our collective goal in medical education is to produce physicians who can successfully enter unsupervised practice and continue their trajectory toward expertise and mastery. Competencies, Milestones, and EPAs do not function independently but, rather, serve as meaningful frameworks in the context of an ACGME-accredited residency or fellowship program to produce a talented, whole physician. Medical school, residency, and fellowship education will be most effective when the output is a whole physician who effectively integrates all competencies, however defined, into his or her practice. Patient care skills do not work in isolation from medical knowledge, professionalism, interpersonal skills and communication, systems-based practice, and practice-based learning and improvement, and vice versa. The care of patients and populations is a dynamic, integrated process.

The NAS, the Clinical Learning Environment Review, and the Milestones all are tools to facilitate and promote innovation and continuous improvement in GME in the United States. We are entering a period of transformation that requires collectivism among all the key stakeholders in medical education and that can feel, like any change, uncomfortable. Only by working together through dialogue and across organizations can the full potential of outcomes-based medical education be realized, and the unintended consequences eloquently summarized by Williams and colleagues be minimized.

Acknowledgments: The author thanks Dr. Thomas Nasca and John Potts for their helpful review and comments.

Back to Top | Article Outline

References

1. Williams RG, Dunnington GL, Mellinger JD, Klamen DL. Placing constraints on the use of the ACGME Milestones: A commentary on the limitations of global performance ratings. Acad Med. 2015;90:404–407
2. Engel GL. Editorial: Are medical schools neglecting clinical skills? JAMA. 1976;236:861–863
3. Wiener S, Nathanson M. Physical examination. Frequently observed errors. JAMA. 1976;236:852–855
4. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: A systematic review. JAMA. 2006;296:1094–1102
5. Eva KW, Regehr G. Self-assessment in the health professions: A reformulation and research agenda. Acad Med. 2005;80(10 suppl):S46–S54
6. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S81
7. Ericsson KA. An expert-performance perspective of research on medical expertise: The study of clinical performance. Med Educ. 2007;41:1124–1130
8. Boud D, Molloy E Feedback in Higher and Professional Education. Understanding It and Doing It Well. 2013 Oxon, England Routledge
9. Holmboe ES. Faculty and the observation of trainees’ clinical skills: Problems and opportunities. Acad Med. 2004;79:16–22
10. Baker K. Determining resident clinical performance: Getting beyond the noise. Anesthesiology. 2011;115:862–878
11. Grover A, Slavin PL, Willson P. The economics of academic medical centers. N Engl J Med. 2014;370:2360–2362
12. Iglehart JK. Financing graduate medical education—mounting pressure for reform. N Engl J Med. 2012;366:1562–1563
13. Chandra A, Khullar D, Wilensky GR. The economics of graduate medical education. N Engl J Med. 2014;370:2357–2360
14. Kogan JR, Conforti LN, Iobst WF, Holmboe ES. Reconceptualizing variable rater assessments as both an educational and clinical care problem. Acad Med. 2014;89:721–727
15. Asch DA, Nicholson S, Srinivas S, Herrin J, Epstein AJ. Evaluating obstetrical residency programs using patient outcomes. JAMA. 2009;302:1277–1283
16. Egener B, McDonald W, Rosof B, Gullen D. Perspective: Organizational professionalism: Relevant competencies and behaviors. Acad Med. 2012;87:668–674
17. Nasca TJ. Graduate medical education financing and the role of the volunteer educator. Virtual Mentor. 2011;13:769–774
18. Holmboe E, Bernabeo E. The “special obligations” of the modern Hippocratic Oath for 21st century medicine. Med Educ. 2014;48:87–94
19. Accreditation Council for Graduate Medical Education. . Next Accreditation System. http://www.acgme.org/acgmeweb/tabid/435/ProgramandInstitutionalAccreditation/NextAccreditationSystem.aspx. Accessed August 18, 2014
20. Pawson R The Science of Evaluation: A Realist Manifesto. 2013 London, England Sage
21. Nabors C, Peterson SJ, Forman L, et al. Operationalizing the internal medicine milestones—an early status report. J Grad Med Educ. 2013;5:130–137
22. Ten Cate O. AM last page: What entrustable professional activities add to a competency-based curriculum. Acad Med. 2014;89:691
© 2015 by the Association of American Medical Colleges