Skip Navigation LinksHome > June 2007 - Volume 82 - Issue 6 > Redirecting the Assessment of Clinical Competence
Text sizing:
A
A
A
Academic Medicine:
doi: 10.1097/ACM.0b013e31805556f8
From the Editor

Redirecting the Assessment of Clinical Competence

Whitcomb, Michael E. MD

Free Access

In the late 1990s, the Accreditation Council for Graduate Medical Education (ACGME) initiated the Outcome Project to ensure that physicians graduating from residency programs are competent to practice in the specialties of their training. In one of the first editorials I wrote for this journal, I challenged the basic premise underlying the design of the project.1 My argument was not with the project's stated objective but, rather, with the ACGME's way of achieving the objective. Specifically, I took exception to the notion that one could determine the clinical competence of graduating residents by assessing how they perform in individual domains (defined as core competencies by the ACGME) of specialty or subspecialty practice.

I continue to maintain that documenting that a graduating resident has mastered, at some predetermined level, the knowledge, skills, and attitudes associated with each of the core competencies, while informative, does not ensure that the individual is a competent physician. Something more is needed: graduating residents must be able to translate and integrate their knowledge, skills, and attitudes so they can perform the complex tasks required to deliver high-quality medical care. Determining that residents have taken this last crucial step is the responsibility of the residency program's faculty, who must find better ways to critically observe the resident's care for patients in a variety of clinical settings and circumstances.

The three lead articles in this issue of Academic Medicine, all by respected and experienced medical educators, reinforce this view. Their perspectives on the assessment of clinical competence are timely and valuable because they address the most important challenge facing the medical education community: ensuring that residents can provide high-quality medical care when they enter practice. However, since the accreditation of residency programs is a high-stakes event, program directors and faculty are unlikely to change how they assess their residents' clinical competence until the ACGME no longer mandates the approach required by the Outcome Project. I hope that those who are directly associated with the ACGME will take these articles' messages to heart, since they make clear the need to reform that organization's way of determining residents' competence.

In his article, Klass sets forth the fundamental concept of how clinical competence should be assessed when he notes that the increasing emphasis being placed on the performance of practitioners in the community reflects “a change in the conception of the competent doctor from someone who possesses the right attributes to someone who does the right thing.” Given that, the determination of clinical competence must shift from assessment of the attributes physicians possess (knowledge, skills, and attitudes) to the assessment of what they actually do when they encounter patients in real clinical situations. To be clear, assessing performance in the ACGME's core competency domains provides information about the attributes residents possess. But such assessment says nothing about their ability to actually provide high-quality care to patients in real-life situations.

Even though Huddle and Heudebert come to the issue from a very different vantage point, they advance the argument set forth by Klass. They assert that assessment methods that capture knowledge and skills—which they acknowledge are the “building blocks” of competence—cannot elucidate clinical competence per se. They argue that “the measurable bits of performance that follow from anatomizing clinical competence according to discrete learning objectives do not and cannot add back together to constitute the skill and ability of the competent physician.” This is rather self-evident to them, because the “bite-sized” elements of performance identified by discrete learning objectives (derived from the core competencies) do not capture the ability of a physician to choose whether and when to make use of those elements in real clinical situations.

What distinguishes their view from Klass' is that they do not simply challenge the validity of the ACGME's approach. They go further and argue that using the ACGME's framework of core competencies for the accreditation of residency programs and the certification of physicians (as adopted by the American Board of Medical Specialties [ABMS] for its Maintenance of Certification Program) actually threatens to undermine the apprenticeship model of clinical training that has existed in the United States for well over a century. In their view, focusing assessment on performance in individual domains of medicine, rather than on performance in caring for patients, will so distract program directors and faculty from what they should be doing—observing residents caring for patients in a variety of clinical settings and under different clinical circumstances—that it will further erode the quality of clinical training. Indeed, why should program directors and faculty worry about having residents interact with master clinicians if the measure of their residents' clinical competence and the accreditation status of their programs are to be based on the performances of the residents in individual competency domains? Huddle and Heudebert's concerns should be treated very seriously, since these authors are on the front line of residency training at a major academic health center.

Finally, from Europe, ten Cate and Scheele bring a similar perspective to the issue by proposing an interesting design for a competency-based GME program. They introduce the concept of identifying a set of “entrustable professional activities” that will link training objectives (assigned to specific patient-care experiences) with a resident's demonstrated ability. And they suggest that by providing “statements of awarded responsibility” that link training to performance, the desired program outcomes will be achieved. Their comments agree with the view proposed by Huddle and Heudebert that assessing a set of “competencies” risks disassociating the experiences of caring for patients in clinical settings from the assessment of the residents' clinical competence. Like Huddle and Heudebert, who suggest that GME programs should incorporate attending physicians whose only responsibility would be the ongoing evaluation of residents' performances, they suggest that the central feature of innovation in postgraduate training should be approaches to support those individuals who supervise residents' clinical activities.

In my view, the authors of the three articles make clear how to assess the clinical competence of residents as they progress through their training and into practice. In a very real sense, all these authors' core messages are identical. And the fact that the authors come at the issue from such different vantage points highlights the validity of what they have to say. I hope that those responsible for the current initiatives designed to assess clinical competence will take the authors' views seriously and reexamine those initiatives.

The public expects the academic medicine community to act in the interest of patients. What can be more important than ensuring that the doctors the public encounters are competent to care for them? Thus, the medical education community should petition the ACGME and ABMS to abandon the use of core competencies as the main measure of clinical competence, and work to develop approaches for assessing competence based on careful observations of residents' performances in providing patient care.

Michael E. Whitcomb, MD

Back to Top | Article Outline

Reference

1 Whitcomb ME. Competency-based graduate medical education? Of course! But how should competency be assessed? Acad Med. 2002;77:359–360.

Cited By:

This article has been cited 5 time(s).

Journal of Surgical Oncology
Training Fellows and Core Competency: "Making Quality Certain"
Roach, PB; Silverstein, JC
Journal of Surgical Oncology, 99(2): 83-84.
10.1002/jso.21185
CrossRef
Academic Medicine
Redesigning residency training in internal medicine: The consensus report of the alliance for academic internal medicine education redesign task force
Meyers, FJ; Weinberger, SE; Fitzgibbons, JP; Glassroth, J; Duffy, FD; Clayton, CP
Academic Medicine, 82(): 1211-1219.

Medical Education
Using mixed methods research in medical education: basic guidelines for researchers
Schifferdecker, KE; Reed, VA
Medical Education, 43(7): 637-644.
10.1111/j.1365-2923.2009.03386.x
CrossRef
Training and Education in Professional Psychology
Advancing the Culture of Competence
Belar, CD
Training and Education in Professional Psychology, 3(4): S63-S65.
10.1037/a0017541
CrossRef
Annals of Surgical Oncology
Use of a Novel, Web-Based Educational Platform Facilitates Intraoperative Training in a Surgical Oncology Fellowship Program
Roach, PB; Roggin, KK; Selkov, E; Posner, MC; Silverstein, JC
Annals of Surgical Oncology, 16(5): 1100-1107.
10.1245/s10434-008-0186-6
CrossRef
Back to Top | Article Outline

© 2007 Association of American Medical Colleges

Login

Article Tools

Share