Secondary Logo

Journal Logo

Economics, Education, and Health Systems Research: Research Report

Accreditation Council for Graduate Medical Education Competencies and the American Board of Anesthesiology Clinical Competence Committee: A Comparison

Rose, Steven H. MD; Burkle, Christopher M. MD

Author Information
doi: 10.1213/01.ane.0000189099.13286.97
  • Free

Graduate medical education officials, program directors, program faculty, residents, and fellows have expended considerable time, thought, and energy to understanding and implementing Accreditation Council for Graduate Medical Education (ACGME) competencies. As educators responsible for documenting the competence of anesthesiology residents, we were struck by the similarities between the ACGME Outcome Project and the long-standing requirement of the American Board of Anesthesiology (ABA) for a Clinical Competence Committee Report.

The ABA requires “every residency training program to file, on forms provided by the Board, an Evaluation of Clinical Competence in January and July on behalf of each resident who has spent any portion of the prior six months in Clinical Anesthesia training in or under the sponsorship of the residency program and its affiliates” (1). They further stipulate that “The Program Director or Department Chair must not chair the department Clinical Competence Committee” (1). Instead, a broadly representative group of faculty is charged with assessing resident competence through this process.

Residents receive credit for training when a signed satisfactory Certificate of Competence has been forwarded to the ABA. If a resident receives an unsatisfactory Certificate of Clinical Competence, he or she will receive credit for this training only if the next Certificate of Competence is satisfactory. The ABA Booklet of Information states “If a resident receives consecutive Certificates of Clinical Competence that are not satisfactory, additional training is required” (1).

According to Frank Hughes, Executive Vice President of the ABA, the requirement for clinical competence reports in anesthesiology residencies was initiated in 1946. Trainees with plans to apply for ABA certification were required to submit reports every 6 mo and residency directors or preceptors were required to submit a report on the competence of each resident annually. This practice continued until 1975, when the ABA introduced the formal Clinical Competence Committee Report. In 1977, the ABA mandated program directors to file a Clinical Competence Committee Report in July and January for each resident who began training on or after July 1, 1977 and spent any portion of the preceding 6 mo in clinical anesthesia training.

Some attributes addressed on the ABA Clinical Competence Committee Report are considered “essential” (Table 1).

Table 1:
Essential Attributes – American Board of Anesthesiology Clinical Competence

It is important to note that if one or more of the essential attributes is unsatisfactory, the overall Clinical Competence Committee Report for that resident must be unsatisfactory as well.

A second group of attributes is also reported on the Clinical Competence Committee Report. These differ from the essential attributes in that one or more may be deemed unsatisfactory without necessarily leading to an unsatisfactory assessment for overall clinical competence (Table 2).

Table 2:
Acquired Attributes – American Board of Anesthesiology Clinical Competence Committee Report

The ACGME Outcome Project, undergoing phased implementation over a several-year period, was initiated to describe and measure aspects of medical competence using a variety of metrics. It is currently in the second of four phases. Phase 1 (7/01–6/02) was intended to provide programs with an opportunity to form an initial response to changes in requirements. Phase 2 (7/02–6/06) was designed to sharpen the focus and definition of the competencies and assessment tools. Phase 3 (7/06–6/11) will focus on full integration of the competencies and their assessment with learning and clinical care. The fourth and final phase (7/11 and beyond) involves expansion of the competencies and their assessment to develop models of excellence (2). The underlying premise for this project is the belief that measuring resident performance in the ACGME competencies will better assess resident performance across a spectrum of desirable behaviors and attitudes for physicians (Table 3) (2).

Table 3:
Accreditation Council for Graduate Medical Education Competencies


Each element on which residents are evaluated through the ABA Clinical Competence Committee Report was reviewed and linked to one or more of the ACGME Competencies by the authors, both of whom are experienced anesthesia educators. There was no limit on the number of links that could be established between the two systems of evaluation.


The results of this comparison are listed in Table 4. An expected result was high levels of redundancy, given the overlap between the systems. It is apparent that the ACGME Outcome Project addresses the attributes that must be measured and recorded on the ABA Clinical Competence Committee Report in a fairly comprehensive manner.

Table 4:
Accreditation Council for Graduate Medical Education (ACGME) Competencies Indexed with Anesthesiology Clinical Competence Reports


The ability to link elements of the ACGME Outcome Project with those of the ABA Clinical Competence Committee Report should not be surprising. Many of the characteristics that define a competent physician have changed little over centuries (3). So, what are we to make of this within anesthesiology? Is the ACGME Outcome Project simply the emperor's new clothes, an exercise in relabeling time-honored characteristics of quality physicians as defined by the ABA? If so, is a bureaucratic approach to complying with yet another required element of an accredited program appropriate? We think not and, in many ways, find the similarities between the ACGME Outcome Project and the well-established requirement for a Clinical Competence Committee Report in anesthesiology residencies reassuring. We describe our reasoning below.

Although some competencies we are asked to evaluate for the ACGME have been influenced by technical advances (for example the need to acquire skill in accessing medical information using highly sophisticated information technology), the basic qualities of a competent physician seem to be remarkably consistent. These qualities are reflected in the essential and acquired components of the ABA Clinical Competence Committee Report. In some ways, the commonality of characteristics described by these different groups of medical educators (the ABA and ACGME), over an extended period of time, validates the Outcome Project.

So what is different? We believe there are at least two important differences when the anesthesiology Clinical Competence Committee Report and ACGME Outcome Project are compared. The first involves methodology. Anesthesiology Clinical Competence Committees are mandated to meet at least every 6 months to evaluate the progress of each resident. The process through which the Clinical Competence Committee makes these decisions may vary among programs. However, the ABA does not (yet) require methodology beyond the traditional “global evaluation.” The ACGME Outcome Project may provide the tools necessary to make the work of assessing clinical competence more fair and comprehensive as additional metrics from the ACGME “toolbox” are added to measure various aspects of performance. Use of a variety of metrics to assess each resident in each of the six ACGME competencies likely results in a more thorough and consistent process to assess competence.

The second important difference involves the fundamental goals of each process. Although the Anesthesiology Clinical Competence Committee may have utility in providing formative feedback, the primary purpose is to provide summative feedback; in essence, to determine if the performance of the resident was sufficient to provide credit for a block of training. Although the ACGME Outcome Project can be used for the purpose of summative evaluation, it has far greater utility than the Clinical Competence Committee Report in providing formative feedback in a fair and consistent manner. This point is important. Because the vast majority of residents meet the minimal standards for credited training, the Clinical Competence Committee Report serves, in most cases, more to document satisfactory performance than to improve resident performance.

The ACGME and the ABA are in some ways similar. Both have a major impact on the structure of residency training in anesthesiology. However, it is important to recognize that the mission of each body is fundamentally different. The ACGME (through the Residency Review Committee for Anesthesiology) is the accrediting body for residency programs. It assures residency programs substantially comply with the general and program-specific requirements for accreditation. The ABA is the certifying board for individual anesthesiologists. It examines and certifies individual physicians who have completed an accredited residency in anesthesiology in the United States.

The influence of the ACGME Outcome Project extends well beyond residents and fellows training in ACGME-accredited programs. For example, the American Board of Medical Specialties has used this competency model in the Maintenance of Certification process. It is possible this framework will also prove useful in the process through which physicians seek credentials and in the granting of procedural and other privileges.

The Society for Education in Anesthesia (SEA) has been active in clarifying and promoting compliance with the ACGME Outcome Project. Information regarding “best practices” is available on the SEA website and efforts to further validate the various competencies are continuing (4).

We believe the Clinical Competence Committee Report may represent an important component of a broader and more sophisticated implementation of the ACGME Outcome Project. Although we appreciate the differences between both assessment tools, their similarities should lead to improvement in resident evaluation and in the provision of feedback. The most important advances involve use of a more diverse set of instruments to assess and document each competency and to encourage residents to perform beyond minimally acceptable standards. The next step in the evolution of this process may include establishing methods to better direct residents to address areas in which their performance is satisfactory but could be improved. Further, the correlation between the ACGME and ABA assessment tools suggests that both tools may be improved by merging common elements while retaining unique elements of each.


1. The American Board of Anesthesiology, Inc. Booklet of Information, Feb 2005. Available at
2. Accreditation Council for Graduate Medical Education. Available at
3. Medical Professionalism Project. Medical professionalism in the new millennium: a physician charter. Lancet 2002;359:520–2.
4. Society for Education in Anesthesia. Available at
© 2006 International Anesthesia Research Society