Johnston, Karen C. MD, MSc
Dr. Johnston is associate professor of neurology and health evaluation sciences and director of the neurology residency program, University of Virginia School of Medicine, Charlottesville, Virginia.
This article is a revised version of an abstract presented on March 6, 2003 at the 2003 Accreditation Council for Graduate Medical Education Annual Educational Conference in Chicago, Illinois.
Correspondence and requests for reprints should be addressed to Dr. Johnston, University of Virginia Health System, Department of Neurology, #800394, Charlottesville, VA 22908; phone: (434) 924-5323; fax (434) 982-1726; e-mail: 〈firstname.lastname@example.org〉.
The general competencies mandated by the Accreditation Council for Graduate Medical Education (ACGME) Outcome Project have resulted in new training requirements for most residency programs. To determine the training program changes necessary because of these new standards, the neurology residency program at the University of Virginia developed a simple grid-like instrument that links the objectives for residents’ major rotations with the six ACGME general competencies. This instrument, created in 2002, helped the program develop specific training elements related to the general competencies that were identified as missing from the residency. The instrument was then converted to an evaluation tool that allows attending physicians to assess individual residents’ competencies for each objective in all major rotations. The author describes the assessment and evaluation instruments, called Self Assessment and Vital Evaluation (SAVE), and their usefulness in the University of Virginia neurology residency program’s initial response to the new standards. She also suggests that these instruments, with some modifications, may be of value to other residency programs.
The Accreditation Council for Graduate Medical Education (ACGME) developed its Outcome Project in an attempt to increase the focus on competency-based residency, with an emphasis on demonstrated success (outcomes) of training, as opposed to experience based training, with an emphasis on the residency program’s potential to educate.1 This new approach requires that programs demonstrate that each resident has attained the appropriate level of competency in six areas: patient care, medical knowledge, professionalism, systems-based practice, practice-based learning and improvement, and interpersonal and communication skills.
Phase 1 of the program ran from July 2001 until June 2002. During this period, residency programs were asked to assess their curricula and form an initial response to the new requirements. In this article, I describe the initial self-assessment of our neurology residency at the University of Virginia and present the instrument we developed in 2002 that facilitated our adjustment to the new standard. In addition, we later converted the instrument into an evaluation tool that is allowing us to consider the six general competencies at all levels of our neurology residents’ training.
The director of the neurology residency training program and colleagues at the University of Virginia developed a very simple grid-like instrument as the program’s self-assessment tool. It links descriptions of the objectives for each major rotation of the program to each of the six general competencies that the residents should achieve (see Chart 1). Across the top of the grid are the six competencies, and down the left side are the objectives for the specific rotation being evaluated. The competencies that were determined to be most central to a given objective were given an X. For example, for the objective “Obtain a neurological history and exam,” we determined that patient care, medical knowledge, and interpersonal communication skills captured the key components of that objective.
A second form, the evaluation form, was developed by blocking out the competencies that were not as imperative to a given objective (the ones not marked with an X in the self-assessment tool; see Chart 2). The spaces that remained reflected the specific competencies that should be evaluated for each of the objectives listed. For example, for the same objective mentioned previously, “Obtain a neurological history and exam,” an attending physician would have to grade the resident’s level of competency in patient care, medical knowledge, and interpersonal skills. It would be possible for a resident to obtain excellent information from the patient that would allow the correct patient-care decisions and would demonstrate excellent medical knowledge (resulting in high scores in each), but if the resident was rude to the patient in the process or in other ways ineffective in his or her communication, the resident could be graded with a low score in interpersonal and communication skills.
Using the Instruments
Regarding the self-assessment instrument, four major rotations were used for the first-year neurology residents: inpatient wards (and night float), inpatient and outpatient epilepsy, neurological intensive care unit, and neurology outpatient unit. Four major rotations were identified for the second-year neurology residents: adult consults, neurology outpatient unit, neurological intensive care unit, and inpatient and outpatient pediatric neurology. Three rotations were identified for the most senior neurology residents: adult neurology inpatient ward senior, inpatient and outpatient psychiatry, and neurology outpatient unit. For each major rotation, the individual objectives of the rotation (previously developed) were crossed with the six competencies (see Chart 1).
The identification of the competencies in each rotation allowed us to capture components of the training that were already a focus of the program (e.g., patient care, ethics). Evaluation strategies were then developed based on that information. Development of new evaluation forms that included assessment of the competencies already being taught followed (e.g., ethical decision making, health care resource utilization). In addition, competencies that were not being covered in individual rotations were identified. New training programs and methods of evaluating those competencies were then developed (e.g., end-of-life care).
Other residency programs at our institution adopted the grid and modified the content to reflect the objectives of their major rotations. Several programs found the modified grid very useful for organizing their planned response to the new general competencies.
We then converted the instrument into the competency evaluation form (see Chart 2). Attending physicians score residents as meeting the competency (indicated with a + sign), not meeting the competency or needing improvement (indicated with a − sign), or exceeding the competency for their level of training (indicated with a ++ sign). These evaluation forms were initiated in July 2003.
After a 30-minute training session with the faculty, the Self-Assessment and Vital Evaluation (SAVE) replaced the previous 24-category checklist evaluation (with four performance choices for each) for the major rotations. Faculty participation has been excellent. The detail of the comments has improved as the faculty appear to be considering a broader range of training components. The completed evaluations have provided the director with more specific information on the strengths and weaknesses of individual residents. Preliminary feedback suggests that the SAVE form consistently takes longer to fill out than did the previous form and that several attendings would prefer to evaluate all six competencies for all objectives.
Benefits of the Instruments
The two instruments, which as a pair are called Self-Assessment and Vital Evaluation (SAVE), have been extremely useful for the University of Virginia’s neurology residency program to form an initial response to the ACGME Outcome Project, to initiate programmatic changes to meet the new standards, and to help in the evaluation of residents.
The main intent of this article is to share information on the simple grid-like self-assessment instrument that can make it easier for programs to consider the role of the general competencies for each training objective. With some modifications, it may be of use to residency programs in other specialties as well. Such an instrument may help identify ongoing teaching in areas emphasized by the competencies and allow the development of methods to assess those competencies. It may also guide program development for areas not currently emphasized or evaluated. Conversion of this instrument to a competency-based, objective-specific evaluation instrument such as the one described here may also be useful to some programs.