Secondary Logo

Journal Logo

A Competitive Objective Structured Clinical Examination Event to Generate an Objective Assessment of Anesthesiology Resident Skills Development

Rebel, Annette MD; DiLorenzo, Amy N. MA; Fragneto, Regina Y. MD; Dority, Jeremy S. MD; Rose, Greg MD; Nguyen, Dung MD; Hassan, Zaki-Udin MBBS; Schell, Randall M. MD, MACM

doi: 10.1213/XAA.0000000000000282
Case Reports: Education
Free

Residency programs are charged with teaching, assessing, and documenting resident competency for a multitude of skills throughout the course of residency training. An innovative, competition-based objective structured clinical examination event was designed in our department to objectively assess the skill level of anesthesiology residents. After conducting the identical event for 2 years in postgraduate year 1 (PGY1) and postgraduate year 2 (PGY2) residents, we tested the hypothesis that the event can provide adequate standardization to appropriately document progression in technical and nontechnical skills. Twenty-one residents participated in both events during their PGY1 and PGY2 years: n = 10, 2012/2013, n = 11, 2013/2014. The PGY1 participants in 2012 were retested in 2013 (as PGY2 residents) during an identical event, and their performance was compared as a group and on an individual level. The PGY1 residents in 2013 did the same in 2014. Four workstations were analyzed to determine whether improvement in performance occurred between the PGY1 and the PGY2 years: (1) preoperative assessment, (2) operating room anesthesia station checkout, (3) peripheral IV and endotracheal tube placement, and (4) transfer of care in the postanesthesia care unit. The performances of PGY1 and PGY2 residents were compared. The assessments were performed by anesthesiology faculty using checklists, time to complete task, and Likert scale ratings. Data analysis showed improved technical anesthesia skills (operating room setup, peripheral IV, and endotracheal tube placement) and more complete anesthesia-related information management in the preoperative assessment and postoperative transition of care in the postanesthesia care unit in PGY2 residents compared with the PGY1 performance of the same residents. The described event is a valuable tool for objective assessment of multiple anesthesia skills and possible milestones during residency.

From the Department of Anesthesiology, University of Kentucky, Lexington, Kentucky.

Accepted for publication September 24, 2015.

Funding: None.

The authors declare no conflicts of interest.

This report was previously presented, in part, at the International Anesthesia Research Society, Society for Education in Anesthesiology.

Address correspondence to Annette Rebel, MD, Department of Anesthesiology, University of Kentucky, 800 Rose St., N202, Lexington, KY 40536. Address e-mail to arebe2@email.uky.edu.

The Accreditation Council for Graduate Medical Education is changing the focus of education away from a time-based to a competency-based system by implementing specialty-specific milestones.1 In the past, anesthesiology educators have mostly relied on observations in clinical practice for resident skills assessment.2 With the anesthesiology milestones implementation in 2014, the objective evaluation of resident skill development and attainment of milestones has become more challenging. The value of the milestones hinges on providing a meaningful longitudinal assessment of the learner throughout their training.3,4 To address the need for an objective longitudinal assessment tool, we previously developed and described a competition-based objective structured clinical examination (OSCE) event to assess resident skill levels in several anesthesia tasks.5 After conducting the event for several consecutive years and documenting the feasibility of running the identical event for postgraduate year 1 (PGY1) and postgraduate year 2 (PGY2) residents, we reviewed our residents’ performance data to determine whether the event could be used to objectively document growth in resident skill levels between the first and the second year of residency training. We hypothesized that the OSCE event would enable us to document improvement in resident performance between PGY1 and PGY2 residents.

Back to Top | Article Outline

METHODS

The study was performed in the Department of Anesthesiology at the University of Kentucky in Lexington, Kentucky. The institutional review board of the University of Kentucky has reviewed the protocol and granted institutional review board exemption for this project. All participating residents consented to the deidentified analysis of their data obtained during the event.

The Anesthesia Olympics event was developed by the authors and previously described.5 The event consists of 6 workstations (WSs) presented as OSCEs. The WS’ content was chosen by faculty consensus based on expectations of what an anesthesiology intern should be able to do after 1 month of introductory anesthesia training. The OSCE event was presented at the departmental education committee consisting of 10 anesthesiology educators with broad involvement in the residency curriculum. The education committee reviewed, provided feedback, and came to consensus on the OSCE design. One of the education committee members was also a station mentor in the OSCE event in addition to the presenter. Faculty assessors are board-certified anesthesiologists with simulation training and experience and have participated in this OSCE event since its inception. The faculty underwent a group orientation to the global goals and objectives of the event and individualized review of checklists and other assessment tools for their assessment stations. The assessor for WS1 (preoperative assessment) met in advance with the standardized patient (SP) days before the event to familiarize the SP with the WS outline and objectives.

The event was conducted at the beginning of the PGY1 year. At our institution, PGY1 anesthesia residents spend their first clinical month in the Department of Anesthesiology. During this month, the residents undergo a structured didactic curriculum including simulation and are integrated into clinical practice with more experienced residents in the operating room (OR). At the end of the first clinical month, we conduct the described event to assess the skill level at this point of their training. The identical event is repeated at the beginning of the PGY2 year. There is approximately a 12-month interval between the PGY1 and the PGY2 events. The residents have identical PGY1 rotations between the events, configured in an individualized sequence. The PGY1 rotations consist of monthly rotations in internal medicine, surgery, pediatrics, critical care medicine, cardiology consult, pulmonary consult, chronic pain clinic, transfusion medicine, acute pain service, and a total of 2 months of anesthesiology. The residents had an additional month of OR anesthesia in July of their PGY2, just preceding the PGY2 event, for a total of 3 months of OR anesthesia. Although the sequence of the rotations was different, at the conclusion of the 12 months between the events, there was no difference in clinical training among the participants. Before each event, residents were informed about the goals and outcomes to be measured at each WS and were informed about the competitive aspect of the events.

The 4 OSCEs included in the data analysis are explained subsequently.

Back to Top | Article Outline

WS1 = Preoperative Assessment (SP)

The resident encounters a trained actor (SP) in a preoperative clinic setting and is expected to perform an anesthesia-focused history and physical examination, create and discuss an anesthesia plan with the patient and address the patient’s questions. The same SP was used in both the PGY1 and the PGY2 years.

The performance is rated for completeness using a checklist. The outcome is reported as a completeness score, reported as a percentage of maximal score.

Anesthesiology Milestones subcompetencies evaluated during this WS are as follows: Patient Care 1: Preanesthetic Patient Evaluation, Assessment, and Preparation; Patient Care 2: Anesthetic Plan and Conduct; and Interpersonal and Communication Skills 1: Communication with Patients and Families.

Back to Top | Article Outline

WS2 = OR Anesthesia WS Checkout

The resident is expected to perform a preanesthesia WS checkout. The machine checkout test will reveal a machine malfunction because of a failed leak test, and the resident must correctly identify and resolve the leak source. The leak sources in the PGY1 and PGY2 events were different but similarly complex. The performance is rated by (1) time (seconds) needed to complete an OR station setup as recommended by textbook literature,6 and (2) ability to identify leak source (yes/no). The Anesthesiology Milestones subcompetency evaluated during this WS is as follows: Patient Care 9: Technical skills: Use and Interpretation of Monitoring and Equipment.

Back to Top | Article Outline

WS3 = Peripheral IV Placement and Airway Management

The resident obtains peripheral IV (PIV) access (Vascular Access Model; Blue Phantom, Sarasota, FL) and performs mask ventilation and successful endotracheal intubation (Ambu Airway Management Trainer; Ambu, Glen Burnie, MD) using simulation equipment. The performance is rated by time (seconds) needed to complete an endotracheal intubation and PIV cannula placement. The Anesthesiology Milestones subcompetency evaluated during this WS is as follows: Patient Care 8: Technical skills: Airway Management.

Back to Top | Article Outline

WS4 = Transfer of Care

After receiving and reviewing anesthesia preoperative evaluation and intraoperative anesthesia record, the resident transports an SP on a stretcher to the postanesthesia care unit (PACU) and gives a report to a confederate PACU nurse. The performance is rated for completeness using a checklist. The outcome is reported as a completeness score, reported as a percentage of the maximal score. In addition, the ability to communicate effectively with the other care provider is assessed using several Likert scales, summed up into a communication score and reported as a percentage of maximal score. The Anesthesiology Milestones subcompetency evaluated during this WS is as follows: Interpersonal and Communication Skills 2: Communication with Other Professionals.

Two additional previously described WS of the event (general anesthesia induction sequence and intraoperative crisis simulation) were excluded from the data analysis because of inconsistency in WS evaluator, OSCE design, and assessment tool over the period of study. Because of faculty unavailability, the faculty mentors of the 2 WS were changed between the PGY1 and the PGY2 events. The design of the OSCE event was changed in both WS over time, requiring adjustments of the assessment tools.

Data are reported as mean ± SD. A paired t test was used for statistical analysis. The results of the leak test in WS2 (yes/no) were statistically assessed with exact 1-tailed binomial calculation for hypothesis testing with a probability value of 0.5.

Back to Top | Article Outline

RESULTS

Twenty-one residents were included in the data analysis; n = 10, 2012/2013; n = 11, 2013/2014. The OSCE event design in 2012 was identical to the events in 2013 and 2014. The PGY1 participants in 2012 were retested in 2013 (as PGY2 residents), and their performance was compared as a group and on an individual level. The performance of the PGY1 residents in 2013 was compared with their performance as PGY2 residents in 2014. On the basis of resident availability (leave time, absence because of clinical needs, sick leave), not every resident was able to participate in both events. Data from 3 residents were excluded from analysis because the residents did not participate in both events (PGY1 and PGY2) and because we could not obtain data for longitudinal analysis or skill growth.

Back to Top | Article Outline

WS1 = Preoperative Assessment (SP)

Table 1.

Table 1.

Figure 1.

Figure 1.

The completeness scores of all participants in the PGY1 and PGY2 event are shown as individual performances (Fig. 1). The majority of participants demonstrated improved ability to complete a preoperative anesthesia assessment. The mean completeness score of PGY2 residents was significantly higher than their PGY1 score (0.88 ± 0.06 vs 0.76 ± 0.08, respectively, P < 0.001; Table 1).

Back to Top | Article Outline

WS2 = OR Anesthesia WS Checkout

Figure 2.

Figure 2.

The times needed to complete an OR WS checkout for all participants in the PGY1 and PGY2 events are shown as individual performances (Fig. 2). The majority of participants showed improved ability to complete the WS checkout. The mean time to complete the checkout for PGY2 residents was significantly shorter compared with their performance as PGY1 residents (646.9 ± 178.9 vs 799.6 ± 194.4 seconds, respectively, P = 0.012), and more residents were able to identify the leak source during the checkout in the PGY2 year (n = 20; 95.2%) than in the PGY1 year (n = 13, 61.9%; P < 0.001; Table 1).

Back to Top | Article Outline

WS3 = PIV Placement and Airway Management

Figure 3.

Figure 3.

The times needed to successfully place an endotracheal tube (ETT) and insert a PIV for all participants in the PGY1 and PGY2 events are shown as individual performances (Fig. 3). The majority of participants demonstrated improvement between the PGY1 and the PGY2 with shorter times to complete the tasks in the PGY2. The mean time to successful ETT placement for PGY2 residents was significantly shorter compared with their PGY1 performance (154.1 ± 56.7 vs 186.7 ± 50 seconds, respectively; P = 0.017; Table 1). The mean time to successful PIV placement for PGY2 residents was also significantly shorter compared with their time needed as PGY1 residents (140.3 ± 71.2 vs 217.4 ± 77.9 seconds, respectively; P = 0.011; Table 1).

Back to Top | Article Outline

WS4 = Transfer of Care

The completeness and communication scores for all participants in the PGY1 and PGY2 events are shown as individual performances (Fig. 4). The majority of participants showed improved ability to effectively transfer patient care with decreased loss of information during the transition. The mean completeness score of PGY2 residents was significantly higher than their PGY1 score (0.77 ± 0.18 vs 0.56 ± 0.17, respectively; P = 0.018; Table 1). The communication score that assessed the quality of communication during the transfer of care improved from the PGY1 to PGY2, indicating enhanced communication and professional behavior. Mean communication scores increased from 0.77 ± 0.12 to 0.91 ± 0.10 (P < 0.001; Table 1).

Figure 4.

Figure 4.

At each WS, at least one resident did not show improved performance at the PGY2 event compared with their PGY1 scores. Analysis of the individual performance did not identify a subset of “low-performing” residents, meaning that different residents performed worse on different WS when compared with their previous performance. Explanations for this observation may depend on the WS content. At WS1 (preoperative assessment), the slight decline in resident performance was related to incomplete physical evaluation and the resident received specific feedback about this. At WS2 (OR Anesthesia WS Checkout), 5 residents were not able to decrease the time needed to complete an OR WS checkout, possibly indicating that they were prioritizing completeness over time. Residents’ ability to identify machine malfunction improved from the PGY1 to PGY2 event. At WS3 (ETT and PIV placement), several residents did not improve their procedure times from the PGY1 to PGY2 event. A possible explanation might be that more time was needed for equipment check and to view optimization for the ETT placement because the resident skill and expectations for “good technique” might have outweighed their emphasis on completing the task quickly. Although a very junior resident might not see the need for checking the laryngoscope before use, and might be happy with borderline visualization of vocal cords, a more experienced anesthesia resident might take longer to ensure all equipment is present and functional and to optimize airway exposure before placing the ETT. Similar reasons may explain the observation at the PIV placement. At WS4 (Transfer of Care in PACU), 2 residents showed only minor decline in performance. One resident, although declining in overall score, still performed at a high level on the station, whereas the other resident continued to show deficits in completeness and clear communication. On the basis of the findings of this WS, this particular resident received individualized counseling and assistance to improve PACU transfer.

Back to Top | Article Outline

DISCUSSION

The main finding of our study was that the competitive OSCE-based event designed by our educational group was able to document an increase in anesthesia task-related skill levels and improved nontechnical, professional abilities in more advanced anesthesia residents (PGY2) when compared with beginning residents (PGY1). This OSCE event, therefore, can be used as a valid objective skills assessment tool for documentation of skill growth and milestones attainment in anesthesiology residents.

The transition to competency-based education with milestones to measure progress and milestones attainment requires many anesthesiology residency programs to adjust their educational evaluation process. An objective method to assess an individual’s performance-based clinical skills is needed to guide the growth of residents throughout residency.7 However, identifying tools that provide objective assessments has been a challenge for resident educators.8–10 Traditionally, the core of resident evaluation has been the observation of clinical work, formally known as workplace-based assessment. Although workplace-based assessment is important, the performance is context-specific, assessors are inconsistent, and standardization is lacking.11 Therefore, workplace-based assessment alone may not allow residency programs to effectively measure the growth and progression of their resident’s clinical skills. The Accreditation Council for Graduate Medical Education Outcome Project has modeled the mandated milestones on the Dreyfus model, describing developmental stages beginning at level 1 (novice) and progressing through level 2 (advanced beginner), level 3 (competent), level 4 (proficient, meeting the expectation for a resident ready for independent practice), and level 5 (expert, the highest obtainable level of master being an aspirational goal not reached by most practicing physicians).1,3,12 A meaningful evaluation in the context of the milestones project demands a longitudinal objective evaluation process.3,4,13 Our project documents such an assessment method for evaluating junior anesthesia residents’ skills in certain milestones areas. Starting with the resident performance at milestone level 1 or 2, our project ensures that the junior residents have mastered some of the essential tasks expected of them based on their training year.1 Documenting the learner’s level at the beginning of their training and repeating the identical event after 12 months of clinical training (not exclusively in anesthesiology) allowed the longitudinal assessment needed for documenting growth.

The assessment tools used in the event are certainly not perfect because we are still depending on faculty observation and interpretation.11 Checklists are effective for scoring completeness but fail to show the ability to address a clinical problem by pattern recognition rather than a less sophisticated step-by-step approach.13–15 However, several assessment tools were included in the event such as time to complete a task, checklists to measure completeness, and modified Likert scales to assess performance quality. Although our study did not directly address the impact of multiple assessors and a combination of assessment tools on measurement validity, applying the general principle from previous studies on assessment tools, we imply that the reliability of this evaluation method should be improved through the use of several assessors and assessment tools.16

The anesthesiology milestones cover a broad spectrum of anesthesia skills. Faculty consensus was used to determine which tasks a junior resident should be able to master during the first year of training. Competence-based advancement assumes that we are able to accurately measure the attainment of important competencies.8 The OSCEs were chosen to assess clinical and practical skills rather than factual knowledge.17 To increase the value of the overall assessment, multiple tasks were assessed, avoiding that the obtained evaluation is too context-specific. Our choice of procedural and nonprocedural WS demonstrated a growth in resident skill level from the first to the second residency year. However, the choice of OSCEs will be dependent on each individual program’s design and assessment needs.

The final question for this project is how the data generated from the event can be used for resident evaluation. In addition to the obvious documentation of growth in skill levels related to the milestones, the event allows the resident to be an active participant in the learning process. Although the milestones provide an evolving roadmap for the creation of a competent anesthesiologist, the learners need to be informed about their individual skill levels and what is expected to advance to the next level.18 Clear expectations have to be set to allow the residents to monitor their own advancement.8 The residency program leadership also benefits from the data obtained from this assessment method. In addition to identifying residents in need of timely remediation, it allows the identification of residents progressing more rapidly than expected. The identification of advanced residents can help to facilitate their individual growth and create a peer resource for fellow residents.8

Although our project and obtained data allowed our program to address the need of our anesthesiology residency for objective assessment tools for the described milestones, other programs might benefit from the described OSCE event as a path to identify their own needs and milestone gaps followed by designing OSCEs and assessment tools. Because several publications have documented the use of low- and high-fidelity simulation for multiple anesthesiology-related skills,14,19 it takes a comprehensive approach to use the skill assessment for milestone attainment and growth evaluation.

Back to Top | Article Outline

CONCLUSIONS

In summary, we demonstrated that the previously described competition-based OSCE event provides objective assessment data both for formative feedback and to document growth in resident skill levels as measured by milestone acquisition. Future directions include development of training level-specific WS to assess residents at higher training levels and to monitor their progress throughout the entire residency.

Back to Top | Article Outline

REFERENCES

1. ACGME milestone project. Available at: http://acgme.org/acgmeweb/Portals/0/PDFs/Milestones/AnesthesiologyMilestones.pdf. Accessed September 20, 2015.
2. Epstein RM. Assessment in medical education. N Engl J Med 2007;356:38796.
3. Carraccio CL, Benson BJ, Nixon LJ, Derstine PL. From the educational bench to the clinical bedside: translating the Dreyfus developmental model to the learning of clinical skills. Acad Med 2008;83:7617.
4. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GME accreditation system—rationale and benefits. N Engl J Med 2012;366:10516.
5. Rebel A, DiLorenzo A, Fragneto RY, Dority JS, Rose GL, Nguyen D, Hassan ZU, Schell RM. Objective assessment of anesthesiology resident skills using an innovative competition-based simulation approach. A A Case Rep 2015;5:7987.
6. Butterworth JF IV, Mackey DC, Wasnick JD. Butterworth JF IV, Mackey DC, Wasnick JD. The anesthesia machine. Morgan & Mikhail’s Clinical Anesthesiology, 5th ed. 2013. New York, NY: McGraw-Hill, Available at: http://accessmedicine.mhmedical.com.ezproxy.uky.edu/content.aspx?bookid=564&Sectionid=42800534. Accessed May 26, 2015.
7. Ebert TJ, Fox CA. Competency-based education in anesthesiology: history and challenges. Anesthesiology 2014;120:2431.
8. Bordley DR, Smith LG, Wiese JG. Competency-based advancement: risky business. Am J Med 2010;123:18891.
9. Crossley J, Humphris G, Jolly B. Assessing health professionals. Med Educ 2002;36:8004.
10. Snadden D. Portfolios—attempting to measure the unmeasurable? Med Educ 1999;33:4789.
11. Crossley J, Johnson G, Booth J, Wade W. Good questions, good answers: construct alignment improves the performance of workplace-based assessment scales. Med Educ 2011;45:5609.
12. Dreyfus SE. The five-stage model of adult skill acquisition. Bull Sci Technol Soc 2004;24:17781.
13. Cooney CM, Redett RJ III, Dorafshar AH, Zarrabi B, Lifchez SD. Integrating the NAS Milestones and handheld technology to improve residency training and assessment. J Surg Educ 2014;71:3942.
14. McEvoy MD, Hand WR, Furse CM, Field LC, Clark CA, Moitra VK, Nietert PJ, O’Connor MF, Nunnally ME. Validity and reliability assessment of detailed scoring checklists for use during perioperative emergency simulation training. Simul Healthc 2014;9:295303.
15. Turner JL, Dankoski ME. Objective structured clinical exams: a critical review. Fam Med 2008;40:5748.
16. Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA 2009;302:131626.
17. Hastie MJ, Spellman JL, Pagano PP, Hastie J, Egan BJ. Designing and implementing the objective structured clinical examination in anesthesiology. Anesthesiology 2014;120:196203.
18. Carraccio C, Burke AE. Beyond competencies and milestones: adding meaning through context. J Grad Med Educ 2010;2:41922.
19. Blum RH, Boulet JR, Cooper JB, Muret-Wagstaff SL; Harvard Assessment of Anesthesia Resident Performance Research Group. Simulation-based assessment to identify critical gaps in safe anesthesia resident performance. Anesthesiology 2014;120:12941.
Copyright © 2016 International Anesthesia Research Society