Secondary Logo

Journal Logo

The Open Mind: The Open Mind

Beyond the “E” in OSCE

Rebel, Annette MD*; Hester, Douglas L. MD; DiLorenzo, Amy MA*; McEvoy, Matthew D. MD; Schell, Randall M. MD, MACM*

Author Information
doi: 10.1213/ANE.0000000000003317
  • Free

Many educators are familiar with Objective Structured Clinical Examinations (OSCEs) as an element of skill assessment during medical school1 but the OSCE is not a common component of training or assessment for residents during graduate medical education in anesthesiology in the United States.2 The addition of OSCEs to the board certification process in anesthesiology is not a new concept. However, the OSCE format of the Applied Examination by the American Board of Anesthesiology (ABA) differs from previously implemented examination practices. The Royal College of Anaesthetists in the United Kingdom has a complex OSCE-based certification process (18 stations, 5 minutes each), covering a broad spectrum of skills including resuscitation, technical skills, anatomy, history taking, physical examination, communication skills, anesthetic hazards, the interpretation of x-rays, and utilizing high fidelity simulation.3 The Israeli Board of Anesthesiology certification includes a 5-station OSCE portion (15 minutes each), covering trauma management, resuscitation, crisis resource management in the operating room, regional anesthesia, and mechanical ventilation.4

In the United States, trainees and educators may be more familiar with the US Medical Licensing Examination (USMLE) OSCE process. The ABA OSCE differs from the USMLE OSCE by examination duration and complexity. While the USMLE OSCE is a much longer examination (approximately 8 vs 1.5 hours) and focuses on basic physician skills (focused history taking, physical examination skills, and telephone consultation encounter), the examination content of the ABA OSCE includes demonstration of technical anesthesia skills and an emphasis on higher level behaviors (communications and professionalism).5

F1
Figure 1.:
The diagram describes the potential Objective Structured Clinical Examination (OSCE) benefits for the learner, the teacher, and the program.

Although the recent inclusion of OSCEs in the Applied Examination by the ABA is likely to result in more programs providing their graduating residents with board preparation sessions for this new type of high-stakes examination, a recent survey demonstrated that only approximately 30% of anesthesiology residency training programs are currently using the OSCE as a part of their training curriculum.2 It is probable that many programs implementing OSCEs will utilize the ABA content outline5 and format to develop and provide practice experiences with the primary intention of preparing their graduating residents for the Applied Examination. We posit that there are other potential benefits of implementing OSCEs throughout the entire residency continuum beyond that of preparation for the ABA Applied Examination (Figure 1). While currently unrealized even to many educators, the benefits of OSCE implementation offer increased value to learners, educators, and training programs.

THE “E” IN OSCE

The current model of development and assessment of medical competence has been classified into 4 stages of learner capability: “knows,” “knows how,” “shows how,” and “does.”6 The most basic competency level, “knows,” is usually assessed by a knowledge test such as a multiple-choice examination. A higher level of medical competency, “knows how,” can be partially addressed in an oral examination setting.6,7 Several medical specialties have provided evidence that OSCEs can be used to test the higher levels of competency and the ABA has added an OSCE to the primary certification process in an attempt to assess the applicant’s competency at a level (“shows how”) not previously achieved.7

The ABA OSCE examination format includes professionalism and communication components, including informed consent, treatment options, periprocedural complications, ethics, communication with other professionals, and practice-based learning and improvement. It also includes technical skills and interpretation, such as the identification of anatomy via ultrasound and interpretation of transesophageal echocardiogram and patient monitors.5

After examining the reasons for implementing the OSCE and the competency assessment gaps that prompted the change in the ABA certification process, one might query if OSCEs are only beneficial as high-stakes summative examinations. We propose that OSCE-based assessment at recurring intervals throughout residency training could provide valuable information for the resident, teacher, and program in the progression toward competency. Individual residents may benefit from increased educational experiences.6 Educators may refine the experience of their interventions and their own teaching skills. Programs may learn more about best practices for training the next generation of anesthesiologists.

BEYOND THE “E” IN OSCE

The potential advantages of regular OSCE-based resident skill assessment during residency training go well beyond “E,” that is, the high-stakes summative examination. As a robust form of experiential learning that is objective, standardized, and clinically relevant, OSCEs used for formative assessment during training have benefits for the learner, the teacher, and program (Figure 1).

For the Learner

Skill Assessment.

Summative evaluations may be used to measure the outcome of instruction at the end of a training period. A pass/fail format of OSCEs creates a high-stakes environment for the participant. However, in addition to summative evaluation, OSCE assessment can provide important formative feedback. The difference between summative and formative assessment is based on the desired use of the evaluation.8,9 By removing a simple pass/fail format, learners can receive specific feedback based on their performance. Such formative assessment would enrich the learning process for the learners and facilitate higher levels of performance.10

F2
Figure 2.:
The OSCE definition can be expanded to address more learning opportunities. OSCE indicates Objective Structured Clinical Examination.

By using OSCEs in a recurring fashion during training, learners receive frequent objective assessments that delineate skill levels and reveal deficits. For example, an OSCE requiring ultrasound identification of sciatic nerve anatomy on a simulated patient provides the learner an opportunity to self-reflect on this objective assessment of demonstrated skills apart from a pass/fail environment. With mentor assistance, the resident could develop individualized learning plans (ILPs) to fill any identified skill gaps. By expanding the number and frequency of OSCEs, learners can benefit from additional objective, standardized, and clinically relevant assessments and learning opportunities (Figure 2).

Demonstration of Milestone Progression.

Intermittent OSCE-based skill assessment may provide the learner with clear evidence of skill growth throughout the training process. By anchoring OSCE content within the specific Milestones (https://www.acgme.org/Portals/0/PDFs/Milestones/AnesthesiologyMilestones.pdf), learners may better perceive their own skill acquisition. Furthermore, they may be able to draw their own courses of study based on their individual performances. One example might be airway management and progression from recognition of adequacy of ventilation (Milestone level 1) to identifying and correcting problems associated with 1-lung ventilation (Milestone level 4). The objective, standardized, and clinically relevant characteristic of OSCE-based skill assessment may be advantageous for the learner and education process by relying on actual demonstration of proficiency to achieve specific Milestone levels.

One obvious advantage for using simulation and formative OSCEs is the patient protection from ineptitude of a trainee learning new skills and encountering challenges beyond their current level of competence. Using repetitive OSCE-based assessment during the residency may provide more objective data for competency documentation, and may result in patients receiving more standardized care.

Improved Learning and Retention.

The learning gains of repeated testing versus repeated study (“testing effect”) have been demonstrated for knowledge acquisition and knowledge retention. Testing dramatically improves long-term memory for tested material.11 Repeated testing may also provide more favorable skills retention than repeated practice.12 Using the previous sciatic nerve block example, OSCEs can provide a blend of knowledge demonstration (identifying structures) and skills demonstration (needle positioning by ultrasound). Pairing a formative OSCE experience in a test–retest manner with purposeful and systematic practice in-between testing is an example of deliberate practice, a powerful way to improve performance. Therefore, an OSCE ought to be considered an experience that facilitates learning and not solely a form of assessment. Adding OSCE-based assessment to a multimodal teaching and learning environment (including clinical practice, didactics, and simulation) can facilitate the learning process at multiple stages of development for both skills and knowledge.13

Feedback, Self-Reflection, and Goal Setting.

There are challenges to providing learners with sufficient feedback during clinical training.9 OSCEs offer the opportunity to provide feedback to the learner in a formative manner after direct observation of skill performance and with the goal of narrowing the gap between actual and desired performance. Learners may benefit from feedback that inherently compares them to others who have undergone the same OSCE.10 Furthermore, a primary source of learning is the result of reflective practices.14 OSCEs allow an environment for this to occur. The availability of objective data and comparison of individual performance with peers at similar level of training can help determine program- and specialty-specific performance expectations.15 However, embedding and encouraging self-reflection in the learning environment will promote life-long learning, and planned times for developing this skill before and after OSCEs might provide significant benefit for trainees.16

More specifically, OSCEs can allow the use of ILPs based on each learner.17 With a specific study plan for each resident, benefits for education can be maximized with targeted educational activities. An ILP using the mnemonic “Important, Specific, Measurable, Accountability, Realistic, Timeline” has been described as a successful approach.18 For example, instead of offering a central venous catheter workshop where all residents receive identical didactic and skills training, OSCE performance creates the chance for individualized educational intervention for catheter placement; 1 resident may need more help with ultrasound skills while another may need more time with modified Seldinger technique and suturing. Areas of needed remediation can be discovered. An OSCE assessment can provide measurable data for the ILP, generate realistic goals, and create an environment of accountability in a defined time frame. Over the course of a residency, trainee progression through Milestones can both be demonstrated and facilitated.

OSCE Practice.

One expected benefit of using OSCEs during training is the opportunity for deliberate practice with formative assessment before high-stakes summative assessment. A recent survey documented that many anesthesiology residency programs are planning to offer OSCEs for their senior residents with intent to prepare for primary board certification.2 However, OSCEs such as those used by the ABA require complex evaluation of validity and reliability in both the creation of the testing environment and in the assessment tools used.19 Such a detailed format may not be required to serve as practice opportunity. OSCEs used for formative assessment do not require the same psychometric evaluation. As such, many programs could provide OSCEs that are high-quality, that are appropriate for formative assessment, and that allow for deliberate practice without needing to have them undergo testing at the same level as that required for summative assessments.

For the Educator

Assessing Competence.

Educators can use OSCE performance data to assess learner competence and identify skill gaps. This stems naturally from the unique value OSCEs offer residents. The difference here is that in addition to assisting an individual resident based on OSCE performance progression, program directors and educators can collect population data allowing for best education practices of groups of residents. That is, the performance of entire groups of residents on OSCEs (eg, by training level or entire program) can serve as a feedback mechanism for educators on how effectively they are teaching certain aspects of knowledge, skill, and practice throughout residency and whether certain pedagogical approaches are more effective than others.

Direct clinical supervision and faculty intervention may mask deficits in learner clinical skills. The random nature of case loads and call duties may allow inequalities in experience that translate into inequalities in education. Furthermore, personality differences between faculty and residents have been shown to affect subjective evaluations.20 An OSCE curriculum for a residency program would allow assessment and demonstration of necessary skills. For example, an OSCE might require an advanced resident to evaluate a simulated patient with a periprocedural complication, conduct a focused evaluation, formulate an action plan, and discuss their plan with the patient. Depending on their performance, residents might demonstrate that their prior training is sufficient for this task. If all CA-3s perform at a Milestone level of 3 or 4, while most CA-1s perform at a level of 2, a program director could consider how faculty are educating residents in this skill; however, the program director might also be satisfied that the global experiences between these 2 years remains sufficient for achieving appropriate progression over time for this milestone. In fact, as skills are expected to grow over time, tailoring such OSCE assessments to expected Milestone progression would be an objective improvement in learner and educator assessment in many programs.

Protecting Patients.

OSCEs allow the teacher to adjust the level of supervision relative to the level of training and expected skill without having the ethical dilemma of needing to avoid patient harm. Thus, the high-stakes nature of some anesthetic situations can be taught and tested in a risk-free environment. Furthermore, these experiences can be repeated over and over without any risk to a patient and thus without the ethical dilemma of doing this in the clinical setting where such “practice” brings no benefit to the patient.21

Improving Teaching.

When an educator administers and evaluates an OSCE over time, he or she will also be learning. Although the focus at first seems primarily on the learner, an educator benefits through the evaluation of teaching efficacy and through mastery of the given topic.22 Based on this feedback, the teacher can modify not only the OSCE itself but also his or her depth of knowledge and efficacy of interaction with the learners.

For the Program

Longitudinal Skill Assessment and Milestones.

The Anesthesiology Milestones Project defined 25 subcompetencies with corresponding Milestones grouped into the 6 general competencies. When OSCE experiences are designed with the Milestones as a guide, learner performance data from OSCE assessment can help the Clinical Competency Committee assign appropriate Milestone levels that are reported to the ACGME, then shared with the ABA. Due to the ACGME requirement to report Milestone performance levels biannually for each learner, programs may find it difficult to obtain sufficient evaluative data for every subcompetency. OSCE-based assessment can supplement clinical evaluations and provide an opportunity to address accreditation required experiences that may be difficult for a program to offer as predictable encounters in the clinical environment for all residents. Furthermore, due to the standardized nature of the OSCE format, repetitive assessment may be a more trustworthy approach for objectively measuring growth in knowledge, skills, and practice.

For example, it may be difficult to allow residents to demonstrate the ability to disclose an adverse event (Milestone Interpersonal and Communication Skills 1) or to manage ethical dilemmas (Milestone Professionalism 2) with independence (level 4). When considering the fifth subcompetency of the “patient care” general competency, program directors may not have observable data about level 4 performance (“identifies and manages clinical crises appropriately with conditional independence; assumes increasing responsibility for leadership of crisis response team”). An OSCE with a very scripted crisis would allow a resident to show graded progression from level 1 to level 4, the goal of the next accreditation system with the Milestones. By nature, these kind of events occur randomly in the clinical setting and often the resident will be influenced by the faculty who is present during such encounters. Additionally, programs may choose to design OSCEs to address program-specific Milestone gaps, therefore ensuring that the residents will have sufficient educational opportunities to develop and demonstrate competence in these areas.

Curriculum Evaluation and Needs Assessment.

An OSCE can be designed to evaluate a specific curriculum to determine if its educational goals were achieved.23,24 For example, the ABA OSCE Content Outline includes the application of ultrasonography for identifying relevant anatomy when performing transversus abdominis plane block. The collective performance of a class of residents may demonstrate adequacy of a regional anesthesia curriculum for this procedure. Conversely, if curriculum deficiencies are identified, interventions can be designed to address the deficit by adding clinical exposure or creating a training workshop. Using this aspect of OSCEs can aid in repetitive programmatic needs assessment to identify “gaps” between the current and desired conditions, a best practice when planning future educational offerings. In fact, this may encourage educators and programs to uncover and address “unperceived gaps” that formerly would have been overlooked. For example, an OSCE was recently used to evaluate point-of-care ultrasound skills of residents in 1 training program. The OSCE performance of the residents demonstrated a need to implement a comprehensive curriculum in point-of-care ultrasound skills.23 Subsequent measurements obtained by using the same OSCE over time may allow a program to document that learning gaps have been closed.24

FUTURE DIRECTIONS

Based on a recent survey, 89% of anesthesiology residency program directors agreed that it is important to practice OSCEs for preparation for the ABA Applied Certification Examination. However, only 31% were currently providing mock OSCE experiences. Of those not providing mock OSCE experiences, 75% reported plans to start one.2 Most anesthesiology training programs are expected to move beyond the barriers of financial resources, faculty expertise, and time to implement OSCEs to prepare their residents for the ABA certification examination. Some of the barriers to OSCE development might be overcome by establishing a national working group formed to create and share OSCEs that include valid and reliable assessment tools. Development of scoring rubrics and implementation with consistency across raters remains a challenge for those who wish to move beyond using the Milestones as a key scoring guide. Further research and development should explore the specific content and subcompetencies best addressed by OSCEs. In addition, the question of how to appropriately integrate OSCEs as a teaching and assessment method needs to be answered. OSCEs may provide a satisfactory way to meet the elusive need of serial competency assessments throughout resident training. Moving beyond the mere examination understanding of “E” in OSCE could create multifaceted and significant educational benefits for the learners, teachers, programs, and profession.

DISCLOSURES

Name: Annette Rebel, MD

Contribution: This author helped with the conception and design of the article, and to write the article.

Name: Douglas L. Hester, MD.

Contribution: This author helped with the conception and design of the article, and to write the article.

Name: Amy DiLorenzo, MA.

Contribution: This author helped with the conception and design of the article, and to write the article.

Name: Matthew D. McEvoy, MD.

Contribution: This author helped with the conception and design of the article, and to write the article.

Name: Randall M. Schell, MD, MACM.

Contribution: This author helped with the conception and design of the article, and to write the article.

This manuscript was handled by: Edward C. Nemergut, MD.

REFERENCES

1. Wallace J, Rao R, Haslam R. Simulated patients and objective structured clinical examinations: review of their use in medical education. Adv in Psych Treatment. 2002:8;342–350.
2. Isaak RS, Chen F, Arora H, Martinelli SM, Zvara DA, Stiegler MP. A descriptive survey of anesthesiology residency simulation programs: how are programs preparing residents for the new American Board of Anesthesiology APPLIED Certification Examination? Anesth Analg. 2017;125:991–998.
3. Royal College of Anaesthetists. Structure and marking. Available at: https://www.rcoa.ac.uk/primary-frca-oscesoe/structure-and-marking. Accessed January 2, 2018.
4. Berkenstadt H, Ziv A, Gafni N, Sidi A. Incorporating simulation-based objective structured clinical examination into the Israeli National Board Examination in Anesthesiology. Anesth Analg. 2006;102:853–858.
5. The American Board of Anesthesiology. Applied examination: objective structured clinical examination. Available at: http://www.theaba.org/PDFs/APPLIED-Exam/APPLIED-OSCE-ContentOutline. Accessed November 20, 2017.
6. Ebert TJ, Fox CA. Competency-based education in anesthesiology: history and challenges. Anesthesiology. 2014;120:24–31.
7. Rathmell JP, Lien C, Harman A. Objective structured clinical examination and board certification in anesthesiology. Anesthesiology. 2014;120:4–6.
8. Kibble JD. Best practices in summative assessment. Adv Physiol Educ. 2017;41:110–119.
9. Burgess A, Mellis C. Feedback and assessment for clinical placements: achieving the right balance. Adv Med Educ Pract. 2015;6:373–381.
10. Boulet JR, Murray D. Review article: assessment in anesthesiology education. Can J Anaesth. 2012;59:182–192.
11. Rowland CA. The effect of testing versus restudy on retention: a meta-analytic review of the testing effect. Psychol Bull. 2014;140:1432–1463.
12. Sennhenn-Kirchner S, Goerlich Y, Kirchner B, et al. The effect of repeated testing vs repeated practice on skills learning in undergraduate dental education. Eur J Dent Educ. 2018;22:e42–47.
13. Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R. Mastery learning for health professionals using technology-enhanced simulation: a systematic review and meta-analysis. Acad Med. 2013;88:1178–1186.
14. Sternlieb JL. A guide to introducing and integrating reflective practices in medical education. Int J Psychiatry Med. 2015;49:95–105.
15. Rebel A, DiLorenzo AN, Fragneto RY, et al. A competitive objective structured clinical examination event to generate an objective assessment of anesthesiology resident skills development. A A Case Rep. 2016;6:313–319.
16. Azer SA, Guerrero AP, Walsh A. Enhancing learning approaches: practical tips for students and teachers. Med Teach. 2013;35:433–443.
17. Weimer M. Learner-Centered Teaching: Five Key Changes to Practice. 2002.San Francisco, CA: Jossey-Bass.
18. Li ST, Paterniti DA, Co JP, West DC. Successful self-directed lifelong learning in medicine: a conceptual model derived from qualitative analysis of a national survey of pediatric residents. Acad Med. 2010;85:1229–1236.
19. Hastie MJ, Spellman JL, Pagano PP, Hastie J, Egan BJ. Designing and implementing the objective structured clinical examination in anesthesiology. Anesthesiology. 2014;120:196–203.
20. Schell RM, Dilorenzo AN, Li HF, Fragneto RY, Bowe EA, Hessel EA 2nd.. Anesthesiology resident personality type correlates with faculty assessment of resident performance. J Clin Anesth. 2012;24:566–572.
21. Ivascu NS, Meltzer EC. Teacher and trustee: examining the ethics of experiential learning in transesophageal echocardiography education. Anesth Analg. 2018;126:1077–1080.
22. Stibane T, Sitter H, Neuhof D, et al. Feedback promotes learning success! Which kind of feedback for the faculty is given by an interdisciplinary OSCE with focus on decision-making? GMS J Med Educ. 2016;33:Doc53.
23. Rebel A, Srour H, DiLorenzo A, et al. Ultrasound skill and application of knowledge assessment using an innovative OSCE competition-based simulation approach. J Educ Perioper Med. 2016;18:E404.
24. Dagnone JD, Hall AK, Sebok-Syer S, et al. Competency-based simulation assessment of resuscitation skills in emergency medicine postgraduate trainees - a Canadian multi-centred study. Can Med Educ J. 2016;7:e57–e67.
Copyright © 2018 International Anesthesia Research Society