Training and evaluation in anesthesia must reflect the appropriate level of professional competence expected from trainees, including communication, knowledge, technical skills, clinical reasoning, and values, used in daily practice for the benefit of the individual and community being served.1 A well-recognized model for the development and assessment of medical competence defines 4 stages of progressive capabilities, each of which builds on the prior stage.2 These 4 levels, (1) knows, (2) knows how, (3) shows how, and (4) does, can be assessed accordingly. Historically, medical examinations and board licensing have written and oral assessment tools that could only evaluate the “knows” and “knows how” levels of capability. With the introduction of the Objective Structured Clinical Examination (OSCE), trainees were tested on a higher level of competency.3
Simulation-based training and assessment are increasingly being used in the field of anesthesia for the assessment of the “shows how” level of competence.4 Advancement in this field has been the result of improved simulation technologies, enhancements in the field of performance assessment, and greater acceptance by clinicians and educators. However, there still remain some areas of anesthesia practice, such as assessment of regional anesthesia procedural skills, where the applicability of simulation-based assessment is either controversial or poorly implemented. Given the increasing use of regional anesthesia in clinical practice, the need for effective assessment tools is evident. This article describes 7 years of experience using OSCE for the assessment of regional anesthesia skills during the Israeli National Board Examination in Anesthesiology. Regional anesthesia is 1 of 5 stations in this examination; the other stations include trauma management, resuscitation, intensive care medicine, and critical events in the operating room.
Process of Examination Development
The development of simulation-based examination stations was based on a collaborative effort led by the Israel Board Examination in Anesthesiology with the assistance of simulation experts from the Israel Center for Medical Simulation and experts in psychometrics and performance assessment from Israel's National Institute for Testing and Evaluations.5,6 The content of the examination was defined and developed according to the steps and criteria described by Newble.7 First, a range of regional anesthesia techniques considered core knowledge for residents completing their training was compiled on the basis of expert opinion. The second step involved the definition of tasks for each of the clinical conditions. The tasks were selected on the basis of >80% consensus among 10 experts using a variation of the Delphi technique.8 In the third step of the process, the tasks were incorporated into hands-on simulation-based examination stations in the OSCE format. The scenarios were first tested on junior attending anesthesiologists before their implementation in the actual examination. Development is an ongoing process and improvements in the scenarios are performed according to the examiners' feedback after each examination period.
Types of Regional Anesthesia Procedures
The chosen scenarios included interscalene blocks, axillary blocks, sciatic blocks, regional anesthesia for awake fiberoptic intubation, popliteal blocks, ankle blocks, cervical blocks, and epidural anesthesia. Ultrasound-guided techniques were not deemed evaluable because of the variability in trainees' access to such equipment during their training.
The scenarios were integrated in 20-minute examination stations conducted in a simulated clinic room located at the Israel Center for Medical Simulation. All examinees were familiar with the simulation center as a result of a formal introduction session provided 2 weeks before the examination.
A role-playing actor simulated the patient and the examinee was briefed on the scenario with instructions to simulate performance of a specified block on the patient. The scenarios were conducted in a stepwise manner with the examinee being assessed on:
- Patient positioning
- Relevant anatomy and topographic landmarks
- Needle positioning including puncture site, needle direction, and expected insertion depth
- Description of the use of a nerve stimulator in the identification of the correct site for injection of local anesthetic
- Planned type and amount of local anesthetic injected
- Expected distribution of anesthesia and duration of block
- Use of the block for postoperative pain relief
- Complications: role played by the simulated patient who had been prebriefed on the complications. Complications included but were not limited to pain during local anesthesia injection, shortness of breath, and convulsions.
Care for a sterile procedure was not expected because of time constraints and the mock nature of the scenario. One examination scenario is presented in the Appendix.
Scoring and Assessment
Each examinee was assessed by 2 examiners from other anesthesia training programs, therefore avoiding prior knowledge of the candidate and minimizing potential for bias. The examiners had knowledge of the examination process in that the specific scenario was e-mailed to each examiner 2 days before the examination date. The examiners assess the performance both by impartial observation of the execution of the block as well as by asking for explanations of relevant dilemmas that may present themselves during the scenario. The scoring is conducted via a checklist. Included within the checklist are 2 to 3 critical items usually pertaining to management of life-threatening complications. Incorrect performance or missing these critical items will lead to failure of the examinee in this part of the examination. In addition, an overall score based on the each examiner's impression is given for the overall performance of each candidate. The overall score is rated on a 4-point scale (1 = unsatisfactory, 2 = acceptable, 3 = good, and 4 = very good). The pass criteria for each scenario are a minimum of 70% on the checklist items including a correct response to all critical items, and a holistic score of 2 or more by each examiner.
The described OSCE simulation regional anesthesia assessments have been conducted 14 times over the last 7 years. During this period, 308 senior residents, in the final year of their training, have been examined (mean of 22 participants per examination period). Presentation of the descriptive statistics for each of the scenarios used in the examination is beyond the scope of this report. The total pass rate was 83% (257 of 308), ranging from 73% to 91%. The interrater correlation for total, critical, and global scores were 0.84, 0.88, and 0.75, respectively.
There is limited use of an OSCE-simulated scenario type evaluation in the area of national accreditation,9,10 and to our knowledge we are the first to describe its use in regional anesthesia evaluation. There is increasing evidence that the use of simulations for assessment of residents captures additional information about examinees. For example, examinees who do well in oral and written examinations do not necessarily do well in simulation evaluations, and vice versa.11 Unfortunately, the setup of this OSCE does not allow us to gather information on construct validity of the scenarios, or to compare the results with written or oral examination. However, the ongoing process of scenario development we describe should ensure face and content validity, and during the last 7 years, gradual improvements were incorporated into the scenarios. Based on examiners' feedback, scenario realism was increased by involving the actor in reporting various signs and symptoms related to the regional anesthesia or complications; definitions of accepted performance were changed based on disagreements between examiners; and the difficulty of scenarios was adjusted based on “interscenario” reliability variables. Keys for success we identified were a structured process of examination development and evolution, examiners' involvement in the definition of assessment conditions, tasks, and scenarios, and cooperation with experts in psychometric evaluation. In summary, our experience is that until realistic simulators for regional anesthesia, allowing for examinees to puncture skin, receive tactile responses from tissue interfaces, and accurately locate nerves via electrical stimulation or ultrasound guidance will be available,12,13 the examination methodology presented may be used as an assessment tool. Such methodology moves toward bridging the gap between trainees' ability to confirm they not only “know how” but can also “show how” to perform regional anesthesia, and as such brings the examination techniques closer to clinical reality.
1. Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA 2002;287:226–35
2. Miller GE. The assessment of clinical skills/competence/ performance. Acad Med 1990;65:S63–73
3. Hodges B. OSCE! Variations on a theme by Harden. Med Educ 2003;37:1134–40
4. Boulet J, Murray D. Simulation-based assessment in anesthesiology: requirements for practical implementation. Anesthesiology 2010;112:1041–52
5. Berkenstadt H, Ziv A, Gafni N, Sidi A. Incorporating simulation-based objective structured clinical examination into the Israeli national board examination in Anesthesiology. Anesth Analg 2006;102:853–8
6. Berkenstadt H, Ziv A, Gafni N, Sidi A. The validation process of incorporating simulation-based accreditation into the anesthesiology Israeli national board exams. Isr Med Assoc J 2006;8:728–33
7. Newble D. Techniques for measuring clinical competence: objective structured clinical examinations. Med Educ 2004;38:199–204
8. Schwid HA, Rooke GA, Carline J, Steadman RH, Murray WB, Olympio M, Tarver S, Steckner K, Wetstone S. Evaluation of anesthesia residents using mannequin-based simulation: a multiinstitutional study. Anesthesiology 2002;97:1434–44
9. Bergqvist D, Liapis C, Wolfe JN. The developing European Board Vascular Examination. Eur J Vasc Endovasc Surg 2004;27: 339–40
10. Taghva A, Mir-Sepassi G, Zarghami M. A brief report on the implementation of an objective structured clinical examination (OSCE) in the 2006 Iranian board of psychiatry examination. Iran J Psychiatry Behav Sci 2007;1:39–40
11. Savoldelli GL, Naik VN, Joo HS, Houston PL, Graham M, Yee B, Hamstra SJ. Evaluation of patient simulator performance as an adjunct to the oral examination for senior anesthesia residents. Anesthesiology 2006;104:475–81
12. Hayter MA, Friedman Z, Bould MD, Hanlon JG, Katznelson R, Borges B, Naik VN. Validation of the Imperial College Surgical Assessment Device (ICSAD) for labour epidural placement. Can J Anaesth 2009;56:419–26
13. Grottke O, Ntouba A, Ullrich S, Liao W, Fried E, Prescher A, Deserno TM, Kuhlen T, Rossaint R. Virtual reality-based simulator for training in regional anaesthesia. Br J Anaesth 2009;103:594–600
APPENDIX: AXILLARY BLOCK EXAMINATION SCENARIO (the scenario presented is an extended version and not all items are used in each examination period)
It is 08:30 and you have been called to the postanesthesia recovery unit. A 22-year-old man has undergone reimplantation of 2 digits on the right hand after traumatic amputation and you have been requested to perform an axillary block for postoperative pain relief. The patient has no relevant medical history and has normal coagulation studies.
You are expected to demonstrate the performance of this block including patient positioning, identification of relevant anatomy, point of needle insertion, direction of needle advancement, and volume of local anesthesia to be used.