Secondary Logo

Journal Logo

A Survey of Simulation Utilization in Anesthesiology Residency Programs in the United States

Rochlen, Lauryn R. MD; Housey, Michelle MPH; Gannon, Ian MD; Tait, Alan R. PhD; Naughton, Norah MD; Kheterpal, Sachin MD, MBA

doi: 10.1213/XAA.0000000000000304
Case Reports: Education
Free

Given the evolution of competency-based education and evidence supporting the benefits of incorporating simulation into anesthesiology residency training, simulation will likely play an important role in the training and assessment of anesthesiology residents. Currently, there are little data available regarding the current status of simulation-based curricula across US residency programs. In this study, we assessed simulation-based training and assessment in US anesthesiology programs using a survey designed to elicit information regarding the type, frequency, and content of the simulation courses offered at the 132 Accreditation Council of Graduate Medical Education-certified anesthesiology training programs. The response rate for the survey was 66%. Although most of the responding programs offered simulation-based courses for interns and residents and during CA-1 orientation, the curriculum varied greatly among programs. Approximately 40% of responding programs use simulation for resident assessment and remediation. The majority of responding programs favored standard simulation-based training as part of residency training (89%), and the most common perceived obstacles to doing so were time, money, and human resources. The results from this survey highlight that there are currently large variations in simulation-based training and assessment among training programs. It also confirms that many program directors feel that standardizing some components of simulation-based education and assessment would be beneficial. Given the positive impact simulation has on skill retention and operating room preparedness, it may be worthwhile to consider developing a standard curriculum.

From the Department of Anesthesiology, University of Michigan, Ann Arbor, Michigan.

Accepted for publication November 24, 2015.

Funding: Department of Anesthesiology, University of Michigan, Ann Arbor, Michigan.

The authors declare no conflicts of interest.

Address correspondence to Lauryn R. Rochlen, MD, Department of Anesthesiology, University of Michigan, 1500 E. Medical Center Dr., 1H247, University Hospital, SPC 5048, Ann Arbor, MI 48103. Address e-mail to rochlenl@med.umich.edu.

Simulation is rapidly becoming an important component of residency training.1,2 Currently, anesthesiology residents are required to participate in at least 1 simulated clinical experience each year.1,3 Furthermore, objective structured clinical examinations are being added to the APPLIED board examinations, and simulation-based education is now a recommended component of the revised American Board of Anesthesiology Maintenance of Certification program.

With the transformation to a competency-based education model incorporating the Next Accreditation System and Accreditation Council of Graduate Medical Education (ACGME) Milestones Project, anesthesiology residency programs were charged with creating their own approach to competency assessment.4 Although each program, currently, has the flexibility to define its own curricular needs and develop assessment methods to fit within their resources, there is no standardized approach assuring the full breadth of experience.

Simulation-based instruction and assessment may prove to be the ideal conduit to a more standardized approach for instruction and assessment. The results of a meta-analysis of simulation-based instruction during anesthesiology residency published by Lorello et al.5 in 2014 confirm the benefits of simulation-based instruction over no intervention and nonsimulation-based instruction (i.e., small group discussion, assigned readings, instructional videos). This study concluded that the anesthesiology education community needs to pursue further clarification concerning the optimal design features for simulation curricula.

Surveys of simulation use in general surgery residency programs in 2004 and again in 2006 shared similar conclusions.6,7 In response, the American College of Surgeons and the Society of American Gastrointestinal and Endoscopic Surgeons developed the Fundamentals of Laparoscopic Surgery (FLS) to provide a standardized national assessment of laparoscopic skills. This educational program includes a simulation-based technical skills assessment with standard, validated measures using trained evaluators. Successful completion of the FLS course is now a requirement for primary certification in general surgery by the American Board of Surgery.1,8–10

Although there are national surveys of generalized simulation availability and use in US academic health centers, currently, there are little data available on the structure and function of simulation curricula among US anesthesiology residency programs. Therefore, this study was designed to assess the current state of simulation-based training and assessment in the US anesthesiology residency programs.

Back to Top | Article Outline

METHODS

Institutional review board (IRB) exemption was approved by the University of Michigan Medical School IRB (Ann Arbor, MI). This survey was developed by anesthesiology faculty without input from the ACGME, Anesthesiology Resident Review Committee, American Society of Anesthesiologists, or other organizations.

Back to Top | Article Outline

Survey Development

A 44-question survey was developed based on the previous surveys of education in anesthesiology and surgery residency programs.6 Survey items were designed to elicit information regarding the type, frequency, and content of the simulation courses offered at each institution. Survey categories also evaluated the use for remediation and assessment, perceived obstacles to running simulation courses, description of faculty involvement, and simulation center location and resources. Questions were predominantly closed ended with the option to select multiple answers and enter free text if desired (Appendix 1).

Survey items were reviewed by 3 anesthesiology faculty experienced with the simulation-based training program at the University of Michigan. Discussion was held among this group, and consensus was achieved regarding selection of items and response options, survey length, and assessment of content validity. The survey was further analyzed for face validity by additional clinical and research faculty not involved with simulation. Finalized survey content was uploaded to online survey software (Qualtrics, Provo, UT).

Back to Top | Article Outline

Survey Distribution

An e-mail cover letter was sent to all 132 ACGME-accredited anesthesiology residency programs in the United States on August 1, 2014. The website for each program was reviewed to identify a simulation program director or anesthesia faculty member with primary responsibility for simulation-based education within the program. If simulation faculty leads could be identified, the e-mail was directed to them; otherwise, it was directed to the residency program director or residency program coordinator listed on the ACGME website as of June 1, 2014. Only 1 e-mail was sent per program to prevent duplicate responses.

The e-mail contained a link to the online survey, instructions on how to complete the survey, IRB exemption notification, and contact information to use in the event of questions or concerns. Those receiving the e-mail were given the option of forwarding the survey onto someone within their department whom they felt was better suited to complete it. No program identifiers were included in the survey, and, therefore, responses remained anonymous. Follow-up e-mails were sent at 1- and 2-month intervals to improve compliance. The survey was available online for 100 days and closed on November 9, 2014.

Back to Top | Article Outline

Statistical Analysis

Data for each survey response were downloaded from Qualtrics directly into SPSS version 21.0 (SPSS Inc., Chicago, IL) for analysis. The full survey tool is available in Appendix 1. Characteristics of the educational programs were summarized using frequencies and percentages. Survey responses were grouped into themes for analysis: course structure, assessment/remediation, and faculty. Partial responses were included in the analysis; variations in sample sizes were because of missing values for each question. Quantitative data were summarized using descriptive statistics (frequencies, percentages). Differences based on size of the residency program were also examined; programs consisting of 2 to 7 residents were considered “small” residency programs, programs with 8 to 13 residents were “medium” residency programs, and programs with 14 or more residents were deemed to be “large” residency programs. Secondary exploratory analysis included qualitative analysis of open-ended questions and free-text responses.

Back to Top | Article Outline

RESULTS

Sixty-six percent (87 of 132) of programs responded. However, we should note that not all programs responded to all questions, resulting in varying denominators for percentages reported throughout the results section. Table 1 illustrates the characteristics of the responding programs. As shown, most programs responding to the survey reported having a designated faculty member as simulation director (83% [72 of 80]).

Table 1.

Table 1.

Back to Top | Article Outline

Course Structure

Individuals from 55% (42 of 77) of programs reported that they provided simulation courses during internship, 83% (67 of 81) during CA-1 orientation, and 96% (81 of 84) as part of residency courses throughout residency. Fifty-one percent (38 of 74) of programs provided simulation courses in all 3 categories. Of those programs providing simulation courses to interns, the majority (46% [19 of 41]) provide between 3 and 7 courses per year, whereas 39% (16 of 41) provide 1 to 2 courses and 15% (6 of 41) provide >7 courses per year. Among programs with simulation courses for anesthesiology residents, 61% (47 of 77) offer 3 to 7 courses, 21% (16 of 77) offer 1 to 2 courses, and 18% (14 of 77) offer >7 courses per year. As Figure 1 indicates, programs varied with regard to the structure and content of simulation courses. Table 2 describes the rotations during which simulation opportunities are provided.

Table 2.

Table 2.

Figure 1.

Figure 1.

Results showed that half of the surveyed programs involve residents in curriculum development, course facilitation, and debriefing. However, residents were less likely to be involved with simulator programming, interdisciplinary simulation, and independent learning (<40%).

As shown in Figure 2, programs rated time as the most significant obstacle to instituting and maintaining simulation courses. Regardless of the size of the residency program, similar obstacles appear evident, although more small and medium programs reported money and simulator availability as the greatest obstacles compared with large programs.

Figure 2.

Figure 2.

Figure 3.

Figure 3.

The majority of responding programs (89% [64 of 72]) agreed that standardized components should be incorporated into residency training. Figure 3 shows the components that programs were most interested in standardizing. Small programs were most interested in standardized perioperative clinical scenarios, whereas medium programs favored standardization of crisis resource management objectives. Large programs also considered crisis resource management objectives as an important component to standardize as well as assessments.

Back to Top | Article Outline

Assessment/Remediation

Forty percent (30 of 75) of programs reported using simulation for assessment of resident performance when requested by the clinical competency committee (60% [18 of 30]) and/or before transition to independent practice (43% [13 of 30]).

For Milestones assessment, 79% (45 of 57) of programs reported that they plan to or are currently using simulation. Open-ended responses indicate that these program directors are unsure how to implement this or are currently working out the details. A few respondents commented that they were developing scenarios focused on communication and professional behavior. Those programs that are already incorporating Milestone evaluations into simulation have scenarios focused on communication and professional behavior as well as checklists that are embedded into the evaluation. One program preferred to keep the simulation center a safe environment associated with learning and did not use simulation for Milestone assessment.

Forty-seven percent (35 of 74) reported using simulation for remediation of marginally performing residents. Remediation processes were most often at the request of the clinical competency committee (78% [25 of 32]), but a surprisingly high number of programs has residents that self-refer to simulation to address their training needs (50% [16 of 32]).

Back to Top | Article Outline

Faculty

The majority of programs reported having between 3 and 7 faculty involved in the simulation curriculum (50% [35 of 70]). Only 48% (27 of 56) indicated that their faculty received any form of compensation for their involvement. Compensation, if provided, was most often monetary (67% [18 of 27]) or in the form of nonclinical time (56% [15 of 27]). Sixty-five percent (46 of 71) of programs reported having faculty that had received training in simulation-based instruction, of which 73% (33 of 45) stated that 1 or more of their faculty had attended an instructional course.

Back to Top | Article Outline

DISCUSSION

The goal of this survey was to assess the current state of simulation-based curricula in US anesthesiology residency programs. The results from the survey highlight the significant variations in the use of simulation for training and assessment and resources devoted to simulation. However, many of the challenges are common among programs. It also supports the hypothesis that many programs believe that standardization of some components of simulation-based education and assessment would be beneficial. The high response rate of 66% achieved during this project is a testament to the relevance of the topic of simulation in anesthesiology residency programs.11

In 2011, the Anesthesiology Resident Review Committee introduced a required simulated clinical experience into the core program requirements. Although they did not provide any standard criteria for this experience, they did encourage programs to include surgery and nursing in the simulation and recommended meaningful debriefing. Although simulation is used in all programs that responded to the survey, there is significant variation in how simulation curricula are used and implemented.

A survey of general surgery residency programs demonstrating wide variation in curricula eventually resulted in the creation of the mandatory FLS program.6,7,10 These surveys found that the primary obstacles for instituting a standardized curriculum were equipment, staffing, and variation in training practices. In response, the Society of American Gastrointestinal and Endoscopic Surgeons developed the FLS educational and assessment program, which includes web-based instruction and standardized assessment.8,9 Successful completion of the FLS course is now a requirement for primary certification in general surgery by the American Board of Surgery.1 Follow-up studies of general surgery residents have revealed greater 2-year skill retention, higher primary certification examination pass rates, and improved performance in the operating room.12,13 Recent research in the surgery literature has assessed resident readiness to transition to independent practice using simulation-based assessment.14

Although there is much to learn from the development of the FLS program and subsequent impact on training and practice, the scope of anesthesia practice involves many aspects in addition to mastery of technical skills. For example, instruction on anesthesia nontechnical skills, crisis resource management, and group debriefing are important educational objectives that must be considered.15–18 This survey shows that programs vary significantly in regard to the structure of course content and types of simulation courses offered as well as simulation faculty training and support. This is not surprising given that although there is solid evidence outlining the advantages of incorporating simulation residency training, there is still much to learn regarding the objectives, content, and administration of these courses.3,5,19–21 Although we feel that determining which components of simulation-based instruction to standardize would be difficult, an initial step could be to require standardized faculty training on debriefing and teaching nontechnical skills.

It has become clear that supporting a simulation-based curriculum requires extensive resources. Results from this survey show that, not surprisingly, the most common obstacles perceived by program directors, independent of size, to instituting a viable simulation program include time, human resources, and financial support.1 In addition to identifying the resources necessary, understanding and sharing best practices to optimize the limited resources will also be key to sustaining these efforts over time.

This survey did not address details regarding programs’ use of simulation for Milestones assessment, and further studies are underway to evaluate this topic in more detail. However, what this survey does reveal is that some centers have begun incorporating simulation into Milestones assessment and remediation, and others recognize the potential difficulties in implementing a such program. One center stated that they prefer to keep the simulation a safe place for learning for their residents and will not use simulation for assessment. However, given the large variation in practice and lack of a standard assessment paradigm, it may prove difficult to generalize among programs.

Close to 90% of the programs completing this survey reported that they would be in favor of standardizing at least some components of simulation-based education. With the large amount of simulation research available, a guide for educators and program directors to develop an efficient and high-impact simulation program should be an obtainable goal. Working from a standard curriculum can also provide for higher standards for faculty training and less variability among faculty instructors.22 It could be hypothesized that if standard scenarios, programs, and debriefing points were available, faculty members’ time could be used more efficiently and effectively.22,23

The limitations of this survey are recognized as those inherent in surveys in general. The response rate of 66% was encouraging given that this was an online survey of busy practitioners; however, there may have been the potential for nonresponse bias because programs with active simulation programs may have been more likely to respond. The survey respondents included in addition to those positions listed in Table 1 directors of education, assistant program directors, and residents. Therefore, it is possible that these respondents may not have had access to the most accurate information regarding the simulation programs. We opted to maintain the anonymity of the residency programs to encourage participation. Although this makes it difficult to generalize among programs based on their available resources, it reduced the potential for any report bias. Although we acknowledge that there was no way to ensure that programs did not respond more than once, we believe that this likelihood was low. Response rates varied by question, and there was not a forced response required for any of the questions. Furthermore, the survey did not address qualitative information regarding simulation programs. Finally, because of survey length limitations and absence of measurement standards, the current data are unable to assess the effectiveness of simulation programs currently in use.

There is much opportunity for future work in this area. Results from this survey and the evidence highlighting the current variations can guide future research on the topic of standardizing simulation use during residency. One question to consider is defining optimal exposure to simulation-based training in regard to time, content, and repetition.5,20 A next step could be a feasibility study to determine whether incorporating standard training protocols is possible or how it can be most effectively developed and instituted.

The results of this survey confirm that there is large variation among anesthesiology residency programs regarding their simulation-based curricula. The majority of program directors responding believe that standardizing some component of simulation-based training would be beneficial. Currently, there is no plan to integrate a standard training component to the current residency training. Given the positive impact simulation has on skill retention and operating room preparedness, it may be worthwhile to consider developing a standardized program for resident training.

Back to Top | Article Outline

APPENDIX 1

Survey

Please select the position that best applies to who is completing this survey.

  • ◯ Residency program director
  • ◯ Residency program coordinator
  • ◯ Simulation program director
  • ◯ Simulation instructor
  • ◯ Other ____________________

How many residents do you have per residency class?

  • ◯ 2–7
  • ◯ 8–13
  • ◯ 14–19
  • ◯ 20–25
  • ◯ >25

Does your program have a designated simulation program director?

  • ◯ Yes
  • ◯ No
  • ◯ Don’t know

The next set of questions relates to intern and resident simulation.

Do you provide anesthesiology-associated simulation courses for anesthesiology interns?

  • ◯ Yes
  • ◯ No
  • ◯ Don’t know/NA

If yes, how many anesthesiology-associated courses per year does each intern attend?

  • ◯ 1–2
  • ◯ 3–5
  • ◯ 5–7
  • ◯ >7

If yes, please describe the intern course structure and content. Select all that apply.

  • ◻ Procedural skills
  • ◻ Interdisciplinary
  • ◻ Principles of crisis resource management
  • ◻ Perioperative clinical scenarios
  • ◻ Debriefing
  • ◻ Other ____________________

Do you provide simulation courses for anesthesiology residents?

  • ◯ Yes
  • ◯ No

If yes, how many courses per year does each resident attend?

  • ◯ 1–2
  • ◯ 3–5
  • ◯ 5–7
  • ◯ >7

If yes, please describe the resident course structure and content. Select all that apply.

  • ◻ Procedural skills
  • ◻ Interdisciplinary
  • ◻ Principles of crisis resource management
  • ◻ Perioperative clinical scenarios
  • ◻ Debriefing
  • ◻ Other ____________________

Do you provide simulation opportunities during CA-1 orientation?

  • ◯ Yes
  • ◯ No

If yes, please describe the orientation course structure and content. Select all that apply

  • ◻ Procedural skills
  • ◻ Operating room preparation
  • ◻ Induction and intubation for general anesthetic
  • ◻ Perioperative clinical scenarios
  • ◻ Principles of crisis resource management
  • ◻ Debriefing
  • ◻ Other ____________________

Do you provide simulation opportunities during specific rotations? Select all that apply.

  • ◻ Airway management
  • ◻ TEE
  • ◻ Regional
  • ◻ Neuraxial
  • ◻ ICU
  • ◻ Other ____________________

Does your program utilize simulation for assessment of resident performance?

  • ◯ Yes (1)
  • ◯ No (2)

If yes, what are the situations where you use simulation for assessment of residents? Select all that apply.

  • ◻ Readiness for specialty rotations
  • ◻ Flowing completion of specialty rotations
  • ◻ End-of-year assessment
  • ◻ Assessment before transition to semi-independent practice
  • ◻ Request of Clinical Competency Committee
  • ◻ Professionalism
  • ◻ Other ____________________

Do you intend to use or are you currently using simulation to assist with resident assessment for Milestones?

  • ◯ Yes
  • ◯ No
  • ◯ Don’t know

If yes, please describe how you intend to use or are currently using simulation for resident assessment for Milestones.

Does your program utilize simulation for remediation of poorly performing residents?

  • ◯ Yes
  • ◯ No

If yes, please describe the process you use for remediation of residents.

  • ◻ Poor performance in specialty rotations
  • ◻ Professionalism
  • ◻ Request of the Clinical Competency Committee
  • ◻ Resident’s request
  • ◻ Result of medical or technical error
  • ◻ Other ____________________

Are residents involved in any of the following related to simulation? Please select all that apply.

  • ◻ Curriculum development
  • ◻ Simulator programming
  • ◻ Course facilitation
  • ◻ Debriefing
  • ◻ Independent learning
  • ◻ Interdisciplinary simulation ____________________
  • ◻ Other ____________________

Do you have CRNAs to help with staffing to support resident education?

  • ◯ Yes
  • ◯ No

Please rank order what you find to be significant obstacles to running/instituting a simulation-based curriculum. Rank 1 is the greatest obstacle. (Drag and drop item to place in rank order.)

  • ______ Time
  • ______ Money
  • ______ Human resources
  • ______ Curriculum development
  • ______ Simulator availability
  • ______ Other

Do you feel that a standardized component of simulation-based training should be incorporated into residency curriculum?

  • ◯ Yes
  • ◯ No

If yes, please select which components you feel should be standardized. Select all that apply.

  • ◻ Perioperative clinical scenarios
  • ◻ Crisis resource management objectives
  • ◻ Debriefing points
  • ◻ Procedural skills
  • ◻ Assessment
  • ◻ Other ____________________

The next set of questions pertains to the simulation faculty instructors.

How many faculties in your department are involved with simulation?

  • ◯ 1–2
  • ◯ 3–5
  • ◯ 5–7
  • ◯ >7

Are the faculty compensated for their involvement?

  • ◯ Yes
  • ◯ No
  • ◯ Don’t know

If yes, please describe how faculties are compensated. Select all that apply.

  • ◻ Monetary
  • ◻ Nonclinical time
  • ◻ Promotion
  • ◻ Other ____________________

Does simulation faculty receive training in simulation-based instruction?

  • ◯ Yes
  • ◯ No
  • ◯ Don’t know/NA

If yes, please describe the type of training your faculty received. Select all that apply.

  • ◻ Attended an instructor course
  • ◻ Attended instructor classes as part of a larger conference
  • ◻ Observed other simulation centers
  • ◻ Other ____________________

The next set of questions pertains to your simulation center.

Is your simulation center accredited by any of the following organizations? Select all that apply.

  • ◻ American Society of Anesthesiologists
  • ◻ American College of Surgeons
  • ◻ Society for Simulation in Healthcare
  • ◻ Other ____________________
  • ◻ No accreditation

If endorsed by the ASA, do you offer MOCA simulation courses?

  • ◯ Yes
  • ◯ No

How would you best describe the location of your simulation center?

  • ◯ Off campus
  • ◯ On campus, different building from main OR
  • ◯ On campus, same building as main OR
  • ◯ Other ____________________

Please select the support staff available at your simulation center. (Select all that apply.)

  • ◻ Administrative—scheduling
  • ◻ Administrative—research
  • ◻ Technical—simulator programming
  • ◻ Technical—Moulage, supplies, setup
  • ◻ Actors/standardized patients
  • ◻ Other ____________________

Please select the types of simulators you use at your simulation center. (Select all that apply.)

  • ◻ High fidelity, full patient
  • ◻ Low fidelity, full patient
  • ◻ Partial task trainers
  • ◻ Virtual reality
  • ◻ Screen-based modules
  • ◻ Standardized patients
  • ◻ Other ____________________
Back to Top | Article Outline

REFERENCES

1. Levine AI, Schwartz AD, Bryson EO, Demaria S Jr. Role of simulation in U.S. physician licensure and certification. Mt Sinai J Med 2012;79:14053.
2. Weinger MB, Burden AR, Steadman RH, Gaba DM. This is not a test!: misconceptions surrounding the maintenance of certification in anesthesiology simulation course. Anesthesiology 2014;121:6559.
3. Matveevskii AS, Gravenstein N. Role of simulators, educational programs, and nontechnical skills in anesthesia resident selection, education, and competency assessment. J Crit Care 2008;23:16772.
4. Ebert TJ, Fox CA. Competency-based education in anesthesiology: history and challenges. Anesthesiology 2014;120:2431.
5. Lorello GR, Cook DA, Johnson RL, Brydges R. Simulation-based training in anaesthesiology: a systematic review and meta-analysis. Br J Anaesth 2014;112:23145.
6. Kapadia MR, DaRosa DA, MacRae HM, Dunnington GL. Current assessment and future directions of surgical skills laboratories. J Surg Educ 2007;64:2605.
7. Korndorffer JR Jr, Stefanidis D, Scott DJ. Laparoscopic skills laboratories: current assessment and a call for resident training standards. Am J Surg 2006;191:1722.
8. Fried GM. FLS assessment of competency using simulated laparoscopic tasks. J Gastrointest Surg 2008;12:2102.
9. Okrainec A, Soper NJ, Swanstrom LL, Fried GM. Trends and results of the first 5 years of Fundamentals of Laparoscopic Surgery (FLS) certification testing. Surg Endosc 2011;25:11928.
10. Scott DJ, Dunnington GL. The new ACS/APDS Skills Curriculum: moving the learning curve out of the operating room. J Gastrointest Surg 2008;12:21321.
11. Cunningham CT, Quan H, Hemmelgarn B, Noseworthy T, Beck CA, Dixon E, Samuel S, Ghali WA, Sykes LL, Jetté N. Exploring physician specialist response rates to web-based surveys. BMC Med Res Methodol 2015;15:32.
12. Mashaud LB, Castellvi AO, Hollett LA, Hogg DC, Tesfay ST, Scott DJ. Two-year skill retention and certification exam performance after fundamentals of laparoscopic skills training and proficiency maintenance. Surgery 2010;148:194201.
13. Sroka G, Feldman LS, Vassiliou MC, Kaneva PA, Fayez R, Fried GM. Fundamentals of laparoscopic surgery simulator training to proficiency improves laparoscopic performance in the operating room-a randomized controlled trial. Am J Surg 2010;199:11520.
14. D’Angelo AL, Cohen ER, Kwan C, Laufer S, Greenberg C, Greenberg J, Wiegmann D, Pugh CM. Use of decision-based simulations to assess resident readiness for operative independence. Am J Surg 2015;209:1329.
15. Fletcher GC, McGeorge P, Flin RH, Glavin RJ, Maran NJ. The role of non-technical skills in anaesthesia: a review of current literature. Br J Anaesth 2002;88:41829.
16. Flin R, Patey R, Glavin R, Maran N. Anaesthetists’ non-technical skills. Br J Anaesth 2010;105:3844.
17. Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul Healthc 2007;2:11525.
18. Murray DJ. Current trends in simulation training in anesthesia: a review. Minerva Anestesiol 2011;77:52833.
19. Pott LM, Randel GI, Straker T, Becker KD, Cooper RM. A survey of airway training among U.S. and Canadian anesthesiology residency programs. J Clin Anesth 2011;23:1526.
20. Weinger MB. The pharmacology of simulation: a conceptual framework to inform progress in simulation research. Simul Healthc 2010;5:815.
21. Blum RH, Boulet JR, Cooper JB, Muret-Wagstaff SL; Harvard Assessment of Anesthesia Resident Performance Research Group. Simulation-based assessment to identify critical gaps in safe anesthesia resident performance. Anesthesiology 2014;120:12941.
22. Kinnear J, Smith B, Akram M, Wilson N, Simpson E. Using expert consensus to develop a simulation course for faculty members. Clin Teach 2015;12:2731.
23. Paige JT, Arora S, Fernandez G, Seymour N. Debriefing 101: training faculty to promote learning in simulation-based training. Am J Surg 2015;209:12631.
Copyright © 2016 International Anesthesia Research Society