Secondary Logo

Journal Logo

A Systematic Review of Simulation for Multidisciplinary Team Training in Operating Rooms

Cumin, David BE, PhD; Boyd, Matt J. MBChB, MA, PhD; Webster, Craig S. BSc, MSc, PhD; Weller, Jennifer M. MBBS, MClinED, MD, FRCA, FANZCA

Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare: June 2013 - Volume 8 - Issue 3 - p 171–179
doi: 10.1097/SIH.0b013e31827e2f4c
Review Article
Free

Summary Statement Current simulation training initiatives predominantly occur in uniprofessional silos and do little to integrate different disciplines working in the operating room (OR). The objective of this review was to determine the current status of work describing simulation for full OR multidisciplinary teams including barriers to conducting OR multidisciplinary team training and factors contributing to successful courses. We found a total of 18 articles from 10 research groups. Various scenarios and simulators were used, and training sessions were generally perceived as realistic and beneficial by participants despite rudimentary integration of surgical and anesthetic models. Measures of performance involved a variety of both technical and nontechnical ratings of the simulations. Challenges to conducting the simulations included recruitment, model realism, and financial costs. Future work should focus on how best to overcome the barriers to implementation of team training interventions for full OR teams, particularly on how to engage senior staff to aid recruitment.

From the Centre for Medical and Health Sciences Education, University of Auckland, Auckland, New Zealand.

Reprints: David Cumin, BE, PhD, Centre for Medical and Health Sciences Education, University of Auckland, Private Bag 92-019, Auckland 1142, New Zealand (e-mail: d.cumin@auckland.ac.nz).

Supported by a grant from the Health Workforce New Zealand Innovations fund.

The authors disclose no conflicts of interest.

Simulation is commonly used for team training in health care, including training for operating room (OR) team members from nursing,1 anesthesia,2 and surgery.3 Observational research in the OR reports that failures in teamwork and communication are common and lead directly to compromised patient care and reduced productivity.4 Furthermore, directives from the US Institute of Medicine call for teams that work together to be trained together.5 This suggests a need for combined training of the 3 primary disciplines (the subteams of nursing, anesthesia, and surgery) that comprise an OR team.

Although there are reviews of simulation-based team training in obstetrics,6 emergency medicine,7,8 and the 3 individual OR subteams,1–3 we could find no review of the literature on OR multidisciplinary team (OR-MDT) training. We therefore undertook a systematic review of the literature to determine the current status of simulation-based team training involving full OR teams. We defined OR-MDT training as an initiative involving participants from the surgical, anesthesia, and nursing teams (ie, a team consisting of at least 1 surgeon, 1 anesthetist or nurse anesthetist, and 1 nurse). We were particularly interested in the types of scenarios and training used, the environment and simulator(s) used, any outcome measures used (including participant perceptions of simulation realism and course effectiveness as well as technical and nontechnical skill measures), and barriers to undertaking this form of simulation-based training.

Back to Top | Article Outline

METHODS

Search Strategy

We searched 5 literature databases (PubMed, EMBASE, SCOPUS, PSYCHinfo, and ERIC), and the Cochrane Library for 3 categories of search terms, namely, those intended to capture articles reporting OR-MDT simulations, those involving interactions between participants from multiple disciplines, and those involving participants forming full OR teams. The resulting terms and their Boolean relationships were combined to form “Simulat* AND (team* OR interprofessional OR multiprofessional OR interdisciplinary OR multidisciplinary) AND (medical OR doctor OR clinician OR surgeon OR anaesthetist OR anesthetist OR anesthesiologist OR nurse)” as the search strategy for each database. No limits were placed on the results. Abstracts of returned articles were read and articles were excluded according to the criteria discussed later.

Back to Top | Article Outline

Exclusion Criteria

We eliminated all articles (a) which involved simulations in domains other than health care, (b) where there was no opportunity for genuine team interactions because there was only one participant working alone or with confederates (we define a confederate as an appropriately skilled accomplice working with the simulation faculty), and (c) which involved teams but not all 3 OR subteams were represented. Included articles were used as a primary source from which other publications were identified from their reference lists, and contact was made with authors for clarification of information where necessary.

Back to Top | Article Outline

Data Extraction

Data were extracted from the included articles systematically, according to the following categories: (a) affiliation of the primary author such that the number of active research groups could be estimated, (b) number and composition of participants and teams involved in the simulations, (c) types of scenarios and training provided, (d) the environment (in situ or at a simulation center) and details of the simulators used, (e) outcome measures of the study, and (f) indications of barriers to conducting simulation for full OR teams. The outcome measures were divided into participant ratings of the simulations and courses, procedural skills or medical management scores (technical skills), teamwork or communication scores (nontechnical skills), and other measures.

Back to Top | Article Outline

RESULTS

A total of 5359 unique articles were returned from the search terms. The number of articles excluded and reasons for exclusion are shown in Figure 1. An additional 2 articles were added from the references of included articles for a total of 18 included articles. These came from 10 different research groups. The majority (4242) of returned results did not include health care simulation. A large proportion of the remaining results (529) involved simulation with individual participants. A similar number (570) involved simulations for health care teams involving participants from disciplines outside the OR or incomplete OR teams.

FIGURE 1

FIGURE 1

Details of the center affiliations, participants and teams, scenarios and training provided in the included articles are listed in Table 1. Some centers published more than 1 article from the same scenarios and training, leaving 11 distinct training programs. These ranged from a single simulated scenario, to half-day courses, to a series of 8 scenarios over multiple days. Where reported, all programs involved simulations of crisis scenarios, the most common of which were cardiac arrhythmias or arrest (5 reports), malignant hyperthermia (3 reports), and massive bleeding (3 reports).

TABLE 1

TABLE 1

The simulation environment and simulators used are reported in Table 2. Of the 11 different programs, 6 were conducted in a simulation center and 5 were carried out in the workplace (in situ). There were 7 different combinations of simulators used for anesthesia and surgical teams reported, but 3 articles did not describe the simulators used. The simulators used for anesthesia were all full-body manikins, and the surgical simulators were either part-task trainers or laparoscopic simulators. Two articles did not report the use of a specific simulator for the surgical team in their simulations.9,10 In 3 of the 5 articles where simulators for both anesthesia and surgical teams were used, the simulators were arranged for practical reasons rather than anatomic realism.11–13 For example, 1 article13 showed the laparoscopic simulator positioned separately from the manikin such that one might believe there to have been 2 patients laying side by side. Five articles reported integrating anesthetic and surgical simulators in an anatomically plausible manner (anatomic integration).14–18 That is, the combined simulator enabled the participants to position themselves physically as they would in the clinical environment. None of the research groups integrated the physiology of anesthesia and surgical simulators such that changes to the physiology of one of the simulators would automatically lead to corresponding physiologic changes in the other (physiologic integration)—for example, major blood loss from the model used by the surgical team leading to cardiovascular instability in the model used by the anesthesia team.

TABLE 2

TABLE 2

Eleven studies included measures of participant perceptions of the simulations (Table 3). Participants generally rated the scenarios as realistic (reported in 7 articles9,10,12,14,18–20) and believed that they behaved in a similar manner in the simulations as they did in the clinical environment (reported in 5 articles).12,14,18,19,21 One article14 asked a more detailed question about the participants’ perception of model realism, and their responses were relatively negative in comparison with the ratings of other elements (Table 3). Participants in the same study and 1 other18 rated the scenarios as appropriately challenging. Participants in 10 studies9,11–14,18,19,21–23 regarded the simulations as useful for learning. There were conflicting results to the question of whether the simulations would result in a change to practice (reported in 4 articles).13,18,22,23 No articles reported measuring objective measures of behaviors in the clinical environment.

TABLE 3

TABLE 3

Four articles reported measurements of technical skills (Table 4). Three focused on surgical and nursing skills (all from the same research group).14,16,17 One compared overall medical management of the “patient” with and without the aid of a checklist.18 Five articles from 2 research groups reported measurements of nontechnical skills (Table 4). For both technical and nontechnical skill ratings, the simulations were used to evaluate interrater reliability of the measurement tools,15,16 compare participant self-ratings with expert ratings,15–17,24 measure differences over time,14,20,24 and measure differences between disciplines.14,17 Other measures included participants’ attitudes to safety before and after training,14,22,23 participants’ perception of the use of checklists,18 and policy changes resulting from the simulations.9 Only 1 article9 reported an objective measure of any sort in the clinical environment.

TABLE 4

TABLE 4

Five articles reported possible barriers to conducting OR-MDT simulations.11,14,17,19,22 Three common themes emerged as follows: problems recruiting participants, fidelity of the models, and the financial cost involved. The logistical difficulty of recruitment was the most cited challenge to be overcome, especially of senior surgical staff.11,17,19,22 The need to achieve organizational and leadership support for the courses was considered vital to overcoming some of the recruitment challenges.11,14,21,22 Achieving participant interest in the training was linked in some articles to the limited technology available to create realistic and engaging simulators for the participants,14,21,22 particularly for the surgical subteam. Two articles also cited the large financial cost and time commitment involved to conduct the courses as a challenge to running simulations for whole OR teams.17,21

Despite the barriers mentioned previously, a number of factors contributed to the success of the reported OR-MDT training and research. Prescenario familiarization to appreciate limitations of the models and environment and clarification of the course objectives were reported as important in 6 articles.10,13,17,21,22,25 One study26 of OR-MDT using action research methodology identified 2 aspects of the training that contributed to its success, namely, allowing enough time for participants to learn and ensuring the environment was “psychologically safe,”27 allowing nurses and physicians to feel a sense of equality. Furthermore, all courses were designed and debriefed by experienced personnel, the authors appearing to have made the assumption that experience is important for success. Didactic lectures were used in some courses and not others, as was video playback of the simulations during debriefing. These aspects were not elaborated on in the success of courses and did not seem to affect the results of participant ratings (Table 3). Many of the features previously mentioned adhere to expert guidance for simulation-based training in general.28

Back to Top | Article Outline

DISCUSSION

This systematic review has identified 18 articles from 10 research groups reporting the use of simulation for OR-MDT training. Simulations were conducted in simulation centers or in situ, and, where reported, all involved crisis scenarios. Different combinations of models were used in many reports for the surgical and anesthesia teams, with limited anatomic and no physiologic integration. Participants generally found the simulations to be realistic and of high educational value, whereas the potential for transfer of learning to the clinical environment was less clear. A range of outcome measures were used, including technical and nontechnical skills, and attitudes to safety.

One might expect more reports of OR-MDT training, especially given the number of surgeries performed each year throughout the world29 and the extent to which simulation is used in the separate disciplines of anesthesia, surgery, and nursing.1–3,30,31 It is also surprising that OR-MDT simulations have not been more extensively researched as evidence suggests failures in whole-team communications lead to compromised patient care.4 Possible reasons for the lack of OR-MDT simulation were found as part of this review. They included the difficulties of recruitment, lack of surgical model fidelity, and financial cost of running the simulations.

All reviewed articles used manikins for the anesthesia team and either part-task trainers or laparoscopic simulators for the surgical team. This is not surprising because these are currently the most widely available models for training uniprofessional teams in anesthesia and surgery.32,33 There is interest from some researchers in the use of completely virtual reality–based or screen-based OR simulation34,35 and even combining modalities,36–39 but these are technical reports only and did not involve participants. None of the studies included in this review integrated the physiology of the simulators used for the anesthesia and surgical teams. It is unclear how much fidelity or integration of simulators for anesthetic and surgical teams is required to engage participants in an authentic clinical experience or for effective learning. It is possible that, if there is no integration, conflicting anatomic or physiologic indicators may emphasize the artificial nature of the simulation and lead to participants to reject the simulation as unrealistic. However, the realism of the models may not be as important as the overall realism of the simulation. More work is required to identify how much fidelity is required because this impacts the cost of the simulations and will help to standardize simulations.40

There were a number of outcome measures used in the reviewed articles with little standardization of the wording of questions or scale for answers to questions about participant perceptions. Likewise, there was a variety of measurement tools used for technical and nontechnical skills. This makes results hard to compare between studies. The lack of standardization is not surprising, given the relative infancy of some of these measures and the active work currently being conducted in validating tools.41,42

A limitation of this study is that it focused on reports of OR-MDT in the literature. There may be many more centers around the world actively involved in OR-MDT simulations but not formally reporting their work. An international survey of simulation centers conducted in 2002 found that only 22% of the estimated active centers had published work related to their simulation activities.43 Conducting a similar survey to update these results would be a valuable contribution to understanding the tools and measures used by those who are practicing but not publishing. A further limitation of this review is that articles were not excluded based on the quality of the study. We decided to include all articles to give as broad an overview as possible of what had been done in the past. Moreover, a meta-analysis was not performed on the data retrieved from included articles. To compare studies requires some consistency in the chosen outcome measures and, because the questions differed between articles in wording and in scale, it was not possible to meaningfully combine the data. Standardization of outcome measurement tools is an area for future research. There are challenges to be met, in creating valid measurement tools,44 but these are required to allow comparison of results from different interventions. Future work should also focus on measurement in the clinical environment to verify transfer of training or to verify that the simulations are genuine analogs of the clinical environment and to measure any impact on patient outcomes.

Back to Top | Article Outline

CONCLUSIONS

There is a lack of research being conducted involving simulation-based OR-MDT training despite the clear need for such initiatives. This review found that previous work has included a variety of scenarios, all involving crises, multiple combinations of simulators for anesthesia and surgical subteams, and several different measurement tools. This makes comparison of the studies and results difficult. Standardization of simulations and tools was not mentioned as a challenge to carrying out the simulations. Barriers to OR-MDT training initiatives include recruitment, fidelity of surgical models, and cost. The first of these barriers is linked to engagement of senior staff and hospital management. More work is required to convince organizational leaders of the mandate for OR-MDT training, so that initiatives can address issues of teamwork between OR subteams.

Back to Top | Article Outline

REFERENCES

1. Cant RP, Cooper SJ. Simulation-based learning in nurse education: systematic review. J Adv Nurs 2010; 66: 3–15.
2. LeBlanc VR. Review article: simulation in anesthesia: state of the science and looking forward. Can J Anaesth 2012; 59: 193–202.
3. Milburn JA, Khera G, Hornby ST, Malone P, Fitzgerald JEF. Introduction, availability and role of simulation in surgical education and training: review of current evidence and recommendations from the association of surgeons in training. Int J Surg 2012; 10: 393–398.
4. Lingard L, Espin S, Whyte S, et al. Communication failures in the operating room: an observational classification of recurrent types and effects. Qual Saf Health Care 2004; 13: 330–334.
5. Committee on Quality of Health Care in America, Institute of Medicine. In: Kohn LT, Corrigan JM, Donaldson MS, eds. To Err is Human: Building a Safer Health System. Washington, DC: National Academy Press; 2000.
6. Merién AER, van de Ven J, Mol BW, Houterman S, Oei SG. Multidisciplinary team training in a simulation setting for acute obstetric emergencies. Obstet Gynecol 2010; 115: 1021–1031.
7. McFetrich J. A structured literature review on the use of high fidelity patient simulators for teaching in emergency medicine. Emerg Med J 2006; 23: 509–511.
8. McLaughlin S, Fitch MT, Goyal DG, et al. Simulation in graduate medical education 2008: a review for emergency medicine. Acad Emerg Med 2008; 15: 1117–1129.
9. O’Regan N. Simulation of malignant hyperthermia improves patient safety. Can J Anaesth 2010; 57: S177.
10. Volk MS, Ward J, Irias N, Navedo A, Pollart J, Weinstock PH. Using medical simulation to teach crisis resource management and decision-making skills to otolaryngology housestaff. Otolaryngol Head Neck Surg 2011; 145: 35–42.
11. Helmreich RL, Davies JM. Human factors in the operating room: interpersonal determinants of safety, efficiency and morale. Baillieres Clin Anaesthesiol 1996; 10: 277–295.
12. Sexton B, Marsch S, Helmreich R, Betzendoerfer D, Kocher T, Scheidegger D. Participant evaluation of team oriented medical simulation. In: Henson LC, Lee AC, eds. Simulators in Anesthesiology Education. New York: Plenum Press; 1998: 109–110.
13. Paige J, Kozmenko V, Morgan B, et al. From the flight deck to the operating room: an initial pilot study of the feasibility and potential impact of true interdisciplinary team training using high-fidelity simulation. J Surg Educ 2007; 64: 369–377.
14. Koutantji M, McCulloch P, Undre S, et al. Is team training in briefings for surgical teams feasible in simulation? Cogn Technol Work 2008; 10: 275–285.
15. Sevdalis N, Davis R, Koutantji M, Undre S, Darzi A, Vincent CA. Reliability of a revised NOTECHS scale for use in surgical teams. Am J Surg 2008; 196: 184–190.
16. Sevdalis N, Undre S, Henry J, et al. Development, initial reliability and validity testing of an observational tool for assessing technical skills of operating room nurses. Int J Nurs Stud 2009; 46: 1187–1193.
17. Undre S, Koutanjti M, Sevdalis N, et al. Multidisciplinary crisis simulations: the way forward for training surgical teams. World J Surg 2007; 31: 1843–1853.
18. Ziewacz JE, Arriaga AF, Bader AM, et al. Crisis checklists for the operating room: development and pilot testing. J Am Coll Surg 2011; 213: 212–219.
19. Kozmenko V, Paige J, Chauvin S. Initial implementation of mixed reality simulation targeting teamwork and patient safety. Stud Health Technol Inform 2008; 132: 216–221.
20. Paige JT, Chauvin SW. Transforming the operating room team through simulation training. Semin Colon Rectal Surg 2008; 19: 98–107.
21. Paige JT, Kozmenko V, Yang T, et al. The Mobile Mock Operating Room: bringing team training to the point of care. Adv Patient Safy 2008; 3: 1–17.
22. Flanagan B, Joseph M, Bujor M, Marshall S. Attitudes to safety and teamwork in the operating theatre, and the effects of a program of simulation-based team training. In: Anca JM, Jr. ed. Multimodal Safety Management and Human Factors. Aldershot, UK: Ashgate Publishing Limited; 2007: 211–220.
23. Stevens LM, Cooper JB, Raemer DB, et al. Educational program in crisis management for cardiac surgery teams including high realism simulation. J Thorac Cardiovasc Surg 2012; 144: 17–24.
24. Paige JT, Kozmenko V, Yang T, et al. Attitudinal changes resulting from repetitive training of operating room personnel using of high-fidelity simulation at the point of care. Am Surg 2009; 75: 584–590.
25. Paige JT, Kozmenko V, Yang T, et al. High fidelity, simulation-based training at the point of care improves teamwork in the operating room. J Am Coll Surg 2008; 207: S87–S88.
26. Forsythe L. Action research, simulation, team communication, and bringing the tacit into voice society for simulation in healthcare. Simul Healthc 2009; 4: 143–148.
27. Edmondson AC. Psychological safety and learning behavior in work teams. Adm Sci Q 1999; 44: 350–383.
28. Salas E, Burke CS. Simulation for training is effective when... Qual Saf Health Care 2002; 11: 119–120.
29. Weiser TG, Regenbogen SE, Thompson KD, Haynes AB, Berry WR, Gawande AA. An estimation of the global volume of surgery: a modelling strategy based on available data. Lancet 2008; 372: 139–144.
30. Flin R, Yule S, Paterson-Brown S, Maran N, Rowley D, Youngson G. Teaching surgeons about non-technical skills. Surgeon 2007; 5: 86–89.
31. Gaba DM, Howard SK, Fish KJ, Smith BE, Sowb YA. Simulation-based training in anesthesia crisis resource management (ACRM): a decade of experience. Simul Gaming 2001; 32: 175–193.
32. Sutherland LM, Middleton PF, Anthony A, et al. Surgical simulation: a systematic review. Ann Surg 2006; 243: 291–300.
33. Cumin D, Merry AF. Simulators for use in anaesthesia. Anaesthesia 2007; 62: 151–162.
34. LeRoy Heinrichs W, Youngblood P, Harter PM, Dev P. Simulation for team training and assessment: case studies of online training with virtual worlds. World J Surg 2008; 32: 161–170.
35. Marks S, Windsor JA, Wünsche B. Enhancing virtual environment-based surgical teamwork training with non-verbal communication. In: Ranchordas A, Pereira J, Richard P, eds. GRAPP 2009 - Proceedings from the 4th International Conference on Computer Graphics Theory and Applications, February 5–8, 2009 Lisboa, Portugal. Setubal, Portugal: INSTICC Press; 2009; 361–366.
36. von Lubitz DKJE, Beier K-P, Freer J, et al. Simulation-based medical training: the Medical Readiness Trainer concept and the preparation for civilian and military field operations. In: Richir S, editor. Proceedings of the Laval Conference on Virtual Reality. Angers, France; 2001.
37. Medical Readiness Trainer Team. Immersive virtual reality platform for medical training: a “killer-application”. Stud Health Technol Inform 2000; 70: 207–213.
38. Semeraro F, Frisoli A, Bergamasco M, Cerchiari EL. Virtual reality enhanced mannequin (VREM) that is well received by resuscitation experts. Resuscitation 2009; 80: 489–492
39. Scerbo MW, Belfore Ii Ii LA, Garcia HM, et al. A virtual operating room for context-relevant training. Proc Human Factors Ergon Soc 2007; 1: 507–511.
40. Cumin D, Weller JM, Henderson K, Merry AF. Standards for simulation in anaesthesia: creating confidence in the tools. Br J Anaesth 2010; 105: 45–51.
41. Weller JM, Frengley R, Torrie J, et al. Evaluation of an instrument to measure teamwork in multidisciplinary critical care teams. BMJ Qual Saf 2011; 20: 216–222.
42. Nadler I, Sanderson PM, liley HG. The accuracy of clinical assessments as a measure for teamwork effectiveness. Simul Healthc 2011; 6: 260–268.
43. Morgan PJ, Cleave-Hogg D. A worldwide survey of the use of simulation in anesthesia. Can J Anaesth 2002; 49: 659–662.
44. Paige JT, Kozmenko V, Yang T, et al. High-fidelity, simulation-based, interdisciplinary operating room team training at the point of care. Surg 2009; 145: 138–146.
Keywords:

Clinical; Simulation; Simulators; Teams; Training

© 2013 Society for Simulation in Healthcare