Cardiopulmonary Resuscitation Team Simulation Sessions
The simulator is deployed at different hospital locations to recreate cardiac arrest scenarios. The simulation team consists of 2 simulation nurse educators and an anesthesiologist and an intensivist who is well versed in resuscitation medicine. The program uses a high-fidelity simulator (Laerdal SimMan 3G, Laerdal Medical, Stavanger, Norway). One or more webcams with portable microphones are installed at strategic vantage points to capture the activities of CRT members. Other equipment includes a standard hospital code cart stocked with mock drugs, medical supplies (central line and airway kits), and a Lifepak 12 defibrillator (Medtronic, Redmond, WA). A central line task trainer (Simulab Corporation, Seattle, WA) was initially used for all codes at our institution, but after we converted to intraosseous lines for all in-hospital resuscitations, an intraosseous line insertion kit (EZ-IO Intraosseous Infusion System, Vidacare Corporation, San Antonio, TX) became part of our standard training equipment. Two-way VHF radios with discrete headsets allow instructors to communicate and coordinate efforts during the scenario. A video projector, external speakers, compact screen, and teaching aids are used for video-assisted debriefing sessions. The equipment is stored and transported in a dedicated case designed for this purpose (Fig. 3).
The clinical details of the cardiac arrest scenarios are arranged with unit personnel before activating the emergency response system. This varies somewhat by location and allows supervisory personnel critical input in choosing a specific clinical scenario or testing a particular aspect of the system they believe to be especially vulnerable.
Between 2 and 6 simulated CRT practice sessions (“mock codes”) are conducted each month. Although the sessions are unannounced, instructors monitor hospital emergencies and unit workload before the exercises to ensure that the simulation does not conflict with real emergencies or periods of intense clinical workload. Each 30-minute session is divided into the following segments: resuscitation (8 minutes), program introduction (for newcomers) and brief review of nontechnical skills concepts (5 minutes), video-assisted debrief (12 minutes), and “roll-up” (5 minutes).
During the resuscitation, the activated CRT encounters the manikin in either pulseless electrical activity or ventricular fibrillation and must manage the scenario as they would an actual patient in cardiac arrest. Task and contextual fidelity is preserved as much as possible by providing a realistic case stem and by using equipment normally located in the specific clinical environment. Realism is further enhanced by blending simulation modes such as a femoral line trainer (for central line placement), an intraosseous line insertion kit, and medical actors (who represent family members or other medical personnel). The simulation proceeds undisturbed with the instructors only observing and recording the event. They interact with the participants only to prevent hazardous situations or to help with nonstandard clinical tasks, for example, applying modified defibrillation pads or inserting a central venous catheter using the task trainer.
The resuscitation is followed by a 5-minute introduction to the program. During this introduction, the simulation team prepares the recording, software, and audiovisual equipment for the video-assisted debrief.
The facilitator begins the debrief with a statement emphasizing the blame-free, safe, and solution-focused nature of the event. Posters are used to facilitate review of the CRT structure, provider roles, and responsibilities and to illustrate some basic nontechnical skills concepts (Figs. 1 and 2). A video of the event is then projected on the wall (or portable screen). Using the video as an aide, the simulation team leads the CRT in a discussion focusing mainly on leadership and teamwork behaviors, eliciting or offering suggestions for improvement. Technical concepts that are absolutely critical to resuscitation such as rapid identification of rhythm and quality of CPR are also discussed. Hazards and system problems identified by the CRT members are also addressed during the debrief to further understand the needs and barriers. The session concludes with an open discussion, during which each participant and observer is asked in turn to share one “take-home” point or comment.
After each exercise, the training team meets to debrief. First, an internal review of the conduct of the exercise is performed, highlighting positive and negative occurrences. Notes are taken and later on incorporated into the simulation log, an electronic file that is used to document every event, the resources used, the participants involved, and the lessons learned. Suggestions for improvement are incorporated into future exercises. In addition, any systems problems related to the resuscitation effort that are detected during the scenario are documented in the same simulation log. The Systems Engineering Initiative for Patient Safety (SEIPS) model is used to classify the observations and understand the relationship among the different components of the system. Structural hazards such as unsuitable equipment or physical space at a particular location or process redesign needs are discussed among the respective stakeholder groups, and solutions are deployed as needed. Typically, a deficiency identified during the conduct of the event is discussed during the debrief. The simulation team later follows up with the responsible party (eg, unit manager, engineering department, or pharmacy department), and a solution is crafted, which is then implemented during the following event and refined. Once deemed adequate and stable, the tool or redesigned process is finally implemented in real cardiac arrest responses. Other structural or process deficiencies that are detected through these exercises are discussed among the critical care committee members, and solutions are crafted with multidisciplinary involvement. For example, proposed changes to the configuration of the code cart were first vetted by this committee and later implemented and iteratively refined.
A total of 72 simulated unannounced cardiac arrest sessions were conducted between October 2010 and September 2013 at various locations throughout the medical center and at different times of the day, including the late evening. More than 300 providers participated in these sessions, including 87 physicians, 100 nurses, 21 respiratory therapists, and 10 administrative staff. A total of 335 hours were devoted to the program, including preparation, process improvement, and didactic activities.
We detected several environmental, teamwork, human-machine interface, culture, and policy hazards and defects. Some of the problems that were addressed through the program were well-known to providers, and the program simply created a formal mechanism by which solutions could be implemented. For example, several experienced nurse managers who routinely respond to codes had raised concerns about occasional delays in establishing intravenous access during resuscitations. After implementation of this program, the suggestion to introduce intraosseous devices was presented to the critical care committee, then tested and further refined during the simulated exercises, and finally implemented in real codes. Other hazards were latent, and the program served to uncover them before any real harm occurred. For example, a simulated event at a newly opened gastrointestinal endoscopy unit revealed that the wall oxygen outlet in one of the procedure rooms was far away from the head of the bed, making it impossible for the bag-mask-valve device to reach the patient. The room layout was reconfigured as a result of this exercise. Other examples of problems detected and their solutions are presented in Table 1.
We describe a large-scale program that uses simulation to improve the quality of CPR in the hospital setting. With the conduct of more than 70 unannounced in situ simulated cardiac arrest events, we have identified and mitigated latent hazards and defects in the hospital emergency response system.
Most studies aimed at improving CRT performance have focused on proficiency with advanced cardiovascular life support treatment protocols. However, our interviews indicated that the greatest barriers to delivering safe and effective care were systems vulnerabilities and ineffective teamwork rather than specific knowledge or skill deficits.
As stated in the US Institute of Medicine report “To Err Is Human,” most errors in patient care are caused by faulty systems, processes, and conditions that lead people to make mistakes or fail to prevent them.25 Our program brings many components of the system together and tests the processes (care and noncare related) needed to achieve the desired outcome. Through these exercises, we are able to systematically study the linkages among the different components of the system, the processes, and the outcomes, to design, implement, and evaluate solutions. As such, the program constitutes a robust quality improvement tool.
Traditional approaches to quality measurement and improvement often rely on the model conceptualized by Avedis Donabedian, in which the system (structure) is linked to the outcomes through the process of care.26 Efforts to improve the quality of care have traditionally focused heavily on the provider and on tools and technologies, all components of the structure, and to some extent, on the processes of care. For example, there are numerous studies that evaluate the efficacy of different drug dosing regimens for CPR and on different resuscitation devices.27,28 Even the American Heart Association’s Basic and Advanced Life Support courses focus heavily on the individual provider’s knowledge of resuscitation algorithms and on technical skills such as bag-mask ventilation, chest compressions, and the use of the defibrillator.4 Unfortunately, this approach ignores other components of the health care delivery system such as the physical space, hospital policies, and non–patient care processes, all of which are important in providing safe and high-quality resuscitation.
The SEIPS model may be used to complement and expand the Donabedian framework by further characterizing structure into 5 categories.24 According to this model, the person (or team) performs a range of tasks using various tools and technologies. The performance of these tasks occurs within a certain physical environment and under specific organizational conditions. It also recognizes the importance of other processes beyond the actual health care delivery process, such as equipment maintenance, supply chain, cleaning, and information flow. Finally, it adds employee and organizational outcomes to the list of important outcomes to consider (Fig. 4). The SEIPS model is useful because it encourages an appreciation of the entire system of care and the interactions among the different components, as opposed to focusing on one aspect of the system at a time.24
Using this framework, we identified several relationships among the different components of the structure (called “work system” in the SEIPS model) and among these components and other processes and outcomes. For example, in the case of our resuscitation team, a long and difficult-to-read hospital policy regarding the CRT provided an organizational barrier to a well-conducted resuscitation because team members rarely knew which providers needed to respond or what their roles and responsibilities were. This forced them to spend the critical initial minutes of the resuscitation organizing the team and distributing tasks. In fact, teams that are assembled ad hoc perform worse than teams that are structured before the resuscitation event.29 By summarizing and clarifying the hospital policy before the crisis occurs, our program provides a structure to the team with tasks and responsibilities defined, thus allowing team members to focus on the resuscitation as soon as they arrive on the scene. Similarly, our program has identified certain factors in the physical environment that constituted barriers to effective and efficient care, such as areas of the hospital that were inaccessible to members of the CRT or signs and maps that were outdated.
An example of a care-related process that we redesigned is communication with the laboratory during codes. Under the previous system, each laboratory test required an individual computerized order by the team leader. There was significant variability in the tests ordered, how samples were collected and labeled, sent to the laboratory, and results reported. Under the redesigned system, a preassembled code laboratory package is included in every code cart. The “laboratory pack” contains all necessary supplies, including tubes, a needleless transfer device, and labels. The samples are hand delivered by designated personnel. A standard set of studies (“code labs”) is now automatically run by the laboratory when the package is received, and the results are called back to the unit. This process was initially tested during simulated codes and, after several iterations, has been implemented in real codes. Other systems issues detected and addressed by this program include the procurement and widespread use of step stools for CPR and the incorporation of intraosseous vascular access into clinical care.
As stated by Carayon et al,24 by making it “easy to do things right and hard to do things wrong,” a systems redesign approach may be more successful at changing provider behavior because the focus is shifted away from the provider and therefore goes against a “culture of blame” and moves toward a “culture of continuous quality improvement.” Our program requires the direct involvement of all providers and ancillary staff in a quality improvement activity, in this case, CPR. The different participants are allowed and even encouraged to participate and become invested in the process of organizational change without risking loss of control or autonomy, an issue especially relevant to medical professionals.24
Finally, our program has allowed us to collect resuscitation process data (performance data) that are generally not otherwise available or difficult to access. For example, by conducting unannounced codes at different times of the day, we have been able to assess the paging system and CRT response times. Such granular data are important when redesigning processes of care but very difficult to capture during day-to-day clinical activities.
The disadvantages of our program should be acknowledged. The fact that the sessions occur in the middle of a working day poses a challenge because participants may find themselves torn between attending to their clinical duties and participating in this learning simulation experience. Being cognizant of this, we strive to keep the sessions short and focused and create an atmosphere that is nonthreatening and perceived to be valuable for all who participate. This has allowed us to achieve and sustain adequate engagement from the majority of providers. In addition, we do not objectively measure the educational benefit to learners. Rather than burden the participants with questionnaires or surveys, we opted to focus on assessing the impact of the program by monitoring longitudinal patient outcomes and other consequences of process changes. An example of the latter is a decreased incidence of laboratory order entry errors after the implementation of the new code cart laboratory pack. Lastly, the impact of this program on team performance during real codes and on patient outcomes has not been studied. In-hospital cardiac arrests are rare events, and the number of events that must be analyzed to demonstrate significant differences is beyond the scope of this project.
In summary, we describe here an ongoing program that uses in situ simulation to identify and mitigate latent hazards and defects in the hospital emergency response system. The SEIPS model provides a framework for describing and analyzing the structure, processes, and outcomes related to these events.
1. Go AS, Mozaffarian D, Roger VL, et al. Heart disease and stroke statistics—2013 update: a report from the American Heart Association. Circulation
2013; 127( 1): e6–e245.
2. Abella BS, Alvarado JP, Myklebust H, et al. Quality of cardiopulmonary resuscitation during in-hospital cardiac arrest. JAMA
2005; 293( 3): 305–310.
3. Ornato JP, Peberdy MA, Reid RD, Feeser VR, Dhindsa HS; NRCPR Investigators. Impact of resuscitation system errors on survival from in-hospital cardiac arrest. Resuscitation
2012; 83( 1): 63–69.
4. Field JM, Hazinski MF, Sayre MR, et al. Part 1: executive summary: 2010 American Heart Association guidelines for cardiopulmonary resuscitation and emergency cardiovascular care. Circulation
2010; 122( 18 Suppl 3): S640–S656.
5. Morrison LJ, Neumar RW, Zimmerman JL, et al. Strategies for improving survival after in-hospital cardiac arrest in the United States: 2013 consensus recommendations: a consensus statement from the American Heart Association. Circulation
2013; 127( 14): 1538–1563.
6. Marsch SC, Muller C, Marquardt K, et al. Human factors
affect the quality of cardiopulmonary resuscitation in simulated cardiac arrests. Resuscitation
2004; 60( 1): 51–56.
7. Lighthall GK, Poon T, Harrison TK. Using in situ simulation
to improve in-hospital cardiopulmonary resuscitation. Jt Comm J Qual Patient Saf
2010; 36( 5): 209–216.
8. Mondrup F, Brabrand M, Folkestad L, et al. In-hospital resuscitation evaluated by in situ simulation
: a prospective simulation
study. Scand J Trauma Resusc Emerg Med
2011; 19: 55.
9. Walker ST, Sevdalis N, McKay A, et al. Unannounced in situ simulations: integrating training and clinical practice. BMJ Qual Saf
2013; 22( 6): 453–458.
10. Reid PP, Compton WD, Grossman JH, Fanjiang G, eds. Building a Better Delivery System: A New Engineering/Health Care Partnership
. Washington, DC: National Academies Press; 2005: 15.
11. Gurses AP, Kim G, Martinez EA, et al. Identifying and categorising patient safety hazards in cardiovascular operating rooms using an interdisciplinary approach: a multisite study. BMJ Qual Saf
2012; 21( 10): 810–818.
12. Martinez EA, Marsteller JA, Thompson DA, et al. The Society of Cardiovascular Anesthesiologists’ FOCUS initiative: Locating Errors through Networked Surveillance (LENS) project vision. Anesth Analg
2010; 110( 2): 307–311.
13. Good ML. Patient simulation
for training basic and advanced clinical skills. Med Educ
2003; 37( Suppl 1): 14–21.
14. Toback SL, Fiedor M, Kilpela B, Reis EC. Impact of a pediatric primary care office-based mock code program on physician and staff confidence to perform life-saving skills. Pediatr Emerg Care
2006; 22( 6): 415–422.
15. Siassakos D, Fox R, Crofts JF, Hunt LP, Winter C, Draycott TJ. The management of a simulated emergency: better teamwork, better performance. Resuscitation
2011; 82( 2): 203–206.
16. Ten Eyck RP, Tews M, Ballester JM, Hamilton GC. Improved fourth-year medical student clinical decision-making performance as a resuscitation team leader after a simulation
-based curriculum. Simul Healthc
2010; 5( 3): 139–145.
17. Andreatta P, Saxton E, Thompson M, Annich G. Simulation
-based mock codes significantly correlate with improved pediatric patient cardiopulmonary arrest survival rates. Pediatr Crit Care Med
2011; 12( 1): 33–38.
18. Garden AL, Mills SA, Wilson R, et al. In situ simulation
training for paediatric cardiorespiratory arrest: initial observations and identification of latent errors. Anaesth Intensive Care
2010; 38( 6): 1038–1042.
19. Kobayashi L, Dunbar-Viveiros JA, Sheahan BA, et al. In situ simulation
comparing in-hospital first responder sudden cardiac arrest resuscitation using semiautomated defibrillators and automated external defibrillators. Simul Healthc
2010; 5( 2): 82–90.
20. O’Leary F, McGarvey K, Christoff A, et al. Identifying incidents of suboptimal care during paediatric emergencies—an observational study utilising in situ and simulation
centre scenarios. Resuscitation
2014; 85( 3): 431–436.
21. Lighthall GK, Mayette M, Harrison TK. An institutionwide approach to redesigning management of cardiopulmonary resuscitation. Jt Comm J Qual Patient Saf
2013; 39( 4): 157–166.
22. Herzer KR, Rodriguez-Paz JM, Doyle PA, et al. A practical framework for patient care teams to prospectively identify and mitigate clinical hazards. Jt Comm J Qual Patient Saf
2009; 35( 2): 72–81.
23. Pronovost PJ, Holzmueller CG, Martinez E, et al. A practical tool to learn from defects in patient care. Jt Comm J Qual Patient Saf
2006; 32( 2): 102–108.
24. Carayon P, Schoofs Hundt A, Karsh BT, et al. Work system design for patient safety: the SEIPS model. Qual Saf Health Care
2006; 15( Suppl 1): i50–i58.
25. Homsted L. Institute of Medicine report: to err is human: building a safer health care system. Fla Nurse
2000; 48( 1): 6.
26. Donabedian A. An Introduction to Quality Assurance in Health Care
. New York, NY: Oxford University Press; 2003.
27. Olasveengen TM, Sunde K, Brunborg C, Thowsen J, Steen PA, Wik L. Intravenous drug administration during out-of-hospital cardiac arrest: a randomized trial. JAMA
2009; 302( 20): 2222–2229.
28. Scholten M, Szili-Torok T, Klootwijk P, Jordaens L. Comparison of monophasic and biphasic shocks for transthoracic cardioversion of atrial fibrillation. Heart
2003; 89( 9): 1032–1034.
29. Hunziker S, Tschan F, Semmer NK, et al. Hands-on time during CPR is affected by the process of teambuilding: a prospective randomised simulator-based trial. BMC Emerg Med
2009; 9: 3.
Keywords:© 2015 Society for Simulation in Healthcare
Quality improvement; Human factors; Simulation; Clinical microsystem; Hospital medicine