Journal Logo

Empirical Investigations

Using Simulation to Orient Code Blue Teams to a New Hospital Facility

Villamaria, Frank J. MD, MPH; Pliego, Jose F. MD; Wehbe-Janek, Hania PhD; Coker, Neil BS, EMT-P; Rajab, M Hasan PhD, MPH; Sibbitt, Stephen MD; Ogden, Paul E. MD; Musick, Keith RN, MSN; Browning, Jeff L. PhD; Hays-Grudo, Jennifer PhD

Author Information
Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare: December 2008 - Volume 3 - Issue 4 - p 209-216
doi: 10.1097/SIH.0b013e31818187f3
  • Free
  • SDC
  • Take the CME Test

Abstract

In-hospital cardiac arrest resuscitation attempts are estimated to occur at a rate between 250,000 and 750,000 per year in the United States.1–3 This translates to between 1% and 2% of hospital admissions.4 Reports on the effectiveness of resuscitation attempts demonstrate a wide variation in success rates. Recently, a study revealed that the time to defibrillation is related to survival proportions for in-hospital cardiac arrest.5 Survival to discharge after in-hospital cardiac arrest has been estimated to range from 0% to 59% (reported variation likely because of patient population, clinical subgroup, study design, and method of data collection).3,6 Recognized variables that may impact survival include time of recognition of cardiac arrest, medical emergency team arrival, initiation of cardiopulmonary resuscitation (CPR), administration of first medication, compliance with time standards,6,7 time from onset of cardiac arrest to defibrillation shock,6,8,9 dedicated medical emergency teams,10,11 and level of training.2,12

Most hospitals have not adopted standards for resuscitation response times. In 2000, the American Heart Association (AHA) recommended guidelines to include the goal of delivering the first electrical shock within 2 minutes of identification of cardiac arrest.13,14 Similarly, the Brooke Army Medical Center CPR committee established time guidelines of initiation of CPR immediately after determination of cardiac arrest, less than 3 minutes to code team arrival, less than 3 minutes to defibrillation, and time to medication of less than 10 minutes.6 Recent studies have confirmed that shorter interval to code team arrival and defibrillation is associated with increased survival to discharge.7,15 More recent guidelines have focused on the need for 1.5 to 3 minutes of CPR before attempting defibrillation in patients when cardiac arrest is un-witnessed or when CPR is estimated to have been delayed for 5 minutes or more.16 Nonetheless, reduced response times and administration of effective Advanced Cardiac Life Support (ACLS) are critical for survival of cardiac arrest.

ACLS training does not address coordination of team resources or team performance during resuscitation.17 Traditionally, ACLS training takes place in silos where participants rotate through different stations demonstrating their knowledge and skills in isolation.17 Reports have also documented that ACLS training does not necessarily predict future performance in a real crisis situation.18,19 In most hospitals, Code Blue Teams (CBTs) respond to a patient experiencing cardiac arrest. CBTs are rarely observed, evaluated, or critiqued on performance in a time of close proximity to the crisis event. Furthermore, CBT training typically occurs on an individual basis through an ACLS course, and rarely in the context of multidisciplinary team training in an operational facility. Many team compositions are dynamic because of scheduling and rotations and include different specialty areas. Thus, providers in a CBT may not have worked together as a team until a code blue event.

To reduce response times and administration of effective ACLS, many healthcare facilities have instituted code team training, including simulation-based training,20 problem-based learning,21 and standard AHA-administered ACLS.12 Although simulation-based training seems to be a superior method to classic methods,20,21 it’s effectiveness has been difficult to assess. Use of historical data to compare before and after training is confounded by factors such as changes in equipment and personnel, and do-not-attempt-resuscitation rates. However, studies have shown improvement in survival after training.10,22,23 Thus, the use of simulation for resuscitation training may generate positive outcomes for patient safety and provider confidence, by improving the performance in a resuscitation situation.

An unfamiliar environment, such as a newly constructed hospital facility, may influence the performance of teams responding to a cardiac arrest. Although a program was presented to orient healthcare providers to our newly constructed hospital facility, this orientation took place under ideal conditions in a scheduled systematic approach. We performed mock code blue scenarios with high-fidelity simulation, to orient and assess the performance of our CBTs within our new facility. This is the first reported study using medical simulation as a tool to familiarize, orient, and audit CBT performance in a newly constructed healthcare facility.

MATERIALS AND METHODS

Study Design

We conducted a prospective study using high-fidelity simulation to orient CBTs to critical events in a new hospital facility, the Scott & White Center for Advanced Medicine (CAM). We measured response times including arrival time of first responder, crash cart to code site, first six CBT responders, time of first chest compression, and first electrical shock. The study served as a method to troubleshoot the new facility for patient safety factors; and study investigators used observation and debriefing of participants after each mock code to elucidate factors that may have contributed to responses.

The study was presented to hospital administration and the Resuscitation Committee as an opportunity for hospital patient safety and quality care improvement. Before the study, the Scott & White Memorial Hospital Institutional Review Board granted it an exempt status. Mock code blue exercises using simulation were implemented for the study. On initiation of the study the CBT members and program directors were informed that they would be participating in mock code exercises.

Setting

All simulations took place at the Scott & White CAM, a newly constructed 417-bed hospital facility. A high-fidelity simulator, SimMan (Laerdal® Medical Corp., Stavanger, Norway) was used for the mock codes. The CAM is an eight-floor facility; we performed the mock codes on floors three to eight and the first floor entrance area (Table 1). Units that respond to their floor codes (Intensive Care and Emergency Department) were not included in the study. The timing for the mock codes was selected to maximize the number of resident participants and to take into consideration rotations of CBT members. To not bias our results, we selected different days and hours of the day, and included six times during the normal day shift, three times at night, and three times of the weekend.

Table 1
Table 1:
CAM Mock Code Schedule

Participants

Study participants included Scott & White CBTs which consists of two intensive care unit (ICU) nurses and a nurse supervisor, an anesthesia resident, three internal medicine (IM) residents, a pharmacist, two respiratory therapists (RT), and a chaplain. The majority of CBT members participating in this study are ACLS certified and experienced in responding to cardiac arrest. A minimum of three study investigators participated in each mock code.

Mock Code Blue Design

We performed a practice mock code blue before initiating the study. Information obtained in this mock code was used to modify and improve our study design.

Study specific information including days, times, and locations of the codes was not provided to the participants; however, the directors of each CBT member were informed of the mock code status. A high-fidelity simulator, SimMan, Laerdal®, was strategically placed at different locations in the Scott & White CAM. The parameters for the SimMan for each scenario included a respiration rate of 0 and a ventricular fibrillation rhythm. Each mock code blue was called by a study investigator using a telephone to the operator, and the code was announced as a real Code Blue. Our patient rooms are equipped with an emergency code blue button that automatically conveys a Code Blue to the operator; to be consistent in measuring calls made from nonpatient rooms, we called all codes using a telephone. The operators were called by dialing the common operator number “0” or the hospital emergency number “4-2000.” If an actual code was called during the mock code, the mock code was immediately aborted.

For the first six mock codes, SimMan was placed in a patient bed (Fig. 1A), and during the second six mock codes, SimMan was the placed outside a bed (Fig. 1B), ie, bathroom or hallway floor. The scenario of each code was realistically created according to the site of the mock code; ie, in the trauma unit, the patient was a 35-year-old, man, non-do not resuscitate (DNR) motorcycle accident patient, who became unconscious. The scenarios were unwitnessed events, and responders were informed that the patient was found unresponsive and to proceed with a full code, ie, non-DNR.

FIGURE 1.
FIGURE 1.:
Staging of SimMan during mock Code Blue exercises. (A) SimMan in a patient bed; (B) SimMan on bathroom floor of a patient room.

Teams were notified that when called to a mock code, to proceed as in reality and perform CPR, give first line ACLS medicine, and defibrillate as necessary. Following each mock code, participants were debriefed by a study investigator on their measured performance to assess any barriers to effective response and decision making. For debriefing, we used the Advocacy/Inquiry method of reflective learning24 and requested participants to respond to a debriefing/comment form. All participants were notified of the study completion to ensure authenticity of subsequent code blues.

Data Collection

Data were collected using two study forms; a data collection form was used to document demographics and response times; and a debriefing form was used to document participant self-reported responses to their experience during the mock codes. All study time measurements were recorded from the time the code was announced by the operator. Participants were asked to score their perceived level of anxiety during the mock code exercise on a Visual Analogue Scale ranging from 0 to 100 mm, anchored with not at all anxious to extremely anxious, respectively. Participants were asked to rate their level of confidence in performing the mock codes on a Likert scale of 1 to 5, ranging from not at all confident to extremely confident. In addition, we measured the duration of time between the time the code was called by dialing “0” (first six codes) or the hospital emergency Code Blue number “4-2000” (last six codes) until the hospital overhead announcement was made. All training exercises and debriefings were recorded on video to accurately document the responses to the scenario. After each mock code exercise, team video taped responses were saved without any identifiable information. Data results and video tapes were saved and secured in a locked environment and were only accessible to Key Staff Personnel.

Identification and Classification of Errors

Using group debriefings, individual debriefing forms, and observations, we identified errors that may have influenced CBT responses. The errors were categorized into four domains: facilities, human, organizational, and technical. The errors were then reported to the organization’s Resuscitation Committee for subsequent intervention and rectification.

Statistical Analysis

Baseline characteristics were summarized using the descriptive statistics medians (1st and 3rd quartiles) for continuous variables and frequencies and percentages for the categorical variables as appropriate. The primary variables were time of arrival of first responder, first six CBT members, time to arrival of crash cart, time to initiation of chest compressions and time to first electrical shock in cardio-pulmonary arrest. Mean time differences were assessed using the two-sample t test as appropriate. Data management and statistical analysis were performed using SAS software, version 9.1.3 (SAS Institute Inc., Cary, NC). Microsoft® Office Excel 2002 (Microsoft Corp., Redmond, WA) was used to create graphical representations of the study results.

RESULTS

There were a total of 12 completed mock codes conducted at different locations of the CAM (Table 1) during the 3-month study period. A total of 12 different CBTs participated in the study. We collected 172 debriefing reports from the participants for data analysis.

In regards to recognizing the announced mock code site, 115 (67%) responders initially knew where the mock code site was located, and 32% did not. When asked about their familiarity with the new CAM, 54% reported that they were familiar, 41% were somewhat familiar, and 5% were not at all familiar. When asked about their familiarity with the equipment used, 71% reported that they were familiar, 25% were somewhat familiar, and 4% were not at all familiar.

When participants were asked to describe the effect of the mock code exercise on their perceived anxiety, the reported median perceived level of anxiety was 32.5 (quartiles 14.0–53.0). Additionally, when participants were asked to describe their level of confidence during the mock code exercise, 1% were not at all, 3% were a little, 30% were somewhat, 51% were quite, and 15% were extremely confident. When participants were asked if the simulation was beneficial, 69% reported yes, 27% reported somewhat and 4% reported no.

In regards to the effect of the number dialed (“0” vs. “4-2000”) to call a code on the time the code was announced, the median difference between time code called and time code was announced was 74 (quartiles 61–135) seconds by dialing “0” and 41 (quartiles 27–52) seconds by dialing “4-2000.”

We recorded time points and identified specialties of first responders and CBT responders during the mock codes. The median arrival time of the first responder was 42 (quartiles 27–60) seconds (Fig. 2). In 7 of the 12 mock codes (58%), the first responder was not a CBT member of which 5 (71%) were floor nurses, one IM resident, and one attending physician (Table 2). Of the remaining codes, the first responders were CBT members and included two IM residents, two RTs, and an ICU nurse (Table 2). The median arrival time of a non-CBT first responder was 38 (quartiles 20–59) seconds and 60 (quartiles 14–66) seconds for a CBT first responder.

FIGURE 2.
FIGURE 2.:
Response times measured from the announcement of the code. There were a total of 12 mock code blues conducted at different locations of the CAM using high-fidelity simulation. Times measured from the announcement of the code included the arrival of the first responder, first CBT member, sixth CBT member, crash cart arrival, initiation of chest compressions, and first electrical shock. The graphical representation of the results is reported as box plots showing the median with the first and third quartiles of the codes.
Table 2
Table 2:
Analysis of Mock Code Responders

The median arrival time of the first CBT responder was 66 (quartiles 50–87) seconds (Fig. 2). Thirty-three percent of the first CBT responder specialties were IM residents, 33% RT, 25% were ICU nurses, and 8% chaplains (Table 2). The median arrival time of the sixth CBT responder was 132 (quartiles 121–158) seconds (Fig. 2). We further assessed the arrival time of the CBT responders over the course of the study, and observed an apparent decrease in arrival of the first CBT responder; however, the change was not statistically significant (Fig. 3A). No change was observed for CBT responders 2 to 6.

FIGURE 3.
FIGURE 3.:
(A) Arrival times of Code Blue team. There were a total of 12 mock code blues conducted at different locations of the CAM using high-fidelity simulation. Times were measured from the announcement of the code by the operator until first through sixth CBT member arrived to the site of the code. The graphical representation of the results are the measured times of each of the 12 codes. (B) Times of crash cart arrival, initiation of chest compressions, and first electrical shock. There were a total of 12 mock code blues conducted at different locations of the CAM using high-fidelity simulation. Times were measured from the announcement of the code by the operator until the arrival of the crash cart, initiation of chest compressions, and first electrical shock.

We recorded time points of CBT treatment responses during the ventricular fibrillation scenarios. The median time of initiation of chest compressions was 80 (quartiles 62–137) seconds (Fig. 2). The median arrival time of the crash cart was 68 (quartiles 62–137) seconds (Fig. 2). The arrival time of the crash cart appeared consistent, except for codes 9 to 11, which deviated to a maximum of an additional 90 seconds (Fig. 3B). We speculate that this variation is a reflection of the particular code sites (public restroom, hallway, and waiting area) that are farther away from patient areas where crash carts are deployed (Table 1). The median time of first electrical shock was 341 (quartiles 294–481) seconds (Fig. 2). The median difference between initiation of chest compressions and first electrical shock was 267 (quartiles 197–427) seconds. We did not observe a statistically significant change in arrival time of the crash cart, initiation of chest compressions, or first electrical shock over the course of the study (Fig. 3B).

A majority of participants believed their responses were timely. In regards to arriving at the site of the code, 79% of the participants reported arriving on time; 95% reported initiation of chest compressions to be on time; and 64% perceived time of electrical shock to be on time. In regards to route taken to reach code site, 51% reported using elevators, 25% reported using stairs, and 24% were on the same floor.

Patient safety factors identified through debriefing and observations were categorized as follows: 16.1% were facilities, 22.0% were human, 41.5% were organizational, and 20.3% were technical (Fig. 4). Topics such as orientation to the new facility, equipment, and team performance were discussed during the debriefing (Table 3). For example, requests and concerns of debriefed staff included adding phones and electrical outlets near elevators, improving code blue announcements to facilitate recognition of code site, improving maps and signage, standardization of equipment locations, and a need to place crash carts in nonpatient areas of the CAM. Convenient stairwell routes were found to have stairwell exit doors to the floor locked for security purposes. A main concern that respondents expressed was the usage of elevators, traffic around elevators, and problems bypassing the floors. Participants also commented that the mock code exercise created awareness about their perceived familiarity with the CAM, and that the mock code exercise also generated motivation to improve their knowledge of the hospital floor plan. All issues that were identified during the mock codes were immediately reported to and resolved immediately by the hospital’s Resuscitation Committee.

FIGURE 4.
FIGURE 4.:
Categorized patient safety factors. Mock code participants were debriefed following the simulation experience to elucidate any factors that may have delayed rapid response. Topics addressing orientation to the new facility, equipment, and team performance were discussed during the debriefing. Factors and issues identified through debriefing and observations were categorized as facilities, human, organizational, and/or technical.
Table 3
Table 3:
Summary of Feedback During Debriefing

DISCUSSION

Over the last decade the growing use of simulation in healthcare has proven to be of significant purpose and value.25 New healthcare facilities have high potential to present unanticipated obstacles to rapid and effective response to a critical situation such as a cardio-respiratory arrest. The current study found that mock code simulation exercises provided a unique and useful method for orienting healthcare workers to a new hospital facility while auditing the facility for patient safety factors. Mock codes using low-fidelity simulation were initially used in an existing hospital described in 1986 by Sullivan and Guyatt.26 The present study advances this strategy by supplementing high-fidelity clinical simulation and reflective learning debriefing strategies.

The results of this study suggest that classroom based facility orientation alone may not be sufficient to ensure that CBT members can rapidly reach a code occurrence in a new facility. We found that even though the majority of CBT members thought that they were familiar with the facility, the median arrival time of the first CBT member was 66 seconds (1.1 minute) and for the sixth member of the team the median arrival time was 132 seconds (2.2 minutes). Kinney et al.6 found that reducing CBT response time from 1.6 to 1.2 minutes increased survival significantly. Our results suggest that until CBT members are completely familiar with a new facility, first responders, ie, non-CBT members, are tasked with managing the patient alone in cardiac arrest during the first critical moments. Thus, training non-CBT first responders in effective cardiac life support may lead to more effective management of the patient.

In an effort to increase survival from cardiac arrest in public facilities, automated external defibrillators (AEDs) are being advocated to treat cardiac dysrhythmias before arrival of EMS personnel.27–31 However, there is a paucity of research on the use of AEDs in a hospital inpatient setting.32 In our facility AEDs are being considered for use in patient areas particularly by first responders who are not designated CBT members to decrease the time to initial countershock.

The importance of early and effective chest compression and prompt defibrillation for in-patient survival is well documented. Kinney et al.6 found increased survival when the first defibrillation shock was delivered at 6.6 rather than 7.8 minutes. Other studies have found that reducing the interval to defibrillation to less than 5 minutes increases survival rate further.8,9 Median time to first defibrillation attempt in the current study was 341 seconds (5.7 minutes) failed to meet the AHA14 or the Brooke Army Medical Center guidelines6 goal of less than 2 minutes or 3 minutes, respectively, to defibrillation shock. As reviewed by Ristagno and collegues,16 the 2005 CPR guidelines recommended chest compressions as the initial intervention before attempted defibrillation. These guidelines recommend 1.5 to 3 minutes of CPR before defibrillation during an un-witnessed event. In this study, the median time from initiation of chest compressions to defibrillation was 267 seconds (4.4) minutes. Human and technical factors may have contributed to the delay in response of the CBT and administration of the first electrical shock. In previous studies, many of the patients were in the ICU or monitored at the time of the arrest, and thus were better situated for faster response, whereas in this study the patient was often located in a medical ward or other non-ICU location.6

This study helped us identify and address facility issues that could adversely affect patient safety. The CAM facility was constructed directly adjacent to the previous hospital. Issues, such as the floors not directly accessible from one building to another and nonfunctional speakers were identified and reported to the Resuscitation Committee. During the planning stages of the new facility, efforts were made to integrate the design with the function of the existing hospital to provide safe and quality patient care. However, the challenges with construction are inevitable with multiple subcontractors building the facility. Thus, using simulation to troubleshoot a newly constructed facility is novel and may improve patient safety.33–35

This study provided a comprehensive need assessment that was used to design and implement a customized system-wide simulation resuscitation training program. Focused training for personnel, such as floor nurses, that would respond to code blues as non-CBT members has been designed to ensure the best training and most effective response. Similar need assessments and initiatives are being conducted at our additional new facilities to prevent problems that could impact patient safety.

Our study did have some limitations. Experimental controls and preintervention data were not practical because of the effect of natural course of time and the main objective of the study; thus identification of specific efficacious factors contributing to the outcomes are difficult to assess. We propose that several confounding factors inherent in the study design and objectives, such as time effect, dynamic team unit and locations of code site, may contribute to the study and influence the results. Increased familiarity to the CAM, either by increased awareness, the training itself, awareness of the exercise objectives from communication among staff, or unintended repetitive participation from staff may also contribute to responses. Further, we did not find a statistically significant change in the arrival time of CBT members 1 to 6, arrival of crash cart, time to initiate chest compressions, and to first electrical shock over the course of the study. Additional studies are warranted to assess the effect of simulation-based training on response times.

Several participants indicated during debriefing that the training was not truly beneficial because the simulation was not “realistic,” there was no clear leader or organization during the code, or because they were called away during clinical duties for an educational activity. We suspect that a factor contributing of some participants not embracing the effectiveness of simulation was because of an inability to prebrief them. Furthermore, the mock codes could have been more beneficial if the intervention had occurred during the construction phase; however, to maintain the reliability of participants’ response and minimize data contamination, we performed the intervention as soon as the hospital was opened.

The efficacy of the study is multifaceted. A new methodology using simulated patients rather than actual patients for troubleshooting a new facility to identify patient safety factors has been developed. The simulation experiences served to orient staff and increase familiarity (or awareness of) to the new facility. The teams engaged in high-fidelity simulation training in a real workplace environment that included hands-on experience in cardiopulmonary resuscitative efforts. Through observation the investigators evaluated CBT responses, and through debriefing, the scenario became a catalyst for the hospital staff to provide information to the investigators that otherwise may have not been discovered. Without the debriefing, simulation is ineffective. By using effective debriefing techniques, such as the Advocacy and Inquiry24 method, the cognitive learning of the simulation was maximized that may otherwise be limited by the technology. The simulation-based orientation program created several systems and facility improvements, assessed team performance gaps for focused training to improve patient care. Future studies are warranted to address areas such as role transitions from the first non-CBT responder to a CBT member, perceptions of responses versus actuality, management of the code, leadership, team communication and flexibility, and team performance.

CONCLUSIONS

The results of the current study provide helpful information about the impact of a new facility on CBT performance and how simulation can orient healthcare workers to a new health care facility. We suggest that simulation is a useful strategy for troubleshooting patient safety issues, policies, and procedures in existing facilities as well. The healthcare industry should consider the adoption of simulation for resuscitation team training in operational facilities because it provides an exposure of staff to crisis incidents within the work environment. Testing hospital facilities with simulated patients will allow healthcare leaders to audit the system of crisis care by identifying, addressing, and correcting breaches in patient safety in different domains and may prevent adverse events with real patients.

ACKNOWLEDGMENTS

The authors thank Laerdal Medical Corporation for their generous support in providing the simulator used to conduct this research. The authors also thank Temple College Clinical Simulation Center and the Scott & White leadership and staff who supported and significantly contributed to this project.

REFERENCES

1.Kaluski E, Uriel N, Milo O, Cotter G. Management of cardiac arrest in 2005: an update. Isr Med Assoc J 2005;7:589–594.
2.Dane FC, Russell-Lindgren KS, Parish DC, Durham MD, Brown TD. In-hospital resuscitation: association between ACLS training and survival to discharge. Resuscitation 2000;47:83–87.
3.Ballew K, Philbrick J. Causes of variation in reported in-hospital CPR survival: a critical review. Resuscitation 1995;30:203–215.
4.United States Department of Health and Human Services Agency for Healthcare Research and Quality. National and regional estimates on hospital use for all patients from the HCUP Nationwide Inpatient Sample (NIS). 2007.
5.Chan PS, Krumholz HM, Nichol G, Nallamothu BK. Delayed time to defibrillation after in-hospital cardiac arrest. N Engl J Med 2008;358:9–17.
6.Kinney KG, Boyd SY, Simpson DE. Guidelines for appropriate in-hospital emergency team time management: the Brooke Army Medical Center approach. Resuscitation 2004;60:33–38.
7.Peters R, Boyde M. Improving survival after in-hospital cardiac arrest: the Australian experience. Am J Crit Care 2007;16:240–246.
8.Gilmore CM, Rea TD, Becker LJ, Eisenberg MS. Three-phase model of cardiac arrest: time-dependent benefit of bystander cardiopulmonary resuscitation. Am J Cardiol 2006;98:497–499.
9.Skrifvars MB, Rosenberg PH, Finne P, et al. Evaluation of the in-hospital Utstein template in cardiopulmonary resuscitation in secondary hospitals. Resuscitation 2003;56:275–282.
10.Buist MD, Moore GE, Bernard SA, Waxman BP, Anderson JN, Nguyen TV. Effects of a medical emergency team on reduction of incidence of and mortality from unexpected cardiac arrests in hospital: preliminary study. BMJ 2002;324:387–390.
11.Henderson SO, Ballesteros D. Evaluation of a hospital-wide resuscitation team: does it increase survival for in-hospital cardiopulmonary arrest? Resuscitation 2001;48:111–116.
12.Moretti MA, Cesar LA, Nusbacher A, Kern KB, Timerman S, Ramires JA. Advanced cardiac life support training improves long-term survival from in-hospital cardiac arrest. Resuscitation 2007;72:458–465.
13.Kern K, Paroskos A. 31st Bethesda conference: emergency cardiac care. J Am Coll Cardiol 2000;35:825–880.
14.Guidelines 2000 for Cardiopulmonary Resuscitation and Emergency Cardiovascular Care. Part 6: advanced cardiovascular life support: section 4: devices to assist circulation. The American Heart Association in collaboration with the International Liaison Committee on Resuscitation. Circulation 2000;102(8 suppl):I105–I111.
15.Skrifvars MB, Castren M, Aune S, Thoren AB, Nurmi J, Herlitz J. Variability in survival after in-hospital cardiac arrest depending on the hospital level of care. Resuscitation 2007;73:73–81.
16.Ristagno G, Gullo A, Tang W, Weil MH. New cardiopulmonary resuscitation guidelines 2005: importance of uninterrupted chest compression. Crit Care Clin 2006;22:531–538, x.
17.American Heart Association. Advanced Cardiovascular Life Support Provider Manual. New York: American Heart Association; 2006.
18.David J, Prior-Willeard PF. Resuscitation skills of MRCP candidates. BMJ 1993;306:1578–1579.
19.Iirola T, Lund VE, Katila AJ, Mattila-Vuori A, Palve H. Teaching hospital physicians’ skills and knowledge of resuscitation algorithms are deficient. Acta Anaesthesiol Scand 2002;46:1150–1154.
20.Perkins GD. Simulation in resuscitation training. Resuscitation 2007;73:202–211.
21.Steadman RH, Coates WC, Huang YM, et al. Simulation-based training is superior to problem-based learning for the acquisition of critical assessment and management skills. Crit Care Med 2006;34:151–157.
22.Birnbaum ML, Robinson NE, Kuska BM, Stone HL, Fryback DG, Rose JH. Effect of advanced cardiac life-support training in rural, community hospitals. Crit Care Med 1994;22:741–749.
23.Winters BD, Pham JC, Hunt EA, Guallar E, Berenholtz S, Pronovost PJ. Rapid response systems: a systematic review. Crit Care Med 2007;35:1238–1243.
24.Rudolph J, Simon R, Dufresne R, Raemer D. There’s no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthcare 2006;1:49–55.
25.Gaba DM. The future vision of simulation in health care. Qual Saf Health Care 2004;13(suppl 1):i2–i10.
26.Sullivan MJ, Guyatt GH. Simulated cardiac arrests for monitoring quality of in-hospital resuscitation. Lancet 1986;2:618–620.
27.Friedman FD, Dowler K, Link MS. A public access defibrillation programme in non-inpatient hospital areas. Resuscitation 2006;69:407–411.
28.Gombotz H, Weh B, Mitterndorfer W, Rehak P. In-hospital cardiac resuscitation outside the ICU by nursing staff equipped with automated external defibrillators—the first 500 cases. Resuscitation 2006;70:416–422.
29.Martinez-Rubio A, Gusi G, Guillaumet E, et al. The fully automatic external cardioverter defibrillator: reality of a new meaningful scenario for in-hospital cardiac arrests. Expert Rev Med Devices 2005;2:33–39.
30.Weil MH, Fries M. In-hospital cardiac arrest. Crit Care Med 2005;33:2825–2830.
31.Hanefeld C, Lichte C, Mentges-Schroter I, Sirtl C, Mugge A. Hospital-wide first-responder automated external defibrillator programme: 1 year experience. Resuscitation 2005;66:167–170.
32.Kaye W, Mancini ME, Giuliano KK, et al. Strengthening the in-hospital chain of survival with rapid defibrillation by first responders using automated external defibrillators: training and retention issues. Ann Emerg Med 1995;25:163–168.
33.Breen PT, Grinstein GG, Leger JR, Southard DA, Wingfield MA. Virtual design prototyping applied to medical facilities. Stud Health Technol Inform 1996;29:388–399.
34.Kobayashi L, Shapiro MJ, Sucov A, et al. Portable advanced medical simulation for new emergency department testing and orientation. Acad Emerg Med 2006;13:691–695.
35.Rodriguez-Paz J, Mark L, Herzer K, et al. Using in-situ simulation to establish a new intraoperative radiation therapy program: a novel multidisciplinary paradigm to patient safety. Simul Healthcare 2008;(suppl 3):57.
Keywords:

Resuscitation; Simulation; Code Blue; Cardiac arrest; New hospital facility; Critical event

© 2008 Lippincott Williams & Wilkins, Inc.