Journal Logo

Empirical Investigations

In Situ Medical Simulation Investigation of Emergency Department Procedural Sedation With Randomized Trial of Experimental Bedside Clinical Process Guidance Intervention

Siegel, Nathan A. MD; Kobayashi, Leo MD; Dunbar-Viveiros, Jennifer A. RN; Devine, Jeffrey RN, NREMT-P; Al-Rasheed, Rakan S. MBBS; Gardiner, Fenwick G. BS; Olsson, Krister MFA; Lai, Stella BFA; Jones, Mark S. BA; Dannecker, Max NREMT-I; Overly, Frank L. MD; Gosbee, John W. MD, MS; Portelli, David C. MD; Jay, Gregory D. MD, PhD

Author Information
Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare: June 2015 - Volume 10 - Issue 3 - p 146-153
doi: 10.1097/SIH.0000000000000083


Emergency department procedural sedation (EDPS) is an essential element of acute patient care. Commensurate with the increasing severity and complexity of emergency department (ED) patients and their presenting conditions, the directed use of sedatives and analgesics to effectively attain diagnostic and therapeutic end points is expanding in scope and frequency.1,2 Yet, the scientific study of EDPS safety is hampered by a lack of consistent definitions, the relatively low (reported) incidence of adverse events, and the variety of clinical practice models and pharmacologic agents used.3,4 Whereas recent reviews of EDPS patient safety in adult populations report adverse event rates between 0% and 18%3–14 (and pediatric rates between 5% and 31%15–25), criteria for inclusion in the analyses and reporting metrics were widely variable. Furthermore, although EDPS guidelines have been set forth,1,2,26–28 a formal, standardized curriculum to ensure provider preparation for optimal clinical practice of adult EDPS is not available. Given the preexisting demands and work hour limitations placed on busy clinical providers, a complementary approach to traditional training methods may involve on-the-spot, focused, and as-needed assistance at the time of EDPS conduct (as described for other procedural skills29–32).

Comprising presedation, induction, sedation, procedure, and recovery phases that generally progress in a sequential manner, EDPS can be conceptually modeled in a manner that fits well with simulation-based investigation.33 Each phase can be defined in terms of essential action elements, resources required, and metrics of minimum safe performance. Deviations from anticipated progression, errors, and adverse events can be readily overlaid onto a conceptual model to develop an EDPS study methodology for provider assessment (Figures, Supplemental Digital Content 1.1,, which provides additional details on the EDPS model) (Figures, Supplemental Digital Content 1.2,, which provides additional details on the EDPS model). Building on previous pilot sessions, investigators conducted the study phase of the Simulation Learning Initiative in Procedural Sedation Training for Routine Engagement of Anticipatory Maneuvers (SLIPSTREAM)s program. This article presents the conduct, results, and analyses of the program’s experimental assessment of senior emergency medicine (EM) residents’ EDPS performance.

This research program explored the following objectives: (1) to assess EDPS provider performance through in situ simulation and (2) to concurrently develop and study the effect of an experimental just-in-time safety system.


The SLIPSTREAM study used a randomized, nonblinded, controlled experimental design. The 2011 pilot phase article by Kobayashi et al34 describes overall programmatic background, rationale, and decisions on investigative methodologies; simulation scenarios and metrics development; and 10-subject novice-versus-expert comparison trial.

Setting and Sample

The study simulations were conducted in situ at an academic 719-bed regional referral hospital’s ED critical care rooms. Personnel from the hospital-affiliated simulation center and departments of emergency medicine and nursing education developed and implemented the program. All postgraduate year 3 (PGY-3) and PGY-4 residents in the institution’s EM residency were eligible to be in the study cohort on account of their exposure to critical care room environment and lack of experience in independent adult EDPS performance (all adult live EDPS is conducted by attending physicians at the study site). A convenience sample of 24 subjects was recruited by investigators (N.A.S. and J.A.D.-V.) from the PGY-3 and PGY-4 classes over 3 academic years and assigned to study groups by Web application randomization table. The institutional review board approved the study protocol.


Simulation Scenarios

Investigators developed 2 interchangeable scenarios (A;B) for a SimMan manikin system (Laerdal, Wappingers Falls, NY). Each scenario featured unique EDPS safety-related details and a fully scripted and interactive simulated patient with an isolated shoulder dislocation that required EDPS for reduction (MedEdPORTAL35 and Table and Figures, Supplemental Digital Content 1.2,, which provides additional details on full scenarios and supporting materials) (MedEdPORTAL35 and Table and Figures, Supplemental Digital Content 1.3,, which provides additional details on full scenarios and supporting materials). Scenarios were constructed for a practice environment with 1 sedation-dedicated MD (subject), an orthopedic proceduralist MD (facilitator), and an EDPS RN (facilitator). Simulated hospital identifiers (bracelet, stickers), medical record, and diagnostic test results were made available. Facilitators, props, and manikin programming advanced the simulations while providing adequate latitude for subjects to independently make clinical decisions and perform critical EDPS actions with impact on patient safety. Scenarios focused on (1) deficiencies noted in EDPS provider chart documentation during prestudy review (as surrogate markers of potentially suboptimal clinical practice), (2) the need to routinely confirm the availability and function of essential EDPS equipment, and (3) adverse events associated with common EDPS medications. Subjects’ medical errors and noncompliance with safety behaviors were set to trigger and/or complicate deterioration of the simulated patient’s condition unless corrective actions were taken. Pilot sessions demonstrated the scenarios’ intersession/intersubject reliability for maximal consistency of simulation experience (pilot phase article supplemental Web content materials for reliability data).34

Study Metrics

Presimulation Web surveys collected data on subject demographics and clinical experience pertinent to EDPS. Investigators monitored in situ simulation setting parameters and scenario timeline characteristics (eg, time spent in presedation assessment and preparation) with checklists and video records, respectively, for all sessions.

Study scenarios were designed to detect best-practice performance of EDPS within the constraints of in situ simulation, institutional protocols, and facility clinical parameters. By design, subjects completing the required clinical actions and safety behaviors in an optimal manner would rate well on study metrics as they progressed through the scenario. Correct EDPS performance was defined for study purposes as follows: timely and adequate patient assessment, determination of potential difficulty with sedation, requisition of appropriate resources and supplies, replacement of defective equipment, time-out completion, administration of proper medications, avoidance of distractors, optimal management of adverse event, postsedation reassessment, pertinent documentation, and maintenance of situational awareness.

Investigators used the following tools to delineate subject performance of EDPS processes: (1) EDPS critical action research checklists derived from the literature5,8 and expert input, (2) existing institutional procedural sedation forms, (3) binary study probes derived from an EDPS hazard Haddon matrix, and (4) retrospective situational awareness forms (Text and Figures, Supplemental Digital Content 1.4,, which display and describe the different study metrics in detail). The experimental element was studied with the Simulation EDPS Composite Patient Safety Score, calculated from completion of select EDPS research checklist critical actions by the subject. This scoring system’s ability to differentiate inexperienced and expert EDPS providers in simulated settings was demonstrated during the pilot study.34 These tools facilitated the consistent acquisition of objective performance data through offline video review by study investigators without conflicts of interest.

Postsimulation Web surveys queried subjects on their perception of simulations; the experimental group completed an additional survey section on the utility and usability of the experimental system.

Study Sessions

Each subject completed a presimulation Web survey, underwent a scripted orientation, and participated in the first simulation. Scenario preparation, progression, and termination followed strict protocols and a closely monitored timeline. The time of EDPS medication administration was used to synchronize programmed simulation events across study sessions, that is, hypoventilation consistently started at 5 minutes after induction, apnea after 6 minutes, and hypoxia to SpO2 73% after 7 minutes. Preprogrammed recovery from EDPS adverse event was triggered by appropriate ventilatory management; predefined time limits specified when hypoxic cardiac arrest would occur in case of inadequate resuscitative and supportive measures. A scenario was terminated when the subject either stated his or her completion of EDPS on the simulated patient or managed the scenario suboptimally, for example, endotracheal intubation with paralytic administration. After a reset of the simulation setting, each subject completed a second simulation, during which experimental subjects had access to the experimental system (with standardized 5-minute orientation). Each individual subject then completed a second-scenario situational awareness assessment, postsimulation Web survey, and debriefing on observed performance. Subjects received $150 gift certificates for participation.

Experimental Just-in-Time Patient Safety System

In an attempt to mitigate impaired teamwork and communication, inadequate situational awareness, workflow challenges, and the lack of a shared mental model within ED clinical teams,36–39 investigators developed the Graphic Link Information Distribution/Exchange for Provider Awareness and Teamwork in Healthcare (GLIDEPATH) system. This experimental just-in-time clinical process guidance system was a novel hardware-software hybrid, designed to emphasize preemptive task check listing, enhance information sharing and team situational awareness, and promote provider vigilance and readiness for adverse events. Integrating off-the-shelf hardware with Web browser-based software, the stand-alone system accomplished these objectives through prominent and intuitive display of patient care status and specific clinical processes. (Of note, one intentional design element was the absence of a comprehensive, preparatory educational or training intervention, to emphasize the just-in-time nature of the system.) Automated guidance with prompting and active provider login verifications at critical decision points were coded into the system. Constructed with a modular architecture, core functions were accessed by Web browser software running on basic PC hardware; full functionality required a large-screen display, USB-based interactive interface (eg, barcode or radiofrequency identification reader), wireless hosting/networking, hardware installation mounts, and a wirelessly linked tablet device with mirroring software.

Study sessions used a Mac Mini (Apple, Cupertino, CA) with SeaMonkey browser (Mozilla, Mountain View, CA), USB touchpad (Ergo Touchpad, New York, NY), MS146 barcode reader (Unitech, Los Angeles, CA) and a 27-in HP2709 display (Hewlett Packard, Palo Alto, CA) as a networked base station, which linked to a 10-in iPad tablet (Apple, Cupertino, CA) via a secure ad hoc wireless network and AirDisplay screen mirroring software (Avatron, Portland, OR) for minimal interface lag. The custom-programmed software was developed for Flash (Adobe, San Jose, CA) by study investigators (K.O. and S.L.). System compatibility with other configurations was tested with Windows (Microsoft, Redmond, WA) stations linked to Android (Google, Mountain View, CA) devices running common Web browsers (Fig. 1; the full system is accessible online at [account, treeaxis; password, glidepath]; type in “0” after start screen and select Adult/Sedation module.)

GLIDEPATH system. A, System hardware installed on site in study ED with touch interface and base station. B, System software that provides clinical process prompts and automated checklisting for EDPS (image courtesy of L. Kobayashi).

The EDPS module prototype used during SLIPSTREAM sessions contained working visual cue process reminders, positive confirmation requirements (eg, patient identification time-out), and interactive checklisting prompts with haptic interface to guide and reinforce safe sedation practice in end users. Specific user interface items were derived from prestudy institutional chart review for documentation lapses (as potential indicators of clinical practice hazards), review of published literature for common and severe EDPS complications, pilot phase findings of resuscitation equipment deficiencies,34 expert input, and EDPS conceptual model.

Data Extraction and Statistical Analysis

Descriptive analyses were performed on subject demographic data by study group. The small sample size precluded robust analysis for failure of randomization during subject assignment to control group or experimental group. To test for a potentially confounding effect of subject experience on observed performance, post hoc Spearman rank-order correlation coefficients (rS) were calculated (1) with all subjects’ postgraduate level and baseline performance, that is, first-scenario Composite Score, as well as (2) with all subjects’ previous EDPS clinical experience and baseline performance. Session monitoring data on simulation settings and timelines were descriptively analyzed.

Data from comprehensive review of research checklists, subjects’ EDPS charting, situational awareness forms, audiovisual records, and SimMan files were extracted and compiled for descriptive characterization of overall in-simulation EDPS performance. Data on subject detections of study probes, completion of research checklist items, and situational awareness were analyzed with Fisher exact and Mann–Whitney U tests.

To manage investigator conflicts of interest, data extraction for the experimental study elements was completed independently by 2 disinterested investigators (N.A.S., J.A.D.-V.) to derive the subjects’ Composite Scores; intraclass correlation coefficients were calculated for their interobserver agreement. Expert statistical input was accessed to analyze all Composite Scores using a binominal generalized linear mixed model, nesting observations within each participant and with a prespecified α level of 0.05 (SAS version 9.3 software; SAS Institute, Cary, NC). Study group, scenario type (A;B), simulation order (first or second), and rater were treated as fixed effects along with all interactions, with the group–by–simulation order interaction treated as the primary hypothesis test for differential change according to study group.

Postsimulation survey data on subjects’ perception of simulation session were examined for differences between study groups (Mann–Whitney U test). Experimental group participants’ responses regarding the experimental system’s utility and usability were compiled.


Twenty-four PGY-3/PGY-4 resident subjects (56% of potential participant pool) were recruited over 3 academic years between 2010 and 2013. Of 48 potential subjects, 5 were excluded because of participation in pilot phase simulations and the 19 remaining eligible residents did not participate. There were no significant differences in participant and study group baseline characteristics (Table 1 for details).

Study Subject Characteristics

Delineation of Simulated EDPS Performance

In-simulation checklists confirmed consistency of monitored in situ simulation parameters for all sessions. During their first/baseline scenarios, subjects spent 13.9 ± 3.0 minutes (range, 8.1–19.3 minutes) in the presedation phase; their sedation induction, maintenance, and recovery phases were completed in 17.4 ± 1.2 minutes (range, 16.0–20.6 minutes). Both groups completed their second scenarios with similar timeline characteristics.

Emergency department procedural sedation medications used during first scenarios were etomidate (16 subjects), ketamine (5, including one dosing error by experimental subject), and midazolam with fentanyl (3); medication regimens were similar across groups and during second scenarios (no errors). During their first scenarios, 5 subjects in each study group were distracted from EDPS by engaging fully in shoulder reduction maneuvers; 6 control and 7 experimental subjects were distracted during second scenarios. There were no significant differences in the performance and timeliness of ventilatory resuscitation between groups for all scenarios: the mean delay until resuscitation was 2.0 ± 1.0 minutes (range, 0.7–5.4 minutes), including 2 laryngeal mask airway deployments and 2 rapid sequence induction endotracheal intubations.

EDPS Critical Action Research Checklists

Emergency department procedural sedation critical action checklists scored control subjects for first-scenario presedation assessment, presedation equipment check, and postsedation assessment at 70% ± 13%, 62% ± 23%, and 34% ± 9%, respectively; the experimental subjects were scored similarly. Control group checklist scores did not change during repeat scenarios; the experimental group exhibited increased performance of only presedation time-outs (P < 0.01) and postsedation assessments (P < 0.01) for their second scenarios (Tables, Supplemental Digital Content 2.1a,, which provide all EDPS research checklist form completion data) (Tables, Supplemental Digital Content 2.1b,, which provide all EDPS research checklist form completion data) (Tables, Supplemental Digital Content 2.1c,, which provide all EDPS research checklist form completion data).

Institutional Procedural Sedation Forms

The subjects in both study groups completed similar levels of documentation on first-scenario presedation, sedation monitoring, and postsedation institutional form items. Only experimental group subjects increased their documentation of time-outs and postsedation checks during second scenarios (see Tables, Supplemental Digital Content 2.2a,, which provide all institutional EDPS form completion data) (see Tables, Supplemental Digital Content 2.2b,, which provide all institutional EDPS form completion data) (see Tables, Supplemental Digital Content 2.2c,, which provide all institutional EDPS form completion data).

Binary Study Probes

Detection of the difficult sedation probe was 67% for the control group and 75% for the experimental group at baseline. Two control subjects and 1 experimental subject requested difficult airway management devices during first scenarios; reversal medications were appropriately requisitioned before induction for 25% of all scenarios with reversible EDPS (n = 32) and used on 6 occasions. Adverse event management was scored as optimal for 94% of the scenarios. There were no significant between-group differences or within-group changes across scenarios with respect to the study probes.

Retrospective Situational Awareness Forms

Subjects’ situational awareness for their second scenario was 51% ± 7% (range, 38%–62%) for the control group and 58% ± 12% (range, 38%–85%; P = 0.15) for the experimental group on a 100% score metric; no significant differences were detected on the analysis of each situational awareness element. (Table, Supplemental Digital Content Part 2.3,, which displays all subject situational awareness form data).

Change in Performance Across Scenarios and Effect of Experimental System

Two investigators without conflicts of interest independently reviewed all simulations to test for learning effect in each subject and for the effect of the experimental system across groups and scenarios; their intraclass correlation values were high (0.88–0.97). Control subjects scored 7.7 ± 1.2 (mean of both investigators’ scores) of 10 on their first-simulation Composite Scores. Experimental subjects were similarly scored at 6.9 ± 1.8 (P = 0.40) for their first simulations. Post hoc analyses revealed weak correlations between subjects’ duration of EM residency training or live EDPS experience and their initial Composite Scores (rS = 0.1–0.3, nonsignificant).

Changes (Δ) in the Composite Score from the first to the second scenario were not significant for the control group. Increases in time-out completions (P < 0.01) and postsedation assessment checklist completion (P < 0.01) across scenarios for the experimental group were the only significant changes for both groups for all Composite Score metrics; these accounted for that group’s Composite Score Δ (1.9 ± 1.9) and the between-group performance difference (Δ[Δ]). The Δ[Δ] attained significance on independent analysis using a binomial generalized linear mixed model (P < 0.01) (Table 2).

Comparison of Control Group and Experimental Group Simulation EDPS Composite Scores

Postsimulation Survey on Simulation and Experimental System

On 11-point ordinal scales (0–10), all subjects scored their simulation experience as realistic (median score, 8), relevant (10), and having impact on their clinical practice (7.5). Experimental subjects reported moderate perceived utility of the experimental system for prevention of medical error during EDPS (Table, Supplemental Digital Content Part 2.4,, which summarizes all postsimulation survey response data).


Procedural sedation in acute care settings is a common, important, and indispensable element of medical care. The tasking of EM resident subjects with the safe and effective administration of appropriate (simulated) procedural sedation enabled the objective delineation of their EDPS-related performance and observance of patient safety behaviors. Study sessions revealed that subjects frequently omitted important elements of EDPS preparation (eg, review of previous sedation/complication history; assessment of injury mechanism/site/side) were easily distracted and charted incompletely. In-simulation probes that gauged preparedness for adverse events often remained undetected, and investigators recorded suboptimal performance of time-outs and postsedation reassessments. Conversely, subjects consistently used proper EDPS medications and managed adverse events in a timely manner.

Study findings seem to indicate that senior-level EM residents have the medical knowledge base and procedural skill set to perform EDPS but lack some of the nontechnical skills (ie, anticipatory planning and preparing, information exchange/charting, maintaining situational awareness) that pertain to ED microsystem functions and patient safety. Emergency department trainees may therefore benefit from efforts to explicitly instill a broad awareness and greater appreciation for safety measures during routine as well as emergent, clinical situations. Incorporation of this mindset into EM training may be facilitated by simulation methods, wherein the participant’s adherence to (or neglect of) recommended patient safety behaviors can immediately alter scenario progression and effect experiential learning. Accordingly, study findings have been presented at the research site’s EM residency conference and shared with administrative and educational leaders—curricular adjustments for ongoing residency simulation sessions are anticipated. The insight and lessons learned from SLIPSTREAM assessments are also being applied to the institution-wide, simulation-based in-servicing and credentialing of all sedation providers who perform “out of operating room” procedural sedations. These measures are in concordance with recent, successful efforts at other health care facilities to achieve institutional standardization in provider skill sets and to attain minimum safe performance requirements of sedation practice.40–46 With respect to SLIPSTREAM program continuation, developed materials are available through the AAMC MedEdPORTAL for collaborative educational and investigative efforts.35

As for the research program’s experimental system, the bedside clinical informatics prototype exhibited limited impact on safety behaviors during simulated EDPS. Although the Composite Score metric was seen to improve in the experimental group, this resulted primarily from increased time-out completions rather than a comprehensive adjustment of the subjects’ approach to EDPS patient safety. Despite reasonable functionality as a workflow guidance system with graphical interface, checklists, automated prompts, and reminder systems, the experimental prototype did not improve subjects’ performance of critical actions in several areas. Given the appearance of some positive utility (and general noninferiority of GLIDEPATH-assisted subjects’ performance when compared with the control group’s EDPS conduct), continued development with training module and use testing may be indicated.


Study design precluded blinding of subjects and investigators to group assignment. Sample size was limited by small target pools and available funding, such that there may have been problems with group randomization and the detection of true performance differences (type II error). (The weak correlations between subjects’ training background or self-reported live EDPS experience and their first-simulation Composite Scores argue against this.) In terms of the EDPS simulation research methodology and its assessment of expert-level performance levels, the Simulation EDPS Safety Composite Score metric was tested during pilot phase sessions only on novice interns and experienced attending EDPS providers. Its ability to serve as a precise, gradational differentiator of provider performance (and EDPS safety) at intermediate and higher levels had not been examined; the effect of the Composite Score’s relatively low resolution, 10-point spectrum, and the potential ceiling effect on study findings are unclear. In addition, the study’s experimental objective precluded the training of subjects on EDPS practice—this likely contributed toward the program’s inability to elicit meaningful advancement toward a practice level approximating that of more experienced providers.

Testing of the experimental system did not feature formal usability evaluations or a diverse end user sample. It remains unclear why the experimental system’s explicit prompts and checklists were unable to elicit sizeable changes in recognition of high-risk EDPS patients and checking of essential equipment. Problems with the experimental system’s approach (just-in-time exposure without preceding educational component),47 end user difficulties with the novel technology, and research methodology limitations may all have contributed to the underwhelming results.


Advanced simulations were successfully applied in situ to assess EDPS clinical practice and related safety behaviors in senior EM residents. An experimental, just-in-time bedside clinical process guidance system primarily improved time-out completions without overall improvement in subjects’ in-simulation safety behaviors or performance.


The authors acknowledge Anna C. Cousins for her insight and assistance in manuscript preparation and Dr Jason Machan PhD for the assistance with the statistical analysis.


1. American College of Emergency Physicians (ACEP). Policy Statement on Sedation in the Emergency Department 2011. Available at: Accessed October 30, 2014.
2. O’Connor RE, Sama A, Burton JH, et al. American College of Emergency Physicians Sedation Task Force. Procedural sedation and analgesia in the emergency department: recommendations for physician credentialing, privileging, and practice. Ann Emerg Med 2011; 58: 365–370.
3. Green SM. Research advances in procedural sedation and analgesia. Ann Emerg Med 2007; 49: 31–36.
4. Miner JR, Krauss B. Procedural sedation and analgesia research: state of the art. Acad Emerg Med 2007; 14: 170–178.
5. Miner JR, Martel ML, Meyer M, Reardon R, Biros MH. Procedural sedation of critically ill patients in the emergency department. Acad Emerg Med 2005; 12: 124–128.
6. Campbell SG, Magee KD, Kovacs GJ, et al. Procedural sedation and analgesia in a Canadian adult tertiary care emergency department: a case series. CJEM 2006; 8: 85–93.
7. Mensour M, Pineau R, Sahai V, Michaud J. Emergency department procedural sedation and analgesia: a Canadian Community Effectiveness and Safety Study (ACCESS). CJEM 2006; 8: 94–99.
8. Symington L, McGugam E, Graham C, Gordon M, Thakore S. Training in conscious sedation techniques: meeting the recommendations of the UK Academy of Medical Royal Colleges. Emerg Med J 2007; 24: 576–578.
9. Adams ST, Woods C, Lyall H, Higson M. Standards of practice in UK emergency departments before, during and after conscious sedation. Emerg Med J 2008; 25: 728–731.
10. Hodkinson PW, James MF, Wallis LA. Emergency department procedural sedation practice in Cape Town, South Africa. Int J Emerg Med 2009; 2: 91–97.
11. Miner JR. Procedural sedation and analgesia research. Methods Mol Biol 2010; 617: 493–503.
12. Harvey M, Cave G, Betham C. Contemporary sedation practice in a large New Zealand emergency department. N Z Med J 2011; 124: 36–45.
13. Smally AJ, Nowicki TA, Simelton BH. Procedural sedation and analgesia in the emergency department. Curr Opin Crit Care 2011; 17: 317–322.
14. Weaver CS, Terrell KM, Bassett R, Swiler W, et al. ED procedural sedation of elderly patients: is it safe? Am J Emerg Med 2011; 29: 541–544.
15. Pena BM, Krauss B. Adverse events of procedural sedation and analgesia in a pediatric emergency department. Ann Emerg Med 1999; 34: 483–491.
16. Coté CJ, Notterman DA, Karl HW, Weinberg JA, McCloskey C. Adverse sedation events in pediatrics: a critical incident analysis of contributing factors. Pediatrics 2000; 105: 805–814.
17. Krauss B, Green SM. Sedation and analgesia for procedures in children. N Engl J Med 2000; 342: 938–945.
18. Malviya S, Voepel-Lewis T, Prochaska G, Tait AR. Prolonged recovery and delayed side effects of sedation for diagnostic imaging studies in children. Pediatrics 2000; 105: E42.
19. Pitetti RD, Singh S, Pierce MC. Safe and efficacious use of procedural sedation and analgesia by nonanesthesiologists in a pediatric emergency department. Arch Pediatr Adolesc Med 2003; 157: 1090–1096.
20. Roback MA, Wathen JE, Bajaj L, Bothner JP. Adverse events associated with procedural sedation and analgesia in a pediatric emergency department: a comparison of common parenteral drugs. Acad Emerg Med 2005; 12: 508–513.
21. Doyle L, Colletti JE. Pediatric procedural sedation and analgesia. Pediatr Clin North Am 2006; 53: 279–292.
22. Sacchetti A, Stander E, Ferguson N, Maniar G, Valko P. Pediatric procedural sedation in the community emergency department: results from the ProSCED registry. Pediatr Emerg Care 2007; 23: 218–222.
23. Misra S, Mahajan PV, Chen X, Kannikeswaran N. Safety of procedural sedation and analgesia in children less than 2 years of age in a pediatric emergency department. Int J Emerg Med 2008; 1: 173–177.
24. Leroy PL, Gorzeman MP, Sury MR. Procedural sedation and analgesia in children by non-anesthesiologists in an emergency department. Minerva Pediatr 2009; 61: 193–215.
25. Couloures KG, Beach M, Cravero JP, Monroe KK, Hertzog JH. Impact of provider specialty on pediatric procedural sedation complication rates. Pediatrics 2011; 127: e1154–e1160.
26. Emergency Nurses Association (ENA). Position Statement: Procedural Sedation and Analgesia in the Emergency Department 2005. Available at: Accessed on March 16, 2015.
27. ENA and ACEP joint position statement: delivery of agents for procedural sedation and analgesia by emergency nurses. Ann Emerg Med. 2005; 46:368.
28. The Royal College of Anaesthetists and the College of Emergency Medicine. Safe Sedation of Adults in the Emergency Department 2012. Available at: Accessed October 30, 2014.
29. Nishisaki A, Donoghue AJ, Colborn S, et al. Effect of just-in-time simulation training on tracheal intubation procedure safety in the pediatric intensive care unit. Anesthesiology 2010; 113 (1): 214–223.
30. Sam J, Pierse M, Al-Qahtani A, Cheng A. Implementation and evaluation of a simulation curriculum for paediatric residency programs including just-in-time in situ mock codes. Paediatr Child Health 2012; 17: e16–e20.
31. Kamdar G, Kessler DO, Tilt L, et al. Qualitative evaluation of just-in-time simulation-based learning: the learners’ perspective. Simul Healthc 2013; 8: 43–48.
32. Scholtz AK, Monachino AM, Nishisaki A, Nadkarni VM, Lengetti E. Central venous catheter dress rehearsals: translating simulation training to patient care and outcomes. Simul Healthc 2013; 8: 341–349.
33. Blike GT, Christoffersen K, Cravero JP, Andeweg SK, Jensen J. A method for measuring system safety and latent errors associated with pediatric procedural sedation. Anesth Analg 2005; 101: 48–58.
34. Kobayashi L, Dunbar-Viveiros J, Devine J, et al. Pilot phase findings from high-fidelity in situ medical simulation investigation of emergency department procedural sedation. Simul Healthc 2012; 7: 81–94.
35. Kobayashi L, Overly F, Gosbee J. Emergency department procedural sedation in situ simulation materials (SLIPSTREAM SD-R study scenarios A + B). MedEdPortal 2012. Available at: Accessed October 30, 2014.
36. Wears R, Leape LL. Human error in emergency medicine. Ann Emerg Med 1999; 34: 370–372.
37. Chisholm CD, Collison EK, Nelson DR, Cordell WH. Emergency department workplace interruptions: are emergency physicians “interrupt-driven” and “multitasking”? Acad Emerg Med 2000; 7: 1239–1243.
38. Schenkel S. Promoting patient safety and preventing medical error in emergency departments. Acad Emerg Med 2000; 7: 1204–1222.
39. Flowerdew L, Brown R, Vincent C, Woloshynowych M. Identifying nontechnical skills associated with safety in the emergency department: a scoping review of the literature. Ann Emerg Med 2012; 59: 386–394.
40. Hoffman GM, Nowakowski R, Troshynski TJ, Berens RJ, Weisman SJ. Risk reduction in pediatric procedural sedation by application of an American Academy of Pediatrics/American Society of Anesthesiologists process model. Pediatrics 2002; 109: 236–243.
41. Babl F, Priestley S, Krieser D, et al. Development and implementation of an education and credentialing programme to provide safe paediatric procedural sedation in emergency departments. Emerg Med Australas 2006; 18: 489–497.
42. Pershad J, Gilmore B. Successful implementation of radiology sedation service staffed exclusively by pediatric emergency physicians. Pediatrics 2006; 117: e413–e422.
43. Pitetti R, Davis PJ, Redlinger R, White J, Wiener E, Calhoun KH. Effect on hospital-wide sedation practices after implementation of the 2001 JCAHO procedural sedation and analgesia guidelines. Arch Pediatr Adolesc Med 2006; 160: 211–216.
44. Priestley S, Babl FE, Krieser D, et al. Evaluation of the impact of a pediatric procedural sedation credentialing programme on quality of care. Emerg Med Australas 2006; 18: 498–504.
45. Shavit I, Keidan I, Hoffman Y, et al. Enhancing patient safety during pediatric sedation: the impact of simulation-based training of nonanesthesiologists.. Arch Pediatr Adolesc Med 2007; 161: 740–743.
46. Shavit I, Steiner IP, Idelman S, et al. Comparison of adverse events during procedural sedation between specially trained pediatric residents and pediatric emergency physicians in Israel. Acad Emerg Med 2008; 15: 617–622.
47. Leape L. The checklist conundrum. N Engl J Med 2014; 370: 1063–1064.

A Haddon matrix with preevent (EDPS preparation), event (EDPS, monitoring, and adverse event), and postevent (EDPS adverse event management and recovery) phases was applied with the following probe-assigned attributes: host factors, patient characteristics impacting EDPS; agent factors, provider EDPS abilities and readiness for complications; environmental factors, EDPS/resuscitation equipment functionality and availability, institutional EDPS policies/procedures.
Cited Here


Adverse effects; Decision support, computerized; Deep sedation; Emergency department; Graduate medical education; Moderate sedation; Emergency treatment; Health care quality improvement; Patient safety; Patient simulation; Safety management

Supplemental Digital Content

© 2015 Society for Simulation in Healthcare