Journal Logo

Empirical Investigations

The Effectiveness of Video-Assisted Debriefing Versus Oral Debriefing Alone at Improving Neonatal Resuscitation Performance

A Randomized Trial

Sawyer, Taylor DO, MEd; Sierocka-Castaneda, Agnes MD; Chan, Debora PharmD; Berg, Benjamin MD; Lustik, Mike MS; Thompson, Mark MD

Author Information
Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare: August 2012 - Volume 7 - Issue 4 - p 213-221
doi: 10.1097/SIH.0b013e3182578eae
  • Free


With the release of the sixth edition of the Neonatal Resuscitation Program (NRP) course, simulation-based training has become the standard method of teaching neonatal resuscitation.1 The classroom portion of the sixth edition of the NRP course focuses almost exclusively on teaching hands-on skills through simulated scenarios and debriefing.2 Early research on simulation-based training in neonatal resuscitation showed it to be feasible and well received by participants.3 There are limited data, however, on the effectiveness of simulation-based training at improving neonatal resuscitation performance, both in simulated and real clinical scenarios.4 Prior studies reporting outcomes after a single simulation-based training session have failed to demonstrate significant improvements in neonatal resuscitation performance.5–7 A recent report on the use of deliberate practice with simulation demonstrated significant improvements in overall neonatal resuscitation performance, improvements in positive-pressure ventilation skills, and a decrease in time to emergent umbilical access and administration of intravenous (IV) medications.8

Facilitated debriefing is a critical component of effective learning in simulation-based education.9–11 However, the optimal method by which to conduct facilitated debriefing remains to be defined.11–13 Video review has been promoted as a modality to improve the quality of facilitated debriefing, and the new NRP Instructor DVD provides detailed instructions and examples on how to conduct a facilitated debriefing using video.14 A recent review on debriefing as part of the learning process in simulation-based education found that additional research is needed to investigate learning outcomes based on video playback versus no video playback during debriefing.11 Prior experimental outcomes comparing the benefits of video-assisted debriefing to oral debriefing alone have been mixed. Studies of simulation-based training in anesthesia have failed to show the benefits of video-assisted debriefing when compared with oral debriefing, for both technical performance and nontechnical skills.15,16 Chronister and Brown17 found faster response times to initiate cardiopulmonary resuscitation and time to defibrillation in a group of nursing students provided with video-assisted debriefing as compared with a group that received oral debriefing only, but the overall performance did not differ between the 2 groups. To the authors’ knowledge, there have been no previous studies comparing the effectiveness of oral debriefing versus video-assisted debriefing at improving neonatal resuscitation performance. To evaluate the educational benefit of video-assisted debriefing to oral debriefing in simulation-based neonatal resuscitation training, we performed the following study. Our hypothesis was that video-assisted debriefing would be more effective at improving neonatal resuscitation performance when compared with traditional oral debriefing alone.



The study followed a prospective randomized design. The Sim-PICO framework (P = population, I = intervention, C = comparator, O = outcome)11 of the study is provided in Figure 1. The residents were randomized to receive oral debriefing or video-assisted debriefing while completing a series of 3 neonatal resuscitation simulations. Three simulation sessions were chosen to establish a baseline level of performance, provide an opportunity for practice, and evaluate the effect of the practice. Measurements of neonatal resuscitation performance and times to complete critical tasks in resuscitation were compared on the first (pretest), the second, and the third (posttest) simulation sessions.

Sim-PICO (P = population, I = intervention, C = comparator, O = outcome) format of the study.11 The vertical columns in the matrix represent the generic dimensions of debriefing. In this study, the intervention was video-assisted debriefing, and the comparator was oral debriefing alone. The aim of the study was to answer a question related to the “what” of debriefing. The outcome was the difference in posttest scores on the NRPE and times to complete critical tasks in resuscitation. Note that the outcome crosses multiple dimensions (columns) because the other factors can influence the outcome. However, as noted in the matrix, the what column is what was altered and tested in this study.


All 38 pediatric and family medicine residents at Tripler Army Medical Center were invited to be the subjects in the study. Tripler Army Medical Center is a 450-bed tertiary care center in Honolulu, with approximately 3000 deliveries and 300 neonatal intensive care unit admissions per year. The participating subjects were randomly paired into teams of 2 members each. Each subject was teamed with another subject from the same year group and within the same training program if possible. Each team completed a series of 3 standardized neonatal resuscitation simulations. Each simulation was followed by a facilitated debriefing. The teams were randomly assigned to receive either oral debriefing alone or video-assisted debriefing after each simulation session. The team members and debriefing method remained the same throughout all 3 simulations, and all scoring was based on team, not individual, performance. Informed consent was obtained from all subjects. The study protocol was approved by the institutional review board at Tripler Army Medical Center. Investigators adhered to the policies for protection of human subjects as prescribed in Title 45 Code of Federal Regulations, Part 46.

Simulation Environment

All simulations were performed in a realistic simulated delivery room. The subjects received assistance during each simulation from one of the researchers (D.C.) who acted as a labor and delivery nurse. During the simulations, the “nurse” was able to perform some steps of NRP, with explicit direction from the residents, including checking heart rate, providing chest compressions, and drawing up medications. The nurse did not provide direction in NRP, make treatment decisions, or assist with medication dosages.

Simulation Sessions

The simulator used was a SimBaby (Laerdal Medical, Wappingers Falls, NY), which had been specially modified to include an integrated umbilical catheterization task trainer.18 Before each session, the subjects received a brief introduction to the Laerdal SimBaby, including the simulator’s visual (perioral cyanosis and chest wall movement), auditory (breath and heart sounds), and tactile (brachial and femoral pulses) cues along with a brief review of the functionality of the umbilical catheterization task trainer.

Simulations sessions were scheduled on separate days with a goal of no longer than 2 months between simulation sessions to avoid natural decline in skills over time. Simulation sessions were spaced apart in time, rather than repeated on the same day, to take advantage of the “spacing effect,” a well-described phenomenon in educational psychology and medical simulation that, for a given amount of study time, spaced study yields better learning than does massed.19,20

The initial condition of the infant in all simulations was limp, apneic, cyanotic, and bradycardic (heart rate <60 beats per minute). All simulations required intubation and IV epinephrine. The second and the third simulations also required a normal saline bolus. Pulse oximetry and cardiorespiratory monitors were not used during the simulations because these were not standard practice at all neonatal resuscitations at our institution at the time of the study and the subjects were asked to rely solely on physiologic cues (auscultation of breath sounds, palpation of pulse, etc) from the simulator.

At the start of each simulation, the team was called into the simulated delivery room and given a brief history leading up to the delivery. The teams were then allowed 60 seconds to prepare for the resuscitation and ready their equipment. All simulations were timed, and a maximum time of 10 minutes was allowed for each simulation session, regardless of how far in the resuscitation the team had progressed. Simulation sessions could be stopped earlier than 10 minutes if all appropriate steps in the resuscitation had been completed. At the time of the study, the subjects did not receive simulation training in neonatal resuscitation in any venue other than this research study. No subjects reported prior experience using the Laerdal SimBaby.


Every simulation was facilitated by a single investigator (T.S.) who also led the debriefing session immediately after the simulation. Debriefings followed the “debriefing with good judgment” model of Rudolph et al.21 Each debriefing consisted of 3 phases: reaction, analysis, and summary.22 In the reaction phase, the facilitator inquired how the subjects felt the simulation went and allowed them to “blow off steam.” During the analysis phase of the debriefing, the facilitator provided a “low” level of facilitation, as defined by Dismukes and Smith,23 wherein he guided the debriefing through the steps of the resuscitation in chronologic order to allow for a discussion of all relevant learning points that arose during each step in the resuscitation. During the analysis discussion, the facilitator attempted to investigate the basis for performance gaps using advocacy statements paired with inquiry statements as described by Rudolph et al.21 The debriefing sessions were intended to provide directed and specific formative assessment to the participants, with an express intent of improving performance in subsequent sessions.22 In the summary phase of the debriefing, the lessons learned during the analysis phase where distilled and specific measures could be changed to improved performance in the future were highlighted.

The subjects in the oral debriefing group relied solely on mental recall of the events during the analysis phase of the debriefing session. The subjects in the video-assisted debriefing group were allowed to watch the entire video of the resuscitation during the analysis phase of the debriefing session, with the facilitator stopping the video to allow discussion of relevant learning points. When needed, the video was rewound and reviewed again to highlight specific performance gaps. As needed, the subjects in both groups received hands-on instruction in technical skills, including airway management, umbilical catheterization, and medication administration, during the debriefing sessions.

To help standardize the debriefings, a debriefing form (Appendix A) was used during the debriefings of both the oral and video-assisted groups. Notes taken on the form during the simulation session were used during the debriefing session to highlight areas of good performance and areas for improvement. To further standardize the debriefings, all debriefing sessions in both the oral and video-assisted groups were timed and limited to 20 minutes in length.

The Neonatal Resuscitation Performance Evaluation Tool

Neonatal resuscitation performance was scored using a previously validated scoring instrument, known as the Neonatal Resuscitation Performance Evaluation (NRPE) tool.8 This instrument was chosen because of its ability to give mean performance scores in the various subdomains of neonatal resuscitation, a feature not found in other published neonatal resuscitation performance scoring instruments.5,24 The NRPE tool was developed by modification of a tool previously used to evaluate performance in actual recorded neonatal resuscitations.25 The validity of the data derived from the NRPE tool was determined using several lines of evidence as described by Downing,26 which included content, consequence, response process, relationship to other variables, and internal structure validity. The NRPE tool shows strong evidence of content and consequence validity.8 The evidence of response process validity, for example, protecting data integrity by controlling for potential sources of error associated with test administration, was provided in our study by standardizing the simulation environment and scenarios, maintaining the same team members throughout the study, and providing good quality videos to a blinded reviewer. The relationship of the NRPE tool to other variables was evident in that the modified NRPE tool measured 12 (80%) of 15 specific performance metrics noted on the NRP “megacode” performance checklist included in the 2006 Neonatal Resuscitation Instructor Manual.24 Internal structure validity of the data derived from the NRPE tool was evident in that the tool is able to reliably differentiate teams with more experience in neonatal resuscitation from those with less experience (P = 0.003).8 Prior reliability testing of the NRPE tool showed a good interrater reliability for the various subdomains of resuscitation, with a mean Cohen κ of 0.63.8

Evaluation of Resuscitation Performance

Evaluation of NRP performance on the first (pretest), the second, and the third (posttest) simulations was conducted by way of video review by a blinded reviewer (A.S.-C.) using the NRPE tool. The reviewer was blinded to the subject year group, simulation session number, and study group assignment. Neonatal Resuscitation Program performance was measured in 7 subdomains including preparation and initial steps, communication of heart rate, positive-pressure ventilation, chest compressions, intubation, medication administration, and umbilical vessel catheterization. Each subdomain included up to 6 task-specific performance metrics. Each metric was given a score of 1 if completed correctly or 0 if not completed or incorrectly performed. The scores for each domain were determined by calculating the number of points achieved divided by the total points possible in that subdomain that yielded 0% to 100%. An overall NRP performance score was calculated by dividing the total accrued points in all subdomains by the total points possible for the resuscitation scenario. Times to achieve critical elements of resuscitation were also determined by video review.

Statistical Analysis

An a priori sample size estimate, based on an estimated initial overall performance of 70% with an SD of 15%, gave an n of 8 teams in each study group to show an educationally meaningful difference in performance between the 2 study groups with a power of 0.8 and an α of 0.05. This sample size estimate was supported by literature in the field of education, indicating that an effect size greater than 1 SD is large and acceptable for a given teaching intervention.27 Changes in performance scores from the first to the third simulation within each group were analyzed using the Friedman repeated-measures analysis of variance on ranks. Comparisons of NRP performance scores between the 2 groups on the pretest and posttest were conducted using the Mann-Whitney rank sum test. Changes in times within each group were evaluated using 1-way repeated-measures analysis of variance. Comparisons of demographic data, and times to complete critical tasks, between the pretest and the posttest of oral debriefing and video-assisted debriefing groups were conducted using an unpaired 2-sided t test or Mann-Whitney rank sum test, as appropriate. P < 0.05 was considered statistically significant. The educational effect size of the video-assisted debriefing intervention was calculated based on posttest overall performance means and SDs using Cohen d. By convention, an effect size of 0.2 or lower was considered to be “small”; higher than 0.5, “medium”; and ≥ 0.8 or higher, “large.”27 Data were analyzed using SigmaPlot 11.0 (Systat Software, Inc).


Thirty subjects, divided into 15 teams of 2 members each, participated in the study. Seven groups received oral debriefing. Eight groups received video-assisted debriefing. The 2 study groups did not differ significantly from one another at baseline. Table 1 provides demographic data of the study subjects.

Table 1:
Demographic Data of Study Participants

The teams completed the series of 3 simulations over a period of 9 months (September 2007 to June 2008). The time between simulation sessions 1 and 2 averaged 123 (SD, 65) days for oral debriefing and 80 (SD, 50) days for the video-assisted group (P = 0.18). The time between simulation sessions 2 and 3 averaged 64 (SD, 38) days for oral debriefing and 68 (SD, 21) days for the video-assisted group (P = 0.81).

Table 2 shows NRP performance scores on the pretest and posttest. Overall neonatal resuscitation performance scores improved in both groups [oral pretest, 83% (SD, 14%) vs. oral posttest, 91% (SD, 6%) (P = 0.009); video pretest, 81% (SD, 16%) vs. video posttest, 93% (SD, 10%) (P < 0.001)]. The video-assisted debriefing group had a larger improvement in overall neonatal resuscitation scores when compared with the oral debriefing group (video, 12% vs. oral, 8%), but the difference was not statistically significant (P = 0.59). There were no differences in performance scores for any of the 7 subdomains of neonatal resuscitation, or for overall performance, between the 2 groups on the pretest or posttest (P ≥ 0.2). The calculated educational effect size of video review during debriefing was small (d = 0.08).

Table 2:
Comparison of Neonatal Resuscitation Performance Scores

Table 3 shows the times to complete critical tasks of neonatal resuscitation on the pretest and posttest. The average times to provide the first positive-pressure breath, achieve successful intubation, achieve vascular access, administer the first IV medication, and give the first IV epinephrine dose decreased on the posttest in both groups. The only statistically significant changes were in time to achieve vascular access and administer the first IV medication in the oral group (IV access, P = 0.039; first IV medication, P = 0.03). There was no significant difference in the time to complete any critical tasks between the 2 study groups on either the pretest or posttest.

Table 3:
Comparison of Times to Complete Critical Steps in Neonatal Resuscitation


The purpose of this study was to investigate the additional beneficial effects of video-assisted debriefing, as compared with traditional oral debriefing alone, at improving neonatal resuscitation performance. Simulations were conducted using a sophisticated infant simulator in a realistic delivery room environment providing a “high-fidelity” experience to the learners. The debriefing methods used were based on previously published and widely taught methods. A debriefing form was used to guide the debriefings. The video review was provided as an adjunct to the oral debriefing and included rewinding and review of pertinent learning points on an individualized basis, tailored to the team’s performance. Using this design, we were unable to show a significant benefit of video-assisted debriefing over oral debriefing alone at improving neonatal resuscitation performance or time to complete critical tasks of resuscitation, and the added educational benefit of video review during debriefing was small. To the authors’ knowledge, this is the first study to compare oral debriefing with video-assisted debriefing in neonatal resuscitation simulation-based training.

Despite its widespread use, there is limited empiric evidence to support video review during the facilitated debriefing in simulation-based medical education.11,13 A review of the literature revealed only 3 previously published investigational studies examining the benefits of video-assisted debriefing. In a study of anesthesia residents’ crisis management skills, Savoldelli et al15 randomized residents to receive no debriefing, oral debriefing alone, or video-assisted debriefing after participating in a series of 2 intraoperative cardiac arrest simulations. On the second simulation, crisis management skills improved significantly in both the oral debriefing and video-assisted debriefing groups, not in the group that received no debriefing. There was no difference in score improvement between the oral and video-assisted debriefing groups. Byrne et al16 compared the performance of 2 groups of anesthetists managing a simulated anesthetic crisis. One group received a short verbal explanation of the clinical problem, and the other group received a verbal explanation in addition to video review. After 5 simulation sessions, there was no difference between the 2 groups in either chart errors or time to solve the crisis. Most recently, Chronister and Brown17 performed a comparative, crossover study of nursing student’s performance and response times during a cardiopulmonary arrest simulation after receiving either oral debriefing or video-assisted debriefing. Results showed faster response times to initiate cardiopulmonary resuscitation and time to shock in the video-assisted debriefing group; however, there was no difference in performance between the 2 groups. We believe that the current study adds to the existing literature on the use of video review during simulation-based medical education.

The use of video review during feedback and debriefing has been examined in several areas outside medicine including sports, education, music, and the military. In sports, video-assisted feedback has been shown to be superior to self-guided and oral feedback alone at improving psychomotor performance.28 Video feedback has also been rated more favorably than descriptive feedback alone by athletes.29 In education, video feedback has been shown to have a positive effect on teacher interaction skills in a range of contact professions.30 The aggregate effect size of video review in education has been reported as small to medium (0.4).31 In music, real-time video feedback of vocal performance has been associated with positive impacts on teaching behaviors and student experiences.32 In the military, the use of high-technology, multivariant video review has been used for decades as part of the army’s after-action review process.33

Given its generally accepted utility and positive impact in other sectors, it is unclear why video-assisted debriefing has not proven itself to be more beneficial in simulation-based medical education. As described by Fanning and Gaba,13 the benefit of video review during medical simulation debriefing is highly dependent on the skills and techniques of the facilitator, rather than the video footage itself. This makes an analysis of the debriefing methods used in comparative investigations critically important. During the debriefings in this study, we used a debriefing form as a guide during the debriefing discussion. This type of “scripted debriefing” may have influenced our negative results. Further research is needed to examine the impact of scripting during debriefing in medical simulation.11,34 In the current study, the video was watched in its entirety and used as a backdrop to the oral discussion with rewinding and review of certain segments to highlight areas of good or suboptimal performance on an individualized basis. This was done with the intent to provide maximal expose of the study intervention (video review) to the subjects. Perhaps watching only snippets of the video would be more educationally beneficial. However, in the study by Savoldelli et al,15 only selected video segments, chosen to illustrate the instructor’s constructive criticisms, were reviewed with similar negative results and a trend toward greater improvement in the oral debriefing group. The authors of that study postulated that the video review may have interfered with the facilitator’s feedback by changing the content of the feedback, may have resulted in “information overload” for the learners, or may have been a distraction during the debriefing session, causing learners to pay less attention to the instructor’s comments and criticisms.15 Timing is another element of the debriefing, which requires further study.11 In this study, debriefing followed immediately after the simulation. As such, the events of the simulation were likely still fresh in the mind of the participants. This may allow oral debriefing alone to be sufficient to explore the events. We do not know if the video review would have had more impact if the debriefings were held some time later or if including the option of making the videos available to students for review later, on their own time, would have proven more educationally beneficial. In addition, providing more time during the debriefing sessions to review the videos (eg, 30 minutes instead of 20 minutes) may have altered our results.

Despite the negative results of the current study, we do not feel that our findings imply that video review during simulation debriefing has no implicit benefits. On the contrary, we believe that video review has many important uses during debriefing sessions, and we commonly use the technique during simulation-based training in neonatal resuscitation at our institution. A specific example of the utility of video review is the ability to clear up controversies of what actions occurred, or did not occur, during the simulation session by viewing the tape. This ability to definitively clear up controversies between the learner and the facilitator can help reduce hindsight bias and may prove invaluable at times during facilitated debriefings. Another advantage of video review is the ability to allow participants to view for themselves personal quirks and mannerisms (eg, laughing under stress) that they themselves may not be aware of. As anyone who has experienced it can tell you, it is one thing to be told you acted in a certain way during a simulation session and quite another thing to see the action on video. Although clearly beneficial to learners, the effects of these specific benefits of video review may not be directly measurable in the current study design.


Our study has several limitations. Our experimental design did not include a control group that received no debriefing. The decision to omit a control group was based on evidence from prior studies and published commentaries indicating that omitting debriefing from simulation-based training significantly limits the educational benefits to the participants.13,15,22 A second limitation is the higher-than-expected baseline performance. In our institution, all pediatric residents are required to become NRP instructors before graduation, which explains the high number of NRP instructors in our study. This may limit the generalizability of our results. Another potential limitation is the relatively low subject number. The sample size for the study was chosen to provide the ability to measure a large educational effect size of the intervention, defined as a difference between the 2 groups greater than 1 SD from the pretest mean.27 Arguably, this standard may be too strict to apply to the current study, especially given the small educational effect size that we found. A larger sample would be needed to adequately power the study to determine a smaller educational effect size.35 However, the practical significance of such an investigation would need to be established.36 In addition, the scoring instrument that we used combines elements of both cognitive and psychomotor skills. This makes it difficulty to determine which of these domains (if either) were impacted more by the use of video review. These limitations must be kept in mind when interpreting our results.

The strengths of this study include the use of the same team members for all simulation sessions, the use of standardized scenarios and well-accepted debriefing methods, and the use of a blinded reviewer using a tool that yields reliable scores that enables one to make valid judgments of neonatal resuscitation performance.


Using our methodology, video-assisted debriefing did not seem to offer significant educational benefits over oral debriefing alone during simulation-based training in neonatal resuscitation. This does not negate the possibility that, under different circumstance, video review may have educational benefits. Clearly, additional research is needed to define the optimal use of video review and to determine what benefits (if any) exist in using video review during simulation debriefing. Research questions that remain unanswered are many and include the following: Does scripting of the debriefing modulate the benefits of video review? What is the best “dose” of video review during debriefing? Does video review have a different impact on technical versus nontechnical skills? What effect does the timing of video review after the simulation have? What are the qualitative effects of video review on simulation participants? All these are important questions, and their answers will provide needed insight to simulation educators on the optimal use of video review during simulation debriefing.


1. Kattwinkle J, ed. Neonatal Resuscitation Instructor Manual. 6th ed. Elk Grove Village, IL: American Academy of Pediatrics & American Heart Association; 2011.
2. Kattwinkle J, Perlman JM, Aziz K, et al.. Special report—neonatal resuscitation: 2010 American Heart Association guidelines for cardiopulmonary resuscitation and emergency cardiovascular care. Pediatrics 2010; 126 (5): e1400–e1413.
3. Halamek L, Kaegi D, Gaba D, et al.. Time for a new paradigm in pediatric medical education: teaching neonatal resuscitation in a simulated delivery room environment. Pediatrics 2000; 106 (4): e45.
4. Halamek LP. The simulated delivery-room environment as the future modality for acquiring and maintaining skills in fetal and neonatal resuscitation. Semin Fetal Neonatal Med 2008; 13 (6): 448–453.
5. van der Heide P, van Toledo-Eppinga L, van der Heide M, van der Lee J. Assessment of neonatal resuscitation skills: a reliable and valid scoring system. Resuscitation 2006; 71: 212–221.
6. Campbell DM, Barozzino T, Farrugia M, Sgro M. High-fidelity simulation in neonatal resuscitation. Paediatr Child Health 2009; 14 (1): 19–23.
7. Thomas EJ, Williams AL, Reichman EF, Lasky RE, Crandell S, Taggart WR. Team training in Neonatal Resuscitation Program for interns: teamwork and quality of resuscitations. Pediatrics 2010; 125 (3): 539–546.
8. Sawyer T, Sierocka-Casteneda A, Chan D, Lustik M, Berg B, Thompson M. Deliberate practice using simulation improves neonatal resuscitation performance. Simul Healthc 2011; 6 (6): 327–336.
9. Issenberg SB, McGaghie WC, Petrusa E, Gordon DL, Sales RJ. Features and uses of high-fidelity medial simulation that lead to effective learning: a BEME systematic review. Med Teach 2005; 27 (1): 10–28.
10. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ 2010; 44: 50–63.
11. Raemer D, Anderson M, Cheng A, Fanning R, Nadkarni V, Savoldelli G. Research regarding debriefing as part of the learning process. Simul Healthc 2011; 6 (7): S52–S57.
12. Dismukes RK, Gaba DM, Howard SK. So many roads: facilitated debriefing in healthcare. Simul Healthc 2006; 1: 23–25.
13. Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul Healthc 2007; 2 (2): 115–125.
14. NRP Instructor DVD: An Interactive Tool for Facilitation of Simulation-Based Learning. Elk Grove Village, IL: American Academy of Pediatrics & American Heart Association; 2010.
15. Savoldelli GL, Naik VN, Park JP, Joo HS, Chow R, Hamstra SJ. Value of debriefing during simulated crisis management. Anesthesiology 2006; 105: 279–285.
16. Byrne AJ, Sellen AJ, Jones JG, et al.. Effect of videotape feedback on anaesthetists’ performance while managing a simulated anaesthetic crisis: a multicentre study. Anaesthesia 2002; 57 (2): 176–179.
17. Chronister C, Brown D. Comparison of simulation debriefing methods [published online ahead of print July 22, 2011]. Clin Simul Nurs doi: 10.1016/j.ecns.2010.12.005.
18. Sawyer T, Hara K, Thompson M, Chan D, Berg B. Modification of the Laerdal SimBaby to include an integrated umbilical cannulation task trainer. Simul Healthc 2009; 4 (3): 174–178.
19. Dempster FN. The spacing effect. Am Psychol 1988; 43: 627–634.
20. Moulton C-A, Dubrowski A, MacRae H, Graham B, Grober E, Reznick R. Teaching surgical skills: what kind of practice makes perfect? A randomized, controlled trial. Ann Surg 2006; 244 (3): 400–409.
21. Rudolph J, Simon R, Dufresne R, Raemer D. There’s no such thing as a “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc 2006; 1 (1): 49–55.
22. Rudolph J, Simon R, Raemer D, Eppich W. Debriefing as formative assessment: closing performance gaps in medical education. Acad Emerg Med 2008; 15 (11): 1010–1016.
23. Dismukes R, Smith G. Facilitation of Debriefing in Aviation Training and Operations. Aldershot, UK: Ashgate; 2000.
24. Lockyer J, Singhal N, Fidler H, Weiner G, Aziz K, Curran V. The development and testing of a performance checklist to assess neonatal resuscitation megacode skill. Pediatrics 2006; 118 (6): 1739–1744.
25. Carbine DN, Finer N, Knodel E, Rich W. Video recording as a means of evaluating neonatal resuscitation performance. Pediatrics 2000; 106 (4): 654–658.
26. Downing SM. Validity: on the meaningful interpretation of assessment data. Med Educ 2003; 37: 830–837.
27. Cohen J. Statistical Power Analysis for the Behavioral Sciences. New York, NY: Academic Press; 1997.
28. Guadagnoli M, Holocomb W, Davis M. The efficacy of video feedback for learning the golf swing. J Sports Sci 2002; 20: 615–622.
29. Reed D, Fleming R. Behavioral coaching to improve offensive line pass-blocking skills of high school football athletes. J Appl Behav Anal 2010; 43: 463–472.
30. Fukkink R, Tavecchio L. Effects of video interaction guidance on early childhood teachers. Teach Teach Educ 2010; 26: 1652–1659.
31. Fukkink R, Trienekens N, Kramer L. Video feedback in education and training: putting learning in the picture. Educ Psychol Rev 2011; 23: 45–63.
32. Welch G, Howard D, Himonides E, Brereton J. Real-time feedback in the singing studio: an innovatory action-research project using a new voice technology. Music Educ Res 2005; 7 (2): 225–249.
33. Rankin W, Genter F, Crissey M. After-action review and debriefing methods: techniques and technology. The Interservice/Industry Training, Simulation & Education Conference (I/ITSEC). 1995.
34. Cheng A, Hunt EA, Donoghue A, et al.. EXPRESS—Examining Pediatric Resuscitation Education Using Simulation and Scripting. The birth of an international pediatric simulation research collaborative—from concept to reality. Simul Healthc 2011; 6: 34–41.
35. Fan X. Statistical significance and effect size in education research: two sides of a coin. J Educ Res 2001; 94 (5): 275–282.
36. Kirk R. Practical significance: a concept whose time has come. Educ Psychol Meas 1996; 56: 746–759.


Overall Resuscitation:

Notes: (overall resuscitation)



Reaction Phase:

Lead-off question: “How do you feel about that resuscitation?”

- Address any emotional concerns

Analysis Phase:

Preparation and Initial Steps:




- Does the group think they prepared adequately?

- Address any deficiencies in preparation/initial steps

- What may have been done/improved?

Communication of Heart Rate to Lead Resuscitator:




- Was the heat rate communicated?

Bag/Mask (BVM) Ventilation:




- Does the group feel that they administered BVM correctly?

- Address any problems/concerns with BVM ventilation

- What may have been done/improved?

Chest Compressions:




- Does the group feel that they administered chest compressions correctly?

- Address any problems/concerns with chest compressions

- What may have been done/improved?





- How does the group feel about their intubation(s)?

- Address any problems/concerns with the intubation

- What may have been done/improved?





- Does the group feel that they administered medications correctly?

- Address any problems/concerns with medications given/omitted

- What may have been done/improved?

Umbilical Vessel Catheterization:




- How does the group feel about their attempt(s) at umbilical catheterization?

- Address any problems/concerns with the UVC

- What may have been done/improved?





- How does the group feel about the leadership during the resuscitation?

- Address any problems/concerns with the leadership

- What may have been done/improved?

Resource Management: (Equipment, Personnel, etc)




- How does the group feel about their resource management during the resuscitation?

- Address any problems/concerns with the resource management

- What may have been done/improved?





- How does the group feel about their communication during the resuscitation?

- Address any problems/concerns with the communication

- What may have been done/improved?

Summary Phase:

Closing question: “Any further question/comments?”

- Address any specific questions/comments

- Indicate points of the resuscitation that went well

- Review any specific problems that occurred during the resuscitation and how performance could be improved

Feedback session time: _____minutes


Simulation; Neonatal resuscitation; Facilitated debriefing; Video review; Video-assisted debriefing

© 2012 Society for Simulation in Healthcare