Health care educators have recognized the essential role of debriefing in simulation learning contexts1–8 to help transform experience into learning through reflection.9–12 Debriefing is a facilitated reflection in the cycle of experiential learning3 to help identify and close gaps in knowledge and skills.13 Debriefing includes the following essential elements14 : (a ) active participation with more than just the passive receipt of feedback; (b ) developmental intent focused on learning and improvement (more than a performance review); (c ) discussion of specific events; and (d ) input from multiple sources. Whereby debriefing represents a conversation between simulation participants and educator(s), feedback is the specific information about an observed performance compared with a standard.15 Effective debriefings can provide a forum for feedback that is essential for performance improvement14–21 and deliberate practice that promotes expertise.22–27 The notion of performance gaps is important for individuals and teams. A performance gap is the difference between the desired and actual observed performance28 and can form the basis for separate lines of questioning in the debriefing. For this article, we will refer to performance gaps as areas in need of improvement. However, simulation educators should also debrief areas of exceptional performance29 because lessons can be drawn from both successful and failed experiences.30 We use the term learner to indicate all participants irrespective of stage of training or career. Moreover, although debriefing may occur during or after the simulation,31–33 our focus is postsimulation debriefing.
Evidence is emerging about what makes debriefing effective6,34,35 and how to assess its quality.36,37 Wide agreement exists about the importance of a supportive learning environment as a prerequisite for successful simulation-based education and debriefing21,28,31,38 and what contributes to it.6,37–40 How educators facilitate debriefings, however, is highly variable14 and in practice may stray from the ideal.5,34 For example, although simulation participants seem to value an honest, nonthreatening approach,6 educators often hesitate to share critical performance feedback to avoid being seen as harsh4,41 and because of perceived potential negative effects on the learner.42–46 Simulation educators, especially novices, can be overwhelmed by the complexity of facilitating debriefings, and practical guidance is needed. Our initial work on scripted debriefing47 has shown promise in promoting debriefing quality for less experienced educators in the narrow scope of resuscitation training. Indeed, scripted debriefing approaches have been integrated into standardized advanced life support courses.48 Educators, however, need additional support. We seek to fill this gap by presenting a debriefing script paired with a novel blended approach to debriefing called PEARLS [Promoting Excellence And Reflective Learning in Simulation]. In this article, we define a blended approach to debriefing as the selective and deliberate use of more than one debriefing strategy, guided by context and learner need, within a single debriefing event.
The purposes of this article are as follows: (1) to provide a rationale for scripted debriefing; (2) to discuss a rationale for a blended approach to debriefing based on challenges to be addressed and debriefing method; (3) to present a PEARLS debriefing framework and guidance for its application; and (4) to offer early experiences of implementing the framework in simulation educator courses.
A RATIONALE FOR SCRIPTED DEBRIEFING
Despite the critical role of debriefing in experiential learning contexts,2,3,38,41,49–52 simulation educators may struggle to learn and master this essential skill. An area of increasing focus is how debriefing best practice translates into practical, easy-to-implement strategies.8,53–56 Structured and scripted debriefing in clinical contexts53,54 and simulation-based education47 may counter the variability in debriefing style and structure. For example, the EXPRESS [Examining Pediatric Resuscitation Education using Simulation and Scripting] trial aimed to standardize debriefings in the Pediatric Advanced Life Support (PALS) course by assessing the effect of a scripted debriefing tool used by novice instructors on learning and performance outcomes.47 Novice instructors using a debriefing script were more effective at increasing learners’ knowledge acquisition and team leader behavioral skills compared with those educators who did not use a script. Building on the experiences gained from the EXPRESS study, the authors of this article collaborated with the American Heart Association (AHA) to help develop a new debriefing tool for both the PALS and Advanced Cardiac Life Support (ACLS) course.48 The AHA debriefing tool used the “Gather, Analyze and Summarize (GAS)” debriefing model48 and was developed to be generalizable to all PALS and ACLS scenarios. The tool provided educators with specific phrases to help facilitate learning and was ultimately incorporated into the 2011 PALS and ACLS instructor materials. Unfortunately, both the EXPRESS and AHA debriefing tools used only one strategy for debriefing, thus providing limited flexibility and guidance for educators struggling to adapt dynamically to learner needs and time constraints.
Of the debriefing tools being developed, some are designed for expediency,53,54,56 some may be suitable only for more experienced simulation educators,55 and some have limitations because they focus on only one debriefing strategy. To address these issues, we have developed a novel debriefing script. The PEARLS debriefing script is a cognitive aid that may promote faculty development efforts and augment debriefing skills particularly in those educators who are still solidifying their debriefing expertise. The use of select video sequences from the simulation scenario, time and technology permitting, may also promote learning57,58 but may not be essential,59,60 so we have not emphasized this aspect. Further research describing the optimal use of video during debriefing is required to help guide optimal integration of video into the PEARLS debriefing framework.
A RATIONALE FOR A BLENDED APPROACH TO DEBRIEFING
Although we have drawn from the education and simulation literature, including empiric evidence where available, we also relied on our own combined debriefing experience and simulation faculty development work. Most expert simulation educators deliberately meld several educational strategies during debriefings based on context or specific debriefing goal rather than adhering rigidly to one particular strategy.5,52 Many options, however, may overwhelm novice debriefers.
Although various strategies exist, we have distilled these into 3 broad categories as follows: (a ) learner self-assessment,3,49,52–54 (b ) focused facilitation to promote critical reflection and deeper understanding of events,2,4,31,50–52,57,61,62 and (c ) and providing information through directive performance feedback63,64 and/or focused teaching.5,51 Each category of commonly used approaches has its own potential advantages and disadvantages in the context of health care debriefing (see Table, Supplemental Digital Content 1 https://links.lww.com/SIH/A174 for advantages and disadvantages of commonly used educational strategies).
In merging these 3 broad educational strategies into a blended debriefing framework, we have kept key learning principles in mind, namely, that learning should be active, collaborative, and self-directed65 and learner-centered.66 The framework helps guide practical decision making for targeted selection of an educational strategy during the analysis phase of the debriefing. For example, educators can engage learners and promote self-assessment of their performance by querying what they think went well and what they would change about their performance using a plus/delta technique3 or what went well/not so well and why (eg, SHARP technique)54 or what was “easy” versus “challenging.”52 Although self-assessment is prone to inaccuracy,67–69 educators can use learner self-assessment approaches to identify areas for further inquiry that learners find important. Other general facilitation techniques70 or more specific questioning methods4,55,71,72 may lead to high-yield discussion and learning. For example, when using advocacy-inquiry, educators seek to uncover learners’ rationale for action or mental models by stating a concrete observation and sharing their point of view or judgment about it before inquiring about the learners’ perspective.4,41 Similarly, exploring alternatives and their pros and cons of clinical decisions, management options, or other areas of performance can yield rich discussion and learning.52 Additional methods are emerging, which have great potential to add to educators’ debriefing repertoire,71,72 and Kolbe et al55 provide a comprehensive discussion. These focused facilitation methods share the goal of helping learners’ surface and explore their mental models and/or thought processes. Once mental models have been made explicit, educators and learners can work together to reframe their thinking or encourage effective cognitive routines.28 Such facilitated discussions can be particularly fruitful when debriefing interprofessional and multidisciplinary teams. Finally, educators often provide information in the form of clear directive performance feedback and/or focused teaching when indicated,57,64 ideally delivered in an honest but nonthreatening manner.6,37 The blending of strategies while addressing a given learning objective may be quite appropriate; as an example, all educational strategies may serve a role during exploration of complex clinical decision-making processes (global self-assessment first, then focused facilitation about decision making, then providing information based on learning needs).
PEARLS DEBRIEFING SCRIPT
The PEARLS debriefing script assists both novice and experienced simulation educators to effectively implement the PEARLS framework of debriefing. The use of the script assumes that educators have adequately prepared learners to participate in the simulated learning encounter; creating a sense of psychological safety is essential.28,52,73 The PEARLS debriefing script supports simulation educators in 3 main areas as follows: (1) setting the stage for the debriefing; (2) organizing the debriefing to include initial participant reactions followed by a description of relevant case elements, an analysis of positive and suboptimal areas of performance using the PEARLS framework to select a debriefing approach, and finally a summary of lessons learned; and (3) formulating questions that empower educators to share clearly their honest point of view about events. Table 1 provides an overview of the PEARLS debriefing framework with suggested wording for each phrase and strategy (see Table, Supplemental Digital Content 2 https://links.lww.com/SIH/A175 , guides the educator through the advocacy-inquiry model of debriefing, for use when selected).
TABLE 1: PEARLS Debriefing Script
PEARLS DEBRIEFING FRAMEWORK
The PEARLS debriefing framework integrates commonly used strategies during debriefings and provides guidance on their implementation, depending on target learner group or debriefing environment. Context-specific factors influence the choice of approach, including time available, whether learners’ rationale for action is clear, and whether the learning objective/performance gap are related to knowledge, skills, or behaviors.
PEARLS outlines 4 distinct phases of the debriefing process,2,28,61 although its novel focus is the blended approach in the analysis phase (Fig. 1 ). The 4 phases are the reactions, description, analysis, and summary phases. For further details, see Table 1 (PEARLS Debriefing Script).
FIGURE 1: PEARLS debriefing framework.
The reactions phase begins with an open-ended question such as “How are you feeling?” to allow learners to vent and express their initial thoughts and feelings.3,6,8,28,57 When only 1 or 2 learners respond to the initial question, a follow-up question such as “Other initial reactions?” or “How are the rest of you feeling?” followed by silence often prompts additional reactions. This ensures that all participants have a chance to vent if they choose.
In the description phase,2 it can be helpful to invite someone to summarize their perspective of key events or major medical problems faced during the case to make sure that educator(s) and participants are on the same page.61 If team members are not on the same page about major issues or events, it can be a useful springboard for later discussion. To avoid a time-consuming and at times inefficient recounting of all events during the case, it can help to focus this portion on main issues. During these opening phases, astute educators make a note of particular learner concerns that may represent important issues to address later in the debriefing.
PEARLS and the Analysis Phase: Specific Decision Support
In applying the PEARLS framework, educators select the strategy suited for each particular aspect of performance in the analysis phase of the debriefing (Fig. 1 ). Before the start of the debriefing session, educators should reflect on the level of insight and experience of the participants, along with his or her own debriefing experience, because these may influence which educational strategies to use during the debriefing (Table 2 ). To determine the ideal strategy for each particular aspect of performance, educators should pose the following questions (Table 2 ):
TABLE 2: Suggested Indications for 3 Educational Strategies Used During Debriefing
Is the rationale for the performance gap clear (ie, it is clear if the participant states, “I did not know what to do next,” thus signifying an underlying knowledge gap)?
How much time is available?
Does the performance clearly represent cognitive (eg, knowledge, clinical decision making), technical (eg, procedural skills), or behavioral domains (eg, team dynamics, interprofessional collaboration, leadership, communication)?
Using these screening questions (Table 2 ) and Figure 1 for guidance, educators can choose a strategy for each relevant aspect of performance. Although no prescribed combination of variables best indicates use of one educational strategy over another, we suggest that the more variables that support use of a specific strategy, the greater is the likelihood that it would be suitable in that particular context. We have designed a decision support matrix for educators to use while observing a simulation event (Table 3 ). Educators simply populate the learning objectives and then sequentially consider the 3 screening questions mentioned earlier to help them select the educational strategy best suited for that specific performance gap or objective. This process is not meant to be overly rigid; it becomes more refined with experience implementing and debriefing a given scenario.
TABLE 3: Decision Support Matrix for Educators
Self-assessment strategies (what went well/what would you change?;3,53 what went well/did not go well and why? ;54 what was easy, what was challenging?52 ) are well suited at the outset of the analysis phase if time is limited or if the participants did not share their thoughts and/or emotions during the reactions phase. Often major issues can be raised in a short period and may provide insight as to what topics are important to participants. Once issues are identified, the educator can selectively use focused facilitation techniques to promote more in-depth discussion or strive to close performance gaps through directive feedback and teaching as appropriate. Self-assessment strategies are more learner centered; indeed, with sufficient time, high-level groups may debrief themselves to a large extent and make the necessary connections to their future clinical practice, whereas groups with less insight/experience may require more guidance.70
Probing deeper using focused facilitation methods can be used to explore specific issues. For example, advocacy-inquiry is appropriate when the underlying rationale for action is not obvious to the educator (or other learners)4 and when sufficient time is available. Similarly, taking the time to explore alternatives and their pros and cons of decision making, management options, and team behaviors encourages participant-focused discussion and acts to depersonalize the performance.52 Irrespective of debriefing approach, careful listening and flexibility about debriefing topic helps identify and address key issues that are important to trainees.
In a more direct, highly educator-driven approach, educators provide information, that is, the “solution” to the problem. Liberal use of instruction or lectures, especially early in the debriefing, represents a pitfall for novice educators who often simply teach irrespective of situation (“The educator who does all the talking”). Providing information judiciously in the form of directive feedback64 and/or teaching may be preferred if time is very short and performance gaps are highly technical (eg, holding a laryngoscope in the wrong hand) or the underlying reason for the deficient performance is clear (eg, due to knowledge gap when a learner says “I could not remember the steps of the algorithm”). In these instances, educators can switch to teaching mode (eg, “Try holding the laryngoscope in the other hand next time” or “Let us review the algorithm”). Figure 2 provides an example of how the PEARLS framework can be applied to various performance domains with a simulated scenario.
FIGURE 2: Application of the PEARLS debriefing framework to address various types of learning objectives. In this sample debriefing, the educator explores a hypothetical case of an infant with head trauma caused by nonaccidental injury. Performance gaps relate to a medication error, a fixation error, and failed intubation. Here, we see how an educator might select an educational strategy during the analysis phase of the debriefing based on key considerations with each objective/performance gap.
As time permits, educators ideally address critical performance issues fully before moving on to the discussion of the next issue to avoid disjointed or superficial discussions. When there are a large number of issues to address, educators often struggle deciding how to prioritize these topics of discussion. Learners typically bring up issues that are important to them (ie, learner agenda) during the reactions phase or of a self-assessment during the debriefing. Determining overlap between the learner agenda and predefined learning objectives will help the educator identify issues that are important to both the learner and the educator (ie, common agenda). We generally recommend prioritizing the common agenda as high-yield topics for discussion earlier in the analysis phase, before moving on to discuss topics that are important only to the learner and/or educator.
In helping trainees reflect on performance, simulation educators can either drive the process or facilitate a learner-driven discussion. Once an issue has been adequately addressed, educators should ask, “Have all learning objectives been covered?” If not, then the next aspect of performance should be addressed using an appropriate strategy (see Table 3 and screening questions for guidance). Once essential learning objectives have been addressed, the educator can inquire if any other outstanding issues remain before moving on to the summary phase of the debriefing.
The summary phase of the debriefing may be conducted in 1 of 2 ways. In a learner-guided manner, the learners are asked to state their main take-home message(s) and perhaps even anticipate enablers and barriers to enact change in their setting. This step also has benefit of allowing the educator to confirm if the learner’s take-home messages align with the predetermined learning objectives of the session. Conducting the summary phase in this fashion usually takes more time, and learners occasionally will introduce new topics for discussion while the educator is trying to facilitate a summary. Although we favor the learner-guided approach, alternatively the educator can summarize by providing a succinct review of the main take-home messages (as perceived by the educator). By conducting the summary in this manner, the educator has more control over when the debriefing will end but is unable to determine if the learner’s take-home messages align with the learning objectives of the session. It is best to manage time during a debriefing to provide sufficient opportunity for learners to formulate their own take-home messages.
DEVELOPMENT AND PILOTING TESTING THE PEARLS DEBRIEFING FRAMEWORK AND DEBRIEFING SCRIPT
The PEARLS debriefing framework and script was developed over a 3-year period via a multistep process involving a comprehensive review of the literature, integration of our own debriefing faculty development experience, and pilot testing with iterative revisions. Table 4 provides an overview of the development process.
TABLE 4: Development Steps of PEARLS Debriefing Framework and Script
Early anecdotal experiences from teaching the PEARLS approach at multiple debriefing workshops at simulation and education conferences and faculty development courses in North America and Europe are quite positive. Our workshop and course participants note the following:
The debriefing script is easy to follow but requires some preorientation and familiarization for optimal use.
A description of the rationale behind the use of the script supports effective implementation.
It helps novice facilitators to use the scripted phrases verbatim initially; once they become familiar with the flow and content, then they become more comfortable adding their own personal touch to wording of questions/phrases.
Even experienced facilitators still benefit from using the PEARLS framework and script as a guide.
The use of the debriefing script as a faculty development tool during simulation educator training anecdotally seems to accelerate the learning curve for acquisition of debriefing skills.
DISCUSSION
Debriefing plays a central role in experiential learning contexts such as health care simulation. Although some frameworks have been adapted from other arenas such as the US Army after-action review,56 limited evidence guides our practice.13 The PEARLS framework and PEARLS debriefing script represent a novel contribution to the simulation literature. The PEARLS fills an important gap by conceptualizing a framework for integrating 3 common educational strategies used during debriefings and providing guidance on their implementation. We realize that as authors, our debriefing styles and faculty development experiences have informed the development of PEARLS; throughout, we have tried to build on what is known from the literature and expert consensus but acknowledge that both science and art contribute to the complex skill of debriefing. We have articulated and operationalized a blended framework that incorporates what many health care simulation educators already do. As such, we believe the PEARLS framework is adaptable and suitable for various learner groups across professions and disciplines and for different debriefing environments. Finally, we have developed and described a debriefing script that will help educators apply the PEARLS framework to their debriefing.
The debriefing script may provide valuable scaffolding for health care simulation educators who are learning to debrief; it naturally adapts to their needs because they may refer to it at their discretion. In our experience, the PEARLS framework and debriefing script promote faculty development efforts because not only specific steps of debriefing are made explicit but also representative phrases are provided to guide possible wording choices. Specifically, we hope to empower educators to make informed decisions about their debriefing practices until guidance from more rigorous study emerges.
Despite the spread of health care simulation and debriefing, many educators have little or no previous formal training in debriefing and still struggle to facilitate effectively,5 and few, if any, practical guides to improve debriefing skills exist. Obstacles to effective debriefing likely include relatively high cost of simulation educator training, limited debriefing experience, and lack of experienced simulation educators to provide the ongoing mentoring that helps improve debriefing skills. Inadequate debriefing expertise may ultimately have a negative impact on knowledge and skill acquisition as well as attitudes in the learners. From the authors’ experience, novice simulation educators are challenged by observing and codifying events of the simulation, organizing their thoughts and meaningfully structuring the debriefing to encourage engaging discussions, promote critical reflection, and provide open and honest performance feedback. Often novice educators struggle to think of their next question, which impedes the effective listening skills that are so important to effective debriefing. Debriefing scripts are one strategy to reduce an educator’s cognitive load,74 provided that educators familiarize themselves with the script before use.
During the development of PEARLS, the authors weighed the advantages and disadvantages of developing a debriefing script that offered a structure and helpful sample phrases but might seem prescriptive in its format and suggested language. Much like any communication guide or template, rigid adherence to the debriefing script is neither desirable nor the ultimate goal. Ideally, educators follow the framework and the script while increasingly modifying the language as they practice and their experience grows. Indeed, the script only offers structure and guidance. We agree that educators should avoid formulaic speech and tokenisms75 as well as linguistic rituals76 by being curious and authentic; educators need to find and speak with their voice. The ultimate goal of debriefing is for learners to reflect on and make sense of their simulation experience and generate meaningful learning that translates to clinical practice. We believe that the PEARLS framework and debriefing script can support this ultimate goal and may also promote consistency within simulation programs while allowing flexibility as to style and approach. For example, although we identify time as a factor, a skilled and experienced educator may be highly efficient in the use of questions and our guidance regarding time constraints may be less appropriate. Moreover, some educators may place greater weight on learner self-assessment or prefer facilitating a focused discussion. With increasing experience and expertise, simulation educators develop the flexibility and individuality in facilitating debriefings that are both suited to the context and learner group.
CONCLUSIONS
The PEARLS framework and debriefing script incorporate what is known about effective debriefing practices by formulating a new framework for debriefing using existing educational strategies and designing a debriefing script to help support its implementation in a variety of settings. Future directions include empiric study of the PEARLS debriefing framework and debriefing script. Areas of focus include the role of PEARLS in debriefing skill acquisition and the development of debriefing expertise, the role of the framework and script on debriefing quality, and how the framework and script impact faculty development efforts.
ACKNOWLEDGMENT
The authors thank the following:
• Vincent Grant, Traci Robinson, Helen Catena, Wendy Bissett, Kristin Fraser, Gord McNeil, and members of the KidSim Team for their feedback and contributions toward refining the PEARLS method. They also thank Nicola Robertson for drafting the PEARLS flow diagrams.
• Members of the kidSTAR Medical Education Program at Lurie Children’s, especially Mark Adler for his critical review of the manuscript.
• Anonymous reviewers whose comments strengthened the article.
REFERENCES
1. Dreifuerst KT. The essentials of debriefing in simulation learning: a concept analysis.
Nurs Educ Perspect 2009; 30 (2): 109–114.
2. Steinwachs B. How to facilitate a debriefing.
Simul Gaming 1992; 23: 186–195.
3. Fanning RM, Gaba DM. The role of debriefing in simulation-based learning.
Simul Healthc 2007; 2 (2): 115–125.
4. Rudolph JW, Simon R, Dufresne RL, Raemer DB. There’s no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment.
Simul Healthc 2006; 1 (1): 49–55.
5. Dieckmann P, Molin Friis S, Lippert A, Ostergaard D. The art and science of debriefing in simulation: ideal and practice.
Med Teach 2009; 31 (7): e287–e294.
6. Ahmed M, Sevdalis N, Paige J, Paragi-Gururaja R, Nestel D, Arora S. Identifying best practice guidelines for debriefing in surgery: a tri-continental study.
Am J Surg 2012; 203 (4): 523–529.
7. Arafeh JM, Hansen SS, Nichols A. Debriefing in simulated-based learning: facilitating a reflective discussion.
J Perinat Neonatal Nurs 2010; 24 (4): 302–309.
8. Zigmont JJ, Kappus LJ, Sudikoff SN. The 3D model of debriefing: defusing, discovering, and deepening.
Semin Perinatol 2011; 35 (2): 52–58.
9. Yardley S, Teunissen PW, Dornan T. Experiential learning: AMEE guide no. 63.
Med Teach 2012; 34 (2): e102–e115.
10. Kolb D.
Experiential Learning: Experience as a Source of Learning and Development . Upper Saddle River, NJ: Prentice Hall; 1984.
11. Jarvis P.
Adult and Continuing Education: Theory and Practice . 2nd ed. New York: Routledge; 1999.
12. Yardley S, Teunissen PW, Dornan T. Experiential learning: transforming theory into practice.
Med Teach 2012; 34 (2): 161–164.
13. Raemer D, Anderson M, Cheng A, Fanning R, Nadkarni V, Savoldelli G. Research regarding debriefing as part of the learning process.
Simul Healthc 2011; 6 (suppl): S52–S57.
14. Tannenbaum SI, Cerasoli CP. Do team and individual debriefs enhance performance? A meta-analysis.
Hum Factors 2013; 55 (1): 231–245.
15. van de Ridder JM, Stokking KM, McGaghie WC, ten Cate OT. What is feedback in clinical education?
Med Educ 2008; 42 (2): 189–197.
16. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review.
Med Teach 2005; 27 (1): 10–28.
17. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009.
Med Educ 2010; 44 (1): 50–63.
18. Edelson DP, Litzinger B, Arora V, et al. Improving in-hospital cardiac arrest process and outcomes with performance debriefing.
Arch Intern Med 2008; 168 (10): 1063–1069.
19. Edelson DP, Lafond CM. Deconstructing debriefing for simulation-based education.
JAMA Pediatr 2013; 167: 586–587.
20. Morrison JE, Meliza LL.
Foundations of the After Action Review Process (Special Report 42) . Alexandria, VA: US Army Research Institute for Behavioral and Social Sciences; 1999: 42.
21. Salas E, Klein C, King H, et al. Debriefing medical teams: 12 evidence-based best practices and tips.
Jt Comm J Qual Patient Saf 2008; 34 (9): 518–527.
22. Ericsson KA. Deliberate practice and acquisition of expert performance: a general overview.
Acad Emerg Med 2008; 15 (11): 988–994.
23. McGaghie WC. Research opportunities in simulation-based medical education using deliberate practice.
Acad Emerg Med 2008; 15 (11): 995–1001.
24. Siassakos D, Bristowe K, Draycott T, et al. Clinical efficiency in a simulated emergency and relationship to team behaviours: a multisite cross-sectional study.
BJOG 2011; 118 (5): 596–607.
25. Hunt EA, Fiedor-Hamilton M, Eppich WJ. Resuscitation education: narrowing the gap between evidence-based resuscitation guidelines and performance using best educational practices.
Pediatr Clin North Am 2008; 55 (4): 1025–1050 xii.
26. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence.
Acad Med 2011; 86 (6): 706–711.
27. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains.
Acad Med 2004; 79 (suppl 10): S70–S81.
28. Rudolph JW, Simon R, Raemer DB, Eppich WJ. Debriefing as formative assessment: closing performance gaps in medical education.
Acad Emerg Med 2008; 15 (11): 1010–1016.
29. Salas E, DiazGranados D, Klein C, et al. Does team training improve team performance? A meta-analysis.
Hum Factors 2008; 50 (6): 903–933.
30. Ellis S, Davidi I. After-event reviews: drawing lessons from successful and failed experience.
J Appl Psychol 2005; 90 (5): 857–871.
31. Flanagan B. Debriefing: theory and techniques. In: Riley RH, ed.
A Manual of Simulation in Healthcare . New York: Oxford University Press USA; 2008: 155–170.
32. Van Heukelom JN, Begaz T, Treat R. Comparison of postsimulation debriefing versus in-simulation debriefing in medical simulation.
Simul Healthc 2010; 5 (2): 91–97.
33. Walsh CM, Ling SC, Wang CS, Carnahan H. Concurrent versus terminal feedback: it may be better to wait.
Acad Med 2009; 84 (suppl 10): S54–S57.
34. Ahmed M, Sevdalis N, Vincent C, Arora S. Actual vs perceived performance debriefing in surgery: practice far from perfect.
Am J Surg 2013; 205 (4): 434–440.
35. Husebo SE, Dieckmann P, Rystedt H, Soreide E, Friberg F.
The relationship between facilitators’ questions and the level of reflection in postsimulation debriefing .
Simul Healthc 2013; 8(3): 135–142.
36. Brett-Fleegler M, Rudolph J, Eppich W, et al. Debriefing assessment for simulation in healthcare: development and psychometric properties.
Simul Healthc 2012; 7 (5): 288–294.
37. Arora S, Ahmed M, Paige J, et al. Objective structured assessment of debriefing: bringing science to the art of debriefing in surgery.
Ann Surg 2012; 256 (6): 982–988.
38. Dieckmann P. Simulation settings for learning in acute medical care. In: Dieckmann P, ed.
Using Simulations for Education, Training, and Research . Lengerich: Pabst; 2009.
39. Simon R, Raemer D, B, Rudolph JW. Debriefing assessment for simulation in healthcare: rater handbook. 2009. From
https://harvardmedsim.org/_media/DASH.handbook.2010.Final.Rev.2.pdf . Accessed September 27, 2012.
40. Wickers MP. Establishing a climate for a successful debriefing.
Clin Simul Nurs 2010; 6: e83–e86.
41. Rudolph JW, Simon R, Rivard P, Dufresne RL, Raemer DB. Debriefing with good judgment: combining rigorous feedback with genuine inquiry.
Anesthesiol Clin 2007; 25 (2): 361–376.
42. Rall M, Manser T, Howard SK. Key elements of debriefing for simulator training.
Eur J Anaesthesiol 2000; 17 (8): 516–517.
43. Baron RA. Negative effects of destructive criticism: impact on conflict, self-efficacy, and task performance.
J Appl Psychol 1988; 73 (2): 199–207.
44. Ende J. Feedback in clinical medical education.
JAMA 1983; 250 (6): 777–781.
45. Eva KW, Armson H, Holmboe E, et al. Factors influencing responsiveness to feedback: on the interplay between fear, confidence, and reasoning processes.
Adv Health Sci Educ Theory Pract 2012; 17 (1): 15–26.
46. Kluger AN, Van Dijk D. Feedback, the various tasks of the doctor, and the feedforward alternative.
Med Educ 2010; 44 (12): 1166–1174.
47. Cheng A, Hunt EA, Donoghue A, et al. Examining pediatric resuscitation education using simulation and scripted debriefing: a multicenter randomized trial.
JAMA Pediatr 2013; 167: 1–9.
48. Cheng A, Rodgers DL, van der Jagt E, Eppich W, O’Donnell J. Evolution of the Pediatric Advanced Life Support course: enhanced learning with a new debriefing tool and Web-based module for Pediatric Advanced Life Support instructors*.
Pediatr Crit Care Med 2012; 13 (5): 589–595.
49. Gaba DM, Howard SK, Fish KJ, Smith BE, Sowb YA. Simulation-based training in anesthesia crisis resource management: a decade of experience.
Simul Gaming 2001; 32 (2): 175–193.
50. Lederman LC. Debriefing: toward a systematic assessment of theory and practice.
Simul Gaming 1992; 23: 145–160.
51. Dismukes RK, Gaba DM, Howard SK. So many roads: facilitated debriefing in healthcare.
Simul Healthc 2006; 1 (1): 23–25.
52. Fanning RM, Gaba DM. Debriefing. In: Gaba DM, Fish KJ, Howard SK, Burden AR, eds.
Crisis Management in Anesthesiology . 2nd ed. Philadelphia, PA: Elsevier Saunders; 2015: 65–78.
53. Mullan PC, Wuestner E, Kerr TD, Christopher DP, Patel B. Implementation of an in situ qualitative debriefing tool for resuscitations.
Resuscitation 2013; 84 (7): 946–951.
54. Ahmed M, Arora S, Russ S, Darzi A, Vincent C, Sevdalis N. Operation debrief: a SHARP improvement in performance feedback in the operating room.
Ann Surg 2013; 258 (6): 958–963.
55. Kolbe M, Weiss M, Grote G, et al. TeamGAINS: a tool for structured debriefings for simulation-based team trainings.
BMJ Qual Saf 2013; 22 (7): 541–553.
56. Sawyer TL, Deering S. Adaptation of the US Army’s After-Action Review for simulation debriefing in healthcare.
Simul Healthc 2013; 8 (6): 388–397.
57. Dieckmann P, Reddersen S, Zieger J, Rall M. Video-assisted debriefing in simulation-based training of crisis resource management. In: Kyle RR, Murray WB, eds.
Clinical Simulation: Operations, Engineering, and Management . Burlington: Academic Press; 2008: 667–676.
58. Hamilton NA, Kieninger AN, Woodhouse J, Freeman BD, Murray D, Klingensmith ME. Video review using a reliable evaluation metric improves team function in high-fidelity simulated trauma resuscitation.
J Surg Educ 2012; 69 (3): 428–431.
59. Savoldelli GL, Naik VN, Park J, Joo HS, Chow R, Hamstra SJ. Value of debriefing during simulated crisis management: oral versus video-assisted oral feedback.
Anesthesiology 2006; 105 (2): 279–285.
60. Sawyer T, Sierocka-Castaneda A, Chan D, Berg B, Lustik M, Thompson M. The effectiveness of video-assisted debriefing versus oral debriefing alone at improving neonatal resuscitation performance: a randomized trial.
Simul Healthc 2012; 7 (4): 213–221.
61. Eppich W, O’Connor L, Adler M. Providing effective simulation activities. In: Forrest K, McKimm J, Edgar S, eds.
Essential Simulation in Clinical Education . Chichester: Wiley-Blackwell; 2013: 213–234.
62. Dieckmann P. Debriefing olympics-a workshop concept to stimulate the adaptation of debriefings to learning contexts.
Simul Healthc 2012; 7 (3): 176–182.
63. Hewson MG, Little ML. Giving feedback in medical education: verification of recommended techniques.
J Gen Intern Med 1998; 13 (2): 111–116.
64. Archer JC. State of the science in health professional education: effective feedback.
Med Educ 2010; 44 (1): 101–108.
65. Dolmans DH, De Grave W, Wolfhagen IH, van der Vleuten CP. Problem-based learning: future challenges for educational practice and research.
Med Educ 2005; 39 (7): 732–741.
66. Estes C. Promoting student-centered learning in experiential education.
J Experiential Educ 2004; 27 (2): 141–160.
67. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review.
JAMA 2006; 296 (9): 1094–1102.
68. Duffy FD, Holmboe ES. Self-assessment in lifelong learning and improving performance in practice: physician know thyself.
JAMA 2006; 296 (9): 1137–1139.
69. Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda.
Acad Med 2005; 80 (suppl 10): S46–S54.
70. McDonnell LK, Jobe KK, Dismukes RK.
Facilitating LOS Debriefings: A Training Manual . Moffett Field, CA. National Aeronautical and Space Administration; 1997.
71. Smith-Jentsch KA, Cannon-Bowers JA, Tannenbaum SI, Salas E. Guided team self-correction: impacts on team mental models, processes, and effectiveness.
Small Group Research 2008; 39 (3): 303–327.
72. Kriz WC. A systemic-constructivist approach to facilitation and debriefing of simulations and games.
Simul Gaming 2008; 41 (5): 663–680.
73. Rudolph JW, Raemer DB, Simon R. Establishing a safe container for learning in simulation: the role of the presimulation briefing.
Simul Healthc 2014; 9(6): 339–349.
74. van Merrienboer JJ, Sweller J. Cognitive load theory in health professional education: design principles and strategies.
Med Educ 2010; 44 (1): 85–93.
75. Bearman M, Ajjawi R. Avoiding tokenism in health professional education.
Med Educ 2013; 47 (1): 9–11.
76. Molloy E, Borrell-Carrio F, Epstein R. The impact of emotions in feedback. In: Boud D, Molloy E, eds.
Feedback in Higher and Professional Education: Understanding It and Doing It Well . London and New York: Routledge; 2013: 50–71.