Share this article on:

Structuring Feedback and Debriefing to Achieve Mastery Learning Goals

Eppich, Walter J. MD, MEd; Hunt, Elizabeth A. MD, MPH, PhD; Duval-Arnould, Jordan M. MPH; Siddall, Viva Jo MS; Cheng, Adam MD

doi: 10.1097/ACM.0000000000000934
Articles

Mastery learning is a powerful educational strategy in which learners gain knowledge and skills that are rigorously measured against predetermined mastery standards with different learners needing variable time to reach uniform outcomes. Central to mastery learning are repetitive deliberate practice and robust feedback that promote performance improvement. Traditional health care simulation involves a simulation exercise followed by a facilitated postevent debriefing in which learners discuss what went well and what they should do differently next time, usually without additional opportunities to apply the specific new knowledge. Mastery learning approaches enable learners to “try again” until they master the skill in question. Despite the growing body of health care simulation literature documenting the efficacy of mastery learning models, to date insufficient details have been reported on how to design and implement the feedback and debriefing components of deliberate-practice-based educational interventions. Using simulation-based training for adult and pediatric advanced life support as case studies, this article focuses on how to prepare learners for feedback and debriefing by establishing a supportive yet challenging learning environment; how to implement educational interventions that maximize opportunities for deliberate practice with feedback and reflection during debriefing; describing the role of within-event debriefing or “microdebriefing” (i.e., during a pause in the simulation scenario or during ongoing case management without interruption), as a strategy to promote performance improvement; and highlighting directions for future research in feedback and debriefing for mastery learning.

Supplemental Digital Content is available in the text.

W.J. Eppich is associate professor of pediatrics and medical education, Northwestern University Feinberg School of Medicine, Chicago, Illinois.

E.A. Hunt is associate professor of anesthesiology and critical care medicine and of health science informatics and pediatrics, Johns Hopkins University School of Medicine, Baltimore, Maryland.

J.M. Duval-Arnould is instructor of anesthesiology and critical care medicine and of health sciences informatics, Johns Hopkins University School of Medicine, Baltimore, Maryland.

V.J. Siddall is simulation clinical educator and research assistant, Stritch School of Medicine, Loyola University, Maywood, Illinois.

A. Cheng is associate professor of pediatrics, Cumming School of Medicine, University of Calgary, Calgary, Alberta, Canada.

Funding/Support: None reported.

Other disclosures: W.J. Eppich receives a per diem honoraria to teach simulation educator courses from PAEDSIM e.V, Germany, and receives salary support paid to his institution to teach simulation educator courses for the Center for Medical Simulation, Boston, Massachusetts. E.A. Hunt receives grant support from the Laerdal Foundation for Acute Care Medicine and the Hartwell Foundation. A. Cheng is a simulation educator for the Royal College of Physicians and Surgeons of Canada and a member of the Board of Directors of the Society for Simulation in Healthcare.

Ethical approval: Reported as not applicable.

Supplemental digital content for this article is available at http://links.lww.com/ACADMED/A302.

Correspondence should be addressed to Walter J. Eppich, Ann & Robert H. Lurie Children’s Hospital of Chicago, 225 E. Chicago Ave., Box 62, Chicago, IL 60611; telephone: (312) 227-6080; e-mail: w-eppich@northwestern.edu.

Because of its powerful impact, mastery learning has found increasing applications in medical education,1,2 such as advanced cardiac life support (ACLS)3 and procedural skills training4 as well as intern boot camps.5 This trend will likely expand within competency-based medical education models.6 Mastery learning has several complementary features2,7: baseline assessment; clear learning objectives, organized in educational units of increasing difficulty; engagement in powerful educational activities (e.g., deliberate practice) focused on attaining the objectives; set minimum passing standards (MPSs) (e.g., checklist percentages) for each educational unit; assessment to determine unit completion at the MPS for mastery; progression to the next educational unit if the MPS is achieved; and continued practice or study on an educational unit until the mastery standard is achieved. In mastery learning models all learners achieve all educational objectives with little or no variation in outcome; what varies is the amount of time learners need to reach a unit’s educational objectives.2,7 Although evidence supports the effectiveness of mastery learning in health care simulation-based education,1,2 precisely how educators should integrate feedback and debriefing into this educational model is less clear.

Deliberate practice with performance feedback within mastery learning models helps learners progress toward meeting performance standards in well-defined tasks.8,9 Feedback is “specific information about the comparison between a trainee’s observed performance and a standard, given with the intent to improve the trainee’s performance.”10 Whereas feedback is information, debriefing is an interactive discussion or conversation to reflect on performance.11 Traditionally, debriefing occurs post event with clear phases12–17 during which learners may receive performance feedback. Debriefing also occurs in a “pause and discuss” manner during an activity such as a simulation scenario18 to provide directive feedback19 and/or explore rationale(s) for action.14 Debriefing post event or during “pause and discuss” fosters reflection-on-action after events have occurred,20 whereas reflection-in-action occurs as events unfold.20 In this article we refer to episodes of within-event debriefing (during a pause or during ongoing case management) as “microdebriefing” to highlight their brief and focused nature. See Figure 1 for an overview of feedback and debriefing characteristics. Several debriefing models12,13,15 and recommendations for practice exist,21–23 but the link between debriefing models and learning outcomes is only beginning to emerge.

Figure 1

Figure 1

Although deliberate practice is core to simulation-based mastery learning1,24 and can yield better results than traditional clinical education,25 details about how to design and implement deliberate-practice-based educational interventions are lacking. Several studies have described the positive impact of deliberate practice on resuscitation skills, but word count limits prohibit publication of details needed to replicate the curricula. For example, in a seminal study on simulation-based mastery learning for ACLS skills,3 the description of feedback and debriefing components of deliberate practice is sparse, specifically regarding timing, content, duration, and frequency of feedback, as well as debriefing method. We seek to fill this gap by providing more explicit details specifically about how to structure feedback and debriefing for deliberate-practice-based educational interventions.

We use ACLS and pediatric advanced life support (PALS) as case studies to explore elements essential for mastery learning curricula: supportive learning environments that prepare learners for effortful practice and feedback; integrating feedback and debriefing into deliberate-practice-based educational interventions; and within-event debriefing or “microdebriefing.” We also offer directions for future research.

Back to Top | Article Outline

Designing the Educational Intervention

Effective feedback and debriefing begin with curriculum design.26,27 Central to mastery learning models2 is the use of performance measures28 and determination of MPSs,29 for which standard setting guidelines exist.29 Although curriculum design and performance measures are important, we focus on structuring feedback and debriefing for mastery learning. When designing educational interventions, educators should specifically consider key feedback characteristics, including type, source, and timing30 in planning how and when feedback and debriefing will occur. See Figure 1 for examples of feedback sources. Feedback from multiple sources promotes learning31 and should be planned in advance.

Back to Top | Article Outline

Establishing a Supportive Learning Environment

Supportive learning environments are essential prerequisites for simulation-based educational strategies,15,22,32 particularly for deliberate practice approaches. Psychological safety fosters individual risk taking33 and helps learners to accept challenges.34 Although steps educators can take to engender psychological safety, mutual respect, and trust have been outlined elsewhere,32 we provide additional considerations for deliberate-practice-based interventions:

  • Explicitly discuss the role of debriefing, how and when learners will receive feedback,26 and the significance of specific, honest yet nonthreatening feedback,23 much like coaching world-class athletes.35
  • Explain that the goal is not perfection from the start, but being challenged, learning from mistakes, and improving.17 Acknowledge that honest feedback may feel unpleasant and trigger defensiveness but is necessary for improvement.
  • Tell learners that they may be interrupted during the simulation so that they can briefly reflect on their progress and receive feedback before rewinding/resuming the case for more practice.
  • Explicitly explain that learners may provide feedback to peers and the value of this process.
Back to Top | Article Outline

Case Studies: ACLS and PALS Resuscitation Training

Although mastery learning has been applied in various domains, here we use simulation-based advanced life support (ALS) training as an exemplar for our discussion for several key reasons: Clear performance guidelines exist for adult and pediatric ALS; resuscitation skills are relevant across health care professions and patient populations, and thus of interest to many readers; resuscitation skills combine cognitive processes and procedural components with team and communication skills; and two coauthors of this article were principal educators in published studies documenting the benefit of deliberate practice for ACLS (V.J.S.)3 and PALS skills (E.A.H.).35 See Table 1 for a summary of the primary studies on ACLS and PALS skills including key findings. Based on input from these two authors, what follows is a detailed description of how these educational sessions were structured and facilitated with a focus on feedback and debriefing components of deliberate practice.

Table 1

Table 1

Back to Top | Article Outline

Using deliberate practice in mastery learning of ACLS skills

In a simulation-based study of ACLS mastery learning, second-year internal medicine residents participated in educational sessions facilitated by a nonphysician, ACLS-certified instructor (V.J.S.) to improve their ability to lead a resuscitation.3 The first two-hour session focused on procedural skills (not embedded in simulation scenarios), including bag-mask ventilation (BMV), intubation, chest compressions, and defibrillation with emphasis on technical proficiency. During deliberate skills practice, the educator provided hands-on feedback as needed. More important, to augment the learners’ abilities to observe and correct performance in their future roles as resuscitation team leaders, the educator encouraged peer-to-peer feedback (both giving as team leaders and receiving as team members). For example, residents reflected on the following questions to improve their observational skills in preparation for providing team members with performance feedback (e.g., adequacy of BMV or chest compressions):

  • When the skill is performed correctly, what characterizes effective performance?
  • If a skill is not performed correctly, what characterizes ineffective performance? What would need to change to ensure correct performance?

Clinical scenarios for the three subsequent two-hour sessions were grouped as follows: pulseless arrhythmias, tachyarrhythmias, and bradycardias. The primary outcome was the residents’ ability to serve as team leaders, adhere to ACLS guidelines, and make critical decisions. Residents assigned to other roles practiced resuscitation skills, such as BMV, chest compressions, defibrillation, and administering medications and received performance feedback.

Before each scenario, residents received case information and assigned roles including team leader. During the scenario, the educator stood just behind the team leader, who was positioned at the foot of the bed. The educator quietly posed reflective questions to the team leader to prompt reflection-in-action (microdebriefing) during the active resuscitation—without pause—depending on the team leader and team member performance. For example, if an element of basic life support (BLS) or ALS was inadequate, the educator directed the team leader’s attention to that element through reflective questions such as “Is BMV being performed adequately?” and “If not, what needs to change to improve it?” If BMV needed improvement, the team leader provided performance feedback as the resuscitation proceeded. As another example, if the team leader seemed to struggle with the next management step, the educator posed questions to prompt reflection-in-action (microdebriefing), such as “What are you seeing/hearing right now?” and “What are you thinking?”

In scenarios involving pulseless electrical activity, if the resident was slow to work through the possible reversible causes, the educator prompted the team leader to assess BLS quality, reassured him/her that high-quality BLS provided time to think, and then posed reflective questions/provided feedback to direct management to the desired outcome. The educator focused on empowering the team leader to guide the team to improved performance, to provide peer feedback as needed during ongoing resuscitative efforts, and to understand the rationale for key actions and maneuvers.

After the scenario, the educator facilitated a short bedside postevent debriefing using a plus-delta technique with the team: what went well, what could have gone better, what should change for the next time. The educator encouraged peer-to-peer feedback and also provided clarification and feedback on troublesome aspects of performance herself as needed (e.g., critical decision making) to promote iterative improvement.

During the two-hour sessions, each resident served as team leader for each scenario type. Scenarios and debriefings were brief, with most of the session dedicated to deliberate practice in up to 15 scenarios with each resident completing multiple turns as team leader. The amount of support the educator provided to the team leader in the form of within-event microdebriefing decreased as each team leader’s ability to lead the team improved during the session.

After completing all practice sessions, residents were assessed using six testing scenarios and guideline-specific checklists.3 If a resident did not achieve mastery standards for leading the team in accordance with resuscitation guidelines in any of the testing scenarios, she or he completed additional deliberate practice on those scenarios until mastery was achieved (see Table 1).

Back to Top | Article Outline

Rapid cycle deliberate practice

Pediatric residents participated in a novel rapid cycle deliberate practice (RCDP) simulation-based curriculum to improve PALS procedural and team skills within the first five minutes of an event.35 Central to RCDP is the selection and sequencing of cases that afford deliberate practice of core skills required to manage clinical scenarios based on their prevalence in clinical practice. In pediatrics, for example, respiratory failure requiring BMV is more frequent than ventricular fibrillation requiring defibrillation, so BMV is integrated in all scenarios. See Figure 2 for case sequencing of scenarios and embedded tasks within a two-hour RCDP session for PALS skills. Unlike other curricula described in the literature that provide little specific detail about how feedback and debriefing occur in simulation,11 the RCDP methodology provides a context (cases sequenced based on increasing difficulty) for rapid cycles of deliberate practice with clear strategies for structuring feedback and debriefing within the individual cases.

Figure 2

Figure 2

During RCDP, the educator observed the performance and paused the learners at various times to give directive feedback19 on exactly what aspect of the performance went well or what needed improvement. During the microdebriefing, the educator focused on the issue that prompted the interruption by linking the observed performance with a brief rationale of why improvement was needed. For example, if the objective was minimizing pauses in compressions immediately prior to defibrillation, the educator stopped the scenario when the pause in compressions exceeded the standard (e.g., < 10 seconds). In this case, when compressions were paused for more than 10 seconds, the educator said “time out” or “let’s pause,” highlighted the breached standard, and provided a brief rationale regarding its importance. After the educator succinctly stated his/her observation, she gave explicit directions on what to do differently next time and why to complete the task properly. These brief, focused explanations (i.e., suggested team choreography or expert modeling of skills) were followed by additional practice attempts until learners performed the task correctly. See Box 1 for examples of educator comments during a microdebriefing. During initial iterations of a scenario (e.g., respiratory failure in which BMV is difficult, requiring advanced maneuvers such as two-person technique), learners engaged in a period of uninterrupted practice before a microdebriefing with performance feedback. As the teaching session progressed, when learners repeated an error or substandard performance that had been addressed previously (e.g., proper use of two-person technique if BMV is difficult), the educator paused again to provide specific, direct feedback, “rewound” the scenario 10 seconds, and resumed the scenario to allow repeated practice on the troublesome aspect of the task. The scenario did not progress to the next phase until learners demonstrated proficiency at that task (see Figure 2). Breaches in performance standards applied for individuals (e.g., the person performing BMV) or the team as a whole (e.g., team choreography to place backboard under patient with minimal interruptions in chest compressions). For instances of inadequate performance despite multiple cycles of directive feedback, the educator spent more time during a microdebriefing exploring the learner’s understanding of the issue at hand. The goal was for learners to experience successful application of key knowledge and essential skills as they worked toward meeting performance standards.

Back to Top | Article Outline

Box 1 Examples of Microdebriefing to Promote Reflection-on-Action in RCDP, Demonstrating Educator Pausing After a Breached Performance Standard and an Exemplary Performance Cited Here...

Breached standard

(pause before defibrillation < 10 seconds)

“Okay guys, we just paused compressions for 15 seconds before the defibrillation, and remember the AHA standard is no pause longer than 10 seconds AND Dana Edelson’s paper [referenced by E.A.H. for feedback] demonstrated that each 5-second decrease in preshock pause is associated with a 86% increase in defibrillation success rate … so let me give you some strategies on how to shrink that pause, and then we will rewind you and can try again.”

Exemplary performance

(closed-loop communication)

“Let’s pause. I would like to highlight that beautiful use of closed-loop communication. I noticed our leader caught herself saying, ‘Can “someone” tell me the dose of amiodarone?’ and after the second time, you looked around the room and touched the pharmacist on his shoulder and said, ‘Mike, can you tell me the dose of amiodarone?’ and Mike looked up and said, ‘Yes, amiodarone is 5 mg/kg, I will prepare the drug and let you know when it is ready.’ That was fabulous … do you want to add anything?”

As learners progressed through the scenarios and achieved target performance on component skills, the simulated patient responded positively to correct interventions (e.g., chest rise with BMV), which reinforced desired behaviors through task-related feedback. RCDP appears particularly well suited for simulation-based sessions devoted to individual and team-based resuscitation skills with established evidence-based guidelines or expert opinion that forms the basis for clear performance standards.35 See Supplemental Digital List 1 at http://links.lww.com/ACADMED/A302 for examples of essential resuscitation skills amenable to a simulation-based RCDP approach.

Back to Top | Article Outline

Using performance data as sources of feedback for deliberate practice

Sources and types of data available can accurately determine whether learners achieve performance standards. These data need to be objective, quantifiable, and presented so that learners can understand them. Educators can use data from simulations as a mechanism for setting expectations, assessing performance, and providing meaningful feedback to help learners monitor and improve their performance. For example, patient simulators log important detail about the quality of chest compressions such as depth, rate, and recoil.35–38

Other sources of data include peer observation and video performance review. Peer observation using checklists and video performance review can capture key tasks and promote collaborative learning. Use of checklist-based measurement requires skilled observers; it can be error prone with the potential of influencing feedback accuracy. For summative assessment of performance, rater training and calibration are essential.39,40 Use of these strategies for mastery learning should be balanced with time and rater training considerations. In mastery learning models, outcome measurement is an essential component to determine whether learners have achieved mastery standards or whether they require additional practice.2

Back to Top | Article Outline

Discussion: Key Lessons

Deliberate practice promotes performance improvement.3,35 This article fills a gap in the literature by describing in detail how various feedback and debriefing strategies support mastery learning and deliberate practice in health care simulation. Insights offered here also raise additional questions that can guide future inquiry.

We glean several key lessons for structuring feedback and debriefing from the description of the deliberate practice interventions for ALS skills by Wayne et al3 and Hunt et al35 given the demonstrated learning benefits in terms of resuscitation skills (see Table 1). The first lesson relates to the notion of training for success. In both studies, resident physicians achieved proficiency in the predefined tasks. According to self-determination theory, promoting feelings of competence augments internal motivation41; during deliberate practice, learners see themselves improve and welcome the feedback that makes this possible. This stands in stark contrast to more traditional models of simulation-based education in which a simulation is followed by a debriefing; learners may discuss how to improve in the postevent debriefing, but then either the session ends or learners move on to a different case. Even a single opportunity to repeat the scenario shows benefit.42 The interventions described by Wayne et al and Hunt et al afforded learners many opportunities to engage in deliberate practice.

The second lesson relates to the role of reflection, both in-action or on-action.20 The “pause and discuss” or microdebriefing strategy used in RCDP35 relies on the principle of reflection-on-action20 like more traditional debriefing strategies.12,13 Recent work has highlighted the advantage of terminal (after the event) over concurrent (immediately during the event) feedback,11,31 although further clarification would be helpful because in the RCDP model, microdebriefings with immediate feedback for repeat errors was a key strategy. The facilitation approach exemplified by V.J.S. in coaching the resident team leaders through a mixture of reflective questions (“What are you seeing?” “What are you thinking?”) and targeted feedback during ongoing resuscitation efforts leverages reflection-in-action.20 To our knowledge, this notion of a “coach” supporting team leaders during simulated resuscitations is not widespread and represents a novel and potentially powerful educational intervention. While coaching the team leader during the scenario, V.J.S. offered scaffolding that was slowly withdrawn with successive rounds of deliberate practice. The relative impacts of reflection-in-action prompted by facilitative “within-event microdebriefing” and scaffolded learning during simulation scenarios demands further study—for instance, frequency and duration of microdebriefings, timing and content, adaptations based on learner training level, and learning objectives suited for this approach, educator training, and transfer of these strategies to supervision of trainees in clinical practice.

A third key lesson is the role of peer-assisted learning in the ACLS study, with team leaders providing performance feedback to other residents in scenarios when appropriate. It is notable that in both ACLS3 and PALS35 studies, all learners were resident physicians. Evidence suggests that matched peers or “status equals” are necessary for effective peer-assisted learning,43 although within-team debriefings after simulations for interprofessional teams show potential as well.44 Social comparison among peers is also an added factor.45,46 Particularly in the study of ACLS skills,3 all residents took their turns as leaders and team members; they experienced both giving and receiving feedback and likely felt pressure to “measure up” in comparison to peers, which may have facilitated learning.

Finally, identifying outcome measures and reliable data sources, such as time-to-event analysis, offers objective performance feedback, ensuring that feedback, microdebriefing, and postevent debriefing are anchored in accurate observations. Recent work demonstrates the power of objective performance feedback after actual resuscitation to improve patient outcomes.47 When appropriate, time-to-event analysis provides performance indicators that assess individual and team functioning for complex skills. Educators should survey the breadth of data sources relevant to the predetermined learning objectives and identify those that are best suited for integration into deliberate practice curricula.

Although we explicate much needed detail on how to design and implement educational interventions based on deliberate practice using ACLS and PALS as case studies, we realize that other feedback and debriefing approaches may be more suitable for other domains of clinical performance. For example, postevent debriefing12,13,16 may have clear advantages when intended learning outcomes involve complex clinical decision making or social interactions. This may be particularly true when no clear performance standards exist and multiple strategies are appropriate48; more traditional forms of postevent debriefing are likely still of great benefit. The relative merits of various approaches to feedback, microdebriefing, and debriefing speak to a need for educators to strive for a blended approach.49 See Table 2 for a comparison of giving feedback, microdebriefing, and postevent debriefing.

Table 2

Table 2

Although mastery learning may be limited in clinical environments because of patient safety concerns, some components of structured feedback and debriefing presented here may apply to clinical practice. In particular, the form of microdebriefing during active clinical management, much as when a supervising physician might stand behind a trainee physician who is learning to lead a resuscitation, may provide useful scaffolding. Further research will characterize success factors that assist clinical supervisors in managing their dual roles as educators and patient care providers and the benefit for trainees of such a microdebriefing strategy.

Although Wayne et al3 and Hunt et al35 ascribe the benefit of their educational intervention to deliberate practice, future research needs to clarify what elements of deliberate practice are most important, and which feedback characteristics such as source/timing and debriefing approach have the most impact for specific learner groups and learning objectives. To what extent reflection-in-action strategies such as within-event microdebriefing and peer-assisted learning augment deliberate practice is not clear. What is clear is that the deliberate-practice-based mastery learning intervention for ACLS skills by Wayne et al3 was powerful, leading to long-term skill retention50 and enhanced performance when trained residents led code teams during actual cardiac arrest medical team responses.51,52

Effortful practice has been shown to promote learning,53 making feedback frequency, content, and timing in deliberate practice models related to particular skills an interesting area of inquiry. Also, most published work on mastery learning and the deliberate practice approach includes resident physicians. Studies are ongoing to explore whether the RCDP approach will be effective with other learner groups (e.g., practicing nurses, medical students).

We hope this article offers health care educators the stimulus to reevaluate their educational practice and consider the full spectrum of debriefing strategies, including both microdebriefing and postevent debriefing as tools to reach educational objectives. Dissemination of innovation in health care is challenging,54 and our hope is that this report will contribute to the adoption of deliberate practice and mastery learning models in health care education.

Back to Top | Article Outline

References

1. Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R. Mastery learning for health professionals using technology-enhanced simulation: A systematic review and meta-analysis. Acad Med. 2013;88:1178–1186
2. McGaghie WC, Issenberg SB, Barsuk JH, Wayne DB. A critical review of simulation-based mastery learning with translational outcomes. Med Educ. 2014;48:375–385
3. Wayne DB, Butter J, Siddall VJ, et al. Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice. J Gen Intern Med. 2006;21:251–256
4. Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4:397–403
5. Cohen ER, Barsuk JH, Moazed F, et al. Making July safer: Simulation-based mastery learning during intern boot camp. Acad Med. 2013;88:233–239
6. Frank JR, Snell LS, ten Cate O, et al. Competency-based medical education: Theory to practice. Med Teach. 2010;32:638–645
7. McGaghie WC, Siddall VJ, Mazmanian PE, Myers JAmerican College of Chest Physicians Health and Science Policy Committee. . Lessons for continuing medical education from simulation research in undergraduate and graduate medical education: Effectiveness of continuing medical education: American College of Chest Physicians Evidence-Based Educational Guidelines. Chest. 2009;135(3 suppl):S62–S68
8. Ericsson KA, Krampe RT, Tesch-Romer C. The role of deliberate practice in the acquisition of expert performance. Psychol Rev. 1993;100:363–406
9. Ericsson KA. Deliberate practice and acquisition of expert performance: A general overview. Acad Emerg Med. 2008;15:988–994
10. van de Ridder JM, Stokking KM, McGaghie WC, ten Cate OT. What is feedback in clinical education? Med Educ. 2008;42:189–197
11. Cheng A, Eppich W, Grant V, Sherbino J, Zendejas B, Cook DA. Debriefing for technology-enhanced simulation: A systematic review and meta-analysis. Med Educ. 2014;48:657–666
12. Steinwachs B. How to facilitate a debriefing. Simul Gaming. 1992;23:186–195
13. Zigmont JJ, Kappus LJ, Sudikoff SN. The 3D model of debriefing: Defusing, discovering, and deepening. Semin Perinatol. 2011;35:52–58
14. Rudolph JW, Simon R, Dufresne RL, Raemer DB. There’s no such thing as “nonjudgmental” debriefing: A theory and method for debriefing with good judgment. Simul Healthc. 2006;1:49–55
15. Rudolph JW, Simon R, Raemer DB, Eppich WJ. Debriefing as formative assessment: Closing performance gaps in medical education. Acad Emerg Med. 2008;15:1010–1016
16. Kolbe M, Weiss M, Grote G, et al. TeamGAINS: A tool for structured debriefings for simulation-based team trainings. BMJ Qual Saf. 2013;22:541–553
17. Eppich WJ, O’Connor L, Adler MDForrest K, McKimm J, Edgar S. Providing effective simulation activities. Essential Simulation in Clinical Education. 2013 Chichester, UK Wiley-Blackwell:213–234
18. Flanagan BRiley RH. Debriefing: Theory and techniques. A Manual of Simulation in Healthcare. 2008 New York, NY Oxford University Press USA:155–170
19. Archer JC. State of the science in health professional education: Effective feedback. Med Educ. 2010;44:101–108
20. Schön D The Reflective Practitioner. 1983 New York, NY Basic Books
21. Ahmed M, Sevdalis N, Paige J, Paragi-Gururaja R, Nestel D, Arora S. Identifying best practice guidelines for debriefing in surgery: A tri-continental study. Am J Surg. 2012;203:523–529
22. Brett-Fleegler M, Rudolph J, Eppich W, et al. Debriefing assessment for simulation in healthcare: Development and psychometric properties. Simul Healthc. 2012;7:288–294
23. Arora S, Ahmed M, Paige J, et al. Objective structured assessment of debriefing: Bringing science to the art of debriefing in surgery. Ann Surg. 2012;256:982–988
24. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44:50–63
25. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86:706–711
26. Molloy E, Boud DBoud D, Molloy E. Changing conceptions of feedback. Feedback in Higher and Professional Education: Undertanding It and Doing It Well. 2013 London, UK Routledge:11–33
27. Kern DE, Thomas PA, Hughes MT Curriculum Development for Medical Education: A Six-Step Approach. 20092nd ed Baltimore, Md Johns Hopkins University Press
28. Schmutz J, Eppich WJ, Hoffmann F, Heimberg E, Manser T. Five steps to develop checklists for evaluating clinical performance: An integrative approach. Acad Med. 2014;89:996–1005
29. Wayne DB, Fudala MJ, Butter J, et al. Comparison of two standard-setting methods for advanced cardiac life support training. Acad Med. 2005;80(10 suppl):S63–S66
30. Chiniara G, Cole G, Brisbin K, et al.Canadian Network for Simulation in Healthcare, Guidelines Working Group. Simulation in healthcare: A taxonomy and a conceptual framework for instructional design and media selection. Med Teach. 2013;35:e1380–e1395
31. Hatala R, Cook DA, Zendejas B, Hamstra SJ, Brydges R. Feedback for simulation-based procedural skills training: A meta-analysis and critical narrative synthesis. Adv Health Sci Educ Theory Pract. 2014;19:251–272
32. Rudolph JW, Raemer DB, Simon R. Establishing a safe container for learning in simulation: The role of the presimulation briefing. Simul Healthc. 2014;9:339–349
33. Edmondson AC. Psychological safety and learning behavior in work teams. Adm Sci Q. 1999;44:350–383
34. Edmondson AC. The competitive imperative of learning. Harv Bus Rev. 2008;86:60–67, 160
35. Hunt EA, Duval-Arnould JM, Nelson-McMillan KL, et al. Pediatric resident resuscitation skills improve after “rapid cycle deliberate practice” training. Resuscitation. 2014;85:945–951
36. Berg RA, Hemphill R, Abella BS, et al. Part 5: Adult basic life support: 2010 American Heart Association Guidelines for Cardiopulmonary Resuscitation and Emergency Cardiovascular Care. Circulation. 2010;122(18 suppl 3):S685–S705
37. Kleinman ME, Chameides L, Schexnayder SM, et al. Part 14: Pediatric advanced life support: 2010 American Heart Association guidelines for cardiopulmonary resuscitation and emergency cardiovascular care. Circulation. 2010;122(18 suppl 3):S876–S908
38. Mancini ME, Soar J, Bhanji F, et al.Education, Implementation, and Teams Chapter Collaborators. Part 12: Education, implementation, and teams: 2010 international consensus on cardiopulmonary resuscitation and emergency cardiovascular care science with treatment recommendations. Circulation. 2010;122(16 suppl 2):S539–S581
39. Feldman M, Lazzara EH, Vanderbilt AA, DiazGranados D. Rater training to support high-stakes simulation-based assessments. J Contin Educ Health Prof. 2012;32:279–286
40. Eppich W, Nannicelli AP, Seivert NP, et al. A rater training protocol to assess team performance. J Contin Educ Health Prof. 2015;35:83–90
41. Ten Cate TJ, Kusurkar RA, Williams GC. How self-determination theory can assist our understanding of the teaching and learning processes in medical education. AMEE guide no. 59. Med Teach. 2011;33:961–973
42. Auerbach M, Kessler D, Foltin JC. Repetitive pediatric simulation resuscitation training. Pediatr Emerg Care. 2011;27:29–31
43. Topping KJ. Trends in peer learning. Educ Psychol. 2005;25:631–645
44. Boet S, Bould MD, Sharma B, et al. Within-team debriefing versus instructor-led debriefing for simulation-based education: A randomized controlled trial. Ann Surg. 2013;258:53–58
45. Raat J, Kuks J, Cohen-Schotanus J. Learning in clinical practice: Stimulating and discouraging response to social comparison. Med Teach. 2010;32:899–904
46. Bleakley A. Social comparison, peer learning and democracy in medical education. Med Teach. 2010;32:878–879
47. Wolfe H, Zebuhr C, Topjian AA, et al. Interdisciplinary ICU cardiac arrest debriefing improves survival outcomes. Crit Care Med. 2014;42:1688–1695
48. Fanning RM, Gaba DMGaba DM, Fish KJ, Howard SK, Burden AR. Debriefing. Crisis Management in Anesthesiology. 20152nd ed. Philadelphia, Pa Elsevier Saunders:65–78
49. Eppich W, Cheng A. Promoting excellence and reflective learning in simulation (PEARLS): Development and rationale for a blended approach to health care simulation debriefing. Simul Healthc. 2015;10:106–115
50. Wayne DB, Siddall VJ, Butter J, et al. A longitudinal study of internal medicine residents’ retention of advanced cardiac life support skills. Acad Med. 2006;81(10 suppl):S9–S12
51. Wayne DB, Didwania A, Feinglass J, Fudala MJ, Barsuk JH, McGaghie WC. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: A case–control study. Chest. 2008;133:56–61
52. Didwania A, McGaghie WC, Cohen ER, et al. Progress toward improving the quality of cardiac arrest medical team responses at an academic teaching hospital. J Grad Med Educ. 2011;3:211–216
53. Bjork RAMetcalfe J, Shinamura A. Memory and metamemory considerations in the training of human beings. Metacognition: Knowing About Knowing. 1994 Cambridge, Mass MIT Press:185–205
54. Berwick DM. Disseminating innovations in health care. JAMA. 2003;289:1969–1975

Supplemental Digital Content

Back to Top | Article Outline
© 2015 by the Association of American Medical Colleges