Secondary Logo

Share this article on:

Faculty Development for Simulation Programs: Five Issues for the Future of Debriefing Training

Cheng, Adam MD, FRCPC, FAAP; Grant, Vincent MD, FRCPC; Dieckmann, Peter PhD, Dipl-Psych; Arora, Sonal PhD, MRCS, MBBS; Robinson, Traci RN; Eppich, Walter MD, MEd

Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare: August 2015 - Volume 10 - Issue 4 - p 217–222
doi: 10.1097/SIH.0000000000000090
Concepts and Commentary

Summary Statement Debriefing is widely recognized as a critically important element of simulation-based education. Simulation educators obtain and/or seek debriefing training from various sources, including workshops at conferences, simulation educator courses, formal fellowships in debriefings, or through advanced degrees. Although there are many options available for debriefing training, little is known about how faculty development opportunities should be structured to maintain and enhance the quality of debriefing within simulation programs. In this article, we discuss 5 key issues to help shape the future of debriefing training for simulation educators, specifically the following: (1) Are we teaching the appropriate debriefing methods? (2) Are we using the appropriate methods to teach debriefing skills? (3) How can we best assess debriefing effectiveness? (4) How can peer feedback of debriefing be used to improve debriefing quality within programs? (5) How can we individualize debriefing training opportunities to the learning needs of our educators?

From the KidSIM Simulation Program (V.G., T.R.), Department of Pediatrics, Alberta Children’s Hospital at the University of Calgary, Calgary, Alberta, Canada; Danish Institute for Medical Simulation (P.D.), Herlev University Hospital, Herlev, Denmark; Department of Surgery and Cancer (S.A.), Imperial College, London, United Kingdom; and Walter Eppich, MD, MEd is from the Center for Education in Medicine (W.E.), Northwestern Feinberg School of Medicine, Ann & Robert H. Lurie Children’s Hospital of Chicago, Chicago, IL.

Reprints: Adam Cheng, MD, FRCPC, FAAP, KidSIM Simulation Program, Department of Pediatrics, Alberta Children’s Hospital, University of Calgary, 2888 Shaganappi Trail NW, Calgary, Alberta, Canada T3B 6A8 (e-mail: chenger@me.com).

The authors declare no conflict of interest.

The growth of simulation-based education (SBE) as a key modality for experiential learning in health care has drawn attention to the importance of debriefing.1–5 Debriefing, defined as a “discussion between two or more individuals in which aspects of a performance are explored and analyzed with the aim of gaining insight that impacts the quality of future clinical practice,”5 provides an important venue for groups to engage in reflective learning.2,5 Recent reviews of the simulation literature have highlighted the importance and benefits of debriefing as part of SBE.6–9 Individual studies have described and evaluated various instructional design features of debriefings, although few features have been studied in more than one context.5,10–27 Despite the recognized importance and widespread use of debriefing as part of SBE, there has been little work published on how faculty development opportunities should be structured to maintain and enhance the quality of debriefing within simulation programs.2,3,5

Training in debriefing can be obtained by attending simulation educator training courses offered by various simulation programs, by attending workshops at conferences, and by seeking fellowship training or advanced degrees in simulation. Simulation educators often “learn by doing” and, in some cases, may obtain unstructured feedback from colleagues and/or learners. These options provide a spectrum of opportunities for simulation educators, but little is known about the actual effectiveness of these faculty development opportunities.

Back to Top | Article Outline

Faculty Development in Health Care Education

Faculty development has been defined as a “range of activities that institutions use to renew or assist faculty in their roles,”28 which is inclusive of programs designed to improve the performance of teaching.29 Faculty development programs have been found to improve the practice of education, enhance individual strengths (eg, knowledge and teaching skills), and positively influence cultural change within organizations.30–32 Recent systematic reviews of faculty development initiatives in medical education describe a broad spectrum of activities, including workshops, seminar series, courses, longitudinal programs (eg, fellowships), and individualized feedback.31,32 The use of experiential learning, provision of feedback, effective peer relationships, and diversity of educational methods represent some of the key features of effective faculty development activities.31 In this article, we explore some of these features as they relate to SBE. Although there is a paucity of literature related to faculty development for SBE, many of the lessons learned in health care education can be applied to enhance faculty development methods for SBE.

Various elements of SBE contribute to the overall educational value of the session (eg, prebriefing, conduct of the simulation event). We have narrowed the focus of this article to debriefing because it represents the component of SBE that allows students to reflect on previous behaviors, which in turn lead to the development and solidification of critical concepts.6–8 In this article, we discuss 5 issues for the future of debriefing training to foster a conversation in the simulation community. Specifically, we ask the following questions.

  1. Are we teaching the appropriate debriefing methods?
  2. Are we using the appropriate methods to teach debriefing skills?
  3. How can we best assess debriefing effectiveness?
  4. How can peer feedback of debriefing be used to improve debriefing quality within programs?
  5. How can we individualize debriefing training opportunities to the learning needs of our educators?

We hope that discussion of these issues and their related tradeoffs will help to stimulate new debriefing research and program development that will shape the future of debriefing training.

Back to Top | Article Outline

Five Issues for the Future of Debriefing Training

Are We Teaching the Appropriate Debriefing Methods?

There is a growing body of evidence evaluating various instructional design features of debriefing for SBE.5,10–23 Scripted debriefing22 and expert modeling10,11 have demonstrated improved learning outcomes in limited contexts, whereas other methods, such as video debriefing12–14 and concurrent debriefing17–19 (vs. terminal debriefing), have demonstrated more mixed results.5 To date, only 2 studies have evaluated different debriefing styles or methods, with one study favoring reflective discussion over performance critique20 and the other favoring discussion with a focus on technical aspects over knowledge deficits.21

Several methods of debriefing have been popularized and subscribed to by educators within certain programs that deliver standardized simulation courses.5,9 Some approaches promote reflection and seek to uncover learners’ rationale for action,6–9,33,34 whereas others seek to engage learners in self-assessment of their performance as a means to identify areas for further inquiry.2,23,24 Educators may choose to emphasize a more didactic approach where feedback on key performance deficits is provided to learners.25,26 The adoption of specific debriefing methods by simulation educators is typically driven by various factors, including personal preference and past experiences with one particular method, as opposed to empiric evidence supporting one debriefing method over another. Table 1 compares various methods used during debriefing.

TABLE 1

TABLE 1

There are potential downsides when simulation educators focus on selectively mastering one specific debriefing method, rather than exploring the relative benefits of various methods. As the context of debriefing varies, so too may the relative educational value of different debriefing methods.27 Specific methods may be more effective in different contexts, depending on the learning objectives (eg, technical vs. nontechnical skills), learner type (eg, level of training) and attributes (eg, learning style), dynamics of the learner group, and the amount of time available.27 For example, an in situ debriefing limited to 10 minutes, conducted for emergency medicine practitioners pulled from their clinical duties, may be more amenable to learner self-assessment2,23 or directive feedback.25,26 This would allow for more topics to be covered in a shorter period. Alternatively, a 30-minute debriefing of a team training simulation event may permit for facilitated discussion, thus allowing for the educator to engage learners in meaningful self-reflection. A better understanding of the relative benefits and indications for each method of debriefing allows educators to blend debriefing methods in a dynamic fashion and adjust the tone, style, and pace of debriefing to the needs of the learners.27 Blending methods of debriefing requires the educator to actively assess the learning environment (eg, topic, learner type, time available) and then select a method of debriefing best suited to that specific situation.27 By doing so, debriefings can be tailored to learner need, learner type, and time available to best promote effective learning outcomes. Future research in this area should delineate when and how specific methods should be used, explore how blending of methods should occur to enhance learning, and describe the impact of various debriefing methods on educational (and/or clinical) outcomes.5

Back to Top | Article Outline

Are We Using the Appropriate Methods to Teach Debriefing Skills?

Although much work has been done evaluating how to best design simulation educational interventions to optimize learning outcomes,1,4 little is known about how to optimally deliver courses to teach debriefing skills. Simulation educator training courses typically follow a format where debriefing knowledge and content are delivered in a didactic manner, with role play exercises, giving learners the opportunity to practice debriefing with peers and/or actors. The challenges faced by course directors include but are not limited to the following: (a) providing substantial opportunities for each learner to practice the various phases and/or methods of debriefing; (b) providing all learners the opportunity to practice transitions between debriefing phases and, when possible, to conduct a full-length debriefing with structured feedback from instructors; and (c) defining proficiency in debriefing and determining if learners have achieved proficiency by course end.

To address some of these challenges, simulation educators should consider applying instructional design methods to debriefing courses that are effective in other educational contexts. For example, providing opportunity to apply what was learned in an experiential manner and coupling this with expert feedback has been shown to be effective in other educational contexts.31,35,36 Immediate relevance and practicality have been noted as important elements of faculty development,37 along with the value of peers to promote exchange of information and ideas.38,39 The application of a repetitive practice model40 to debriefing training would provide educators with multiple opportunities to practice specific debriefing skills with feedback in a supportive, peer environment. Repetitive practice, in combination with constructive feedback, will allow educators to hone their debriefing skill set in a situational manner. Titrating the degree of time spent engaged in different instructional methods (eg, didactic teaching vs. repetitive practice vs. receiving feedback) will influence outcomes and should be dependent on the complexity of learning objectives and the relative experience of learners and educators. A combination of these and/or other methods may help to accelerate the acquisition of specific debriefing skills.

Not all candidates in a simulation educator course may have the opportunity to conduct a full-length debriefing with participants after a simulation event. When this opportunity is not provided, educators for these courses have little chance to provide feedback on the more dynamic aspects of debriefing, such as prioritization of topics, transition from one topic to another, redirection of discussion when it strays from the desired path, and dealing with potentially difficult situations. In the absence of the opportunity to conduct a full-length debriefing, educators should at the very least be provided opportunity to engage in repetitive practice of the various aspects of debriefing described earlier. The integration of repetitive practice and/or full-length debriefing practice may have resource implications for programs offering training in debriefing skills (eg, more educators required to provide feedback) and may potentially lengthen the duration of the typical simulation educator training course. Although these changes may be challenging and potentially more costly to implement, we believe exploring new instructional methods is important to help ensure we are optimizing the way we teach debriefing skills. Future research should explore the relative benefit of various methods for teaching debriefing skills and assess the short- and long-term retention of these skills over time.

Back to Top | Article Outline

How Can We Best Assess Debriefing Effectiveness?

The development of debriefing assessment tools provides simulation programs with a formal way to assess the quality of debriefing. Two different debriefing assessment tools have recently been published. The Debriefing Assessment for Simulation in Healthcare (DASH) tool is a 6-element, criterion-referenced behaviorally anchored rating scale.41 Element ratings are based on a 7-point effectiveness scale, ranging from 1 (extremely ineffective/detrimental) to 7 (extremely effective/outstanding). A rater handbook is provided to support rating of each element and the various dimensions that contribute to the 7 elements. In a study using 3 standardized debriefings, the DASH showed evidence of good reliability and preliminary evidence of validity.41 The Objective Structured Assessment of Debriefing (OSAD) has 8 categories mirroring the core components of effective debriefing identified from the literature and end-user opinion.42,43 Each category is rated on a scale of 1 to 5, with descriptive anchors for scores of 1, 3, and 5. The OSAD showed good interrater reliability and content/concurrent validity when assessed in the context of surgical simulation debriefings.42,43 Table 2 compares the DASH and OSAD tools.

TABLE 2

TABLE 2

Although both of these tools still need to be studied in other contexts, they do provide 2 options for either measuring debriefing quality for summative assessment or providing feedback via formative assessment. The implementation of summative assessment for debriefing would provide a quantitative quality assurance measure and shed light on the rate of retention and/or decay of debriefing skills over time. Furthermore, tracking debriefing performance for a group of educators within a simulation program could help to establish benchmarks for different levels of debriefing proficiency. Establishing benchmarks and tracking debriefing proficiency may potentially allow programs to selectively match debriefers to selected groups of learners and/or be paired up with mentors. For example, a debriefer who consistently scores at a level viewed as “novice” within a program may be selectively matched to novice learners and paired up with a more experienced debriefer so that they have opportunity to receive feedback from their expert colleague.

Both of the aforementioned tools were initially studied in the context of trained raters with a background in education and/or simulation that viewed video recordings of debriefings after high-fidelity simulation. The widespread implementation of debriefing assessment for quality assurance will require further research to ascertain how these tools can be deployed in an efficient and effective manner. For example, assessing the performance of these tools when used by learners,44 simulation educators (peer assessment),45–48 or the debriefers themselves (self-assessment)44,49 will provide options with less of a resource burden on programs. Furthermore, identifying how these various sources of ratings (ie, learner, educator, self) correlate with learning outcomes will determine how they can best be used to enhance debriefing performance. Lastly, both the DASH and OSAD focus on what is happening during the debriefing and how the facilitator contributes to the dynamics of the debriefing. More validity evidence is required for these tools if they are going to be used for summative assessment. Little is known about how these ratings relate to the attainment of specific learning outcomes, such as improved knowledge, skills, and behaviors. Although it would be difficult to demonstrate a direct correlation between “effective” debriefing (as rated by a debriefing assessment tool) and improved performance in clinical environments, future research should at least aim to establish correlation with knowledge acquisition, improved skills, and/or behavioral performance in the simulated environment. This research would help simulation educators better understand what truly constitutes an effective debriefing.

Because the items on both the DASH and the OSAD reflect core components of effective debriefing, these tools may also serve as useful guides for reflecting on debriefing performance in a formative manner. For example, program directors could use the DASH or OSAD as a guide when providing feedback to their educators on the quality of their debriefings. The use of these tools in a longitudinal and structured manner would allow educators to track areas of potential improvement over time and serve to reinforce positive behaviors. Naturally, using either tool for formative assessment purposes would require a sound working knowledge of the components within each tool (and their associated descriptors) and appropriate human resources to support the integration of structured expert feedback.

Back to Top | Article Outline

How Can Peer Feedback of Debriefing Be Used to Improve Debriefing Quality Within Programs?

Provision of feedback is a key feature of effective faculty development in medical education contributing to improving teaching effectiveness.31,32 Formative assessment and feedback can be provided by various sources, including peers who have a sound understanding of educational principles.31,32,44–49 Peer observation of teaching with associated feedback has been used as an effective strategy to improve the quality of teaching for clinician educators from various backgrounds.31,32,45–47 The other associated benefits of peer feedback include the ability to inspire confidence and enthusiasm among educators,48 along with an additional element of quality assurance within programs.45 Collegial peer interactions and relationships permit ongoing exchange of ideas and help to promote cultural change within programs.38,39

The integration of peer feedback for debriefing performance within simulation programs has the potential to build a culture of feedback that may translate positively to SBE. If done on a regular basis, peer feedback may also help with the maintenance and acquisition of core debriefing skills. That being said, the sensitive nature of delivering feedback on suboptimal performance to a peer should be considered when developing such a program. To help prevent conflict, the aims of a faculty development program with a peer feedback component should be communicated in a clear and focused manner before implementation.45 Specific details related to timing, format, and confidentiality of peer feedback should be discussed among educators. The implementation of peer feedback for debriefing skills within a simulation program should be supported by faculty development opportunities designed to teach educators how to provide constructive feedback to their peers. Such opportunities might include teaching educators how to use the DASH and/or OSAD tools to help guide feedback on key content areas for debriefing and provide a chance for repetitive practice in delivering peer feedback on debriefing performance using these tools.

Back to Top | Article Outline

How Can We Individualize Debriefing Training Opportunities to the Learning Needs of Our Educators?

Individualized learning describes training activities that are responsive and tailored to individual learner needs.4 In the individualized learning model, learners are active participants in educational experiences who take responsibility for their own educational progress within the context of their curriculum.1 Delivering education targeted to the defined needs of individual learners has been shown to be highly effective at improving learning outcomes.1,50

When applied to debriefing, individualized learning would involve identifying the learning need via one or more methods of assessment (ie, learner, peer, expert, and/or self-assessment) and delivering educational opportunities to address the deficits in debriefing skills. Although this can be done via regularly scheduled debriefing courses focusing on a variety of different debriefing-related issues, a more tailored offering may help to correct undesirable debriefing behaviors. Previous studies in medical education demonstrate that educators who view videotapes of their teaching, combined with self-assessment and individualized feedback, improve their teaching skills.31,44 Self-assessment promotes personal reflection and, when coupled with feedback, provides an effective model of faculty development.31,44 Applying this model to debriefing could comprise small group sessions with simulation educators where segments of videos with individual debriefing performances can be reviewed, reflected on individually via self-assessment, and discussed in a manner where constructive feedback can be given by peers.51 When possible, these sessions can be moderated by debriefing experts, who help to conduct debriefing role play exercises targeted to address issues identified during the video review process. Although these suggestions have cost and resource implications, a dedicated effort by the simulation community to research and evaluate various methods of delivering individualized learning will help to enhance our collective understanding of debriefing and subsequently improve the value of SBE.

Back to Top | Article Outline

A Proposed Model for Faculty Debriefing Training

The 5 issues discussed in this article offer possible strategies and considerations for the future of faculty development for debriefing. In considering which strategies to implement, simulation programs should carefully weigh the tradeoff between the increased time and commitment required of their educators to participate in these new programs versus maintaining a sufficient pool of adequate debriefers to meet staffing needs. Many of the best educators are also busy clinicians, and implementing a robust faculty development program with various mandatory activities may run the risk of losing talented educators. Instead of trying to train all simulation educators to be experts at debriefing, perhaps, a tiered approach with varying “levels” of educators could be taken, while matching debriefer level and proficiency to learner group and/or session type. In this tiered model, faculty development opportunities could be offered at each successive level of educator, with advancement to the next level determined by achieving a predefined measure of proficiency.

Debriefing is a complex, dynamic skill that typically requires hours of practice and thoughtful reflection to achieve proficiency. A comprehensive faculty development program with multisource, longitudinal feedback, combined with structured courses to address predefined learning needs would likely provide the best opportunity to enhance and maintain debriefing skills. Building on the issues described in this article, a proposed model for debriefing training (in a resource-rich program) could include the following: (1) a course to teach various methods of debriefing in a blended fashion that incorporates opportunity for deliberate practice, feedback, and full-length debriefing; (2) summative assessment of debriefing performance using established tools; (3) formative assessment of debriefing performance with feedback by experts; (4) peer feedback of debriefing performance; and (5) structured opportunity for self-assessment of debriefing performance with group discussion and/or feedback.

Many questions still remain. Although the components of the proposed model are loosely supported by evidence from the faculty development literature, little is known about the frequency, timing, or exact structure required for optimal outcomes. These faculty development strategies for debriefing still require empiric study, and future research will dictate which concepts are most important for improving debriefing skills for simulation faculty. We hope that the 5 issues discussed in this article will help to guide the direction for future work related to faculty development for debriefing and, in turn, advance the field of simulation for health care education.

Back to Top | Article Outline

REFERENCES

1. Issenberg SB, McGaghie WC, Petrusa ER, Gordon DL, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005; 27: 10–28.
2. Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul Healthc 2007; 2: 115–125.
3. Raemer D, Anderson M, Cheng A, Fanning R, Nadkarni V, Savoldelli G. Research regarding debriefing as part of the learning process. Simul Healthc 2011; 6: S52–S57.
4. Cook DA, Hamstra SJ, Brydges R, et al. Comparative effectiveness of instructional design features in simulation-based education: systematic review and meta-analysis. Med Teach 2013; 35: e867–e898.
5. Cheng A, Eppich W, Grant V, Sherbino J, Zendejas-Mummert B, Cook D. Debriefing for technology-enhanced simulation: a systematic review and meta-analysis. Med Educ 2014; 48: 657–666.
6. Rudolph JW, Simon R, Raemer D, Eppich WJ. Debriefing as formative assessment: closing performance gaps in medical education. Acad Emerg Med 2008; 15: 1–7.
7. Rudolph JW, Simon R, Rivard P, Dufresne RL, Raemer DB. Debriefing with good judgment: combining rigorous feedback with genuine inquiry. Anesthesiol Clin 2007; 25: 361–376.
8. Rudolph JW, Simon R, Dufresne RL, Raemer DB. There’s no such thing as a “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc 2006; 1: 49–55.
9. Cheng A, Rodgers D, Van Der Jagt E, Eppich W, O’Donnell J. Evolution of the Pediatric Advanced Life Support course: enhanced learning with a new debriefing tool and Web-based module for Pediatric Advanced Life Support instructors. Pediatr Crit Care Med 2012; 13 (5): 589–595.
10. LeFlore JL, Anderson M, Michael JL, Engle WD, Anderson J. Comparison of self-directed learning versus instructor-modeled learning during a simulated clinical experience. Simul Healthc 2007; 2: 170–177.
11. LeFlore JL, Anderson M. Alternative educational models for interdisciplinary student teams. Simul Healthc 2009; 4: 135–142.
12. Savoldelli GL, Naik VN, Park J, Joo HS, Chow R, Hamstra SJ. Value of debriefing during simulated crisis management: oral versus video-assisted oral feedback. Anesthesiology 2006; 105: 279–285.
13. Grant JS, Moss J, Epps C, Watts P. Using video-facilitated feedback to improve student performance following high-fidelity simulation. Clin Sim Nurs 2010; 6: e177–e184.
14. Byrne AJ, Sellen AJ, Jones JG, et al. Effect of videotape feedback on anaesthetists’ performance while managing simulated anaesthetic crises: a multicenter study. Anaesthesia 2002; 57: 176–179.
15. Sawyer T, Sierocka-Castaneda A, Chan D, Berg B, Lustik M, Thompson M. The effectiveness of video-assisted debriefing versus oral debriefing alone at improving neonatal resuscitation performance: a randomized trial. Simul Healthc 2012; 7: 213–221.
16. LeFlore JL, Anderson M. Effectiveness of 2 methods to teach and evaluate new content to neonatal transport personnel using high-fidelity simulation. J Perinat Neonatal Nurs 2008; 22: 319–328.
17. Van Heukelom JN, Begaz T, Treat R. Comparison of postsimulation debriefing versus in-simulation debriefing in medical simulation. Simul Healthc 2010; 5: 91–97.
18. Xeroulis GJ, Park J, Moulton CA, Reznick RK, LeBlanc V, Dubrowski A. Teaching suturing and knot-typing skills to medical students: a randomized controlled study comparing computer-based video instruction and (concurrent and summary) expert feedback. Surgery 2007; 141: 442–449.
19. Walsh CM, Ling SC, Wang CS, Carnahan H. Concurrent versus terminal feedback: it may be better to wait. Acad Med 2009; 84: S54–S57.
20. Dreifuerst KT. Using debriefing for meaningful learning to foster development of clinical reasoning in simulation. J Nurs Educ 2012; 51: 326–333.
21. Bond WF, Deitrick LM, Eberhardt M, et al. Cognitive versus technical debriefing after simulation training. Acad Emerg Med 2006; 13: 276–283.
22. Cheng A, Hunt EA, Donoghue A, et al.; EXPRESS Investigators. Examining pediatric resuscitation education using simulation and scripted debriefing: a multicenter randomized trial. JAMA Pediatr 2013; 167: 528–536.
23. Sawyer TL, Deering S. Adaptation of the US Army’s after-action review for simulation debriefing in healthcare. Simul Healthc 2013; 8: 388–397.
24. Ahmed M, Arora S, Russ S, Darzi A, Vincent C, Sevdalis N. Operation debrief: a SHARP improvement in performance feedback in the operating room. Ann Surg 2013; 258: 958–963.
25. Archer JC. State of the science in health professional education: effective feedback. Med Educ 2010; 44 (1): 101–108.
26. Hatala R, Cook DA, Zendejas B, Hamstra SJ, Brydges R. Feedback for simulation-based procedural skills training: a meta-analysis and critical narrative synthesis. Adv Health Sci Educ Theory Pract 2014; 19 (2): 251–272.
27. Eppich W, Cheng A. Promoting Excellence and Reflective Learning in Simulation (PEARLS): Development and Rationale for a Blended Approach to Health Care Simulation Debriefing. Simul Healthc 2015; 10(2): 106–115.
28. Centra JA. Types of faculty development programs. J Higher Educ 1978; 49: 151–162.
29. Sheets KJ, Schwenk TJ. Faculty development for family medicine education: an agenda for future activities. Teach Learn Med 1990; 2: 141–148.
30. Bligh J. Faculty development. Med Educ 2005; 39: 120–121.
31. Steinert Y, Mann K, Centeno A, et al. A systematic review of faculty development initiatives designed to improve teaching effectiveness in medical education: BEME Guide No. 8. Med Teach 2006;; 28: 497–526.
32. Leslie K, Baker L, Egan-Lee E, Esdaile M, Reeves S. Advancing faculty development in medical education: a systematic review. Acad Med 2013; 88: 1038–1045.
33. Kolbe M, Weiss M, Grote G, et al. TeamGAINS: a tool for structured debriefings for simulation-based team trainings. BMJ Qual Saf 2013; 22: 541–553.
34. Fanning RM, Gaba DM. Debriefing. In: Gaba DM, Fish KJ, Howard SK, Burden AR, eds. Crisis Management in Anesthesiology. 2nd ed. Philadelphia, PA: Elsevier Saunders; 2015: 65–78.
35. Coles CR, Tomlinson JM. Teaching student-centered educational approaches to general practice teachers. Med Educ 1994; 28: 234–238.
36. Hewson MG. A theory-based faculty development program for clinican-educators. Acad Med 2000; 75: 498–501.
37. Sheets KJ, Henry RC. Evaluation of a faculty development program for family physicians. Med Teach 1988; 10: 75–83.
38. Dewitt TG, Goldberg RL, Roberts KB. Developing community faculty: principles, practice, and evaluation. Am J Dis Child 1993; 147: 49–53.
39. Elliot DL, Skeff KM, Stratos GA. How do you get to the improvement of teaching? A longitudinal faculty development program for medical educators. Teach Learn Med 1999; 11: 52–57.
40. Motola I, Devine AL, Chung HS, Sullivan JE, Issenberg B. Simulation in healthcare: a best evidence practical guide. AMEE Guide No. 82. Med Teach 2013; 35: e1511–e1530.
41. Brett-Fleegler M, Rudolph J, Eppich W, et al. Debriefing assessment for simulation in healthcare: development and psychometric properties. Simul Healthc 2012; 7: 288–294.
42. Arora S, Ahmed M, Paige J, et al. Objective structured assessment of debriefing: bringing science to the art of debriefing in surgery. Ann Surg 2012; 256: 982–988.
43. Runnacles J, Thomas L, Sevdalis N, Kneebone R, Arora S. Development of a tool to improve performance debriefing and learning: the paediatric Objective Structured Assessment of Debriefing (OSAD) tool. Postgrad Med J 2014; 90: 613–621.
44. Skeff KM. Evaluation of a method for improving the teaching performance of attending physicians. Am J Med 1983; 75: 465–470.
45. Sullivan PB, Buckle A, Nicky G, Atkinson SH. Peer observation of teaching as a faculty development tool. BMC Med Educ 2012; 12: 26.
46. Finn K, Chiappa V, Puig A, Hunt DP. How to become a better clinical teacher: a collaborative peer observation process. Med Teach 2011; 33: 151–155.
47. Adshead L, White PT, Stephenson A. Introducing peer observation of teaching to GP teachers: a questionnaire study. Med Teach 2006; 28: e68–e73.
48. Cosh J. Peer observation in higher education—a reflective approach. Innovat Educ Teach Int 1998; 35: 171–176.
49. Marvel MK. Improving clinical teaching skills using the parallel process model. Fam Med 1991; 23: 279–284.
50. Litzelman DK, Stratos GA, Marriott DJ, Lazaridis EN, Skiff KM. Beneficial and harmful effects of augmented feedback on physicians’ clinical teaching performances. Acad Med 1998; 73: 324–332.
51. Rudolph JW, Foldy EG, Robinson T, Kendall S, Taylor SS, Simon R. Helping without harming: the instructor’s feedback dilemma in debriefing—a case study. Simul Healthc 2013; 8: 304–316.
Keywords:

Debriefing; Faculty Development; Simulation; Education; Training

© 2015 Society for Simulation in Healthcare