Secondary Logo

Journal Logo

Coaching From the Sidelines

Examining the Impact of Teledebriefing in Simulation-Based Training

Ahmed, Rami A. DO, MHPE; Atkinson, Steven Scott NREMT-P; Gable, Brad MD, MS; Yee, Jennifer DO; Gardner, Aimee K. PhD

doi: 10.1097/SIH.0000000000000177
Empirical Investigations

Introduction Although simulation facilities are available at most teaching institutions, the number of qualified instructors and/or content experts that facilitate postsimulation debriefing is inadequate at many institutions. There remains a paucity of evidence-based data regarding several aspects of debriefing, including debriefing with a facilitator present versus teledebriefing, in which participants undergo debriefing with a facilitator providing instruction and direction from an off-site location while they observe the simulation in real-time. We conducted this study to identify the effectiveness and feasibility of teledebriefing as an alternative form of instruction.

Methods This study was conducted with emergency medicine residents randomized into either a teledebriefing or on-site debriefing group during 11 simulation training sessions implemented for a 9-month period. The primary outcome of interest was resident perception of debriefing effectiveness, as measured by the Debriefing Assessment for Simulation in Healthcare-Student Version (See Appendix, Supplemental Digital Content 1, completed at the end of every simulation session.

Results A total of 44 debriefings occurred during the study period with a total number of 246 Debriefing Assessment for Simulation in Healthcare-Student Version completed. The data revealed a statistically significant difference between the effectiveness of on-site debriefing [6.64 (0.45)] and teledebriefing [6.08 (0.57), P < 0.001]. Residents regularly evaluated both traditional debriefing and teledebriefing as “consistently effective/very good.”

Conclusions Teledebriefing was found to be rated lower than in-person debriefing but was still consistently effective. Further research is necessary to evaluate the effectiveness of teledebriefing in comparison with other alternatives. Teledebriefing potentially provides an alternative form of instruction within simulation environments for programs lacking access to expert faculty.

From the Northeast Ohio Medical University (R.A.A.), Rootstown; Department of Medical Education (R.A.A.), Virtual Care Simulation Lab (S.S.A.), Summa Akron City Hospital, Akron; Ohio University Heritage College of Osteopathic Medicine (B.G.), Athens; Medical Simulation (B.G.), Riverside Hospital, Columbus; Northeast Ohio Medical University (J.Y.), Rootstown; Western Reserve Hospital (J.Y.), Cuyahoga Falls, OH; and Department of Surgery (A.K.G.), UT Southwestern Medical Center, Dallas, TX.

Reprints: Rami A. Ahmed, DO, MHPE, FACEP, Department of Medical Education, Summa Akron City Hospital, 525 E Market St, Akron, OH 44304 (e-mail:

The authors declare no conflict of interest.

Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal’s Web site (

Debriefing after a simulation scenario is believed to be the single most important component of simulation-based education.1–4 The objective of the debriefer is to establish a safe learning environment that facilitates a meaningful dialog that allows for reflective self-discovery of the learners’ performance with the ultimate goal of enhanced performance in the clinical environment.4,5 The ability of the debriefer to successfully facilitate this reflection during the debriefing process is critical for learners to gain a deeper understanding of the learning objectives, which impacts retention of newly learned skills.5–9 Traditionally, debriefing is conducted on-site by a facilitator immediately after the training episode. To be most effective, this individual not only needs to have both subject matter expertise but also needs to be knowledgeable of best practices in debriefing. When faculty members possess both qualities, participants are provided a consistent, high-quality learning experience. However, debriefing after simulations can also be effective with the combination of a seasoned debriefer paired up with a content expert, who together can provide a rich, engaging, and reflective debriefing session.

Because simulation assumes a more integral role in medical education, many institutions are organizing their efforts to meet national mandates that require simulation for trainees.10–14 This increased demand for simulation instruction has placed greater burdens on the available pool of qualified instructors that are able to facilitate effective postsimulation debriefing.

As a result, institutions may be limited in their ability to offer training opportunities for learners interested in simulation-based education or may be forced to offer subpar instruction by untrained staff. In an attempt to overcome some of these limitations, institutions are seeking alternative teaching methodologies to overcome these obstacles. One such alternative is the implementation of teleconferencing technologies to meet the growing demands of learners across institutions. Such a session would involve a faculty member, in a remote location, watching a live simulation and providing real-time feedback through videoconferencing technology, referred to by previous scholars as teledebriefing.15 With this strategy, learners and faculty have the opportunity to reap the benefits of a facilitator’s expertise without the burdens of travel, geographic distance, cost, and time away from other obligations. This method allows institutions to maximize training sessions with learners, even when they may not have content experts on-site. Ultimately, a network of individuals with a variety of domains of expertise could be created to optimize educational endeavors around the globe and on demand. Such synchronous learning environments have previously demonstrated effectiveness in the training of medical students, residents, and subspecialists.16–20

Currently, there remains a paucity of evidence-based data regarding several aspects of debriefing,21 including debriefing with a facilitator on-site versus teledebriefing. Evaluating the effectiveness and feasibility of teledebriefing may help expand the impact and feasibility of simulation-based education across specialties and institutions. The goals of this study are to examine the feasibility of teledebriefing and to explore differences between teledebriefing and on-site debriefing for a 9-month period for a cohort of emergency medicine (EM) residents.

Back to Top | Article Outline



This study was conducted with EM residents randomized into either a teledebriefing or on-site debriefing group during 11 simulation training sessions implemented for a 9-month period. The primary outcome of interest was resident perception of debriefing effectiveness, as measured by the Debriefing Assessment for Simulation in Healthcare-Student Version (DASH-SV) (See Appendix, Supplemental Digital Content 1, completed at the end of every simulation session. Summa Health System Institutional Review Board approved the study through expedited review procedures, because this study was deemed minimal risk.

Back to Top | Article Outline

Setting and Sample

The study was conducted within the hospital simulation laboratory for 9 months. Thirty EM residents at a level 1 trauma center were randomized to a traditional debriefing group or a teledebriefing group by pulling names out of a box for each postgraduate year (PGY) to be in either group in alternating fashion (10 residents per year). Residents remained in that group for the entirety of the study. Not all 30 residents were available for all simulations throughout the year as a result of off-service rotation requirements, duty hour restrictions, or vacation. During each simulation session, residents were divided into four groups of four to six residents with all three postgraduate levels represented in each group. Faculty attempted to evenly distribute first-, second-, and third-year residents within each group of the study period. Every session throughout the study period had different combinations of four to six residents (Fig. 1, CONSORT diagram). All groups participated in a high-fidelity simulation and associated debriefing. The two faculty employed a similar approach to debriefing combining the Gather Analyze Summarize22 and Advocacy and Inquiry models.8 Common themes of the debriefing focused around closed-loop communication, crisis resource management principles, and the development of time-sensitive treatment plans for high-risk, low-frequency presentations. The total time was 15 minutes for each simulation and 30 minutes for debriefing. In the first half of the academic year (August-October), the residents only participated in one scenario during the session. In the second half of the academic year (January-May), they participated in two scenarios during each session. The monthly simulation sessions were conducted as part of the EM curriculum. Didactic education before the simulation sessions was typically not related to the objectives of the simulation sessions.



Back to Top | Article Outline

Study Protocol

Medical simulation faculty (MSF) consisted of two board-certified EM attending physicians who also completed 1-year fellowships in medical simulation. These individuals provided expert level debriefing and content expertise for each of the 11 high-fidelity EM simulation scenarios (Table 1) for the 9-month period. At the conclusion of each scenario, residents were seated and either debriefed in person by an MSF or on a large projection screen in the simulation laboratory (Fig. 2) by an MSF not physically present (both while observing the simulation live and during the live debriefing). After each debriefing session, residents completed the DASH-SV.23 Both teledebriefing and traditional debriefing sessions were conducted by both faculty in an alternating fashion. Therefore, residents received 50% of their debriefings from one MSF (R.A.A.) and 50% of their debriefings from a second MSF (B.D.G.), to minimize effects of delivery style and years of experience between the two MSF.





iPhones (placed on a camera stand) were used as cameras within the hospital simulation center to facilitate the teledebriefing. iPads were used by the faculty in remote locations of the hospital to visualize the simulations live and also provide live debriefing using FaceTime. The simulation technician physically moved the iPhone camera in the simulation center (at the conclusion of the simulation) to an adjacent debriefing area. This was subsequently plugged into a large mobile television or a computer projecting to an overhead screen, using an audiovisual connection, to maximize visualization of the teledebriefer, as described by previous scholars (Fig. 1).15

Back to Top | Article Outline


The date, form of debriefing (in person vs teledebriefing), name of faculty debriefer, and PGY of training were collected from participants with the DASH-SV for each debriefing. The DASH-SV uses an ordinal scale ranging from 1 (extremely ineffective/abysmal) to 7 (extremely effective/outstanding) for all survey measures. This tool has demonstrated preliminary evidence of reliability and validity.24,25 Its full psychometric properties are still under investigation. The residents evaluated the MSF on 23 behaviors with six elements of evaluation (See Appendix, Supplemental Digital Content 1, The DASH scorers’ names were blinded to the debriefers, and the collection of data and subsequent analysis was performed by other faculty members.

Back to Top | Article Outline

Statistical Analysis

Descriptive statistics were analyzed with SPSS (Version 19; Chicago, Ill). A significance level of a P value of less than 0.05 was chosen. A two-tailed t test was used for the differences between forms of debriefing. The scores from each DASH form were averaged to a single score per evaluation, and the means of those scores were compared. Analysis of variance was used for comparisons of each component of the DASH scale by month. With power set at 0.80 and an α value at 0.05, two-tailed, power analyses indicated that approximately 31 individuals are needed for a power of 0.80 on the basis of the expected mean for a between-groups comparison.

Back to Top | Article Outline


Thirty EM residents participated in these training sessions.

A total of 44 debriefings (22 teledebriefing and 22 in-person composed of four groups, 11 scenarios) occurred during the study period with a total number of 246 DASH-SVs completed (132 teledebrief, 114 in-person). Ten DASH-SVs that did not identify the faculty or the form of debriefing were excluded from the study.

Overall, the data revealed a difference between the effectiveness of traditional in-person debriefing [6.64 (0.45)] and teledebriefing [6.08 (0.57), P < 0.001]. These differences were also analyzed across PGY. No effect of PGY was found. Residents regularly evaluated both traditional debriefing and teledebriefing as “consistently effective/very good.”

We further analyzed the data by month to examine any trends over time or with certain scenarios. Interestingly, the first 3 months demonstrated no significant differences between conditions (August: P = 0.21; September: P = 0.34; October: P = 0.10). Starting in January, however, differences between conditions began to emerge. Table 2 displays the values for each component of the DASH scale by group over the course of the study period. The number for each month and group was the following: August: 12 in-person, 13 teledebriefing; September: 13 in-person, 14 teledebriefing; October: 10 in-person, 13 teledebriefing; January: 17 in-person, 29 teledebriefing; February: 28 in-person, 26 teledebriefing; March: 26 in-person, 21 teledebriefing; April: 8 in-person, 16 teledebriefing.



Back to Top | Article Outline


Because the demands on simulation centers, faculty, and staff continue to increase, educators must find creative ways to provide learners the simulation-based training opportunities they need and desire. Some institutions have attempted to address the demand or shortage of trained debriefers and content experts by offering faculty development programs for nonsimulation faculty in the form of “train-the-trainer” curricula. The aim of these programs is to provide greater independence of nonsimulation faculty in the use of their respective simulation centers. However, despite these faculty development programs, many institutions rely on their simulation faculty and staff to provide most educational content and debriefing for simulation sessions. This study was undertaken to explore the perceived effectiveness of faculty teledebriefers as a potential solution for programs unable to meet the increased demands of simulation-based medical education.

Our results indicate that learners felt that in-person debriefing was more effective compared with teledebriefing. This finding is typical across specialties in various other studies.26,27 We postulate that this may be for several reasons. First, most of the residents had previous experience with the same faculty and simulation center and had always had in-person debriefings before this study. Although we would have expected this difference to present itself across all months of data collection, novelty in the first months may have accounted for no difference between the debriefing approaches, but in subsequent months, the novelty seems to have worn off and in-person debriefing was rated higher. Next, it should be noted that although debriefing does focus on the group’s discussion of the events, much can be conveyed by the debriefer through nonverbal cues. Perhaps, our learner’s perceived effectiveness may be attributed in part to some of the nonverbal communication provided by the debriefer. Many of these nonverbal cues may have been lost during teledebriefing because the debriefer was only being viewed from the shoulders up. In addition, the telepresent faculty member does not have the opportunity to demonstrate appropriate maneuvers or procedures required during the EM simulations that a physically present instructor could demonstrate at the bedside (eg, cricothyrotomy, fasciotomy, etc). Also, even minute technical issues can have an impact on the resident’s perception of the effectiveness of the debriefing. Some of these issues include the telepresent faculty member being unable to visualize or hear subtle forms of communication during the actual simulations (that may have been identified by a physically present faculty members), the rare technology anomaly (eg, volume issues, frozen screen, loss of signal), or, as suggested by some authors, the camera placement not picking up all the movements and actions of the learners, limiting the ability to fully discuss the actions of the learners in the scenario during the debriefing.26,28

Importantly, however, the higher rating for in-person debriefing, although statistically significant, should be interpreted with caution. The overall rating of teledebriefing was still very high with an overall rating of 6.07 (vs. in-person debriefing of 6.64), representing a score of “consistently effective/very good” on the DASH 1 to 7 rating scale. This reflects an effective mean score for almost an entire academic year. Despite teledebriefing being rated lower than traditional debriefing, it does seem that teledebriefing is an effective and practical solution to providing education when other resources or faculty are not physically present. Recent studies have demonstrated effective telepresence training and guidance for EM and surgical procedures.16,18,29,30 This provides a convenient solution to simulation programs that are also struggling to keep up with the growing demands of learners or to off-site or rural locations, especially when the alternative is not being able to provide any education at all. This is especially true given that most of the telecommunication equipment used in this study is widely available and likely already present in most simulation centers.31,32

A score of 4 or higher is considered acceptable for the DASH.23 The debriefing scores by the residents in both categories was higher than 6.0 for the study period. These atypically high scores may be the result of several factors. Both debriefers were well-respected EM faculty members, and both completed fellowships in medical simulation. This may have contributed to a halo effect on the basis of previously positive interactions in the training environment. In addition, no training on use of the DASH tool was provided other than providing adequate time to become familiar with the instrument. The developers of the tool, however, report that once raters are “thoroughly familiar with the elements,” they report ease of use with reliable and valid scoring without the need for training.23 Another reason for the potentially high scores was that the training environment was set up to minimize factors that contribute to lower debriefing scores (abbreviated time to debrief/self-reflect, restricted space in the training environment, and clinical distractions/responsibilities during protected time).23 The assessment of the debriefing sessions by outside faculty members unfamiliar with the debriefers and formally trained in debriefing evaluation may potentially yield lower scores. Another opportunity for further evaluation of teledebriefing would provide faculty who are unknown to the resident evaluators throughout the study period, potentially eliminating the previously mentioned limitation of familiarity with the debriefer.

The simple use of an iPhone, iPad with FaceTime, and projection equipment provided a viable teleconferencing solution that was both cost-effective and practical. Skype technology has previously been used effectively for telementoring surgeons across a variety of subspecialties in similar fashion.16,33,34 We did use our own Wi-Fi hotspot to overcome issues with the hospital’s firewalls and network connectivity. This provided a reliable and inexpensive solution ($80 per month) to connectivity issues identified during initial pilot testing of the program. This equipment did not require any special expertise and was easy to set up and test. There were instances when a resident unintenionally positioned themselves in front of a camera, blocking the faculty’s view of the simulation. This was usually easily addressed by sending a text message to the on-site simulation technician during the simulation who, as discretely as possible, reminded the resident of the position of the iPad or iPhone camera. In addition, we only had one camera view of the entire simulated environment (positioned approximately 10 feet back from the foot end of the bed and approximately 7 feet high, angled downward), limiting our ability to visualize subtle nonverbal communication, and all actions that may have occurred outside of the camera angle.

This study had several limitations. This study included EM residents at a single institution, limiting generalizability. In addition, the outcome measures of this study are limited to the T1 level (learner’s perception only) and do not evaluate the effectiveness of the debriefing methods to affect knowledge and skill. In addition, the DASH-SV has not had the same psychometric scrutiny because the DASH-Rater Version and the scores of this study demonstrate that the scores may be inflated. In addition, we did not pursue qualitative data from the participants, potentially missing an opportunity to further illuminate perceived barriers or limitations in either form of debriefing. Use of only one system of teleconferencing (equipment, interface) also poses a limitation to the study. An additional limitation was that most residents had extensive experience in the traditional form of in-person debriefing before this study. Previous experience with one type of debriefing may have influenced the learners’ perception of a novel form of debriefing. One final limitation was that both of the expert debriefers in this study were well-known to the residents as established members of the EM faculty. However, the faculty, in this study, were not responsible for supervising or grading the residents.

There are a number of areas in which future research could elucidate the efficacy of the strategies described here. For example, studying the role of learner disposition and experience may be a fruitful endeavor. Specifically, examining the effect on learners at various stages of training, various types of skills (individual skills, communication, faculty’s perceptions of other faculty’s ability to effectively debrief) might be beneficial. In addition, evaluation of various well-accepted learning theories in simulation such as experiential learning,5 deliberate practice,35 and mastery learning environments36 should be evaluated in a similar side-by-side comparison. Deliberate practice environments, although frequently discussed, are infrequently available to learners because this is a resource (faculty)-intensive environment that is difficult to set up with the increasing demands on faculty. Further exploration to determine whether these environments may be more easily established using these novel technologies should be pursued.

Back to Top | Article Outline


Teledebriefing was found to be rated lower than on-site debriefing but was still consistently effective. Teledebriefing potentially provides a practical alternative for simulation-based medical education if faculty or staff is unavailable to provide traditional in-person debriefing. Future studies should further explore this application in variety of settings and in a variety of learning theories.

Back to Top | Article Outline


The authors thank Dr Jenny Rudolph and Dr Robert Simon for their guidance in establishing the methods and refinement of the study design.

Back to Top | Article Outline


1. Levine AI, DeMaria S, Schwartz AD, Sim AJ. The Comprehensive Textbook of Healthcare Simulation. New York: Springer; 2013.
2. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005;27(1):10–28.
3. Marks M, Sabella MJ, Burke CS, Zaccaro SJ. The impact of cross-training on team effectiveness. J Appl Psychol 2002;87:3–13.
4. Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul Healthc 2007;2(2):115–125.
5. Kolb DA. Experiential Learning: Experience as the Source of Learning and Development. Englewood Cliffs, NJ: Prentice-Hall; 1984.
6. Schon D. The Reflective Practitioner: Toward a New Design for Teaching and Learning in the Professions. San Francisc: Jossey-Bass; 1987.
7. Cannon-Bowers JA, Salas E, Tannenbaum SI, Mathieu JE. Toward theoretically-based principles of training effectiveness: A model and initial empirical investigation. Military Psychol 1995;7(3):141–164.
8. Rudolph JW, Simon R, Dufresne RL, Raemer DB. There’s no such thing as a non-judgmental debriefing: a theory and method for debriefing with good judgment. Simul Healthc 2006;1:49–55.
9. Rudolph JW, Simon R, Raemer DB, Eppich WJ. Debriefing as formative assessment: closing the performance gaps in medical education. Acad Emerg Med 2008;15:1010–1016.
10. Accreditation Council for Graduate Medical Education 2014. Surgery Frequently Asked Questions. Available at: Accessed December 25, 2014
11. Liaison Committee on Medical Education 2015. Available at: Accessed January 1, 2015.
12. Association of American Medical Colleges 2011. Medical Simulation in Medical Education: Results of an AAMC survey. Available at: Accessed December 25, 2014.
13. The American Society of Anesthesiologists 2015. Simulation education. Available at: and Accessed January 1, 2015.
14. Accreditation Council for Graduate Medical Education 2014. Emergency Medicine Frequently Asked Questions. Available at: Accessed December 25, 2014
15. Ahmed R, King Gardner A, Atkinson SS, Gable B. Teledebriefing: connecting learners to faculty members. Clin Teach 2014;11(4):270–273.
16. Ponsky TA, Bobanga ID, Schwacter M, et al. Transcontinental telementoring with pediatric surgeons: proof of concept and technical considerations. J Laparoendosc Adv Surg Tech A 2014;24(12):892–896.
17. Walker RB, Underwood PK, Bernhagen M, Markin N, Boedeker BH. Virtual intubation training at remote military site. Stud Health Technol Inform 2012;173:540–542.
18. Mikrogianakis A, Kam A, Silver S, et al. Telesimulation: an innovative and effective tool for teaching intraosseous insertion techniques in developing countries. Acad Emerg Med 2011;18(4):420–427.
19. Hortos K, Sefcik D, Wilson SG, McDaniel JT, Zemper E. Synchronous videoconferencing: impact on achievement of medical students. Teach Learn Med 2013;25(3):211–215.
20. Markova T, Roth LM, Monsur J. Synchronous distance learning as an effective and feasible method for delivering residency didactics. Fam Med 2005;37(8):570–575.
21. Raemer D, Anderson M, Cheng A, Fanning R, Nadkarni V, Savoldelli G. Research regarding debriefing as part of the learning process. Simul Healthc 2011;6:S52–S57.
22. American Heart Association 2015. Structured and Supported Debriefing Course. Available at: TrainingCenters/InstructorResources/Structured-and-Supported-Debriefing-Course UCM 304285 Article.jsp
23. Center for Medical Simulation 2015. Debriefing Assessment for Simulation in Healthcare, © 2015. Available at: Accessed January 14, 2015.
24. Brett-Fleegler M, Rudolph J, Eppich W, et al. Debriefing assessment for simulation in healthcare: development and psychometric properties. Simul Healthc 2012;7(5):288–294.
25. Dreifuerst KT. Using debriefing for meaningful learning to foster development of clinical reasoning in simulation. J Nurs Educ 2012;51(6):326–333.
26. Hayden EM, Navedo DD, Gordon JA. Web-conferenced simulation sessions: a satisfaction survey of clinical simulation encounters via remote supervision. Telemed J E Health 2012;18(7):525–529.
27. Kramer NM, Demaerschalk BM. A novel application of teleneurology: robotic telepresence in supervision of neurology trainees. Telemed J E Health 2014;20(12):1087–1092.
28. Ikeyama T, Shimizu N, Ohta K. Low-cost and ready-to-go remote-facilitated simulation-based learning. Simul Healthc 2012;7(1):35–39.
29. Prescher H, Grover E, Mosier J, et al. Telepresent intubation supervision is as effective as in-person supervision of procedurally naïve operators. Telemed J E Health 2015;21(3):170–175.
30. Ali J, Sorvari A, Camera S, Kinach M, Mohammed S, Pandya A. Telemedicine as a potential medium for teaching the advanced trauma life support (ATLS) course. J Surg Educ 2013;70(2):258–264.
31. Sclafani J, Tirrell TF, Franko OI. Mobile tablet use among academic physicians and trainees. J Med Syst 2013;37(1):9903–9906.
32. Franko OI, Tirrell TF. Smartphone app use among medical providers in ACGME training programs. J Med Syst 2012;36(5):3135–3139.
33. Miller JA, Kwon DS, Dkeidek A, et al. Safe introduction of a new surgical technique: remote telementoring for posterior retroperitoneoscopic adrenalectomy. ANZ J Surg 2012;82(11):813–816.
34. Choy I, Fecso A, Kwong J, Jackson T, Okrainec A. Remote evaluation of laparoscopic performance using the global operative assessment of laparoscopic skills. Surg Endosc 2013;27(2):378–383.
35. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med 2004;79(10):70–81.
36. Barsuk JH, McGaghie WC, Cohen ER, O’Leary KJ, Wayne DB. Simulation based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med 2009;37:2697–2701.

Teledebriefing; Debriefing; Telepresence; DASH; Telementoring

Supplemental Digital Content

Back to Top | Article Outline
© 2016 Society for Simulation in Healthcare