Secondary Logo

Journal Logo

Innovation Reports

Developing the Virtual Resus Room: Fidelity, Usability, Acceptability, and Applicability of a Virtual Simulation for Teaching and Learning

Foohey, Sarah MD, CCFP-EM1; Nagji, Alim BHSc, MD, CCFP-EM2; Yilmaz, Yusuf PhD3; Sibbald, Matthew MD, PhD4; Monteiro, Sandra PhD5; Chan, Teresa M. MD, MHPE6

Author Information
doi: 10.1097/ACM.0000000000004364



Physical distancing restrictions during the COVID-19 pandemic resulted in the transition from in-person to online teaching sessions for many medical educators globally. 1 Simulation is a powerful educational tool, allowing learners to learn critical communication skills, to practice technical skills, and to practice managing high-risk scenarios without threatening patient safety. 2 Distance simulation is the practice of implementing a simulation at a physical distance from the participants. 3 While a variety of simulation software programs exist 4 and have been shown to be beneficial in medical education, 5,6 they are often expensive or are designed for single learner use. One method that has been used to deliver simulation remotely involves a simulation technician filmed performing actions in a simulation lab as instructed by learners observing virtually, 7 which allows learners to mentally rehearse running a case but does not allow for any hands-on participation. Other programs have hosted virtual simulation sessions where learners talk through cases by verbalizing the management steps as prompted by questions from their facilitators 8 or where learners watch a video and subsequently debrief. 9 However, neither of these methods allows learners to work as a team to practice their nontechnical, crisis resource management skills or meets the definition of an online simulation, which is an interactive simulation experience offered through a platform that connects participants with other learners. 3

This report describes the Virtual Resus Room (VRR)—a free, novel, open-access resource for running collaborative online simulations—that allows participants to interact with the virtual environment contained within an online slide set (Google Slides) while simultaneously interacting with each other using online videoconferencing (Zoom).


The lead author (S.F.) created the VRR in May 2020 to give learners the opportunity to rehearse their crisis resource management skills by working as a team to complete virtual tasks. It also allows facilitators to observe and update the case progression in response to the learners’ decisions, creating a responsive learning environment.

Overview of the innovation

The VRR uses 2 simultaneous communication tools: one to link participants to the virtual environment (or room) and a second to link participants to each other (see below). The room is a shared series of Google Slides. The first slide consists of a patient silhouette surrounded by space for participants to record notes and to move images of equipment and basic monitors (e.g., heart rate, oxygen flow rate; see Figure 1). Subsequent slides show a medications tray, airway supplies, investigatory tests, and other content (e.g., defibrillator machine). Each participant has the same slide deck open on their individual laptops, giving each participant the freedom to view and interact with different slides simultaneously as if they were in different parts of a simulated room. As they complete tasks in the shared room, like selecting and dragging images of equipment, monitors, and medications onto the patient silhouette, these changes are viewable by all nearly instantaneously. The synchronous, collaborative editing of this online document allows multiple participants to complete and observe actions and to work together to manage the case. Facilitators interact with the same slide set, typing in the vital signs after the participants have placed equipment, monitors, and medications on the patient silhouette.

Figure 1:
Screenshot showing the layout of the first slide of the Virtual Resus Room (VRR) midsimulation (ventricular tachycardia [VTach] case), when participants have moved images onto the patient silhouette. The VRR was created in May 2020 by the lead author (S.F., University of Toronto, Toronto, Ontario, Canada) to give learners the opportunity to rehearse their crisis resource management skills by working as a team to complete virtual tasks and was integrated into the emergency medicine clerkship at McMaster University, Hamilton, Ontario, Canada, from June to August 2020. Abbreviations: ASA, acetylsalicylic; BP, blood pressure; BVM, bag-valve-mask; CPR, cardiopulmonary resuscitation; Defib, defibrillation; Epi, epinephrine; HR, heart rate; IV, intravenous; J, joule; L, liter; mg, milligram; min, minute; NP, nasal prongs; NRB, non-rebreather; O2, oxygen; RR, respiratory rate; T, temperature.

Participants and facilitators communicate with each other using Zoom, an online video conferencing program. This combination of platforms allows learners to see each other doing actions like miming compressions or gesturing the “all clear” with hands up before they perform defibrillation.

After the first version of the VRR was created by S.F. in May 2020, the usability was improved through multiple phases of peer review. Informal feedback was first elicited from a medical student reviewer, colleagues, and a small group of senior emergency medicine residents at the University of Toronto. Then, a group of 10 emergency medicine residents and staff at McMaster University provided further feedback after their introduction to the VRR. The VRR was then integrated into the emergency medicine clerkship at McMaster University from June to August 2020.

Table 1 provides details on the VRR training and teaching sessions for facilitators and participants, respectively. Figure 1 shows the layout of the first slide of the VRR midsimulation, when participants have moved images onto the patient silhouette (see Supplemental Digital Appendix 1 at for additional depictions of how the VRR works). The VRR is available online at

Table 1:
Details on the Virtual Resus Room (VRR) Training and Teaching Sessions for Facilitators and Participants, Respectivelya

Program evaluation

Program evaluation study context.

The program evaluation study was conducted at McMaster University from June to August 2020. During this time, virtual teaching sessions were offered to medical students completing their emergency medicine clerkship rotation. As part of their virtual clerkship curriculum, these students had previously participated in a virtual simulation alternative that involved talking through cases like an oral exam, with a preceptor asking questions and verbally directing them through how a case might progress. Two cases (supraventricular tachycardia and ventricular tachycardia arrest) were developed for the VRR to complement this curriculum. Facilitators completed a training session that included an introduction to the interface, a trial run of one of the cases during which they acted as a participant, and the opportunity to provide suggestions for improving the VRR before their teaching sessions (see Table 1).

Data collection tools.

Student data collection was completed using pre- and postsession Google Forms surveys (see Supplemental Digital Appendix 2 at These 2-part surveys were developed in collaboration by a group of physicians within our research team. The first part of the presession survey gathered demographic data. The first part of the postsession survey assessed usability and acceptability (7 questions), applicability for learning (8 questions), and fidelity (4 questions), using yes/no, 7-point Likert scale, and open-ended questions (anchors were customized for each question and can be found in Supplemental Digital Appendix 2 at It also included additional questions regarding students’ experiences with the VRR. The second part of both the pre- and postsession surveys was a knowledge test, using 12 multiple-choice questions based on the content of the 2 cases with a total possible score of 12, which was subsequently converted to a percentage-based score for analysis purposes. These questions were reviewed by content experts on our authorship team (M.S., T.M.C.), and the content validity of the questions was checked through multiple rounds of peer review from a group of physicians from both emergency medicine and cardiology. The surveys were piloted by a group of 12 nonparticipating emergency physicians. No updates to the surveys were made after the pilot.

Facilitator data collection was completed using a postsession Google Forms survey (see Supplemental Digital Appendix 2 at The facilitator survey gathered demographic data; assessed usability and acceptability (17 questions), applicability for teaching (7 questions), and fidelity (4 questions); and included open-ended questions eliciting the facilitators’ feedback.


As a program evaluation study, this work was deemed exempt by the Hamilton Integrated Research Ethics Board.


Descriptive statistics were used to analyze the first part of the student postsession survey. While we report pre- and postsession mean scores for the knowledge test separately, comparison within the group was performed after missing data were removed pairwise. The significance level was set to P = .05 using paired-sample t test statistics. Two authors (T.M.C. and S.F.) reviewed responses to open-ended questions, independently identifying themes using an open coding approach, and then refining understanding through a consensus-based approach informed by direct content analysis. This analysis enabled us to get more detailed information on participants’ experiences of the VRR. Student quotations were selected to augment findings from the quantitative work. The Mann-Whitney U test was used to compare facilitator total time spent for in-person and virtual simulations.


Between June and August 2020, 67 student participants and 16 facilitators used the VRR. Our postsession survey response rate was 69% for both students (46) and facilitators (11).

Usability and acceptability

Of the students, 44 (96%) rated the usability of the VRR highly and 42 (91%) felt that it was easy to work with their fellow learners using this platform (i.e., Likert scale responses ≥ 5). All facilitators (11, 100%) found the platform easy to use and to explain to the learners (i.e., Likert scale responses ≥ 5).

Facilitators reported spending significantly less time (in minutes) leading this virtual simulation (mean = 119, standard deviation [SD] = 36) than they did leading prior in-person simulation sessions (mean = 181, SD = 58; U = 20.50, P < .008). On average, facilitators estimated that they spent less time getting to the location for (33 vs 0 minutes), preparing for (45 vs 40 minutes), running (82 vs 78 minutes), and cleaning up after (22 vs 1 minutes) the VRR session than in-person sessions.

Applicability for learning and teaching

Almost every student reported that they learned something during their VRR session (45, 98%) and that their experience might influence their future training (44, 96%). Most students felt that it offered a different learning experience than other online teaching sessions (45, 98%), felt that it was useful as a teaching tool (43, 93%), and stated that they preferred their VRR simulation session to other forms of online learning (41, 89%). Of the students, 42 (91%) rated the VRR simulation session as “mostly” or “completely” useful (i.e., Likert scale responses = 6–7), while only 18 (39%) students gave similar high ratings to their prior virtual simulation experience. One student commented: “Compared to the virtual sim sessions offered during [the virtual longitudinal clerkship curriculum], the VRR offered a much more interactive and collaborative experience, and was more realistic.”

All facilitators (11, 100%) felt that the VRR was a useful teaching tool, that it offered a different learning experience than other online teaching sessions, and that the participants learned something during the session. Additionally, all of the facilitators (11, 100%) indicated that they would be interested in teaching another VRR session in the future.


Most students (43, 93%) and all facilitators (11, 100%) felt the cases were reflective of patients they see in the emergency department. The majority also felt that the sequence of events that occurred in the VRR session reflected what might actually happen in a real case (students = 42, 91%; facilitators = 9, 82%). One student commented: “It felt to be much more real than other sims I have done in the past (even real-life ones), particularly with the incorporation of the virtual defibrillator and ability to drag and drop equipment and medications onto the patient.”

Learning outcomes

Students who took the presession survey (47) scored an average of 72.32 (SD = 17.98) on the knowledge test, and students who took the postsession survey (46) scored an average of 89.35 (SD = 9.50) on the knowledge test. After removing the students who did not reply to both surveys, we analyzed the mean difference within the group. The students who replied to both surveys showed a significant improvement in their postsession knowledge test scores (mean = 89.06, SD = 9.56) compared with their presession scores (mean = 71.17, SD = 15.77; t(34) = 7.28, P < .001, with a large effect size Cohen’s d = 1.23).

Thematic analysis

Two perceived learning outcomes were identified from students’ responses: content learning and communication skills development. Many students remarked on learning medical content, including medications and mnemonics (e.g., Hs and Ts, which is used to recall the causes for pulseless electrical activity) during the VRR sessions. The students also identified this interface as a way to learn about teamwork skills (e.g., closed-loop communication, role clarification).


This program evaluation included a convenience sample of one level of trainee completing their emergency medicine clerkship rotation, limiting its generalizability. Quantitative outcomes may be overstated related to a testing effect from the pre–post test design. In addition, the interpretation of effectiveness is limited by the absence of validity evidence for the questions included in the knowledge test. However, the overall effect of the VRR was supported by the qualitative data. All students and facilitators must have access to computers with web camera capabilities and high-quality internet to use the VRR, potentially limiting its use. Finally, although this innovation may benefit educators seeking to offer collaborative simulation experiences during the current pandemic, it is unclear what role the VRR might play when access to in-person simulations is again more ubiquitously available.

Next Steps

The data presented here suggest that the utility of the VRR as an educational tool has been established at the clerkship level. For subsequent iterations of the VRR with other groups of medical students, we had students conduct “toilet paper CPR” 10 to enhance fidelity. We have expanded the use of the VRR to residents and staff and developed more cases. We developed subsequent versions of the VRR with input from nurse and respiratory therapist colleagues with plans to formally study its utility for interprofessional education, particularly relevant when compliance with physical distancing is paramount. Next steps will include expanding the evaluation of the VRR to include participants from additional learner levels, from varying sites (e.g., community centers, outpatient clinics), and from other health professions.

Learning how to lead a simulation session using the VRR might pose a hurdle for some educators who are less comfortable with technology. Another potential barrier to its widespread use is finding methods to inform other educators that it is available.

While this study compared the VRR with other forms of online learning, future studies could compare the VRR with in-person simulations to help determine how it might be used postpandemic. The VRR might be used as an alternative to traditional didactic teaching or to offer simulation to groups with limited access to in-person sessions. For example, it could be used to facilitate online education to remote regions globally, particularly in settings where the cost of running high-fidelity, in-person simulation sessions might be prohibitive or where the availability of skilled facilitators is limited.

This report suggests that the VRR—a free, novel, open-access resource that can be adapted for use with a multitude of clinical cases, learners (e.g., of different levels or from other disciplines), and environments (e.g., a family doctor’s office or a prehospital environment)—can be an effective complement to existing simulations.


The authors would like to thank Dr. Paul Koblic for his feedback as an early peer reviewer of the Virtual Resus Room interface.


1. Rose S. Medical student education in the time of COVID-19. JAMA. 2020;323:2131–2132.
2. McLaughlin S, Fitch MT, Goyal DG, et al.; SAEM Technology in Medical Education Committee and the Simulation Interest Group. Simulation in graduate medical education 2008: A review for emergency medicine. Acad Emerg Med. 2008;15:1117–1129.
3. Lioce L, Lopreiato J, Downing D, et al, eds. Distance simulation. Lioce L, Lopreiato J, Downing D, et al eds. In: Healthcare Simulation Dictionary. 2nd ed. Rockville, MD: Agency for Healthcare Research and Quality; 2020:14.
4. McGrath JL, Taekman JM, Dev P, et al. Using virtual reality simulation environments to assess competence for emergency medicine learners. Acad Emerg Med. 2018;25:186–195.
5. Duff E, Miller L, Bruce J. Online virtual simulation and diagnostic reasoning: A scoping review. Clin Simul Nurs. 2016;12:377–384.
6. Youngblood P, Harter PM, Srivastava S, Moffett S, Heinrichs WL, Dev P. Design, development, and evaluation of an online virtual emergency department for training trauma teams. Simul Healthc. 2008;3:146–153.
7. Torres A, Domańska-Glonek E, Dzikowski W, Korulczyk J, Torres K. Transition to online is possible: Solution for simulation-based teaching during the COVID-19 pandemic. Med Educ. 2020;54:858–859.
8. Mok G, Schouela N, Thurgur L, et al. Resident learning during a pandemic: Recommendations for training programs. Can J Emerg Med. 2020;22:617–621.
9. Hanel E, Bilic M, Hassall K, et al. Virtual application of in situ simulation during a pandemic. Can J Emerg Med. 2020;22:563–566.
10. Wanner GK, Osborne A, Greene CH. Brief compression-only cardiopulmonary resuscitation training video and simulation with homemade mannequin improves CPR skills. BMC Emerg Med. 2016;16:45.

Supplemental Digital Content

Copyright © 2021 by the Association of American Medical Colleges