The challenge of providing nursing students with adequate quality clinical placement experiences (Kim, Park, & Shin, 2016) has led to an increasing use of human patient simulation (HPS) with high-tech manikins as a teaching strategy in health care education and in nursing programs in particular (Hayden, Smiley, Alexander, Kardong-Edgren, & Jeffries, 2014). Simulation provides a learner-centered, experiential environment where students can safely apply theory to practice (Palmer & Ham, 2017). It offers exposure to common situations and situations that require time-sensitive interventions students may experience infrequently in the clinical setting.
Although findings from a study by the National Council of State Boards of Nursing support greater use of simulation in undergraduate nursing education, the financial and human resource costs are significant and can be a barrier for institutions (Hearling, 2018; Zendejas, Wang, Brydges, Hamstra, & Cook, 2013). Thus, many educators search for alternatives. This was the case at our large, urban educational facility where undergraduate nursing students participate in 12 to 14 hours of HPS simulation during their program. This equates to providing approximately 800 hours of simulation per academic year, and even with this amount, students request more simulation time. The HPS simulation program requires a simulationist to facilitate the prebriefing, scenario, and debriefing; a technologist to manage manikins and recording equipment; and a high-fidelity simulation suite, along with the costs of manikins and supplies. The challenge we faced was to find ways to increase quality simulation time for students in a cost-effective manner.
THE INTERACTIVE DIGITAL SIMULATOR INNOVATION
Simulation educators at our institution had the opportunity to beta test Body Interact™, a new technology described as a virtual interactive digital simulator (IDS; Padilha, Machado, Ribeiro, & Ramos, 2018). Rather than learning on a manikin, students work with the image of a patient that appears on an interactive, computerized, touch-table patient simulator. Instead of physically making an assessment or taking action, students use a drop-down menu to choose an assessment or intervention. Their choice is then carried out by the IDS, and an appropriate response is provided, allowing students to continue with their assessment or treatment plan or to reevaluate their decisions and change direction in how they provide care. Students can choose from a bank of scenarios created by the IDS developer, such as patients with respiratory distress, and can set the duration of the scenario (5, 10, 15, or 20 minutes).
The use of a peak flow meter illustrates how a virtual IDS differs from HPS. When students using HPS conduct a peak flow, they are required to obtain the peak flow meter from the anteroom and then show the “patient” how to use it and take a measurement. Students using the IDS simply choose a peak flow measurement under the “tests” button, and the results appear.
The touch table is 47 inches long and waist high. We placed the table in one of our patient simulation rooms where students work in groups of eight around the table (six observers/two students taking action), as with HPS. The IDS provides a briefing, scenario, and debrief, just as with the HPS. An understanding of the technology in action may be gained by visiting https://www.youtube.com/watch?v=B76pU0lc5xo.
For the cost of a student license, paid by the institution or the student, students use the IDS offline at the interactive table or access the IDS on any web-based device at any time as needed. Compared to HPS, we estimate that the IDS could potentially cost less to purchase with decreased consumable costs over time. Students can run the scenario without a technologist; however, a comprehensive cost analysis is needed for true comparison.
STUDENT OUTCOMES AND LESSONS LEARNED
We decided to evaluate both the HPS and the IDS learning experience with Year 2 practical nursing students. Students in both the HPS and IDS groups (n = 22) were given the same asthmatic patient scenario created by the IDS developer and modified by our nursing faculty, along with an algorithm for decision-making points, learning objectives, and prebrief and debrief sessions. The scenario design for both modalities expected the students to follow the nursing process, aligned with all four stages and learning styles of Kolb’s theory of experiential learning (Kolb, 1984).
We measured satisfaction and self-confidence using the Satisfaction and Self-Confidence in Learning Questionnaire (National League for Nursing, 2005). Competency was measured using the Asthma Scenario Performance Checklist (ASPC), a tool we developed based on established protocols. The mean satisfaction score for the IDS group was 16.2/20 (81.4 percent) compared to 18.6/20 (93.1 percent) for the HPS students. The mean self-confidence score for the IDS group was 26.5/35 (75.9 percent) compared to 29.3/35 (84.6 percent) for HPS.
Behavioral research software from Noldus, Viso Software Suite, was used to make an audio/video recording of the simulation sessions with a subset of students. The videos were then imported and analyzed using Observer XT from Noldus to determine completion rates for the 26 priority actions identified on the ASPC. Priority actions ranged from assessing lung sounds and vital signs to identifying specific diagnostic results and administering medications. IDS students completed 58.3 percent of the top priority action items; students in the HPS group completed 43.9 percent.
We were not surprised that the satisfaction and confidence scores were higher for the HPS, as students were familiar with this type of learning modality and IDS was new to them. A surprising finding was the higher performance of priority actions on the ASPC by students in the IDS group. That said, this was a small group, selected through convenience sampling, and we have to be mindful that it is easier to take action using IDS than with the HPS. It is a simple as pressing a button, which may have influenced results.
We found that the IDS was easy to use once students and faculty oriented themselves. We learned a number of other lessons:
- Orientation to the software takes about 30 minutes, less if users know how to use an iPAD.
- One table with five student licenses was sufficient for our needs; that may vary, depending on how the IDS is used.
- The IDS could be used in numerous courses, such as pathotherapeutics and health assessment.
- The online version can be used in class, enabling the entire class to participate in decision-making, or students can use it independently, if they have their own license.
- A hard copy of the debriefing log can be obtained and used by students to compare outcomes for the same scenario.
- A facilitator may or may not be needed, depending on whether or not the web-based or table format is used, student experience, level of case complexity, and whether or not students have used the IDS before.
A scenario editor tool is under beta testing. It will allow faculty to create and input their own scenarios. This would expand the application of IDS because scenarios can be tailored to a specific level of learning and curricular outcomes.
Both IDS and HPS offer important although different types of simulation learning opportunities. Our initial experience leads us to think they each has a time and place when they would best contribute to student learning. Nursing students need to be able to evaluate their interventions, both by physiological parameters and verbal cues. With HPS, the patient’s verbal response to a question can be done ad hoc or preprogrammed; with IDS, the verbal response is always preprogrammed and answers to questions are embedded into the software program. Although students may miss the more “real” or tangible experience of HPS, students using IDS can engage in simulation practice at any time, from any location.
We think it is worth adding the IDS to the suite of available simulation options to give students more access to simulation and potentially help them better prepare for the HPS. However, research is needed to support that theory. We need to keep exploring systems like the IDS that support student learning through simulation within the budget constraints of educational institutions.
Hayden J. K., Smiley R. A., Alexander M., Kardong-Edgren S., & Jeffries P. R. (2014). The NCSBN national simulation study: A longitudinal, randomized controlled study replacing clinical hours with simulation in prelicensure nursing education
. Journal of Nursing Regulation
, 5(2), S1–S64.
Hearling K. A. (2018). Cost–utility analysis of virtual and mannequin-based simulation. Simulation in Healthcare
, 13(1), 33–40. doi:10.1097/SIH.0000000000000280
Kim J., Park J.-H., & Shin S. (2016). Effectiveness of simulation-based nursing education
depending on fidelity: A meta-analysis. BMC Medical Education
, 16, 152. doi:10.1186/s12909-016-0672-7
Kolb D. (1984). Experiential learning: Experience as the source of learning and development
. Englewood Cliffs, NJ: Prentice-Hall.
Padilha J. M., Machado P. P., Ribeiro A. L., & Ramos J. L. (2018). Clinical virtual simulation in nursing education
. Clinical Simulation in Nursing
, 15(C), 13–18. doi:10.1016/j.ecns.2017.09.005
Palmer B. J., & Ham K. (2017). Collaborative simulation: Enhancing the transition to clinical practice. Nursing Education Perspectives
, 38(5), 281–282. doi:10.1097/01.NEP.0000000000000166
Zendejas B., Want A. T., Brydges R., Hamstra S. J., & Cook D. A. (2013). Cost: The missing outcome in simulation-based medical education research: A systematic review. Surgery
, 153(2), 160–176. doi:10.1016/j.surg.2012.06.025