Innovative methodologies are needed which provide realistic, immersive learning experiences for nursing students. Virtual environments may provide a method of overcoming barriers to clinical training while focusing on immersive experiences. This technology is a powerful medium for training and offers different benefits that complement the use of mannequin-based simulators or standardized patients.
Many terms related to virtual environments are used interchangeably in the literature including game, virtual world, virtual patient, and virtual reality. These terms are not often explicitly defined. The terms can be used generically, or they may refer to something specific, such as use of the term virtual world refers to Second Life1–3 by Linden Research, Inc.4
Prensky5(p119) has defined games as “organized play.” Games can be single or multiplayer,6,7 are rule and goal-oriented,6,8,9 and motivate players to play.5,7,10,11 Games may include consequences and feedback5,8,12 that encourage learning.5 Single-player games usually contain nonplayer characters (NPCs),6,11 which are animated persons with scripted behaviors.11,13 In games, learning may occur as a by-product, but the primary purpose is to entertain.13
Virtual worlds are internet-based, three-dimensional, simulated environments where users work through a graphical representation known as an avatar.2,14 In virtual worlds, there is usually a player behind each avatar. Although a virtual world may contain some game components, most virtual worlds do not include the NPCs that are common in games.
Virtual patients are typically animated NPCs with which the user, who assumes the role of a healthcare provider, interacts.15,16 An essential feature of any virtual patient is the interactive interface that enables the user to query the patient and receive a patient response that is supplied by the computer.15,16
Characteristics that are shared among games, virtual worlds, and virtual patients include that they are 3D and immersive, incorporate virtual reality by recreating real environments, and are used to make abstract concepts more understandable.1 They all have visual and auditory feedback1,11,17 and a graphic interface. Virtual reality encompasses many broad ideas and has been described as an immersive computer-generated environment.18
Virtual worlds provide various educational possibilities including an extension of the classroom and a conduit for clinical experiences.1 Virtual reality research has suggested that participation in a 3D environment also supports the constructivist paradigm of instruction and may bridge the gap between information representation (knowledge acquisition) and experiential learning (knowledge application).19 Research regarding games, virtual patients, and virtual worlds in healthcare education has demonstrated their efficacy in teaching didactic information to promote knowledge acquisition,10,15,16,20,21 critical thinking/reasoning,15,16,22 and skilled communication.23,24
Although there is some evidence suggesting that new technologies may be useful to support nursing education, research has not kept pace with the rapid development of these new technologies.16,25 This article describes development and testing of a virtual patient trainer with NPC virtual patients/staff (see Figs. 1 and 2), 3D virtual game-based simulator, and auditory and visual feedback. Although there are some game components in the virtual patient trainer, the focus of its development was education, not entertainment. The objective was to compare the achievement of learning outcomes of undergraduate nursing students when a virtual patient trainer or a traditional lecture was used to teach pediatric respiratory system content.
We constructed a virtual patient trainer using Virtual Pediatric Patients (VPPs) and a Virtual Pediatric Unit (VPU). The virtual nursing trainer was developed for undergraduate registered nursing students enrolled in a pediatric nursing course. The phases of development were as follows:
- Phase 1: Develop a plan for displaying the pediatric respiratory content.
- Phase 2: Develop the VPPs, VPU, and game components.
- Phase 3: Compare learning outcomes between traditional lecture and the virtual patient trainer.
Phase 1: Planning
The pediatric respiratory content was reviewed by two nursing faculty with expertise in pediatric nursing. The learning objectives for the pediatric respiratory content included the following: (1) Describe major clinical manifestations, assessment/history parameters, and nursing interventions related to respiratory failure in children; (2) Identify the main causes, clinical manifestations, disease characteristics, and treatments for pediatric patients with respiratory system dysfunction; (3) Use the nursing process, summarize, and prioritize nursing care goals and interventions for children with respiratory system dysfunction and their families; and (4) Identify the main causes, clinical manifestations, disease characteristics, and treatments for high-risk infants with disturbed respiratory system function.
A plan for delivery of the pediatric respiratory content in a virtual environment was developed. A virtual patient trainer would allow the student to be immersed in a pediatric hospital while assuming the nursing care of four VPPs with the following respiratory diseases: bronchiolitis, cystic fibrosis, pneumonia, and asthma. These four disease processes were chosen because they linked with the didactic content and case studies discussed in lecture and were frequent diagnoses in the surrounding pediatric hospitals. A flowchart was used to display the process the nurse should follow while providing nursing care to the four patients. The flowchart showed the progression of actions based on whether the nurse made a correct or incorrect decision or not. Each action was followed by the appropriate consequence. The simulated scenarios were developed using the assessment-intervention-reassessment format that has been used in scenarios developed for use with high-fidelity human patient simulators.26 Situation, Background, Assessment, and Recommendation (SBAR27) would be the format used to organize patient data during the verbal handoff and in the written format in the patient’s medical record that would be available throughout the experience. Students had received instruction about SBAR earlier in the pediatric nursing course. Each scenario was developed to guide the learner through the nursing process: assess the patient’s condition, make clinical judgments about the patient’s need(s) for nursing care, implement care decisions and procedures in a timely manner, reassess the patient following interventions, and call for help when indicated and in a timely manner.
Phase 2: Development
The VPPs were developed to represent infants and children of different ages, races, and physiological variations (ie, retractions and nasal flaring). The VPPs were in a private hospital room on the VPU built to look and feel like a unit in a pediatric hospital. Consent was obtained from the administration and security of a local hospital to allow for pictures to be taken of a new unit that was not yet occupied. The pictures were taken by the game design team. Routine sounds, such as beeping monitors, clacking trays, and overhead announcements, were recorded throughout the hospital. Sounds that could not be recorded in the hospital due to patient privacy were captured in the simulated hospital on the nursing school campus.
The virtual patient trainer was built using Unreal Engine 3. The components included (1) orientation, (2) mini-games, (3) virtual patient experience, and (4) debriefing/feedback. Two VPPs that included coaching from a virtual charge nurse were developed for round one, and two VPPs without coaching were developed for round two. These components and learning techniques were chosen with the lecture content/objectives in mind. A maximum of 3 hours was planned for each student to complete the virtual trainer experience because that was the same duration as the traditional lecture intervention planned for students in the control group.
In the orientation, the student would be introduced to the virtual charge nurse, the VPU, and the desired behaviors of the nursing process. The virtual charge nurse was developed to provide oral and written directions. Written instructions would appear automatically on the computer screen as the charge nurse spoke. The student would then be directed to play three mini-games after completion of the orientation (see Fig. 1).
The goal of the mini-games would be to familiarize the student with different types of coughs, retractions, and infectious agents associated with pediatric respiratory diseases. In the first mini-game, the student would view retractions displayed by the VPP and then choose from a list the type of retraction the VPP was displaying. In the second mini-game, the student would hear different types of coughs and then choose from a list the type of cough being demonstrated. In the third mini-game, information would be given about respiratory diseases that affect pediatric patients, and the student would be asked to match infectious agents with their associated respiratory diseases (ie, bronchiolitis is caused by a virus). After completion of all three mini-games, the student would then enter the virtual patient experience.
Virtual Patient Experience (Round One)
The first two VPPs (round one) planned were an asthma patient and a bronchiolitis patient. During these two VPP encounters, a virtual charge nurse would be present to orient the student to the environment (room and patient specific) and to the “dashboard.” Interventions within several categories would be available on a “dashboard” marked with appropriate icons: assessment, interventions, calling the respiratory therapist, calling the virtual charge nurse, and viewing the patient’s medical record/physician’s orders (see Fig. 2). Throughout the virtual training experience, the patient’s monitor would be visible to the student, and it would include standard physiologic readings (such as heart rate and respiratory rate) complemented by typical monitor sounds. The student would also be introduced to the time clock (time kept throughout the experience) and indicator for confidence level (smiley face in the corner of the screen). The student would be told that the smiley face will change colors depending on whether the student performed the correct interventions. Green would indicate appropriate performance, yellow would indicate caution and the need for the student to reassess actions, and red would indicate incorrect or delayed actions. The smiley face was planned to serve as a formative assessment of the student’s performance while caring for the patients. The change in color from green to yellow would serve as feedback so that the student could reassess nursing interventions and take corrective actions. If the smiley face would change to red, the virtual charge nurse would enter the room to “coach” the student toward the correct next step. The student could make mistakes simply by doing nothing or by not responding quickly enough to events. Following round one, the student would progress to round two.
Virtual Patient Experience (Round Two)
During round two, the student would care for the final two patients (pneumonia and cystic fibrosis exacerbation) independently. The purpose of the second round would be to assess the student’s ability to independently take care of the patient. If the student would call the virtual charge nurse spontaneously or if the smiley face would turn red, the student would be given detailed information about the patient’s disease process to cue the student to move through the experience. When the student’s experience in the VPU was finished, the student would do a hand-off report using the SBAR format27 to inform the on-coming nurse. The student would then receive information again about the four respiratory diseases and other pediatric respiratory diseases through questioning from the on-coming nurse; the student would be shown a multiple-choice list from which to select answers.
At the conclusion of the virtual patient experiences, the student would receive an in-depth assessment of care decisions which included correct and timely interventions. If those decisions did not yield improvement in the patient’s condition, an analysis of care events that contributed to worsening of the patient’s condition would be given. No numerical scores were to be provided to the students.
Phase 3: Comparison
This was a randomized, controlled study which included a posttest with control group design. Institutional Review Board approval was obtained at the university before the start of the study.
A priori power analysis using a 0.05 level of significance, a medium effect size, and a power of 0.8 had revealed that a total of 90 participants would be required for the study (45 in each group). Senior BSN students (n = 106) enrolled in a pediatric nursing course in one university in the southern United States were offered the opportunity to participate in the study. At the beginning of the semester, students were invited to enroll in the study, and informed consent was obtained. The 16-week semester was divided into two, 8-week blocks during which students alternated between pediatric and obstetric courses. Students in each block were randomly assigned to one of two groups using a random numbers table. The control group received the traditional 3-hour pediatric respiratory lecture by faculty, and the experimental group participated in the virtual patient trainer experience. A few students, less than 10, elected to participate in the virtual training experience more than once. Students in both groups were assigned textbook readings related to respiratory diseases, and it was assumed that all students would complete the readings before their group activities, ie, lecture or an experience with the virtual patient trainer. Students in both groups also received a standard simulation laboratory experience using medium- and high-fidelity mannequins and standardized patients.
Virtual Patient Trainer Experience
All students randomized to the experimental group were scheduled to participate in the experience with a virtual patient trainer during the same 3 hours the lecture was being delivered, although students typically completed the experience in less than 2 hours. Students were all seated at individual computers on which the virtual trainer had been loaded. Headphones were available for all students. Before starting the virtual patient experience, students were shown a small introductory video and were given an instruction sheet with the intervention icons. Technology experts were available for computer/software issues. Students in the experimental group were offered class lecture notes and a live faculty lecture after study-related measurements were completed but before their final course examination.
Students’ demographic data, including age, gender, ethnicity, marital status, and education, were collected to describe the study sample and allow for analysis of preexisting differences between the groups. Both knowledge acquisition and knowledge application were measured to determine students’ achievement of the course learning objectives after a virtual patient trainer or a traditional lecture was used to teach principles and concepts of respiratory distress in the pediatric patients. Knowledge application was measured using Objective Structured Clinical Examinations (OSCEs), and then knowledge acquisition was measured using a written test.
Measurement of Knowledge Application
To measure students’ knowledge application during the research study, students from both groups participated in two OSCEs that occurred 1 week after they received the instructional intervention. The OSCE scenarios were based on course content and were validated by the undergraduate pediatric faculty (face validity). The OSCEs occurred on the same day and used high-fidelity infant simulators and standardized patients (actors). Students were asked in the OSCEs to recognize and treat respiratory distress during two different infant scenarios. Both scenarios took place in a simulated hospital on the university campus, each in a separate room, designed to resemble an inpatient hospital room with equipment needed for each scenario. The scenarios were planned to exemplify the signs and symptoms of respiratory diseases. A “patient” medical record developed for each scenario was available to students.
SimBaby (SW version 1.4.1; Laerdal Medical, Stavanger, Norway) was used to portray an infant in respiratory distress in each OSCE. Both simulators were run by trained operators who were familiar with the scenarios. A charge nurse and mother, played by standardized patients, were available for both scenarios. All roles were scripted so that students would receive the same information during the OSCEs. The charge nurse oriented students to the chart and environment.
Two reviewers, one for each scenario, who were blinded to students’ group assignment observed students during the OSCEs. The reviewers assessed students’ knowledge application using scenario-specific checklists developed based on critical elements students should perform during each scenario. One scenario checklist had 10 items; the other had 11 items. One scenario required students to review physician orders and administer Albuterol. In the second scenario, the patient recovered with simple nursing interventions: bulb suctioning, elevating the head of the bed, and making sure the nasal cannula was positioned securely in the nares. Checklists were scored dichotomously as “Yes,” the student performed the correct action, or “No,” the student did not. The same reviewer was used for each of the scenarios. Event logs (performance data and timing of interventions) from the infant simulator were used to score each checklist and to record times for interventions. Times for interventions were recorded in minutes and seconds.
Measurement of Knowledge Acquisition
Knowledge acquisition was assessed using a 10-item multiple-choice test. The items were different than those planned for the course final examination but were based on course objectives and were developed to reflect general nursing knowledge of the care of pediatric patients with respiratory diseases. Total scores on the knowledge acquisition test could range from 0 (least knowledge) to 100 (most knowledge).
Statistical analysis was performed using SPSS (Statistical Package for Social Sciences, version 19; SPSS, Inc: Chicago, IL). Descriptive statistics were used to describe the sample. χ2 test was used to assess preexisting demographic differences between the groups of students. A Student’s t test was used to analyze differences between the groups on knowledge and knowledge acquisition.
Of the 106 students in the class, 93 participants (86%) were enrolled in the study (control group, n = 47; experimental group, n = 46). No differences were found between the groups in age, gender, ethnicity, marital status, or education; therefore, all data from the two blocks were analyzed together in one control group and one experimental group.
There was a significant difference in knowledge acquisition between the control and experimental groups (mean scores 75 ± 12 vs. 83.9 ± 15, respectively, P = 0.004). Internal consistency reliability for the knowledge test was 0.90 using Cronbach’s alpha. On the checklists for the two OSCEs measuring knowledge application, there were significant differences in times between the groups for all critical elements, with the experimental group demonstrating more timely performance of critical nursing tasks in the OSCEs (P = 0.001 for each of the two scenarios; Table 1). Internal consistency for the two checklists were 0.70 and 0.80 using Cronbach’s alphas.
There is no single terminology used consistently in the literature of games, serious games, virtual worlds, virtual patients, and virtual reality. The technology tested in this study falls more in line with a virtual patient trainer; however, some game components were included, such as rules, feedback, consequences, and goals.5,6,8,12 Unlike a game in which the purpose is primarily to entertain, the purpose of this study was to teach.13 This study compared a virtual patient trainer to traditional lecture for undergraduate nursing students’ achievement of learning outcomes related to pediatric respiratory content. Successfully completing an experience falls short of indicating that learning occurred.28 It is important to also determine whether the student learned the material or just learned how to complete the experience. The outcomes compared in this study were knowledge acquisition, as measured by a paper and pencil test, and knowledge application, as measured using OSCEs.
Students who participated in the virtual patient trainer experience achieved higher scores on average on the knowledge acquisition test (84 = B compared with 76 = C, P = 0.004) and also demonstrated more timely performance of critical nursing tasks in the OSCEs (P = 0.001 for each of the two scenarios) than students who received a traditional lecture. Although the total completion time for each of the OSCE scenarios was statistically significantly better for students who experienced the virtual patient trainer, a 1-minute difference in response times may not be considered clinically significant. There may, however, be cases in which a response that is 1 minute quicker could make a real difference in patient outcomes.
The precise mechanism that led to improved achievement of learning outcomes in this study is uncertain, but the active learning process may have been one factor. Because the virtual patient trainer required students to interact with the material, participating in the experience may have stimulated more active learning compared with the relatively passive learning that occurs during lectures in a typical classroom setting. Bhoopathi29 reported similar improvement in learning outcomes in undergraduate mental health nursing students who participated in an educational game that included active involvement of the students.
Formative assessment with instant feedback during the learning process has been shown to be highly effective for increasing learning30,31 and may also have an impact on long-term retention of information.21,32 Investigators have reported the positive influence of the “testing effect” on long-term learning after training in cardiopulmonary resuscitation.31 In this study, the formative assessments students received at various points of the virtual trainer experience may have been a factor in those students’ improved information retention. The mini-games during the virtual trainer experience included real-time feedback when students answered multiple-choice questions and then immediately received the correct answers and explanations. This provided multiple opportunities to both correct and reinforce learning of the subject content. Students received continuous and timely feedback1,8 with a visual cue8 (confidence level) during the training as well as individual debriefing and evaluation1,8 at the end of the training. They were also given specific instructions to assess, intervene, and reassess the patient, and the virtual charge nurse was available for coaching and keeping students on task.26 Although students demonstrated improved short-term retention of information after the virtual training, it is unclear how long this retention might have persisted. Only 1 week elapsed between the intervention and the measurement of learning objective achievement. Follow-up OSCEs 6 to 9 months later would have evaluated long-term efficacy of the intervention and perhaps established translation of skills and knowledge from a virtual to a physical environment.
The main strength of this study was that it tested and provided support for a novel approach to teaching pediatric respiratory content while using virtual patient trainer. Although the start up cost for a virtual patient trainer may be almost twice that of a high-fidelity simulator, after it is developed, the virtual patient trainer has the possibility of being used by multiple students without limitations of schedule, space, or personnel.1,8,19 In a time of scarce resources, the virtual patient trainer may use fewer resources than high-fidelity simulators, especially related to maintenance. The basic maintenance of the virtual patient trainer does require technical support, but it does not include processes such as cleaning or replacement of parts.
Several limitations of this study decrease generalizability of the findings. Because the measurement tools were developed for this study, they have had limited psychometric testing. Additional evaluation by external experts would strengthen conclusions regarding validity of the instruments. Although internal consistency reliability was high for both measurement tools, additional testing, including evaluation of test-retest reliability without intervening instruction/experiences, would provide support for instrument reliability.
Another limitation of the study was the lack of a knowledge pretest. A knowledge pretest would have been helpful to establish equivalence between the groups before the intervention. This would provide support for concluding that higher scores in the experimental group could be attributed to the virtual patient trainer rather than to preexisting differences in the groups.
Lack of consistency between the groups related to the duration of the interventions may also limit study validity. Although students received a 3-hour intervention in the traditional lecture group, students who participated in the virtual trainer experience took varying amounts of time to complete the intervention, and most completed the experience in less than 2 hours. This inconsistency could have influenced the magnitude of the difference in the outcomes. An exploration of the relationship between the duration of the interventions and the achievement of learning outcomes would be helpful to identify the optimal intervention duration.
Virtual environments offer opportunities to present information about a variety of disease states in patients with different levels of acuity11 that can be accessed at anytime, anywhere. The virtual patient trainer can allow large numbers of students to practice requisite nursing skills as often as needed. Research has suggested that participation in a 3D environment, such as in virtual patient trainers, may bridge the gap between experiential learning (learning by doing) and information representation (lecture format) to achieve a variety of learning outcomes.10,16–22 Evidence supporting the effectiveness of a virtual patient trainer to facilitate learning is emerging but currently is inconclusive. Further research is needed to provide additional evidence to support more widespread use of this new educational technology.
1. Kilmon CA, Brown L, Ghosh S, et al.. Immersive virtual reality simulations in nursing education. Nurs Educ Perpect. 2010; 31: 314–317.
2. Hansen MM. Versatile, immersive, creative and dynamic virtual 3-D healthcare learning environments: a review of the literature. J Med Internet Res. 2008; 10: e26.
3. Wiecha J, Heyden R, Sternthal E, et al.. Learning in a virtual world: experience with using second life for medical education. J Med Internet Res. 2010; 12: e1.
7. Breslin P, McGowan C, Pecheux B, et al.. Serious gaming. Health Manage Technol 2007; 28: 14–17.
9. Bauman E. Preparing learners for future experiences using game-based learning. Presented at the International Nursing Association for Clinical Simulation and learning, Orlando, FL, 2011.
10. Sward KA, Richardson S, Kendrick J, et al.. Use of a Web-based game to teach pediatric content to medical students. Ambul Pediatr. 2008; 8: 354–359.
11. Zielke MA, Evans MJ, Dufour F, et al.. Serious games
for immersive cultural training: creating a living world. IEEE Comput Graph Appl. 2009; 29: 49–60.
12. Medland MB, Stachnik TJ. Good-behavior game: a replication and systematic analysis. J Appl Behav Anal. 1972; 5: 45–51.
13. Bohannon J. IEEE International Conference on Computational Intelligence and Games. Smarts for serious games
. Science. 2010; 330: 31.
14. Andrade AD, Bagri A, Zaw K, et al.. Avatar-mediated training in the delivery of bad news in a virtual world. J Palliat Med. 2010; 13: 1415–1419.
15. Cook DA, Erwin PJ, Triola MM. Computerized virtual patients in health professions education: systematic review and meta-analysis. Acad Med. 2010; 85: 1589–1602.
16. Cook DA, Triola MM. Virtual patients: a critical literature review and proposed next steps. Med Educ. 2009; 43: 303–311.
17. Bordnick PS, Carter BL, Traylor AC. What virtual reality research in addictions can tell us about the future of obesity assessment and treatment. J Diabetes Sci Technol. 2011; 5: 265–271.
18. Aukstakalnis S, Blatner D. Silicon Mirage—The Art and Science of Virtual Reality. Berkeley, CA: Peachpit Press; 1992.
19. Dede C. The evolution of constructivist learning environments: immersion in distributed, virtual words. Educ Technol. 1995; 35: 46–52.
20. Harless WG, Drennon GG, Marxer JJ, et al.. CASE: a Computer-Aided Simulation of the Clinical Encounter. J Med Educ. 1971; 46: 443–448.
21. Blakely G, Skirton H, Cooper S, et al.. Educational gaming in the health sciences: systematic review. J Adv Nurs. 2009; 65: 259–269.
22. Kamin C, O’Sullivan P, Deterding R, et al.. A comparison of critical thinking in groups of third-year medical students in text, video, and virtual PBL case modalities. Acad Med. 2003; 78: 204–211.
23. Vash JH, Yunesian M, Shariati M, et al.. Virtual patients in undergraduate surgery education: a randomized controlled study. ANZ J Surg. 2007; 77: 54–59.
25. Gibson D, Aldrich C, Prensky M. Games and simulations in online learning: research and developmental frameworks. Hershey, PA: Information Science Publishing; 2007.
26. Childs JC, Sepples SB, Chambers K. Designing simulations for nursing education. In: Jeffries PR, ed. Simulation in Nursing Education: From Conceptualization to Evaluation. New Work, NY: National League for Nursing; 2007: 35–38.
27. Pillow M. Improving Hand-Off Communication. Oakbrook Terrace, IL: Joint Commission Resources; 2007.
29. Bhoopathi PS. Educational games for mental health professionals: a Cochrane review. Int J Psychiatr Nurs Res. 2007; 12: 1497–1502.
30. Roediger HL, Karpicke JD. The power of testing memory: basic research and implications for educational practice. Perspect Psychol Sci. 2006; 1: 181–210.
31. Nicol DJ, Macfarlane-Dick D. Formative assessment and self-regulated learning: a model and seven principles of good feedback practice. Stud Higher Educ. 2006; 31: 199–218.
32. Kromann CB, Bohnstedt C, Jensen ML, et al.. The testing effect on skills learning might last 6 months. Adv Health Sci Educ. 2010; 15: 395–401.