Share this article on:

Preparing Residents Effectively in Emergency Skills Training With a Serious Game

Dankbaar, Mary E.W. PhD; Roozeboom, Maartje Bakhuys MSC; Oprins, Esther A.P. B. PhD; Rutten, Frans MD; van Merrienboer, Jeroen J.G. PhD; van Saase, Jan L.C.M. PhD, MD; Schuit, Stephanie C.E. PhD, MD

Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare: February 2017 - Volume 12 - Issue 1 - p 9–16
doi: 10.1097/SIH.0000000000000194
Empirical Investigations

Introduction: Training emergency care skills is critical for patient safety but cost intensive. Serious games have been proposed as an engaging self-directed learning tool for complex skills. The objective of this study was to compare the cognitive skills and motivation of medical residents who only used a course manual as preparation for classroom training on emergency care with residents who used an additional serious game.

Methods: This was a quasi-experimental study with residents preparing for a rotation in the emergency department. The “reading” group received a course manual before classroom training; the “reading and game” group received this manual plus the game as preparation for the same training. Emergency skills were assessed before training (with residents who agreed to participate in an extra pretraining assessment), using validated competency scales and a global performance scale. We also measured motivation.

Results: All groups had comparable important characteristics (eg, experience with acute care). Before training, the reading and game group felt motivated to play the game and spent more self-study time (+2.5 hours) than the reading group. Game-playing residents showed higher scores on objectively measured and self-assessed clinical competencies but equal scores on the global performance scale and were equally motivated for training, compared with the reading group. After the 2-week training, no differences between groups existed.

Conclusions: After preparing training with an additional serious game, residents showed improved clinical competencies, compared with residents who only studied course material. After a 2-week training, this advantage disappeared. Future research should study the retention of game effects in blended designs.

From the Departments of Work, Health and Care (M.B.R.) and Training and Performance Innovations (E.A.P.B.O.), TNO; Training Institution for the Professional Education of General Practitioners SBOH (F.R.); Health, Medicine and Life Sciences, Educational Development and Research, Maastricht University (J.J.G.V.M); and Departments of Internal Medicine (J.L.C.M.V.S.) and Emergency Care and Internal Medicine (S.C.E.S.), Erasmus University Medical Center, Rotterdam, The Netherlands.

Reprints: Mary E. W. Dankbaar, PhD, Institute for Medical Education Research, Erasmus University Medical Center Rotterdam, PO Box 2040, 3000 CA Rotterdam, Room Ae 234, The Netherlands (e-mail:

The authors declare no conflict of interest.

AbcdeSIM B.V. is a spin-off company founded by Erasmus University Medical Center Rotterdam (Erasmus MC) and Stichting SBOH (a Dutch foundation for the professional education of general practitioners) as a joint venture to aid in the further development of and selling licenses for the abcdeSIM serious game to other hospitals and institutions. By virtue of standard policies at Erasmus MC, S.C.E.S.(author) and Erasmus MC have a financial interest in abcdeSIM B.V.: Erasmus MC Holding B.V. and Stichting SBOH as founders of abcdeSIM B.V. own most shares, and S.C.E.S. as an inventor of the abcdeSIM serious game received certified shares (without voting rights) from Erasmus MC. To mitigate this financial interest in the research, the authors asked TNO, an independent research institute with expertise in research on game effectiveness, to conduct the data collection and data analysis. S.C.E.S. and Erasmus MC were in no way directly involved in this data collection and data analysis.

Supported by the Dutch Ministry of Economic Affairs and Stichting SBOH. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the article.

Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal’s Web site (

This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal.

Health care, with its exponential growth in knowledge and increasing demands on the competencies required of doctors, has a need for new and more cost-effective training models.1 Online learning, together with an instructor-led course or as a fully online course, can be used to improve the efficiency and flexibility of medical training.1,2 Technology-enhanced simulation programs provide learning opportunities for controlled skills practice, without risk to the patient.3 The effectiveness of simulation programs is well established. Their use in medical education is consistently associated with large improvements in knowledge, skills, and attitudes when compared with no intervention.3–5

However, full-scale computer-based simulators are often expensive, both in terms of initial purchase price and running costs.6 Furthermore, attention to student motivation is often neglected in simulation programs.7 As a result, learners tend to use them to reach certain learning outcomes but avoid continuous practice.8 Serious games offer a challenging learning environment in which video game characteristics are attuned to educational goals.9–13 They offer “situated meaning and learning,” whereby players interact with the game world through probing, seeing, and experiencing things in a specific context.14 An important and common type of serious games are simulation games; they offer learning tasks in a realistic, engaging online environment, where learners directly experience the consequences of their decisions.9,12,15,16 More specifically, they offer a mix of pedagogical (eg, cases, multimodal representations, scaffolding), game (eg, competition, context, multiple levels), and simulation (representations of situations or objects, user interaction, and feedback)17 elements. Experiential or task-centered learning in games can be associated with guided discovery learning.9 The rationale for making real-world tasks the basis of a learning environment is to promote the application of knowledge and transfer to practice.18 Because of their scalability, simulation games have the potential to teach knowledge and skills that are typically acquired in a simulation center and at a fraction of the cost.6 The integration of fun and challenge in games can reduce stress and enhance motivation and effectiveness.19,20 In addition, self-directed learning seems to enhance intrinsic motivation, because it affords a greater sense of autonomy.21 If students are intrinsically motivated to learn, they will likely spend more time learning and feel more positive about what they learn.22 Furthermore, the element of competition in games is expected to stimulate sustained play and learning.

Several studies have examined the effects of serious games on learning outcomes and motivation. Most reviews of these studies, however, find that they have mixed and ambiguous outcomes, largely because of methodological flaws.11,15,16,23,24 Examples of such flaws are: the use of pre-to-post comparisons leading to overestimation of the game effect, internal validity threats (history, selection),15 and a lack of studies with a suitable control condition or randomised controlled trial.16 In short, serious games offer promising learning tools for engaged, complex learning, but more research is needed to evaluate their efficacy in health care.

Training in emergency care skills is critical for patient safety and is an essential part of undergraduate and postgraduate medical education. Each year, worldwide more than 1.5 million health care professionals attend a variety of emergency care courses, using the internationally standardized “ABCDE” approach.25 This method prioritizes initial resuscitation of critically ill patients. The mnemonic stands for Airway, Breathing, Circulation, Disability, and Exposure. Substantial resources are invested in teaching these complex cognitive skills26 with the aim of integrating knowledge, numerous skills, and a professional attitude within tight time constraints.27 Considering the training challenges and tight budgets in health care organizations, more efficient training models are required. In 2010, the International Liaison Committee of Resuscitation recommended combining online self-directed training with instructor-led skills training.28

As a preparation for instructor-led emergency skills training, we have developed a serious game (abcdeSIM), in which medical residents can stabilize patients in a virtual emergency department. In a previous study of fourth-year medical students, we found that abcdeSIM and text-based cases were no better than an e-module (used by the control group) at improving their cognitive emergency care skills. However, the game-based group was found to be more motivated than the text-based group.29 Because these students were inexperienced in emergency care, we suspect that working on such a limited number of simulated cases, although engaging, was too complex for them. In the current study, we will investigate the effectiveness of the same game for residents (i.e, the original target group), in a blended design. Should game-based learning prove to be effective as a preparation for face-to-face training of complex acute care skills, then training time may be reduced. Once simulation games have been developed, they can be used for skills training for large numbers of trainees, with no extra costs for instructors or simulated patients, in contrast to simulation centers. Blended learning, combining e-learning with classroom training, has been shown to make emergency care training more efficient for different learning goals, while maintaining learning outcomes.25,30

The research questions of this study are:

(1) Do residents, who use the abcdeSIM game in addition to a course manual as a preparation for classroom training, show better emergency care skills (before training) than residents who exclusively use a course manual? (2) Are they motivated to play the game? (3) Are game-playing residents more motivated to learn the course content than residents who only used the course material? (4) Is there a difference in skills level between groups after 2 weeks of training?

Our hypothesis is that residents who use the game will show improved skills at the start of face-to-face training compared with those who exclusively used the course manual (1). We expect residents to be motivated to play the game (2) and also expect game-playing residents to be more motivated for the course content than residents who only used the course manual (3). We have no specific expectations of the skills level after 2 weeks of training (4).

Back to Top | Article Outline


Participants, Setting, and Study Design

We performed this study with second-year family-practice residents. All family-practice residents in The Netherlands are required to do a 6-month traineeship in an emergency department of a hospital; before the start, residents must complete a 2-week general emergency care course. After passing this course, they are allowed to start their traineeship under the supervision of certified attending physicians. This course includes emergency care subjects based on the ABCDE approach to emergency resuscitation. Each year, 500 to 600 Dutch family-practice residents are trained and assessed in this 2-week certified course. They are assigned in training groups of 70 to 90 residents by the academic training organization; there are no systematic differences between groups.

Most residents have limited experience with emergency care. Because the residents of a particular training group usually know each other (they are following the same courses), random assignment to both experimental groups was not feasible; game accounts could easily be exchanged between groups. In addition, different resident groups are quite homogeneous in age and experience. Therefore, we used a quasi-experimental design with a historical control; residents from the December training group were treated as the control (reading) group and received (only) the course manual 6 weeks before the 2-week classroom training. Residents from the next March and September groups were treated as the intervention (reading and game) group and in addition received an account for the abcdeSIM game 6 weeks before training. They were advised to first study the course manual and then play the game. The game group completed an online evaluation questionnaire after playing the game. Emergency care skills were assessed before the training and on the last training day. Participants completed a pretraining questionnaire during the first training day, and their data on posttraining assessment were recorded.

Back to Top | Article Outline

Selection of Participants

Six weeks before the start of the face-to-face training, residents were asked to participate in the extra pretraining assessment; this took place 2 hours before the start of the face-to-face-training. This assessment was not part of the regular program and had no consequences for the participants. On the first training day, all residents were in addition asked to participate in the rest of the study (questionnaires, posttraining assessment). Participants signed a consent form for both study parts; the study was approved by the Dutch ethical board for research in medical education (NVMO, Ned. Vereniging voor Medisch Onderwijs, no 210).

Back to Top | Article Outline


Preparatory Course Manual

All residents received a course manual on emergency care skills as a preparation for classroom training. The manual contained instructional material (no cases) on the ABCDE approach and the essentials of medical emergency and trauma care. Subjects that were covered in the manual were impaired critical functions, disturbances of consciousness, etc. (see Table 1).

Back to Top | Article Outline

Preparatory Serious Game

The abcdeSIM simulation game provides an online-simulated emergency department, where residents can apply and exercise their emergency care skills with virtual patients.31 Its design was based on analyzing the task demands of the ABCDE approach for stabilizing acutely ill patients. The learning objectives, resulting from this analysis, were the basis for the choice and development of the cases including the feedback. The contents of the cases were validated by an expert panel. The game was designed by an experienced game design company and pilot tested with the target group before implementation. All relevant tools for assessment and stabilization of acutely ill patients are virtually available (stethoscope, laboratory tests, infusion fluids, blood, medication). A high-fidelity mathematical model of human physiology for respiration, circulation, and neurological functioning was implemented on the virtual patients to create realism and give immediate feedback on the patients' condition. Players started with doing a tutorial, explaining the game interface. The game contained 6 regular adult patient cases, presented as animated photos (cases in Table 1). The game was primarily aimed at training clinical emergency care skills; communication skills were not addressed in the game. Each case had to be solved within 15 minutes (a timer was presented) and could be done as often as desired. During playing, players received direct feedback on their actions through a monitor with data on the patient's condition and from the assisting virtual nurse. After playing a case, residents got a score and narrative feedback. The game score was dependent on how many correct decisions were made according to the ABCDE method and how efficiently this was done (less minutes resulted in more points). Peers' scores per case and a high score list were presented to stimulate competition between players. The simulation game contained only cases, no additional instructions (Fig. 1 for screenshots of the game) (see document, Supplemental Digital Content 1,, more detailed game description). The reading and game group studied the course material as a preparation for game play.

Back to Top | Article Outline

Classroom Training

The 2-week training course was aimed at training emergency care skills, using the ABCDE approach and basic and advanced life support techniques. The design and assessment of this course are comparable with Advanced Trauma Life Support and Advanced Life Support courses. The training provided a combination of lectures, scenario training in small groups with standardized patients and mannequins, and skills training. Each participant acted as the treating physician in 3 scenarios with a simulated patient, on 1 scenario with a pediatric mannequin and assisted as a nurse in 6 other scenarios. In addition, one observed and critiqued peers in 15 scenarios. Each 20-minute scenario play was followed by a 20-minute feedback session. To summarize, during training, residents played an active role in 10 scenarios and reviewed another 15 scenarios.

Back to Top | Article Outline

Assessment and Evaluation Instruments

Skill Assessment

Posttraining assessment was part of the regular certified emergency care course. Fifteen assessors with different medical specialty backgrounds, all qualified instructors in internationally certified emergency medicine courses and trained according to international standards (a standardized Generic Instructor Course), assessed the residents, 1 rater per candidate. Assessments consisted of a 1-case scenario test (15 minutes) with a standardized, trained simulation patient. Because of the design, the assessors could not be blinded for the condition; they may have heard about the game as a preparation for training, but they were not involved in the study, nor had they any interest in a specific outcome. Before assessments, raters were briefed on the scenarios and instrument. During assessments, a course director was available for questions. This is a high-stakes assessment for the residents; if they fail, 1 resit is offered with another scenario and 2 raters (including the course director). If residents fail again, they are not allowed to start the emergency department traineeship. Participants for the pretraining and posttraining assessments were assigned at random to different assessment scenarios. The same scenarios were used for the reading group and the reading and game group. The skill assessment scenarios were different from the scenarios in the game. The pretraining assessment followed the same procedure (1-case scenario test with a standardized simulation patient).

The assessment instrument aimed to measure the ability to assess and treat seriously ill or injured patients and consisted of a clinical competency scale (6 items on the ABCDE method and diagnostics, eg, “uses ABCDE approach on initial assessment”), a communication competency scale (3 items on communication, eg, “communicates with patient effectively”), both rated on a 7-point scale (7 = excellent). In addition, the assessor judged the candidate on a global performance scale (using a single 10-point scale to rate “independent functioning in caring for acutely ill patients in the emergency department” (10 = perfect)) (see document, Supplemental Digital Content 2,, assessment instrument). The pass/fail cut point was based on the global performance scale (fail if <6). The assessment instrument was validated in a separate study; the competency scales showed good construct validity and internal consistency. The clinical competency scale and global performance scale showed moderate interrater reliability (intra class correlation = 0.49 resp. 0.50); the communication competency scale had poor interrater reliability (intra class correlation = 0.27).26 Although communication skills were not addressed in the game, we do report them, because they were a part of the regular course and its assessment. These assessment results also provide insight into the competencies on which the game does and does not have an impact.

Back to Top | Article Outline

Motivation Questionnaire on the Game

After working on all cases in the game, residents who participated in the reading and game group completed an online questionnaire evaluating the game and the motivation to play. The questionnaire consisted of 9 statements, including items such as “I felt actively involved with the patient cases,” to be scored on a 5-point Likert scale (5 = fully agree) (see document, Supplemental Digital Content 3,, motivation questionnaire).

Back to Top | Article Outline

Pretraining Questionnaire on Task Value, Self-efficacy, and Self-assessment

During the first training day, a questionnaire was completed by all participants on motivation toward the course, self-efficacy, and self-assessment. The Motivated Strategies for Learning Questionnaire has been used extensively in educational research projects to measure students' motivational orientations.32 We used 2 subscales: task value (9 items, eg, “I think that what I am learning in this course is useful for me to know”) and self-efficacy (9 items, eg, “I'm certain I can understand the ideas taught in this course.”) All items were rated on a 7-point Likert scale (7 = very true of me). In addition, participants completed a self-assessment instrument, on the basis of the raters' assessment instrument.

Back to Top | Article Outline

Self-study Time

On the last training day (before assessment), all residents completed an evaluation form on the course, including a question on the hours of preparation for the course. The questionnaire was anonymous; data on self-study time was available on group-level only.

Back to Top | Article Outline

Statistical Analysis

A reliability analysis (Cronbach α) was calculated for the questionnaires and assessment instruments. Independent t tests were performed to compare group characteristics, assessment data, and motivation data from the reading and reading and game group. Unless the distribution of scores is severely skewed, data from rating scales can be analyzed as if they were interval without introducing bias.33 Effect sizes (ESs) were calculated using Glass’s δ. Practical significance of research results can be quantified from being small (ES ≈ 0.20) to moderate (ES ≈ 0.50) to large (ES ≈ 0.80).34 Associations between game data were calculated using Pearson coefficient. We treated missing data with pairwise deletion and used SPSS for the statistical analysis.

Back to Top | Article Outline


Characteristics of Participants

Of 210 eligible residents (60 from the reading group, 150 from reading and game group), 159 (76%) consented to participate in the study, 52 in the reading and 107 in the game and reading group. A subgroup of 18 (reading) and 24 (reading and game) residents agreed to participate in the extra pretraining skill assessment. There were no differences between the reading group and reading and game group on main characteristics, such as experience with acute patients and score on national test for family-practice residents (Table 2). The pretraining assessment group (n = 42) did not differ from the rest of the research participants (n = 117, data not presented).

Back to Top | Article Outline

Assessment Results Emergency Care Skills

Reliability of the scales (Cronbach α) was 0.92 for the 6-item clinical competency scale and 0.81 for the 3-item communication competency scale. Before training (after the game), the reading and game group performed better on clinical competency skills (P = 0.03, Table 3) with a medium-large effect size (Glass’s d = 0.62) than the reading group. Improvements occurred particularly in the items on initial assessment (d = 0.82), treatment (d = 0.72), and requests for additional diagnostics (d = 0.50). The reading and game group also showed less variability in competency levels (more homogeneity, measured as smaller standard deviation scores; P = 0.02). There were no differences in communication competency skills or on global performance scores between groups before training.

There was an association between assessment scores on the global performance scale with the clinical competencies scale (r = 0.74, P < 0.001) and with the communication competencies scale (r = 0.42, P = 0.006).

At the end of the 2-week training, scores on the competency and global performance scales were similar for both groups (Table 3). There was also an association between assessment scores on the global performance scale with the clinical competencies scale (r = 0.80, P < 0.001) and with the communication competencies scale (r = 0.60, P < 0.001).

Back to Top | Article Outline

Self-study Time and Game Data

Mean self-study time (self-reported) before the course for the reading group was 9.9 hours (SD = 5.9 hours), and for the reading and game group, it was 12.4 hours (SD = 5.7 hours); hence, the game group spent 2.5 hours extra self-study time (P = 0.007). Mean game-playing time (logged) for the game group was 2.2 hours per resident (SD = 0.95 hours), which is consistent with the self-reported study time. The mean playing time per case (excluding the tutorial) was 17 minutes. The longest playing time was spent on the first case (internal bleeding with shock, mean = 28 minutes) and on what seemed to be the most difficult case (subarachnoid hemorrhage leading to seizures, mean = 27 minutes). The maximum playing time in the game was 15 minutes per case indicating that (on average) cases were completed more than once. There was an association between playing time and game score (r = 0.49, P < 0.001), indicating a learning effect within the game.

Back to Top | Article Outline

Evaluation of the Game

Reliability (Cronbach α) of the 9-item game motivation scale was 0.80 (n = 90). Mean score on the 5-point scale was 3.9 (SD = 0.41). Above average scoring items were “I felt actively involved with the cases” (mean = 4.2, SD = 0.53), “My attention was completely drawn to the cases” (mean = 4.2, SD = 0.63), and “I liked playing the game” (mean = 4.2, SD = 0.69). Below average scoring item was “I regularly felt stressed during playing the cases” (mean = 3.3, SD = 0.87) (See supplemental Digital Content 3,, for detailed evaluation data). Responses from the residents that indicated strong points of the game included the following: “it's a very good way to develop experience with this approach, as a preparation for seeing patients”; “it feels very ‘real’”; “it is very instructive”; and “I could feel the stress.” Points for improvement of the game were: “it took me some time to figure out how to select and put back the tools in the game”; “I would have liked more feedback on my actions”; and “I would like more information on how scores are composed and may be improved.” A few remarks were made on limitations in game play because of a slow computer or internet access.

Back to Top | Article Outline

Task Value, Self-efficacy, and Self-assessment

Reliability (α)of the task value scale was 0.76 (n = 159), for the self-efficacy scale α was 0.83 (n = 159), and for the self-assessment scales on clinical and communication competencies α was 0.88 and 0.71 (n = 150), respectively. On comparison, both groups showed the same scores on task value (motivation for the course, mean = 6.2/6.2; SD = 0.43/0.39 for the reading and reading and game group resp. on the 7-point scale) and on self-efficacy before training (mean = 4.8/4.8; SD = 0.66/0.61 for the reading and reading and game group resp.). The self-assessment scores on clinical competencies before training were higher for the reading and game group (n = 104, mean = 4.7, SD = 0.54) than the reading group (n = 46, mean = 4.4, SD = 0.74, P = 0.01). There were no differences in self-assessed communication competences or global performance before training. Compared with the assessors' rating, both groups overestimated their skills.

Back to Top | Article Outline


This study compared the cognitive skills and motivation of residents who only used a standard course manual as a preparation for classroom training on emergency care (the “reading group”) with residents who used an additional serious game (the “reading and game group”). In terms of our research questions, we found that residents who combined a course manual and a simulation game as a preparation for classroom training improved their emergency care skills, compared with residents who only used a course manual (1), which was in line with our hypothesis. The game had no effect on communication competencies or on global performance in caring for acutely ill patients. As expected, residents also were motivated to use the game (2). Contrary to our expectation, the motivation before training for the training content was not different between groups (3). After 2 weeks of face-to-face training, there no longer was a difference between groups in skills level (4).

With limited extra self-study time (+2.5 hours, compared with 10 hours spent on the course manual by both groups), the serious game had a positive effect on clinical competencies at the start of the training, with a moderate-large effect size, and also reduced the variability in clinical performance. For teachers, a more homogeneous student group is easier to train. The same communication competency levels were measured in both groups, which can be explained by the fact that the game did not address these competencies. Nor was there an effect of the game on global performance. Apparently, this more holistic, perhaps personal notion of global performance captures for assessors more than only clinical competencies. The fact that both skills are not perceived as the same by assessors was supported by the (only) partial correlation of the global performance and clinical competency scores. Moreover, the self-assessment of the reading and game group only improved for the clinical competencies (not for global performance), compared with the reading group.

Anecdotal reports from teachers indicated that the extended (game-based) training preparation had a positive impact on the training level. Before, the group showed considerable heterogeneity; some trainees had experience with scenario-based ABCDE training, whereas most started with only the knowledge from the course manual.

Our study basically compared reading and reading in adjunct with game play. As the reading and game group spent more time on self-study, it remains questionable how important the game characteristics were for skills learning. The simulated cases, being at the center of the game, provided an opportunity to practice the skill with a variety of patient problems. The finding that this had a positive impact on residents' performance is consistent with research on task-centered learning, showing that learning with a variety of real-world tasks facilitates skills development and transfer to clinical practice.18,27 In addition, reviews on technology-enhanced simulation in health care show that in comparison with no intervention, simulation programs are associated with large effects on knowledge and skills.35,36 Future design-based research that controls for time on task should determine which game features enhance performance and motivation. These may include, for example, the narrative line, the scoring system, multiple sources of feedback,37 and animated cases.38

Considering the issue of games and motivation, we found that residents were motivated to learn with the game and felt actively involved and immersed with the game cases (as was illustrated by remarks as “I could feel the stress,” “It's a very good way to develop experience with this approach”). This was supported by the fact that they played the patient cases several times and spent 2.5 hours extra self-study time on the game. In a previous study with the same game, used by medical students, the group working on the game felt more motivated compared with a group working on the same cases in a text-based format.29 These are important results, because in self-guided, online training programs, motivated trainees will put more effort into learning. It also suggests that the simulation game could be used as a skills maintenance tool after training. How do games motivate learners to spend extra time on task? Choice and the opportunity for self-directed learning seem to enhance intrinsic motivation, because they afford a greater sense of autonomy.21 The opportunities for self-directed learning and interest in the subject of the game probably created intrinsic motivation. More research is needed on the specific features that make games engaging for learning, compared with simulations and interactive cases.

Residents who used the game as an additional preparation were just as motivated for the course content as residents who only used the course material (as preparation). A possible explanation for this is the high level of motivation for the course among all residents, as shown by the high task value scores. When residents know they are going to need certain skills in practice, they usually are quite motivated for the course. Our results show that motivation to engage with an instructional format should be distinguished from motivation to learn a specific task.

Both groups had the same (general) self-efficacy levels, but the reading and game group correctly self-assessed their clinical skills as superior to that of the reading group (although both groups slightly overestimated their skills relative to the assessor's rating). Apparently, the game does not easily change the residents' general sense of self-efficacy during the course, whereas it did make them more aware of their improved emergency skills.

After 2 weeks of training, we no longer found a positive effect of the preparatory game. An obvious explanation for this is learning time; the effect of the 2.5-hour game time was overshadowed by the 2-week classroom training. In addition, in terms of the number of different scenarios discussed, classroom training offered much more opportunities for learning than game training (25 vs. 6 cases).39 Considering the relatively high assessment scores at the end of training, there probably also was a ceiling effect in the residents' competencies after the 2-week training.

Given the higher starting level in clinical competencies, a relevant question is whether classroom training can be shortened in combination with the game, maintaining learning outcomes. This would make the blended training design more cost-effective because online games are scalable to large numbers of health care professionals without extra costs (in contrast to simulation centers). A study on Advanced Life Support training, comparing assessment results at the end of a conventional 2-day course with a 1-day course supplemented with online interactive simulation cases, showed similar knowledge and skills.25 Instead of the course manual, we currently use an e-module, including exercises on the emergency approach and a demonstration video with a simulated patient. More worked cases of patient problems could be added to this e-module. To further enhance the preparatory skills level of residents, a number of cases could be moved from the training to the game. This enables residents to practice with a larger variety of virtual patients and focus on their personal learning needs. We know that for training complex cognitive skills, offering a high variety of learning tasks is important to allow transfer to new tasks.39 Training can possibly be shortened, reducing direct and indirect costs. Future research should confirm the effectiveness of this new blended design.

In the current study, we found a positive effect of the game on emergency care skills with residents. In a previous study with medical students, however, the game group did not profit from this game, compared with a group working on text-based cases or the control group working on only an e-module.29 This indicates an “expertise reversal effect,” where a rich learning environment benefits experts, but is ineffective or even counterproductive for novice learners.40 The effects of game design choices for different user's expertise levels are an interesting field for further study.

One limitation of our study is the fact that groups were not randomized; thereby, confounders may potentially have played a role. Our research groups were very homogeneous (second-year family-practice residents), and we believe that we measured the most important possible confounders, such as experience with acute patients and general knowledge on national tests, and found no differences between groups.

Another limitation is the relatively small number of participants for the pretraining assessment (n = 42). Although the number of residents was limited, they do not differ from the total group on important characteristics and thus may be considered as representative for the groups they came from. In addition, self-assessment data from the total group (n = 150) supports the measured improvement in clinical skills from the smaller group. Despite the small sample, we did find (practical) significant differences in skills between groups. Nevertheless, the small numbers limited our possibilities to analyze relationships between game time, performance, and motivation, more specifically. It would be interesting to replicate the study with larger groups of residents and a shorter (blended) training design.

Thirdly, we assessed residents' competencies in a single patient scenario. Extensive evidence in assessment research shows that content specificity is the main cause of unreliability and outweighs other sources of bias.41 However, we have analyzed an internationally representative emergency care assessment situation where single scenario assessment is commonly used42,43 and our assessment instruments were validated in a separate study.26 Furthermore, we did not assess the effects on patient outcomes. Few emergency care courses have had patient outcomes as an end point,44 but it would be worthwhile to investigate the transfer to clinical practice.

Finally, this study is not conclusive for the question of whether the learning time or the game format was responsible for the improved skills, because the game group spent more time on task. Many studies on serious games discuss proposed elements that are important for motivation and learning, but relatively few describe empirical evidence.45–48 Comparative design-based research will have to show which features of simulation cases enhance learning outcomes.49,50 This “value-added” research approach (Mayer) will be an important next step in the young field of game-based learning.51

In summary, this study showed that serious games can be used as an effective, motivating preparation for instructor-led emergency care courses to teach medical residents' clinical competencies. Learning from doing, with a variety of realistic, interactive patient cases, through error and without harming patients is an important potential benefit of games and simulation programs. Future research is needed to show how this effect can be sustained and whether training time can be reduced in combination with online-simulated cases, maintaining learning outcomes. More research is also needed on the question of which game features are critical for engaged skills learning.

Back to Top | Article Outline


The authors thank the training institution for family practitioners (Stichting SBOH, The Netherlands) for facilitating this research study.

Back to Top | Article Outline


1. Ruiz JG, Mintzer MJ, Leipzig RM. The impact of E-learning in medical education. Acad Med 2006;81(3):207–212.
2. Bates T. Managing Technological Change. San Fransisco, CA: Jossey-Bass; 2000.
3. Issenberg SB, McGaghie WC, Petrusa ER, Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach 2005;27(4):10–28.
4. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ 2010;44(1):50–63.
5. Cook DA, Hatala R, Brydges R, et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA 2011;306(9):978–988.
6. Kalkman CJ. Serious play in the virtual world: can we use games to train young doctors? J Grad Med Educ 2012;4(1):11–13.
7. Kneebone R. Evaluating clinical simulations for learning procedural skills: a theory-based approach. Acad Med 2005;80(6):549–553.
8. Jalink MB, Goris J, Heineman E, Pierie JP, ten Cate Hoedemaker HO. Construct and concurrent validity of a Nintendo Wii video game made for training basic laparoscopic skills. Surg Endosc 2014;28(2):537–542.
9. Kiili K. Digital game-based learning: towards an experiential gaming model. Internet High Educ 2005;8:13–24.
10. van Staalduinen JP, de Freitas S. A first step towards integrating educational theory and game design. In: Felicia P, ed. Research on improving learning and motivation through educational games: multidisciplinairy approaches. Hershey, PA: IGI Global; 2010:28.
11. Akl EA, Pretorius RW, Sackett K, et al. The effect of educational games on medical students' learning outcomes: a systematic review: BEME Guide No 14. Med Teach 2010;32(1):16–27.
12. Clark RC, Mayer RE. E-Learning and the Science of Instruction. Proven Guidelines for Consumers and Designers of Multimedia Learning. Elearning. San Fransisco: Pfeiffer; 2008:476.
13. Huang WH, Huang WY, Tschopp J. Sustaining iterative game playing processes in DGBL: the relationship between motivational processing and outcome processing. Comput Educ Elsevier Ltd 2010;55(2):789–797.
14. Gee J. What Video Games Have to Teach Us About Learning and Literacy. New York: Palgrave Macmillan; 2003.
15. Sitzmann T. A meta-analytic examination of the instructional effectiveness of computer-based simulation games. Pers Psychol 2011;64(2):489–528.
16. Connolly TM, Boyle EA, MacArthur E, Hainey T, Boyle JM. A systematic literature review of empirical evidence on computer games and serious games. Comput Educ Elsevier Ltd 2012;59(2):661–686.
17. Aldrich C. Learning by Doing. San Francisco, CA: Pfeiffer; 2005.
18. Francom GM, Gardner J. What is task-centered learning? TechTrends 2014;58(5):27.
19. Allery LA. Educational games and structured experiences. Med Teach 2004;26(6):504–505.
20. Prensky M. The Digital Game-Based Learning Revolution. New York: McGraw Hill; 2001:1–19.
21. Ryan RM, Deci EL. Intrinsic and extrinsic motivations: classic definitions and new directions. Contemp Educ Psychol 2000;25(1):54–67.
22. Malone TW. Towards a theory of intrinsically motivating instruction. Cogn Sci 1981;4:333–369.
23. Graafland M, Schraagen JM, Schijven MP. Systematic review of serious games for medical education and surgical skills training. Br J Surg 2012;99(10):1322–1330.
24. Vogel J, Vogel D, Canon-Bowers J, Bowers C, Muse K, Wright M. Computer gaming and interactive simulations for learning: a meta-analysis. J Educ Comput Res 2006;34:229–243.
25. Perkins GD, Kimani PK, Bullock I, et al. Improving the efficiency of advanced life support training: a randomized, controlled trial. Ann Intern Med 2012;157(1):19–28.
26. Dankbaar ME, Stegers-Jager KM, Baarveld F, et al. Assessing the assessment in emergency care training. PLoS One 2014;9(12):e114663.
27. Van Merrienboer JJ, Kirschner PA. Ten Steps to Complex Learning. A Systematic Approach to Four-Component Instructional Design. 2nd edn. London: Routledge; 2012.
28. Mancini ME, Soar J, Bhanji F, et al. Part 12: Education, implementation, and teams: 2010 International Consensus on Cardiopulmonary Resuscitation and Emergency Cardiovascular Care Science With Treatment Recommendations. Circulation 2010;122(16 Suppl 2):S539–S581.
29. Dankbaar M, Alsma J, Jansen E, van Merrienboer J, van Saase L, Schuit S. An experimental study on the effects of a simulation game on students' clinical cognitive skills and motivation. In: Advances in Health Sciences Education. Netherlands: Springer; 2015;21:505–521.
30. Dankbaar ME, Storm DJ, Teeuwen IC, Schuit SC. A blended design in acute care training: similar learning results, less training costs compared with a traditional format. Perspect Med Educ 2014;3(4):289–299.
31. Den Blijker J. Een game die levens redt (a game which saves lives). Trouw Netherlands. 2012; June 29.
32. Pintrich PR, de Groot E. Motivational and self-regulated learning components of classroom academic performance. J Educ Psychol 1990;82(1):33–40.
33. Streiner D, Norman G. Health Measurement Scales: A Practical Guide to Their Development and Use. 4th ed. Oxford: Oxford University Press; 2008.
34. Hojat M, Xu G. A visitor's guide to effect sizes: statistical significance versus practical (clinical) importance of research findings. Adv Health Sci Educ Theory Pract 2004;9(3):241–249.
35. Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R. Technology-enhanced simulation to assess health professionals: a systematic review of validity evidence, research methods, and reporting quality. Acad Med 2013;88(6):872–883.
36. Ilgen JS, Sherbino J, Cook DA. Technology-enhanced simulation in emergency medicine: a systematic review and meta-analysis. Acad Emerg Med 2013;20(2):117–127.
37. Squire KD. Games, learning, and society: building a field. Educ Technol. 2007;4:51–54.
38. Kamin C, O'Sullivan P, Deterding R, Younger M. A comparison of critical thinking in groups of third-year medical students in text, video, and virtual PBL case modalities. Acad Med 2003;78:204–211.
39. Kulasegaram K, Min C, Ames K, Howey E, Neville A, Norman G. The effect of conceptual and contextual familiarity on transfer performance. Adv Health Sci Educ Theory Pract 2012;17(4):489–499.
40. Kalyuga S, Ayres PL, Chandler P, Sweller J. The expertise reversal effect. Educ Psychol 2003;38(1):23–31.
41. Swanson DB, van der Vleuten CP. Assessment of clinical skills with standardized patients: state of the art revisited. Teach Learn Med 2013;25(Suppl 1):S17–S25.
42. Preston JL, Currey J, Eastwood GM. Assessing advanced life support (ALS) competence: Victorian practices. Aust Crit Care 2009;22(4):164–171.
43. Ringsted C, Lippert F, Hesselfeldt R, et al. Assessment of Advanced Life Support competence when combining different test methods—reliability and validity. Resuscitation 2007;75(1):153–160.
44. Teteris E, Fraser K, Wright B, McLaughlin K. Does training learners on simulators benefit real patients? Adv Health Sci Educ Theory Pract 2012;17(1):137–144.
45. Garris R, Ahlers R, Driskell JE. Games, motivation, and learning: a research and practice model. Simul Gaming 2002;33(4):441–467.
46. Huang WH. Evaluating learners' motivational and cognitive processing in an online game-based learning environment. Comput Human Behav Elsevier Ltd 2011;27(2):694–704.
47. Bedwell WL, Pavlas D, Heyne K, Lazzara EH, Salas E. Toward a taxonomy linking game attributes to learning: an empirical study. Simul Gaming 2012;43(6):729–760.
48. Young MF, Slota S, Cutter AB, et al. Our princess is in another castle: a review of trends in serious gaming for education. Rev Educ Res 2012 Feb 1;82(1):61–89.
49. Norman G. Simulation comes of age. Adv Health Sci Educ Theory Pract 2014;19(2):143–146.
50. Wouters P, van Oostendorp H. A meta-analytic review of the role of instructional support in game-based learning. Comput Educ Elsevier Ltd 2013;60(1):412–425.
51. Mayer RE. Computer Games for Learning: An Evidence-Based Approach. Cambridge, MA: MIT Press; 2014:264.

serious game; game-based simulation; emergency care training; clinical skills training; motivation

Supplemental Digital Content

Back to Top | Article Outline
© 2017 Society for Simulation in Healthcare