Before certification for independent practice, internal medicine residents must demonstrate effective and appropriate use of advanced cardiac life support (ACLS) skills including the recognition and management of various life-threatening conditions, such as cardiac arrest.1 Residents acquire these skills through ACLS provider courses and by participating as a member of a team that responds to relevant clinical events. Unfortunately, physician knowledge and skill retention after ACLS provider courses may not be optimal,2 with decay occurring as early as 3 to 6 months after training.3,4 In addition, the frequency of in-hospital events requiring the use of ACLS skills has decreased.5 These waning opportunities likely leave residents unprepared to become competent ACLS response team leaders,6 and many internal medicine residents are not able to effectively demonstrate the use of ACLS skills at the completion of their residency training.7
Previous studies have shown that mastery learning is an effective instructional method8 and can be applied to developing residents’ ACLS skills.9 Mastery learning is a focused approach to competency-based education that has been defined as requiring the following: (1) assessments for which a minimum passing standard has been established, (2) alignment between the passing standard and learning objectives, (3) baseline assessment, (4) education aimed at fulfilling the learning objectives, (5) reassessment after instruction, (6) permitting a “pass” only after the standard is achieved, and (7) continued practice if the standard is not achieved.10–12 The time needed to achieve the mastery standard will vary among learners; each will have his or her own “learning curve.”10
Simulation-based education (SBE) can be used to enable residents to engage in mastery learning because it allows for repetitive practice with feedback and does not risk harm to patients.13 A recent meta-analysis demonstrated that although mastery learning using SBE is superior to nonmastery instruction, it also takes more time.11 Increased time often translates to increased cost, a major barrier to implementing an SBE mastery learning program. Costs include not only the initial costs of acquiring the simulators but also, often more significantly, the recurring costs to pay for facility operation, administrators, technicians, and faculty instructors.14,15
Instructors’ time commitments might be reduced if their role was shifted from direct supervisor to instructional designer. One potential way of improving the cost effectiveness of SBE is by designing effective experiences in which residents and others can learn independently. To help address the issues of instructor cost, organizations such as the American Heart Association and American Academy of Pediatrics have endorsed the use of independent learning using computerized programs as part of their resuscitation training programs.16,17
Because simulation centers have become commonplace, residents have increased opportunities to use these environments for self-regulated learning (SRL). In SRL, learners are active agents who make study decisions and manage their cognition and motivation to achieve identified learning goals.18,19 Recent evidence suggests that because trainees self-regulate their learning both with and without direct instructor supervision, optimal SRL requires explicit and evidence-based scaffolding and other supports, especially early in skill acquisition. A recent systematic review of the SBE literature has shown that the “usual” approach to instructor-led SBE training does not provide adequate supports for trainees’ SRL.20 Indeed, a recent comparative trial showed that unsupervised learners who had explicit training supports available to them better maintained their long-term educational outcomes, as compared with an instructor-led group.21 Allowing learners full control over the educational process is not without potential risk; however, as research has shown that self-regulated learners can develop incorrect beliefs about learning or “bad habits” that may lead to suboptimal learning.22
Rather than relying on learners to regulate the entire learning experience without guidance, the concept of directed SRL (DSRL) suggests that an instructor can apply educational principles to design and structure the SRL training experience.18 Within this structured context, learners are supported in their efforts to regulate relevant aspects of their learning (eg, how to set effective task-specific goals). Thus, even in unsupervised settings, learners can benefit from the scaffolding necessary to support their learning, while maintaining enough independence to be motivated to learn.18 The effectiveness of DSRL in teaching procedural skills such as lumbar puncture and suturing has previously been demonstrated but has not yet been studied or optimized for nonprocedural skills.21,23
To date, we are not aware of any study that examines the efficacy of applying a DSRL mastery learning model to SBE of residents acquiring ACLS skills. In addition to the potential educational benefits of DSRL, its use in a mastery learning curriculum may enhance cost effectiveness of this educational approach. Hence, we compared the educational and cost effectiveness of DSRL with instructor-regulated learning (IRL), using a simulation-based mastery learning model. We hypothesized that residents in the DSRL mastery learning intervention would achieve the same level of performance as those in the IRL intervention and that DSRL would be less expensive.
The study was conducted at the University of Toronto and used the SimMan patient simulator (Laerdal Medical, Stavanger, Norway). The simulator displays physiologic parameters, which change in response to interventions applied by participants and can be preprogrammed to automatically respond in real-time to interventions such as defibrillation.
Eligible study participants were all postgraduate year 1 (PGY1) internal medicine residents at the University of Toronto (N = 64). These PGY1 residents participate as members of the ACLS response team and will become the response team leaders in their PGY2 year. All residents are required to maintain their ACLS provider status and generally complete an ACLS provider course at the beginning of their residency.
We conducted the study as part of a mandatory PGY1 education session. Those who chose to participate in the research protocol were randomized and received either the IRL or DSRL educational intervention, including a pretest and posttest. All residents who volunteered for the study provided informed consent and were entered in a raffle for a $350 gift certificate. This study was approved by the University of Toronto Health Sciences Research Ethics Board.
We conducted this randomized controlled trial of a simulation-based mastery learning curriculum between January and June 2013. The study procedure is outlined in Figure 1. Using a computer-based algorithm, participants were randomly assigned, in blocks of four, to the DSRL or IRL intervention. At the beginning of the study, each group watched a video orienting them to the simulator and to the educational session. Each participant then completed a baseline performance test (pretest) on a scenario requiring the application of ACLS skills.
The assessment instrument (Appendix A, http://links.lww.com/SIH/A221) and minimum passing mastery score were provided to all residents after they had completed the pretest. The residents then completed an educational intervention based on the intervention arm to which they were randomized. After the completion of the intervention, we conducted a posttest of each participant’s ACLS skills and requested participants to complete a course evaluation survey. The residents were also asked to complete a retention test 5 months later, at the end of their PGY1 year.
The intervention employed a mastery learning model using simulation to teach groups of 4 residents ACLS skills. The intervention was not an ACLS provider course but was designed to refresh, reinforce, and build on the ACLS skills the residents had learned when previously completing the ACLS provider course. Based on scenarios created by Wayne et al,24 4 standardized case scenarios for each of the 4 pulseless ACLS conditions (ventricular fibrillation, pulseless ventricular tachycardia, asystole, and pulseless electrical activity) were developed jointly by 2 ACLS instructors to reflect clinical situations commonly encountered. These 16 scenarios were used for the pretest, posttest, and retention test and for simulated practice of ACLS skills.
Within each group of 4 residents, one of the 16 scenarios was randomly selected for the first resident to perform as a team leader during the pretest. The scenarios for the subsequent pretests, practice scenarios, and posttests were then randomly selected from the remaining scenarios. During the pretest, practice scenarios, and posttests, 1 resident in each group was randomly assigned to lead the resuscitation efforts for each scenario, whereas the other residents participated as members of the resuscitation team.
In both the DSRL and IRL interventions, each resident in a group sequentially rotated through the role of team leader. Groups were encouraged to ensure that each resident received the opportunity to lead during at least 2 practice scenarios, although the actual time spent on each practice scenario and debriefing was left to the discretion of the groups in the DSRL intervention and to the instructor in the IRL intervention.
During the pretest, practice scenarios, posttest, and retention test, group members were permitted to have access to written ACLS algorithms but not to the assessment instrument. Access to the assessment instrument and the minimum passing mastery score was provided to all participants during each debriefing. During the pretest, practice scenarios, posttest, and retention test, the team leader directed the resuscitation and made the management decisions. During the practice scenarios, but not the tests, the leader was allowed to request ideas from team members regarding the resuscitation efforts. This interaction between team members was permitted because this is an important component of responding to an ACLS event.25 The session lasted for 3.5 hours, with 2 hours dedicated to practice scenarios. Retention tests were performed 5 months later, using the same scenario that the resident had completed as the leader during their posttest.
The DSRL Intervention
ACLS skills in the hospital context are not performed individually or in isolation; instead, the ACLS response team leader is responsible for coordinating and directing the team’s activities.25 To correspond to this reality, we proposed that the concept of DSRL could be expanded to include groups of learners with a shared identity and similar set of goals, who participate in a team-based response to ACLS situations.
After the completion of all pretests and after each practice scenario, the groups participated in peer-regulated, self-regulated, and structured debriefing using a debriefing template (Appendix B, http://links.lww.com/SIH/A222), which outlined the GAS (gather, analyze, summarize) debriefing method endorsed by the American Heart Association.26 The resident who led the resuscitation led the debriefing within their group. During the debriefing, the residents had the opportunity to seek feedback, review ACLS algorithms, review case-specific tips and teaching points developed in advance by an ACLS instructor, access any relevant online materials, and/or watch a provided video of an expertly conducted resuscitation. Groups were encouraged to use the assessment instrument during debriefing to identify performance gaps and develop a plan to address them. Debriefing did not occur after individual pretests or posttests but did occur after all pretests had been completed. Residents had not previously received training in debriefing during residency, and most of their exposure with debriefing would have occurred during their previous ACLS course experience.
The IRL Intervention
All instructors were ACLS provider course instructors with current Heart and Stroke Foundation of Canada instructor cards. The instructors were given 45 minutes to familiarize themselves with the scenarios and assessment instruments. In the IRL intervention, the same procedures were followed as in the DSRL intervention, with the exception of the debriefing. After completion of all pretests and after each practice scenario, the instructor debriefed the group in the manner they deemed most appropriate to help the group achieve the mastery standard (ie, they did not receive specific instructions on whether and how to help trainees self-regulate their learning). Similar to the DRSL intervention, there was no debriefing during or after each posttest. The instructors and participants in the IRL intervention had access to all of the same educational and debriefing resources as in the DSRL intervention and all were familiar with, and regularly used, the GAS debriefing method. Instructors were encouraged to use the assessment instrument to provide information to their group regarding performance gaps and to develop strategies for improvement.
Residents were asked to provide baseline demographic data and information regarding previous exposure to situations requiring the use of ACLS skills. The pretest, posttest, and retention test were scored using assessment instruments (Appendix A) that were checklists modified from those initially developed by Wayne et al.24 The checklists were modified by consensus between 2 ACLS instructors, including a Medical Director for a large center that delivers formal ACLS provider courses, to reflect changes in the 2010 ACLS guidelines.27 Modifications included the addition of items on each checklist to ensure high-quality cardiopulmonary resuscitation (CPR) was being performed, that CPR was resumed immediately after defibrillation attempts and the removal of atropine administration from the asystole and pulseless electrical activity checklists. The checklists focus on the knowledge and application of resuscitation algorithms by the team leader and use a dichotomous scoring scale, with a score of 0 if the item is not done or done incorrectly and a score of 1 if the item is completed correctly. Each checklist contained a different number of items (range, 24–29); therefore, scores were reported as percent correct. Rigorous procedures to determine the minimum passing mastery score for each checklist have been previously performed.28 These standards were used in this study, with a score of greater than 77% required for pulseless electrical activity scenarios and a score of greater than 75% required for all other scenarios.
All scenarios were video recorded and later scored by a trained ACLS instructor. A random sample of 25% of the recorded scenarios was scored by a second trained ACLS instructor. Raters were blinded to intervention allocation. The primary rater was blinded to pretest and posttest status but, due to practical limitations, was not blinded to retention test status. The second rater was blinded to pretest, posttest, and retention test status. Before rating, the raters reviewed the checklists and discussed what actions would be considered as “correct” for each checklist item.
Upon determining that participants’ scores were not statistically significantly dependent [intraclass correlation coefficient (ICC) results later], we analyzed all posttest performance data using a univariate analysis of covariance (ANCOVA), with pretest scores as the covariate. In addition, we analyzed the performance data for the residents who also completed a retention test using a 2-group by 2-test (posttest and retention test) mixed effects ANCOVA, with pretest scores as covariate. Post hoc comparisons to assess for changes between posttest and retention test and to examine for differences between the interventions were planned. We calculated the ICC to measure interrater reliability for the 25% sample rated by 2 instructors. For continuous variables, we used independent samples t tests to compare demographic information, previous exposure to real-life ACLS situations, pretest scores, and postcourse evaluations between intervention arms. For categorical variables, we used χ2 tests to compare baseline data. We also recorded all costs incurred in preparing and delivering the interventions. We specified the α level at P < 0.05 and conducted all statistical tests using SPSS version 19 (SPSS Inc, Chicago, Ill).
Given that we assessed participants’ performance in groups, each participants’ scores may not have been independent from each other. For instance, observing the first team leader may remind others of the components of the ACLS protocols, allowing them to perform better than if they had led the team first. If this resulted in each participant’s performance scores not being independent, then this would violate the assumptions of an ANCOVA. Thus, to demonstrate that there was no significant dependence of scores within groups, we calculated the ICC using the ANOVA estimator method for each test (ie, pretest, posttest, and retention test).29 To further analyze the effect of order of being team leader on performance, we conducted three 1-way ANOVAs (ie, one for each test) with each participant’s sequence of being team lead (first through fourth) as the independent variable.
Initially, we analyzed the data testing for superiority. Given that we developed the assessment tools specifically for this study, we could not perform an a priori power calculation. After collecting the data, we determined that analyzing the data to test for equivalence would be more appropriate given our hypothesis. Therefore, we used our data to perform an a posteriori sample size calculation for equivalence testing using a specified educationally significant difference of 6.9% to 8.3% (representing 2 points on the checklist, depending on the scenario), power of 0.80, and an α of 0.05.
In the IRL intervention, 20 residents completed the precourse survey and 15 completed the course evaluation survey, whereas in the DSRL intervention, 19 residents completed the precourse survey and 15 completed the course evaluation survey. Twenty residents were unable to attend the initial session because of vacation, duty hours limitations (they were “postcall”), and data from 1 group was not obtained because 1 resident in the group did not provide consent to participate. Demographic and baseline data for each intervention arm are presented in Table 1. The 2 intervention arms did not differ significantly in measured baseline demographics or in both previous simulation-based and “real-life” ACLS experience. There were no significant differences in measured demographics between the residents who completed the retention test and those who did not, nor were there differences between the residents in the DSRL and IRL interventions who completed the retention test.
According to the ANOVA estimator method, we found that the ICC for each separate test was 0.16 (pretest), 0.01 (posttest), and −0.31 (retention test). Because these ICC values are less than 0.20 or negative, the data (especially the ICC = −0.31 on retention) do not completely discount dependence of participants’ scores. However, according to the three 1-way ANOVAs we conducted, we found no statistically significant influence of order on participants’ scores on the pretest [F(3,36) = 0.67, P = 0.58], posttest [F(3,36) = 1.01, P = 0.40], or retention test [F(3,16) = 0.67, P = 0.58]. Cumulatively, these analyses suggest that we can proceed with the assumption that participants’ scores are sufficiently independent for our remaining analyses.
Our power calculation suggested that we required 7 to 11 residents in each intervention arm to conduct an equivalence trial. Surpassing this sample size, in each intervention arm, 20 residents completed the pretest and posttest, whereas 10 completed the retention test.
The ACLS Performance Data
The ICC for the instructor ratings of videotaped performances was 0.84 for single measures, with an averaged measure ICC of 0.91, indicating strong interrater reliability and that ratings from a single instructor were sufficiently reliable for analysis.30,31
All scores are reported as mean (SE). At pretest, there was no difference between the mean score in the IRL intervention [71.46 (1.96%)] and the DSRL intervention [71.23 (1.31%), P=.92]. At pretest, 4 participants in the IRL intervention and 5 in the DSRL intervention had surpassed the mastery standard. Notably, all residents exceeded the mastery standard on the posttest. Finally, 9 of 10 participants in each intervention arm continued to surpass the standard on the retention test.
According to the univariate ANCOVA of all participants’ posttest scores, the IRL intervention [93.11 (1.26%)] and the DSRL intervention [92.98 (1.12%)] did not differ significantly [F(1,37) = 0.01, P = 0.94].
Figure 2 depicts the results for the mixed effects ANCOVA including the data for participants who also completed the retention test (n = 20). According to the original ANCOVA test of within-subjects effects, we found no main effect for test [F (1,17) = 1.34, P = 0.26]; however, post hoc analysis did show a significant decline from posttest [93.03 (1.27%)] to retention test [82.43 (1.48%), P < 0.001]. We found no significant group difference between the IRL [86.23 (1.76%)] and DSRL [89.23 (1.76%)] interventions [F (1,17) = 1.43, P = 0.25] and no significant interaction between test and intervention arm [F (1,17) = 0.82, P = 0.30].
The overall costs (Canadian dollars) associated with developing and implementing the IRL and DSRL interventions are outlined in Table 2. The cost of simulation laboratory rental, including the use of the necessary simulators, monitors, defibrillators, airway equipment, and disposables (intravenous [IV] lines, defibrillation pads, etc), was identical in each intervention. One-time costs were higher in the DSRL intervention because the materials to support and direct the residents had to be created. Although all materials were available to both interventions, many items were prepared specifically for the DSRL intervention, including the debriefing tool, scenario-specific tips and teaching points, and videos outlining the session, demonstrating the use of the simulator and showing an example of a well-conducted resuscitation; therefore, these items were only included in the cost of the DSRL intervention. It took 10 hours of instructor time to prepare these materials and to preprogram the simulation scenarios, at a cost of $1200. Instructors in the IRL intervention received 45 minutes of preparation time to familiarize themselves with the simulation scenarios and assessment instruments, at a total cost of $375. By design, the IRL intervention required the presence of an instructor and 1 simulation technician. The DSRL intervention did not require any instructors; however, 2 simulation laboratory technicians were employed to troubleshoot any technical issues. The total cost of the IRL intervention was $3600 ($720 per group), whereas the DSRL intervention cost was $3200 ($640 per group). Recurring costs to conduct the intervention for 20 residents (5 groups) were higher in the IRL intervention at $3225 compared with $2000 in the DSRL intervention. Cost savings were realized in the DSRL intervention after the fourth group (16 residents) had completed each intervention.
Course Evaluation Data
A postcourse evaluation was completed using a 5-point Likert scale, ranging from 1 (strongly disagree) to 5 (strongly agree) (Table 3). Residents felt the standard ACLS provider course was insufficient training to lead a cardiac arrest team. They felt that simulation-based training was valuable in increasing their comfort level in leading an ACLS team and wished to participate in more simulation-based training in residency. Residents in the DSRL intervention were significantly more likely to have expressed a desire to train in the IRL intervention (mean = 3.5) than residents in the IRL intervention who were to have wanted to train in the DSRL intervention (mean = 2.2, P = 0.001). Although participants in both interventions agreed that participation within their assigned intervention was an effective way to learn ACLS skills, residents in the IRL intervention agreed with this more strongly (mean = 4.40 vs. 3.75, P = 0.016).
This study demonstrates that both DSRL and IRL significantly improve resident ACLS skills when responding to simulated pulseless scenarios, with an average improvement of 21.7% between the pretest and posttest. All residents surpassed the mastery standard on the posttest, whereas only 22.5% had done so on the pretest. Participants’ scores did drop from posttest to retention test, yet 18 of 20 continued to surpass the mastery standard. Neither IRL nor DSRL was found to be superior to the other. Based on our post hoc sample size calculation for an equivalence trial, we believe the data are consistent with our primary hypothesis, demonstrating equivalence in ACLS skill acquisition and retention for the DSRL intervention compared with the IRL intervention.
Our participants’ improvement may be less than that demonstrated by PGY2 and PGY3 residents in the study of Wayne et al32 for several reasons. Residents in the current study had higher pretest scores, limiting the degree of possible improvement. They were also PGY1 residents who were not yet leading resuscitations outside of the simulated environment, which may have impacted motivation and retention. Similar to the studies conducted by Wayne et al,9,32 this study demonstrates that simulation-based mastery learning of ACLS skills is an effective educational intervention that can be implemented in either an IRL or DSRL format to achieve impressive results.
Conceptually, we argued that small-group peer learning is a form of DSRL intervention. We acknowledge that this results in a potential lack of conceptual clarity, considering that the scenario leader largely self-regulated the ACLS performance and subsequent debrief yet did so in a context where peers could coregulate each other and also regulate aspects of performance together (ie, socially shared regulation of learning). Although we used the DSRL concept to design our intervention successfully, future work will need to consult the educational psychology literature as we define the boundaries in health professions education between SRL, coregulated learning, and “self and socially regulated learning.”33
After accounting for all costs associated with developing and administering the interventions, DSRL was more cost-effective than IRL. The number and type of simulators used and the simulation laboratory time used were the same in each intervention. Although the DSRL intervention had additional initial start-up costs associated with developing the materials necessary to support the learning of the DSRL participants, the instructor costs in the IRL intervention surpassed these. After 16 residents (4 groups) had completed the DSRL intervention, the initial development costs (ie, the 1-time costs) were offset. The DSRL intervention costs $80 less per resident ($320 per group) than IRL, on an ongoing basis. Although the current DSRL intervention required 2 technicians, technicians may no longer need to be present when residents become more familiar with the simulation technology, further increasing the cost savings to $100 per resident. By limiting the instructor’s direct involvement, simulation-based mastery learning of ACLS skills using a DSRL model may represent a more feasible and cost-effective means to train larger groups of learners over time.
Consistent with findings in other studies, most residents did not feel that the standard ACLS provider course was sufficient preparation before leading a cardiac arrest team.6 Despite the equivalent performance gains in the 2 groups, residents in both interventions were significantly more likely to express a desire to learn in the instructor-led group. Although residents in both interventions felt that participation in their assigned intervention arm was an effective way to learn ACLS skills, those in the IRL intervention more strongly agreed with this statement. Our findings do not allow us to determine why residents in this and other studies prefer IRL.34 It may be that the residents are more comfortable with the traditional model of didactic and instructor-led learning because of increased familiarity with this learning method.19 Research is needed to discern why residents prefer IRL experiences, despite evidence that DSRL is at least as effective and, in some cases, superior to IRL.21
This study has several limitations. First, we initially designed the study as a superiority trial and adjusted to an equivalence trial in a post hoc manner. Researchers must recognize that when analyzing data to detect superiority, the failure to show a true difference between groups does not mean that equivalence between groups can be inferred. Instead, an equivalence trial, with an appropriate sample size, must be conducted for this purpose.35 Post hoc analysis demonstrated that the study was indeed sufficiently powered as an equivalence trial for the posttest analyses. Although often overlooked in SBE research and medical education research in general, strong equivalence and/or noninferiority designs are essential to the comparative research necessary to advance the field and to ensure that equivalence or noninferiority is not assumed simply when superiority was not found.36 Second, in an effort to avoid test-retest bias, we used different scenarios for pretest and posttests, which we accounted for by normalizing scores and by using pretest scores as a covariate in analyses. Third, we varied the pretest scenarios both within and across groups, which introduced random variation that may have reduced our statistical power. That said, we introduced this variation equally to both intervention groups. Nonetheless, we would recommend using the same scenarios in the same order for each group in future studies to remove such random variation from the study design. Fourth, although our setting is similar to that used in previous studies, the checklists and minimum passing mastery score used in this study were modified and used in a different context than those in which they were initially developed.24 Fifth, there was a 50% dropout rate between posttest and retention test. This dropout rate could threaten the validity of retention results; however, there were no differences in demographic data or pretest and posttest scores between residents who completed the retention test and those who did not. Our choice of a 5-month retention period likely contributed to this attrition, yet we chose this timeline because the residents would be entering their PGY2 year shortly after this time and there would be large variations in the clinical experiences of residents depending on when they were ACLS response team leaders and when they completed intensive care unit rotations. Finally, although a debriefing tool was provided to each intervention, the actual debriefing technique and previous experience with the GAS debriefing method were not controlled for.
Future work should be conducted to determine how to optimize the DSRL intervention and to examine the impact of changing practice conditions or feedback and debriefing mechanisms. Qualitative studies should be conducted to examine the mechanisms that make DSRL effective and to determine why residents prefer IRL more than DSRL, the implications of this preference, and how to increase the acceptance of DSRL. It is necessary to determine whether the DSRL model can be applied effectively to learners at different levels of training and to activities that are less structured than ACLS skills and procedural skills. Hybrid methods that incorporate DSRL and IRL in varying doses at various times may also be more effective than either method alone and should be studied, to inform decisions as to how to most effectively use one of the scarcest resources in simulation—instructors. Determination of how skills learned using a DSRL model in the simulation laboratory transfers to improvements in actual clinic practice will also be necessary.
Directed self-regulated learning represents a cost-effective way to effectively teach ACLS skills in simulation-based courses. When providing residents with a choice about their method of learning, there is a clear preference for IRL to DSRL. Therefore, if educators are to engage learners in DSRL, it may be important for them to acknowledge learner preferences, discuss the effectiveness of DSRL, and outline potential advantages of DSRL, such as increased flexibility in meeting learning needs and in scheduling training time. These findings have potentially broad implications for the design of ACLS interventions, which are offered to many disciplines within medicine and to other health professions, including nursing, respiratory therapy, and pharmacy.
The authors would like to acknowledge Dr George Tomlinson for advice on statistical analysis and Finch Taylor and SimSinai at Mount Sinai Hospital, Toronto, for technical support provided.
2. Kaye W, Mancini ME, Rallis SF. Advanced cardiac life support
refresher course using standardized objective-based mega code testing. Crit Care Med
1987; 15: 55–60.
3. Semeraro F, Signore L, Cerchiari EL. Retention of CPR performance in anaesthetists. Resuscitation
2006; 68: 101–108.
4. Smith KK, Gilcreast D, Pierce K. Evaluation of staff’s retention of ACLS
and BLS skills. Resuscitation
2008; 78: 59–65.
5. Mickelsen S, McNeil R, Parikh P, Persoff J. Reduced resident “code blue” experience in the era of quality improvement: new challenges in physician training. Acad Med
2011; 86: 726–730.
6. Hayes CW, Rhee A, Detsky ME, Leblanc VR, Wax RS. Residents feel unprepared and unsupervised as leaders of cardiac arrest teams in teaching hospitals: a survey of internal medicine residents. Crit Care Med
2007; 35: 1668–1672.
7. Wayne DB, Butter J, Siddall VJ, et al. Graduating internal medicine residents’ self-assessment and performance of advanced cardiac life support
skills. Med Teach
2006; 28: 365–369.
8. Cook DA, Hamstra SJ, Brydges R, et al. Comparative effectiveness of instructional design features in simulation
-based education: systematic review and meta-analysis. Med Teach
2013; 35: e867–e898.
9. Wayne DB, Butter J, Siddall VJ, et al. Mastery learning of advanced cardiac life support
skills by internal medicine residents using simulation
technology and deliberate practice. J Gen Intern Med
2006; 21: 251–256.
10. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation
based medical education research: 2003-2009. Med Educ
2010; 44: 50–63.
11. Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R. Mastery learning for health professionals using technology-enhanced simulation
: a systematic review and meta-analysis. Acad Med
2013; 88: 1178–1186.
12. McGaghie WC. Research opportunities in simulation
-based medical education using deliberate practice. Acad Emerg Med
2008; 15: 995–1001.
13. Motola I, Devine LA, Chung HS, Sullivan JE, Issenberg SB. Simulation
in healthcare education: a best evidence practical guide. AMEE Guide No. 82. Med Teach
2013; 35: e1511–e1530.
14. Issenberg SB, Scalese RJ. Simulation
in healthcare education. Perspect Biol Med
2008; 51: 31–46.
15. Okuda Y, Bond W, Bonfante G, et al. National growth in simulation
training within emergency medicine residency programs, 2003-2008. Acad Emerg Med
2008; 15: 1113–1116.
16. American Heart Association. Heart code. AHA Web site. Available at: www.heart.org/heartcode
. Accessed October 4, 2014.
17. American Academy of Pediatrics. Neonatal resuscitation program. Available at: www2.aap.org/nrp
. Accessed October 4, 2014.
18. Brydges R, Dubrowski A, Regehr G. A new concept of unsupervised learning: directed self-guided learning in the health professions. Acad Med
2010; 85: S49–S55.
19. Kornell N, Bjork RA. The promise and perils of self-regulated study. Psychon Bull Rev
2007; 14: 219–224.
20. Brydges R, Manzone J, Shanks D, et al. Self-regulated learning
-based training: a systematic review and meta-analysis. Med Educ
2015; 49: 368–378.
21. Brydges R, Nair P, Ma I, Shanks D, Hatala R. Directed self-regulated learning
versus instructor-regulated learning in simulation
training. Med Educ
2012; 46: 648–656.
22. Kornell N, Bjork RA. Optimising self-regulated study: the benefits - and costs - of dropping flashcards. Memory
2008; 16: 125–136.
23. Jowett N, LeBlanc V, Xeroulis G, MacRae H, Dubrowski A. Surgical skill acquisition with self-directed practice using computer-based video training. Am J Surg
2007; 193: 237–242.
24. Wayne DB, Butter J, Didwania A, Siddall V, McGaghie WC. Advanced Cardiac Life Support
checklists for simulation
-based education. MedEdPORTAL
; 2009. Available at: www.mededportal.org/publication/1773
. Accessed June 16, 2014.
25. Bhanji F, Mancini ME, Sinz E, et al. Part 16: education, implementation, and teams: 2010 American Heart Association Guidelines for Cardiopulmonary Resuscitation and Emergency Cardiovascular Care. Circulation
2010; 122: S920–S933.
26. Cheng A, Rodgers DL, van der Jagt E, Eppich W, O’Donnell J. Evolution of the Pediatric Advanced Life Support course: enhanced learning with a new debriefing tool and Web-based module for Pediatric Advanced Life Support instructors. Pediatr Crit Care Med
2012; 13: 589–595.
27. Neumar RW, Otto CW, Link MS, et al. Part 8: adult advanced cardiovascular life support: 2010 American Heart Association Guidelines for Cardiopulmonary Resuscitation and Emergency Cardiovascular Care. Circulation
2010; 122: S729–S767.
28. Wayne DB, Fudala MJ, Butter J, et al. Comparison of two standard-setting methods for advanced cardiac life support
training. Acad Med
2005; 80: S63–S66.
30. Roberts J, Norman G. Reliability and learning from the objective structured clinical examination. Med Educ
2009; 24: 219–223.
31. Streiner DL, Norman GR. Health Measurement Scales: A Practical Guide to Their Development and Use
. New York, NY: Oxford University Press, 2008.
32. Wayne DB, Siddall VJ, Butter J, et al. A longitudinal study of internal medicine residents’ retention of advanced cardiac life support
skills. Acad Med
2006; 81: S9–S12.
33. Molenaar I, Järvelä S. Sequential and temporal characteristics of self and socially regulated learning. Metacogn Learn
2014; 9: 75–85.
34. Jensen AR, Wright AS, Levy AE, et al. Acquiring basic surgical skills: is a faculty mentor really needed? Am J Surg
2009; 197: 82–88.
35. Lineberry M, Walwanis M, Reni J. Comparative research on training simulators in emergency medicine: a methodological review. Simul Healthc
2013; 8: 253–261.
36. Tolsgaard MG, Ringsted C. Using equivalence designs to improve methodological rigor in medical education trials. Med Educ
2014; 48: 220–221.