Share this article on:

Beyond the Simulation Laboratory: A Realist Synthesis Review of Clinical Outcomes of Simulation-Based Mastery Learning

Griswold-Theodorson, Sharon MD, MPH; Ponnuru, Srikala MD; Dong, Chaoyan PhD; Szyld, Demian MD, EdM; Reed, Trent DO; McGaghie, William C. PhD

doi: 10.1097/ACM.0000000000000938
Reviews

Purpose: Translational educational outcomes have been defined as starting in simulation laboratories (T1) and moving downstream to improved patient care practices (T2), patient outcomes (T3), and cost/other value outcomes (T4). The authors conducted a realist synthesis review of the literature to evaluate the translational effect of simulation-based mastery learning (SBML) principles beyond the laboratory. They also sought to address future directions in SBML to improve patient care processes and outcomes and, thus, the quality of health care delivery.

Method: The authors searched multiple databases for simulation-based medical education (SBME) studies published through April 2013. They screened articles using the PICO method—population (P), intervention (I), control (C), outcome (O)—to answer the research question: For (P) any health care providers, does the (I) implementation of SBML training, compared with (C) other training methodologies or no extra training, result in (O) a change in patient care practices or T2–T4 outcomes? Studies implementing SBME interventions with training methodologies that met all SBML principles and reporting T2–T4 outcomes were identified.

Results: The 14 included studies used pre/post or cohort study designs; the majority were limited to individual performance and procedural competency. They reported improvement after SBML training in procedure performance, task success, patient discomfort, procedure time, complication rates, or T4 impacts (e.g., cost reduction).

Conclusions: Findings suggest health professions education conducted using SBML methodology can improve patient care processes and outcomes. Further research is needed to understand the translational impact of SBML for nontechnical skills, including teamwork, and skill retention.

S. Griswold-Theodorson is director, Master of Science in Medical and Healthcare Simulation Program, director, Division of Simulation, Department of Emergency Medicine, and professor of emergency medicine, Drexel University College of Medicine, Philadelphia, Pennsylvania.

S. Ponnuru is fellowship director, Division of Simulation, Department of Emergency Medicine, and assistant professor of emergency medicine, Drexel University College of Medicine, Philadelphia, Pennsylvania.

C. Dong is assistant director of medical education, National University of Singapore Yong Loo Lin School of Medicine, Singapore.

D. Szyld is medical director, New York Simulation Center for the Health Sciences, and assistant professor of emergency medicine, New York University School of Medicine, New York, New York.

T. Reed is assistant dean and director of clinical simulation and associate professor, Department of Emergency Medicine, Loyola University Chicago Stritch School of Medicine, Maywood, Illinois.

W.C. McGaghie is professor of medical education, Department of Medical Education, Northwestern University Feinberg School of Medicine, Chicago, Illinois.

Funding/Support: Dr. McGaghie was supported by the Ralph P. Leischner, Jr. MD Institute for Medical Education of the Loyola University Chicago Stritch School of Medicine.

Other disclosures: None reported.

Ethical approval: Reported as not applicable.

Correspondence should be addressed to Sharon Griswold-Theodorson, Drexel University College of Medicine, 245 N. 15th St., NCB 2nd Floor, Room 2108, MS 1011, Philadelphia, PA 19102; e-mail: sgriswol@drexelmed.edu.

Nearly two decades ago, the Accreditation Council for Graduate Medical Education (ACGME) began the ACGME Outcome Project initiative1 to replace the traditional curriculum-based, apprenticeship model of graduate medical education with an outcome-based model. This paradigm shift from a process-based, structured curriculum to evaluation of outcomes via a competency-based curriculum is amongst the most profound changes in medical education.2 This monumental shift has refueled the conversation concerning competence. Although the definition and operationalization of competence in medicine are high-stakes activities, the meaning of competency and the formation of educational programs for its attainment have challenged medical educators for decades.3 Defining, measuring, and ensuring the competency of health care providers remains a key but elusive goal for health care educators.4

In 2008, Dougherty and Conway5 proposed a “3Ts” translational science classification model with the intent to accelerate the rate at which innovations in health care deliverables are implemented in the U.S. health care system. McGaghie6 modified the 3Ts road map to apply the model in educational terms as the desired consequences of educational interventions measured at graduated levels beginning in a classroom or simulation laboratory (T1), moving downstream to improved and safer patient care practices and processes (T2), and ultimately to improved patient outcomes (T3). McGaghie et al7,8 later added a fourth impact level to describe outcomes such as cost savings, skill retention, systemic educational value, and health care system improvements (T4).

Back to Top | Article Outline

Simulation-Based Medical Education as Translational Science

The early simulation-based medical education (SBME) literature documented health care training outcomes that were predominantly measured within simulation laboratories (T1). Several comprehensive reviews8–16 and meta-analyses17–23 have documented the more recent SBME literature, providing examples of medical education translational outcomes not only in the T1 environment but also beyond it to the patient care environment (T2–T4).24 These reviews have demonstrated that SBME uses many different technology and education modalities25 that can improve patient care. As the translational science of SBME continues to mature, the conversation has evolved from considering whether SBME is an effective way to train health care providers to exploring which SBME techniques are most effective, for whom, and under what circumstances.

Back to Top | Article Outline

Simulation-Based Mastery Learning

Simulation-based mastery learning (SBML) in medical education has a history in the educational literature dating to the 1970s.26–29 The mastery learning model holds that given the necessary time, under appropriate learning conditions, most students can “master” or reach a high level of achievement.27 The goal of SBML as applied to health care education is to ensure that all learners accomplish all educational objectives or reach competency standards beyond proficiency levels with little or no variation in outcome. The SBML model implies that most learners, with deliberate practice,9,30–33 formative assessment, and appropriate feedback can and will meet acceptable achievement standards. An important paradigm shift in SBML as compared with other learning methodologies is that the amount of time needed to reach mastery standards for educational objectives varies among learners. A recent report suggests that SBML interventions are more effective than non-SBML interventions.21

In this report, we describe the patient care processes, outcomes, and other variables reported after successful implementation of SBML curricula. Our research had two purposes: (1) to conduct a realist synthesis review of the literature to evaluate the translational impact of SBML principles beyond the simulation laboratory and (2) to address future directions in SBML curriculum planning and implementation to understand how SBML may be useful in improving patient care processes and outcomes and, thus, the quality of health care delivery.

Back to Top | Article Outline

Method

We conducted our review according to the reporting standards set by the RAMESES (Realist and Meta-narrative Evidence Syntheses: Evolving Standards) collaboration.34,35 Realist synthesis is a theory-driven method that is firmly rooted in a realist philosophy of science and places particular emphasis on understanding causation and how causal mechanisms are shaped and constrained by social context. The realist synthesis method examines the question, What works, for whom, under what circumstances, how, and why?36,37

This makes the RAMESES style particularly suitable for reviews of simulation-based research.35 Reports of SBME interventions commonly fail to clearly describe every aspect of their research methods. There is also significant heterogeneity among study designs, participants, and outcomes in the SBME literature. In a quantitative meta-analysis, these differences could introduce biases. Thus, traditional systematic review and meta-analysis techniques may be less applicable in reviews of the SBME literature.36 Colliver et al37 suggest that “the medical education field might be better served in most instances by systematic narrative reviews that describe and critically evaluate individual studies and their results, rather than obscure biases and confounds by averaging.”

Back to Top | Article Outline

Search strategy and inclusion criteria

Original research reports that evaluated an SBME intervention with a patient care process or outcome measure in a clinical environment were reviewed for inclusion. To identify SBME studies with T2–T4 outcomes, two of the authors (S.G.T. and S.P.) conducted an initial search of the peer-reviewed, English-language literature published through April 2013 using three databases: MEDLINE (via OVID), CINAHL, and Web of Science. The search included terms for the intervention (e.g., simulat*, manikin*, virtual*, simman*, Harvey), topic (e.g., education, health sciences, teaching, experiential learning), and outcome (e.g., patient safety, quality of health care, risk management, evaluation, adverse event).

Studies were screened for inclusion using the PICO38 method: population (P), intervention (I), control (C), and outcome (O). Our research question was: For (P) any health care providers, does the (I) implementation of SBML training, compared with (C) other training methodologies or no extra training, result in (O) a change in patient care practices or T2–T4outcomes? References from systematic reviews, review bibliographies, and articles in key journals were also reviewed to identify additional studies.

Back to Top | Article Outline

Exclusion criteria

Studies were excluded from consideration if the methods of training described did not adhere to all seven SBML principles (see List 1), outcome data were self-reported, the article did not represent original research, or the evaluation did not have an observed T2, T3, or T4 patient care process or measurable outcome.

Back to Top | Article Outline

List 1 Mastery Learning Criteria: The Seven Key Principles of Simulation-Based Mastery Learninga Cited Here...

1. Baseline or diagnostic testing

2. Clear learning objectives, sequenced as units usually in increasing difficulty

3. Engagement in educational activities focused on reaching the objectives

4. A set minimum passing standard for each educational unit

5. Formative testing to gauge unit completion at a preset minimum passing standard for mastery

6. Advancement to the next educational unit given measured achievement at or above the mastery standard

7. Continued practice or study on an educational unit until the mastery standard is reached

aAdapted from McGaghie et al.33

Back to Top | Article Outline

Study selection and data extraction

Two of the authors (S.P. and S.G.T.) reviewed the titles and abstracts of the articles identified in the initial search. Both authors reviewed the full text of original research articles with an SBME intervention with a T2, T3, or T4 patient care process or outcome measured in the clinical environment. Two authors (S.P. and T.R.) then independently reviewed the remaining articles to identify articles that met all SBML principles (see List 1).

Three authors (S.P., S.G.T., and T.R.) independently reviewed each article selected for inclusion to conduct the realist review. They extracted and assimilated the information provided, including characteristics of learners, study design, reported outcomes, and study funding. The results were shared among all the researchers. For any disagreement, the entire research team reviewed the article and discussed it until consensus was reached.

Back to Top | Article Outline

Results

A total of 11,905 articles were screened for inclusion. Ninety-three of these articles reported an SBME intervention with a T2, T3, or T4 patient care process, patient care outcome, or health care system outcome (e.g., cost). After critical review for use of all seven SBML principles, 14 articles remained (Figure 1).

The 14 included articles with T2, T3, or T4 translational outcomes of SBML are summarized in Table 1.39–52 These articles are described below in categorized groups based on the specific patient or health care system impact reported: (a) procedural performance, task success, and decreased patient discomfort, (b) procedure time, (c) complication rates, and (d) T4 impact such as skill retention or health care cost reduction. The majority of SBML interventions identified by this review were limited to individuals’ performance; most involved postgraduate trainees and evaluated procedural competency. All included studies used a pre/post or cohort study design to compare the outcomes of SBML interventions and interventions that employed traditional teaching approaches. For this review, we defined “traditional” education as training with or without a study plan that all participants are privy to.

Back to Top | Article Outline

Procedural performance, task success, and decreased patient discomfort

SBML resulted in improvement of bedside procedural performance and procedure success rates in several studies. Downstream translational practice and process (T2) outcomes of SBML included improved performance of skills (including hemodialysis catheter insertion,39 cardiac auscultation,40 and adherence to advanced cardiac life support [ACLS] guidelines41) and improved performance of procedures (including transurethral resection of the prostate [TURP],42 laparoscopic fascial closure,43 colonoscopy,44 and laparoscopic surgery45,46). SBML also demonstrated improved patient care outcomes/procedure success rates (T3) as seen in the studies evaluating the use of SBML for colonoscopy44,47 and TURP procedures.42 Studies on skill acquisition in colonoscopy44,47 also reported decreased patient discomfort during the procedure after SBML training.

Back to Top | Article Outline

Procedure time

Several studies demonstrated decreased procedural or operative time after SBML curricula were implemented. Ahlberg et al47 reported a significant difference in procedure time to reach the cecum during colonoscopy, with the SBML group requiring a median of 30 minutes as compared with 40 minutes for the control group. Yi et al44 also reported a similar reduction in time to successful colonoscopy completion, with the SBML group requiring 31 minutes versus the 41.5 minutes for the control group.

Larsen et al45 reported a 50% reduction in the operating room (OR) time required in the intervention group in a laparoscopic virtual reality training trial. Zendejas et al46 reported a reduction in procedure time during total extraperitoneal (TEP) inguinal hernia repair: SBML-trained residents were able to complete the surgery with a mean time of 34 ± 8 minutes compared with 48 ± 14 minutes for the traditional training group. However, in contrast, Hogle et al48 reported a statistically significant increase in total operative time in the simulation-trained group compared with the control group.

Back to Top | Article Outline

Complication rates

Reduction of complication rates is an important translational outcome in health care costs and patient well-being. Barsuk et al49 demonstrated an 85% decline in central-line-associated blood stream infections (CLABSIs) among medical intensive care unit (MICU) patients whose central venous catheter (CVC) placements were performed by residents who completed the SBML intervention compared with patients whose CVCs were placed by traditionally trained residents. This decline in CLABSI rates was replicated in a second study at another institution, with Barsuk et al50 reporting a 74% reduction after SBML training. Duncan et al51 demonstrated a reduction of pneumothorax rates following SBML training in thoracentesis; however, this study also incorporated the use of ultrasound, which is known to independently improve the safety of CVC insertion. Zendejas et al46 demonstrated decreased intraoperative and postoperative complication rates during laparoscopic TEP hernia repair among surgical trainees who completed an SBML curriculum as compared with trainees who completed a traditional curriculum.

Back to Top | Article Outline

T4 impact

In a cost analysis of Barsuk and colleagues’49 2009 SBML CVC study, Cohen et al52 estimated a $700,000 direct cost savings, yielding a 7-to-1 return on investment, associated with this simulation-based intervention. The incremental costs attributed to each central line infection were approximately $82,000 (in 2008 dollars) and 14 additional hospital days (including 12 in the MICU). Zendejas et al46 also demonstrated collateral effects from SBML, reporting a significant decrease in overnight hospital days after implementing SBML training for laparoscopic inguinal hernia repair. These results not only represent advantages for the health and safety of individual patients but also have notable financial implications for hospitals and health care systems.

Back to Top | Article Outline

Discussion

This review presents a realist synthesis of translational outcomes beyond the simulation laboratory after the implementation of SBML curricula to train health care providers. Our review identified 14 studies describing the implementation of SBML curricula that improved patient care practices or outcomes or demonstrated added health care value.

Several studies44–47 reported decreased procedural and operative time. In contrast, Hogle et al48 demonstrated increased operative time. However, they noted that several confounding variables not documented in the medical record (e.g., variations in hands-on operating time by the resident versus attending physician, the procedure’s level of difficulty, intraoperative and perioperative complications) were not accounted for in the evaluation of operative time, and time as a variable was not isolated independently.

Although we do not present a formal cost analysis in this review to further evaluate the impact of decreased OR or procedure time, we suggest that reduction in surgical or procedural time may translate into improved OR efficiency and, therefore, financial savings in the health care system. Decreasing time under anesthesia may be beneficial for patients. Time savings in the OR for residents is likely a valuable outcome of SBML. Given that U.S. duty hours restrictions limit trainee time in the OR, improved efficiency after SBML training may increase the overall number of cases performed by residents. The issue of operative time and improved surgical efficiency as a result of SBML warrants further study.

An interesting finding in this review is that the articles we identified predominantly reported on individual providers performing a procedural skill. Almost all of these studies involved postgraduate trainees; the exceptions were one study with practicing staff physicians51 and one study of medical students.40 Two studies tested nonprocedural skills (cardiac auscultation40 and adherence to ACLS protocols41). Only one study43 attempted to address cognitive retention rates associated with acquisition of knowledge and multitasking. Evidence to support translational outcomes of SBML for nontechnical skills—including communication, teamwork, and complex cognitive skills—and skill retention remains scarce. This is due to challenges such as measuring complex clinical outcomes, defining competence and competencies in teamwork and communication skills, and following health care providers over meaningful periods of time.

Back to Top | Article Outline

SBML and the definition of competence

Health care educators have begun to use SBML to define, measure, and confirm health care providers’ abilities to ensure a more effective health care workforce. In fact, SBML has been identified as a “best practice” of SBME.10 The studies discussed in this review have identified SBML strategies that may be better understood in terms of the Dreyfus/Benner model of skill acquisition.53–56 In that model, the path from novice to expert typically includes development of foundational knowledge, integration of pieces of information, application of information into problem solving, and transfer of information to different contexts.53–56 The SBML model proceeds through baseline assessment, defined learning objectives, engagement in the educational activity, accomplishment of a minimum passing standard, and advancement to the next educational unit.

The following definition of a “competent” health care provider, based on our review and assimilation of the literature, highlights the implications of SBML principles: a provider who has attained the educational outcomes or competencies at the mastery learning level and has achieved an acceptable level of performance to begin to safely care for patients autonomously (see Figure 2). At the cusp of the advanced beginner to competent and proficient levels of the Dreyfus/Benner models, health care practitioners have a better working knowledge of practice areas, become more autonomous, may require less supervision, and are able to complete more complex tasks using their own judgment.

Health care providers educated to mastery standards also recognize when they are exceeding their own comfort levels and when they can safely proceed independently. Practice and preparation in the simulation laboratory have demonstrated superiority to traditional, apprenticeship methods of clinical education,9 characterized by the “see one, do one, teach one” adage. SBML prepares health care trainees to enter the clinical environment at a level of competence beyond that of the Dreyfus/Benner novice or beginner. Adaptive expertise,57–59 or the ability to understand and navigate complexities of various patient and environmental intricacies, is essential to progress to more advanced levels of performance.

Back to Top | Article Outline

SBML and variability of time

It is important to understand that the variability of time is a component of SBML that differs in many ways from the traditional apprenticeship model. In 1971, Bloom27 reported that if teachers could provide the necessary time and appropriate learning conditions, nearly all students could reach a high level of achievement. The SBML model has been applied in educational environments to differentiate and individualize instruction and feedback to ensure that all learners accomplish all educational goals or achieve competencies with little or no variation in outcome. The SBML model suggests that learners commit to continuous practice with appropriate feedback until set standards are reached. We believe that SBML combined with intentional deliberate practice31 will play a significant role in the future of health care education. Once learning outcomes and assessment modalities are identified, most health professions students should be allowed to learn at their own pace without penalty while provided continuous formative assessment of their performance.

Cook et al21 summarized quantitative outcomes of SBML and found that SBML occasionally required more time but was “associated with large benefit in skills.” Within reason, the time may be justified if we are training providers to a higher level of performance without risk to patients. In the same systematic review, Cook et al noted that limited evidence suggests that the effect of SBML is optimized when a flipped classroom or pretraining is involved. The flipped classroom60 approach allows learners to absorb and master acquired knowledge ahead of simulation laboratory time on their own schedule, and reserves face-to-face training time in the laboratory for experiential skill acquisition.

Back to Top | Article Outline

Limitations

Reproducing the results of an SBME intervention is an ongoing challenge as numerous heterogeneous factors contribute to successful SBME interventions. Owing to the large variation in interventions, learner populations, controls (i.e., no intervention or traditional education), and various patient care outcomes reported, we considered a realist synthesis34,35 to be the most appropriate method to review and report the effects of SBML.

It is important to note that 79 of the SBME articles we identified were excluded from this review. Many of the excluded studies included at least some of the SBML principles, and the majority demonstrated a positive association between SBME and T2, T3, or T4 outcomes. This highlights the importance of understanding when SBML may or may not be a superior training modality as compared with traditional educational interventions or other SBME modalities. Although the studies included in this review met all seven SBML principles, we observed variations in how the simulation intervention in the studies followed these SBML principles. Only two of the studies clearly stated that mastery learning was used as the simulation intervention.40,50 This was partly because SBML has attracted attention in the simulation field only in recent years.

Finally, publication bias may have influenced the results. Negative trials are less common than positive trials in the general scientific literature.61 However, given that educators spend significant energy implementing educational programs as performance and quality improvement efforts, rather than studying them with the intent to publish their outcomes, it is also possible that successful SBML interventions at all four translational levels may not be submitted for publication and therefore will remain unknown.

Back to Top | Article Outline

Conclusions

Although the number of articles included in our review is small, the findings of these studies suggest that health care education conducted using SBML methodology can improve patient care processes and outcomes. SBML has been shown to affect performance level, procedural success rate, patient discomfort, procedure time, error rate, and health care costs.

Ensuring that the health care workforce is well trained and competent is likely to have additional far-ranging benefits, including better patient care practices and improved patient outcomes.7 This requires further study.62 The application of SBML principles to health care educational curricula may help educators define translational outcomes and further understand the meaning of competency. However, translational health professions education outcomes cannot be achieved from single, isolated studies. Rather, in health professions education, translational science results derive from educational and health services research programs that are thematic, sustained, and cumulative.7 Such translational education research programs must be carefully designed and executed to capture and measure downstream results to aid in the creation of a patient-focused health care system that reliably delivers high-quality care.

Acknowledgments: The authors would like to thank Gary Childs, liaison librarian for health sciences at Drexel University, for assistance with the literature review; Michele Spotts, instructional designer in the Drexel University Master of Science in Medical and Healthcare Simulation Program, for the creation of the figures; and Diana Winters for editing the manuscript.

Back to Top | Article Outline

References

1. Accreditation Council for Graduate Medical Education. ACGME Outcome Project Core Competencies. 2001 http://www.acgme.org. Accessed March 12, 2015 [no longer available]
2. Anderson MBHodges BD, Lingard L. Foreword. The Question of Competence: Reconsidering Medical Education in the Twenty-First Century. 2012 Ithaca, NY ILR Press
3. McGaghie WC, Miller GE, Sajid AW, Telder TV. Competency-Based Curriculum Development in Medical Education. 1978 Geneva, Switzerland World Health Organization Public health paper no. 68.
4. Sklar DP. Competencies, milestones, and entrustable professional activities: What they are, what they could be. Acad Med. 2015;90:395–397
5. Dougherty D, Conway PH. The “3T’s” road map to transform US health care: The “how” of high-quality care. JAMA. 2008;299:2319–2321
6. McGaghie WC. Medical education research as translational science. Sci Transl Med. 2010;2:19cm8
7. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Translational educational research: A necessity for effective health-care improvement. Chest. 2012;142:1097–1103
8. McGaghie WC, Issenberg SB, Barsuk JH, Wayne DB. A critical review of simulation-based mastery learning with translational outcomes. Med Educ. 2014;48:375–385
9. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86:706–711
10. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44:50–63
11. McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. Effect of practice on standardised learning outcomes in simulation-based medical education. Med Educ. 2006;40:792–797
12. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: A BEME systematic review. Med Teach. 2005;27:10–28
13. Issenberg SB, McGaghie WC, Hart IR, et al. Simulation technology for health care professional skills training and assessment. JAMA. 1999;282:861–866
14. Zendejas B, Brydges R, Wang AT, Cook DA. Patient outcomes in simulation-based medical education: A systematic review. J Gen Intern Med. 2013;28:1078–1089
15. Zendejas B, Wang AT, Brydges R, Hamstra SJ, Cook DA. Cost: The missing outcome in simulation-based medical education research: A systematic review. Surgery. 2013;153:160–176
16. McGaghie WC, Draycott TJ, Dunn WF, Lopez CM, Stefanidis D. Evaluating the impact of simulation on translational patient outcomes. Simul Healthc. 2011;6(suppl):S42–S47
17. Salas E, DiazGranados D, Klein C, et al. Does team training improve team performance? A meta-analysis. Hum Factors. 2008;50:903–933
18. Ilgen JS, Sherbino J, Cook DA. Technology-enhanced simulation in emergency medicine: A systematic review and meta-analysis. Acad Emerg Med. 2013;20:117–127
19. Cook DA, Hatala R, Brydges R, et al. Technology-enhanced simulation for health professions education: A systematic review and meta-analysis. JAMA. 2011;306:978–988
20. Mundell WC, Kennedy CC, Szostek JH, Cook DA. Simulation technology for resuscitation training: A systematic review and meta-analysis. Resuscitation. 2013;84:1174–1183
21. Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R. Mastery learning for health professionals using technology-enhanced simulation: A systematic review and meta-analysis. Acad Med. 2013;88:1178–1186
22. Cook DA, Erwin PJ, Triola MM. Computerized virtual patients in health professions education: A systematic review and meta-analysis. Acad Med. 2010;85:1589–1602
23. Kennedy CC, Maldonado F, Cook DA. Simulation-based bronchoscopy training: Systematic review and meta-analysis. Chest. 2013;144:183–192
24. McGaghie WC. Implementation science: Addressing complexity in medical education. Med Teach. 2011;33:97–98
25. Chiniara G, Cole G, Brisbin K, et al.Canadian Network for Simulation in Healthcare, Guidelines Working Group. Simulation in healthcare: A taxonomy and a conceptual framework for instructional design and media selection. Med Teach. 2013;35:e1380–e1395
26. Block JH, Burns RB. Mastery learning. Rev Res Educ. 1976;4:3–49
27. Bloom BSBlock JH. Mastery learning. Mastery Learning: Theory and Practice. 1971 New York, NY Holt, Rinehart & Winston:47–63
28. Bloom BS. Recent developments in mastery learning. Educ Psychol. 1973;10:53–57
29. Bloom BS. Time and learning. Am Psychol. 1974;29:682
30. Ericsson KA. An expert-performance perspective of research on medical expertise: The study of clinical performance. Med Educ. 2007;41:1124–1130
31. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S81
32. Ericsson KA. Deliberate practice and acquisition of expert performance: A general overview. Acad Emerg Med. 2008;15:988–994
33. McGaghie WC, Siddall VJ, Mazmanian PE, Myers JAmerican College of Chest Physicians Health and Science Policy Committee. . Lessons for continuing medical education from simulation research in undergraduate and graduate medical education: Effectiveness of continuing medical education: American College of Chest Physicians Evidence-Based Educational Guidelines. Chest. 2009;135(3 suppl):62S–68S
34. Greenhalgh T, Wong G, Westhorp G, Pawson R. Protocol–realist and meta-narrative evidence synthesis: Evolving standards (RAMESES). BMC Med Res Methodol. 2011;11:115
35. Wong G, Greenhalgh T, Westhorp G, Buckingham J, Pawson R. RAMESES publication standards: Realist syntheses. J Adv Nurs. 2013;69:1005–1022
36. Eva KW. On the limits of systematicity. Med Educ. 2008;42:852–853
37. Colliver JA, Kucera K, Verhulst SJ. Meta-analysis of quasi-experimental research: Are systematic narrative reviews indicated? Med Educ. 2008;42:858–865
38. Huang X, Lin J, Demner-Fushman D Evaluation of PICO as a knowledge representation for clinical questions.Paper presented at: American Medical Informatics Association Annual SymposiumNovember 11–15, 2006Washington, DC
39. Ahya SN, Barsuk JH, Cohen ER, Tuazon J, McGaghie WC, Wayne DB. Clinical performance and skill retention after simulation-based education for nephrology fellows. Semin Dial. 2012;25:470–473
40. Butter J, McGaghie WC, Cohen ER, Kaye M, Wayne DB. Simulation-based mastery learning improves cardiac auscultation skills in medical students. J Gen Intern Med. 2010;25:780–785
41. Wayne DB, Didwania A, Feinglass J, Fudala MJ, Barsuk JH, McGaghie WC. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: A case–control study. Chest. 2008;133:56–61
42. Källström R, Hjertberg H, Svanvik J. Impact of virtual reality-simulated training on urology residents’ performance of transurethral resection of the prostate. J Endourol. 2010;24:1521–1528
43. Palter VN, Grantcharov T, Harvey A, Macrae HM. Ex vivo technical skills training transfers to the operating room and enhances cognitive learning: A randomized controlled trial. Ann Surg. 2011;253:886–889
44. Yi SY, Ryu KH, Na YJ, et al. Improvement of colonoscopy skills through simulation-based training. Stud Health Technol Inform. 2008;132:565–567
45. Larsen CR, Soerensen JL, Grantcharov TP, et al. Effect of virtual reality training on laparoscopic surgery: Randomised controlled trial. BMJ. 2009;338:b1802
46. Zendejas B, Cook DA, Bingener J, et al. Simulation-based mastery learning improves patient outcomes in laparoscopic inguinal hernia repair: A randomized controlled trial. Ann Surg. 2011;254:502–509
47. Ahlberg G, Hultcrantz R, Jaramillo E, Lindblom A, Arvidsson D. Virtual reality colonoscopy simulation: A compulsory practice for the future colonoscopist? Endoscopy. 2005;37:1198–1204
48. Hogle NJ, Chang L, Strong VE, et al. Validation of laparoscopic surgical skills training outside the operating room: A long road. Surg Endosc. 2009;23:1476–1482
49. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Use of simulation-based education to reduce catheter-related bloodstream infections. Arch Intern Med. 2009;169:1420–1423
50. Barsuk JH, Cohen ER, Potts S, et al. Dissemination of a simulation-based mastery learning intervention reduces central line-associated bloodstream infections. BMJ Qual Saf. 2014;23:749–756
51. Duncan DR, Morgenthaler TI, Ryu JH, Daniels CE. Reducing iatrogenic risk in thoracentesis: Establishing best practice via experiential training in a zero-risk environment. Chest. 2009;135:1315–1320
52. Cohen ER, Feinglass J, Barsuk JH, et al. Cost savings from reduced catheter-related bloodstream infection after simulation-based education for residents in a medical intensive care unit. Simul Healthc. 2010;5:98–102
53. Dreyfus SE, Dreyfus HL A Five-Stage Model of the Mental Activities Involved in Directed Skill Acquisition. 1980 Fort Belvoir, Va Defense Technical Information Center www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA084551. Accessed July 16, 2015
54. Dreyfus HL, Dreyfus SE, Athanasiou T Mind Over Machine: The Power of Human Intuition and Expertise in the Era of the Computer. 1986 New York, NY Free Press
55. Lester S. Novice to expert: The Dreyfus model of skill acquisition. Stan Lester Developments. 2005 http://www.fes-seguridadregional.org/images/stories/docs/eve2por.pdf. Accessed July 28, 2015
56. Benner P. From novice to expert. Am J Nurs. 1982;82:402–407
57. Ericsson KAO’Neil HF, Perez RS, Baker EL. Adaptive expertise and cognitive readiness: A perspective from the expert-performance approach. Teaching and Measuring Cognitive Readiness. 2014 New York, NY Springer:179–197
58. Bohle Carbonell K, Stalmeijer RE, Könings KD, Segers M, van Merriënboer JJG. How experts deal with novel situations: A review of adaptive expertise. Educ Res Rev. 2014;12:14–29
59. Chen G, Thomas B, Wallace JC. A multilevel examination of the relationships among training outcomes, mediating regulatory processes, and adaptive performance. J Appl Psychol. 2005;90:827–841
60. Prober CG, Khan S. Medical education reimagined: A call to action. Acad Med. 2013;88:1407–1410
61. Chopra V, Davis M. In search of equipoise. JAMA. 2011;305:1234–1235
62. Issenberg SB, McGaghie WCMcGaghie WC. Looking to the future. International Best Practices for Evaluation in the Health Professions. 2013 London, UK Radcliffe Publishing, Ltd.:341–359
© 2015 by the Association of American Medical Colleges