A common expression of wishful thinking is to believe that a medical education innovation that works in one setting can be inserted whole cloth elsewhere and achieve identical results. The idea is that medical education settings are uniform, that their parts are interchangeable, and that successful programs can be transferred to other sites without regard to local history, culture, habits, aspirations, receptivity to change, or financial conditions. To illustrate, a recent national survey conducted under the auspices of the Association of American Medical Colleges reports, “Simulation is arguably the most prominent innovation in medical education over the past 15 years.”1 The survey quotes anesthesiologist David Gaba,2 who wrote, “Simulation has the potential to revolutionize health care and address the patient safety issues if appropriately used and integrated into the educational and organizational improvement process.” However, decades of research and practical experience have demonstrated that the implementation, financing, management, and integration of medical education simulation into medical school curricula and other organizational structures is very difficult, uneven across medical schools, and rarely evaluated rigorously.3,4 Scholars also point out that the operation of productive simulation research programs in diverse medical education environments is challenging.5 Another innovation, team-based learning, has also traveled a rocky road in medical education because its introduction at multiple schools has achieved mixed results.6
Mastery learning, the theme of this Academic Medicine cluster, is an innovative strategy in medical education. Mastery learning is a hybrid approach to competency-based education7 that expects “excellence for all,” or the acquisition of knowledge, skills, and professionalism competencies to uniformly high achievement standards with little or no variation among learners and without restricting learning time to fixed intervals. The conceptual foundation of mastery learning was expressed over 50 years ago.8 The mastery learning model has been used to help medical learners acquire a variety of procedural skills, clinical judgment goals, and communication competencies with patients and their families.9 This research has been summarized in several integrative reviews.10,11 Research also shows that mastery learning in medical education can translate to improved patient care practices and better patient outcomes as well.11–13
Disseminating Medical Education Innovations
Advocates of medical education innovation argue that medical schools are eager to embrace change and adopt new practices and that change can be enacted on grounds of reason, evidence, and best practices.14 We present a different case in this article, that program change in medical education—especially when change originates from external sources—is not easy. Educational inertia—that is, the maintenance of the status quo—is a powerful force in most medical schools. Implementing successful educational programs across settings takes time, hard work, and financial resources. Berwick15 noted in a 2003 essay, “In health care, invention is hard, but dissemination is even harder.” Rigorous evaluation of educational innovations is also needed to rule out the appearance of impact—that is, “reform without change”16—when the results are equivocal.
Why is it so difficult to transplant a highly successful medical education program from one medical education setting to another? What are the barriers that frustrate attempts to introduce new and successful ideas like mastery learning into established medical education curricula?
We argue in this article that the slow adoption and use of the mastery learning model in medical education is a case study of implementation science. Implementation science is a relatively new discipline and the title of a journal now in its 10th volume (2015). The journal’s stated research focus is
the scientific study of methods to promote the uptake of research findings into routine healthcare in clinical, organisational or policy contexts … it includes the study of influences on patient, healthcare professional, and organisational behavior in either healthcare or population settings.… [This] is scientifically important because it identifies the behaviour of healthcare professionals and healthcare organisations as key sources of variance requiring improved empirical and theoretical understanding before effective uptake can be reliably achieved.17
The rest of this article has three sections that address implementation science issues. First, we briefly review implementation science principles to frame and set boundaries for the discussion. Second, we present an educational case study about the difficult yet successful transfer and maintenance of a medical education simulation-based mastery learning (SBML) curriculum on central venous catheter (CVC) insertion from a tertiary care academic medical center to an academic community hospital setting. Third, we present lessons learned from the educational case study to inform other medical educators about the challenges and pitfalls they may face when implementing new ideas into established programs.
Implementation science addresses the mechanisms of medical education and health care delivery.18,19 The goal of implementation science is to
[study] and [seek] to overcome health-care organizational silos and barriers, pockets of cultural inertia, professional hierarchies, and financial disincentives that reduce health-care efficiency and effectiveness. In hospitals and clinics, [implementation science] addresses the science of health-care delivery.13
A key article in the journal Implementation Science authored by Damschroder and colleagues20 presents a consolidated framework for implementation research that has five major domains: intervention characteristics, outer setting, inner setting, characteristics of the individuals involved, and the process of implementation.
Scholarship by UK behavioral scientist Ray Pawson and colleagues21,22 sheds practical light on implementation science principles. These scholars argue that widespread adoption of an educational innovation such as mastery learning is a complex service intervention (CSI) that operates in a heterogeneous social system (e.g., academic medical center, primary care residency program). Pawson and colleagues21 teach that CSIs have seven defining features that shape their implementation:
1. CSIs are theories, grounded “on a hypothesis that postulates: if we deliver a programme in this way or we manage services like so, then this will bring about some improved outcome.”
2. CSIs “are active … they achieve their effects via the active input of individuals (clinicians, educators, managers, patients).” Passive bystanders do not contribute to implementation science, but their inaction needs to be studied.
3. CSIs “have a long journey.” Their success depends on a cascaded sequence of events and the “integrity of the implementation chain.”
4. The CSI implementation chain is often “non-linear and can even go in reverse.”
5. CSIs are “fragile … embedded in multiple social systems.” Mastery learning programs are not equally effective in all circumstances because of the influence of context.
6. CSIs are “leaky and prone to be borrowed.” The “same intervention will be delivered in a mutating fashion shaped by refinement, reinvention, and adaptation to local circumstances.”
7. CSIs are “open systems that feed back on themselves. As interventions are implemented, they change the conditions that made them work in the first place.”
Implementation science scholars and practitioners argue that the introduction or dissemination of novel practices into established health care organizations requires much effort and needs to be “informed by an assessment of the likely barriers and facilitators.”23 Some of the methods used by implementation scientists to find ways to embed novel interventions in educational and clinical programs include stakeholder engagement, effectiveness studies, and research syntheses.24 Mastery learning interventions in medical education programs have been informed by several of these methods.9–11 However, widespread dissemination and adoption of new advances such as the mastery learning model occur at a slow pace in medical education.
Mastery Learning Case Study
A clinical and educational research team from Northwestern University Feinberg School of Medicine (hereafter, simply Northwestern) created an innovative SBML curriculum on CVC insertion for postgraduate internal medicine and emergency medicine residents. The SBML CVC curriculum described in this article was introduced in two phases: (1) curriculum development, implementation, and evaluation at a tertiary care academic medical center; and (2) curriculum dissemination to a nearby academic community hospital. This dissemination research study was grounded in implementation science principles and addressed the question “Will mastery learning educational and clinical outcomes achieved in a tertiary care academic medical center transfer to an academic community hospital setting?”
Phase 1: Curriculum development, implementation, and evaluation at a tertiary care academic medical center
The SBML curriculum on CVC insertion was created in 2007–2008 with attention to clinical detail and seven mastery learning principles articulated in an earlier article.25 The seven mastery learning principles are:
1. Baseline or diagnostic testing;
2. Clear learning objectives, sequenced as units in increasing difficulty;
3. Engagement in educational activities (e.g., deliberate practice, data interpretation, reading) focused on reaching the objectives;
4. A set minimum passing standard (MPS) (e.g., test score) for each educational unit;
5. Formative testing to gauge unit completion at a preset MPS for mastery;
6. Advancement to the next educational unit given measured achievement at or above the mastery standard; and
7. Continued practice or study on an educational unit until the mastery standard is reached.
The goal of mastery learning is to certify that all medical learners achieve all educational objectives with little or no variation in outcome. The time needed to reach the mastery standard for a unit’s educational objectives may vary among the learners.
The SBML CVC curriculum has origins in clinical practice. The Northwestern team noted from a needs assessment that internal medicine residents rotating in the medical intensive care unit (MICU) at Northwestern Memorial Hospital (NMH), a tertiary care academic medical center, had varied experience and uneven CVC insertion skills. Mechanical complications such as arterial punctures were high, and the central-line-associated bloodstream infection (CLABSI) rate, which could be attributed to the CVCs inserted by internal medicine residents, was higher than expected. Therefore, the team’s initial goal was to train all internal medicine residents to perform a CVC procedure to a mastery standard.
An SBML curriculum was developed for CVC insertion skills among second- and third-year internal medicine residents.26,27 Two 27-item skills checklists were designed for internal jugular insertion and subclavian insertion using evidence-based guidelines and expert opinion. Pilot testing was performed on 10 learners using a CVC simulator and the skills checklist. Faculty raters were calibrated, and a high level of skills checklist agreement was achieved (Kn = 0.94).26 An expert panel of 11 physicians set the MPS at 79% of items correct using the Angoff and Hofstee methods.26 The curriculum was funded by NMH based on the expectation that CVC complications and the CLABSI rate would be reduced after rigorous education of internal medicine residents.
Once the SBML CVC curriculum was developed and pilot tested, second- and third-year internal medicine and emergency medicine residents at NMH underwent baseline testing and received feedback using a CVC simulator and the skills checklist one month before rotating through the MICU. The residents watched a recorded lecture about sterile technique, indications, contraindications, interpretation, and complications of CVC insertion and experienced two 2-hour education sessions where they received instruction about ultrasound techniques for CVC insertion. Residents received time for deliberate practice28 on the CVC simulator with coaching and focused feedback from faculty and were expected to perform an entire CVC procedure. After simulation training, residents took a posttest skills exam using the skills checklist. Residents were expected to meet or exceed the MPS on the 27-item internal jugular and subclavian skills checklists before rotating through the MICU. Residents who did not reach the MPS engaged in additional deliberate practice until the MPS was reached. The amount of additional deliberate practice time needed by residents who did not reach mastery in the standard 4-hour curriculum never exceeded 1 hour.26,27
Scores from the baseline (or pretest) to the posttest on CVC skills improved significantly from 50.6% (standard deviation [SD] = 23.4) of skills checklist items correct to 93.9% (SD = 10.2) correct for internal jugular insertion, and from 48.4% (SD = 26.8) of skills checklist items correct to 91.5% (SD = 17.1) correct for subclavian insertion (both P < .0005).27 Decay of trainees’ CVC skills over time was also evaluated by retesting residents at 6 and 12 months after the SBML CVC curriculum.29 Over 82% of the trainees passed the retests and maintained their high performance up to one year after training.29
Several quality metrics were tracked daily for one year on all CVCs placed in the NMH MICU to further evaluate the SBML CVC curriculum effects.27 Patient complications from SBML-trained residents were compared with patient complications from residents who had been traditionally trained instead of SBML trained. SBML-trained residents had significantly fewer arterial punctures (P < .0005), less need for CVC adjustments (P = .002), and fewer insertion failures (P = .005) compared with traditionally trained residents.27
In a separate study, the impact of the SBML training on CLABSI rates in the NMH MICU was evaluated.30 CLABSIs decreased dramatically for 19 months after medical residents who had undergone SBML training started to rotate in the MICU, compared with the 19 months before the intervention (85% reduction, P < .0001). CLABSI rate outcomes were also compared with those of an NMH surgical intensive care unit, where no SBML-trained residents rotated. During the study period, the MICU had a significantly lower CLABSI rate after SBML-trained residents started rotating in, compared with the surgical intensive care unit (P = .003).30
An analysis done to evaluate cost savings due to the CLABSI rate reduction revealed a 7-to-1 rate of financial return on the SBML training investment.31 Another study shows that the SBML curriculum on CVC insertion had unexpected yet welcome collateral effects, expressed as steadily increasing pretest passing scores, on new residents in the same clinical and educational setting.32 This made it necessary to “raise the bar”—that is, to boost the MPS from 79% of skills checklist items correct to 88% correct, a nearly flawless performance level.33 In addition to demonstrating improvements in safe patient care practices and patient outcomes, these research studies also demonstrate improvements in the clinical culture that recognizes and values the contribution of rigorous education to patient care.
Phase 2: Curriculum dissemination to a nearby academic community hospital
The Northwestern SBML CVC curriculum was effective at NMH because it improved trainee skills, reduced mechanical complications during actual CVC insertions, and significantly reduced CLABSI rates in the MICU. We did not know, however, if the SBML CVC curriculum would work in a different medical education setting.
Table 1 presents Pawson and colleagues’21 seven defining features of CSIs as applied to the SBML CVC curriculum dissemination program, using implementation science principles,34 that introduced the curriculum at Mercy Hospital and Medical Center (MHMC), an academic community hospital in Chicago. The following narrative amplifies the information presented in the table.
MHMC is located four miles from NMH34 and has Accreditation Council for Graduate Medical Education–accredited postgraduate education programs in internal medicine, emergency medicine, surgery, and three other medical specialties. Two Chicago Tribune articles published in May 2010 described the burden of CLABSI rates in Illinois hospitals. In one of these articles, MHMC was identified as a hospital with a higher-than-expected CLABSI rate,35 and in the other article, NMH was cited for a very low CLABSI rate.36 The MHMC internal medicine residency program director, who had educational experience and an academic affiliation at Northwestern, reasoned that rigorous resident education would mitigate the CLABSI rate problem. Historically, MHMC residents learned CVC insertion from lectures or vicariously by observing other residents inserting CVCs on patients in the MICU. There was no reliable assessment of resident CVC insertion competency at MHMC. Despite these conditions, internal medicine residents staffed the MHMC MICU routinely and inserted almost all CVCs with supervision by second- and third-year residents. The MHMC internal medicine residency program director contacted Northwestern colleagues to request the dissemination of the SBML CVC curriculum at MHMC and to propose a collaborative evaluation of the curriculum’s effects on trainee skills and CLABSI rates.
Active involvement by a variety of persons was needed to implement, maintain, and evaluate the SBML CVC curriculum dissemination. Key personnel included (1) the MHMC internal medicine residency program director who sought the educational intervention; (2) MHMC hospital leaders who were persuaded by NMH training26,27 and outcome30 data that CLABSI rate reduction was feasible and necessary; (3) two MHMC medicine chief residents who completed the SBML CVC insertion course at NMH, observed NMH training sessions, and became primary clinical teachers and evaluators; and (4) two pro bono Northwestern educational consultants (J.H.B., D.B.W.). Of note, involvement by MHMC internal medicine faculty and fellows in resident procedural training was neither part of local educational custom nor part of the SBML CVC curriculum dissemination.
The initial setup for the SBML CVC curriculum dissemination at MHMC was intense, involving a sequence of planning, educational, and logistical events over several months. Starting in August 2010, the Northwestern educational consultants spent three half-days at MHMC to become acquainted with the hospital and its MICU, discern SBML CVC curriculum dissemination barriers, and equip and prepare the setting for rigorous SBML training. The MHMC chief residents received at least 10 hours of faculty development as instruction, observation, practice, feedback, and supervision from the educational consultants. The educational consultants also provided remote consultation for 10 months through monthly reviews of the training progress at MHMC via meetings, phone calls, and e-mail communication. MHMC provided CVC insertion supplies and purchased a central line simulator with replacement skins for the SBML training. Recurring costs, including each trainee’s gowns, caps, masks, sterile gloves, and dressings, are approximately $2,000 annually. The monthly number of CLABSIs per 1,000 catheter days was measured routinely by MHMC infection control personnel using protocols published by the National Healthcare Safety Network.37 Acute Physiology and Chronic Health Evaluation III38 mortality scores were used to compare the severity of illness of MICU patients before and after the SBML CVC curriculum dissemination, and mean monthly Acute Physiology and Chronic Health Evaluation III scores were used to determine whether changes in severity of illness affected CLABSI rates.
The SBML CVC curriculum was implemented at MHMC to match the previous experience at NMH, with all eligible internal medicine and emergency medicine residents trained in CVC insertion to the mastery standard. The MHMC training outcomes were nearly identical to the NMH outcomes (see Figure 1):
Mercy residents performed similar to Northwestern residents on a CVC insertion clinical skills examination, including the number of residents meeting the MPS at initial post-test and the time required [less than 1-hour] to remediate residents who did not initially meet the MPS.34
The SBML CVC curriculum dissemination also produced a 74% reduction in the incidence of CLABSI in the MHMC intensive care unit following the training, essentially replicating the NMH results.34 MHMC infection control personnel and residents were not aware of the nature of the dissemination research study; MHMC infection control personnel were also not aware of the timing of the SBML CVC curriculum dissemination.
The SBML CVC curriculum dissemination from NMH to MHMC was not seamless. When the original study ended in May 2012 there was a six-month hiatus in SBML training due to the startup of a new and untrained chief resident, which produced a lapse in educational continuity. Figure 2 shows that the MHMC CLABSI rate rose during this six-month interval (June–December 2012) which was detected by MHMC infection control personnel. However, after the new chief resident was oriented to the role and was CVC educated to the mastery standard, the SBML CVC curriculum was reintroduced for resident education and the CLABSI rate dropped to zero. This unplanned “natural” experiment39 provides added evidence about the power of SBML to affect patient outcomes.
This curriculum dissemination underscores the idea that CSIs are fragile, shaped by the social conditions in which they are embedded, and can even go in reverse. Evidence from this study also shows that rigorous education is a key feature of clinical quality improvement.
We also see from this work that the curriculum dissemination CSI mutated slightly during the journey from its NMH origins to its MHMC destination. We learned that despite our intention, medical clinical curricula are never identical. Curriculum installation, maintenance, and integrity depend on deliberate organizational attention to planning, engagement, execution, and evaluation. Unexpected collateral effects, such as the short-term lapse in SBML training after the succession of chief residents, cannot be detected and fixed without continuous process monitoring and outcome measurement.
The dissemination of the SBML CVC curriculum from NMH to MHMC was neither easy nor perfect. The dissemination was adapted to the new clinical and educational academic community hospital setting and achieved good results for at least four reasons. The SBML CVC curriculum included (1) a powerful, sustained educational intervention grounded in mastery learning principles with deliberate practice and an expectation of excellence for all learners; (2) early and sustained attention to implementation science principles, including intense faculty training, frequent communication, tight program management, and engaged clinical leadership; (3) commitment to a unified patient care culture; and (4) established mechanisms to rigorously measure resident learning and skills acquisition and patient care outcomes expressed as CVC complications and CLABSI rates.
The successful dissemination of the SBML CVC curriculum also underscores the importance of local champions—that is, educationally influential physicians15,40—and organizational context to achieve important educational and patient care goals.41,42 Local champions are critical because in medical education organizations the curve of innovation adoption is distributed normally with few faculty innovators (2.5%) and early adopters (13.5%)15; most medical education faculty are characterized as early majority (34%), late majority (34%), or laggards (16%).15 Faculty in the innovator, early adopter, and early majority groups are the most likely source of innovation dissemination and maintenance in medical education organizations.
The clinical and educational leadership at MHMC was composed of early adopters, meaning the setting was receptive to change and improvement. In addition, the internal medicine residency program director served on the hospital’s continuous quality improvement committee that was charged with improving patient care. Despite these conditions, the “spike” in the CLABSI rate in June–December 2012 (shown in Figure 2) is a reminder that CSIs are fragile, and there is a need to ensure that an educational intervention is sustained over time with continuous outcome measurement to gain lasting effects.
The organizational context is important for the dissemination of an innovation like mastery learning because the novel educational approach will survive only if it is received with praise, reinforcement, resources, and security.15,41,42 The successful dissemination of an innovation in medical education depends on how an organization treats and responds to innovators, early adopters, and early majority faculty groups. Educational inertia is very hard to overcome because, as Berwick15 argues, “Medical communities are primarily local in their orientation, are dominated numerically by early and late majority groups, and do not trust remote and personally unfamiliar sources of authority.”
The chief lesson we have learned from the largely successful dissemination of the SBML CVC curriculum is that such work takes time, much effort, and a common commitment to education and patient care goals from both the parent and the receiving organizations. These findings are consistent with large-scale quality improvement interventions such as the Keystone project in Michigan, which was aimed at reducing MICU infection rates in over 100 hospitals.43 The Michigan Keystone project was successful because the health care intervention involved not only clinical practice changes but also strong doses of medical and nursing education, peer and public pressure, accountability, professional sanctions, and other social variables.43 Knowledge about an innovation like mastery learning derived from customary educational programs such as conferences and grand rounds is insufficient for its successful implementation in educational practice. Successful dissemination of new programs and ideas in medical education takes active educational leadership, personal contacts, dedication, hard work, rigorous measurement, and attention to implementation science principles.
1. Passiment M, Sacks H, Huang G Medical Simulation in Medical Education: Results of an AAMC Survey. Washington, DC Association of American Medical Colleges September 2011. https://www.aamc.org/download/259760/data
. Accessed June 25, 2015
2. Gaba DM. The future vision of simulation in health care. Qual Saf Health Care. 2004;13(suppl 1):i2–i10
3. Loyd GE, Lake CL, Greenberg RB eds. Practical Health Care Simulations. 2004 Philadelphia, Pa Elsevier Mosby
4. Kyle RR Jr, Murray WB eds. Clinical Simulation: Operations, Engineering, and Management. 2008 Burlington, Mass Academic Press
5. . Monographs from the First Research Consensus Summit of the Society for Simulation in Healthcare. Simul Healthc. 2011;6(suppl):S1–S67
6. Thompson BM, Schneider VF, Haidet P, et al. Team-based learning at ten medical schools: Two years later. Med Educ. 2007;41:250–257
7. McGaghie WC, Miller GE, Sajid AW, Telder TV. Competency-based curriculum development in medical education: An introduction. Public Health Paper. 1978;68:11–91
8. Issenberg SB, McGaghie WCMcGaghie WC. Looking to the future. International Best Practices for Evaluation in the Health Professions. 2013 London, England Radcliffe Publishing
9. Cohen ER, Barsuk JH, Moazed F, et al. Making July safer: Simulation-based mastery learning during intern boot camp. Acad Med. 2013;88:233–239
10. Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R. Mastery learning for health professionals using technology-enhanced simulation: A systematic review and meta-analysis. Acad Med. 2013;88:1178–1186
11. McGaghie WC, Issenberg SB, Barsuk JH, Wayne DB. A critical review of simulation-based mastery learning with translational outcomes. Med Educ. 2014;48:375–385
12. McGaghie WC, Draycott TJ, Dunn WF, Lopez CM, Stefanidis D. Evaluating the impact of simulation on translational patient outcomes. Simul Healthc. 2011;6(suppl):S42–S47
13. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Translational educational research: A necessity for effective health-care improvement. Chest. 2012;142:1097–1103
14. Harden R, Grant J, Buckley G, Hart IR. BEME guide no. 1: Best evidence medical education. Med Teach. 1999;21:553–562
15. Berwick DM. Disseminating innovations in health care. JAMA. 2003;289:1969–1975
16. Bloom SW. Structure and ideology in medical education: An analysis of resistance to change. J Health Soc Behav. 1988;29:294–306
18. Bonham AC, Solomon MZ. Moving comparative effectiveness research into practice: Implementation science and the role of academic medicine. Health Aff (Millwood). 2010;29:1901–1905
19. McGaghie WC. Implementation science: Addressing complexity in medical education. Med Teach. 2011;33:97–98
20. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implement Sci. 2009;4:50
21. Pawson R, Greenhalgh T, Harvey G, Walshe K. Realist review—A new method of systematic review designed for complex policy interventions. J Health Serv Res Policy. 2005;10(suppl 1):21–34
22. Pawson R Evidence-Based Policy: A Realist Perspective. 2006 Thousand Oaks, Calif SAGE
23. Grimshaw JM, Eccles MP, Lavis JN, Hill SJ, Squires JE. Knowledge translation of research findings. Implement Sci. 2012;7:50
24. Lobb R, Colditz GA. Implementation science and its application to population health. Annu Rev Public Health. 2013;34:235–251
25. McGaghie WC, Siddall VJ, Mazmanian PE, Myers JAmerican College of Chest Physicians Health and Science Policy Committee. . Lessons for continuing medical education from simulation research in undergraduate and graduate medical education: Effectiveness of continuing medical education: American College of Chest Physicians Evidence-Based Educational Guidelines. Chest. 2009;135(3 suppl):62S–68S
26. Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4:397–403
27. Barsuk JH, McGaghie WC, Cohen ER, O’Leary KJ, Wayne DB. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37:2697–2701
28. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S81
29. Barsuk JH, Cohen ER, McGaghie WC, Wayne DB. Long-term retention of central venous catheter insertion skills after simulation-based mastery learning. Acad Med. 2010;85(10 suppl):S9–S12
30. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Use of simulation-based education to reduce catheter-related bloodstream infections. Arch Intern Med. 2009;169:1420–1423
31. Cohen ER, Feinglass J, Barsuk JH, et al. Cost savings from reduced catheter-related bloodstream infection after simulation-based education for residents in a medical intensive care unit. Simul Healthc. 2010;5:98–102
32. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Unexpected collateral effects of simulation-based medical education. Acad Med. 2011;86:1513–1517
33. Cohen ER, Barsuk JH, McGaghie WC, Wayne DB. Raising the bar: Reassessing standards for procedural competence. Teach Learn Med. 2013;25:6–9
34. Barsuk JH, Cohen ER, Potts S, et al. Dissemination of a simulation-based mastery learning intervention reduces central line-associated bloodstream infections. BMJ Qual Saf. 2014;23:749–756
37. Horan TC, Andrus M, Dudeck MA. CDC/NHSN surveillance definition of health care-associated infection and criteria for specific types of infections in the acute care setting. Am J Infect Control. 2008;36:309–332
38. Knaus WA, Wagner DP, Draper EA, et al. The APACHE III prognostic system. Risk prediction of hospital mortality for critically ill hospitalized adults. Chest. 1991;100:1619–1636
39. Shadish WR, Cook TD, Campbell DT Experimental and Quasi-experimental Designs for Generalized Causal Inference. 2002 Boston, Mass Houghton Mifflin
40. Stross JK. The educationally influential physician. J Cont Educ Health Prof. 1996;16:167–172
41. Rogers EM Diffusion of Innovations. 20035th ed New York, NY Free Press
42. Krein SL, Damschroder LJ, Kowalski CP, Forman J, Hofer TP, Saint S. The influence of organizational context on quality improvement and patient safety efforts in infection prevention: A multi-center qualitative study. Soc Sci Med. 2010;71:1692–1701
43. Dixon-Woods M, Bosk CL, Aveling EL, Goeschel CA, Pronovost PJ. Explaining Michigan: Developing an ex post theory of a quality improvement program. Milbank Q. 2011;89:167–205