Medical educators have used simulation-based education (SBE) to improve the quality of patient care in a variety of medical disciplines.1–9 Simulation promotes trainees' acquisition of important clinical skills,10,11 supports their achievement of rigorous standards in a safe, forgiving environment,10,11 and allows them to receive feedback about their performance from trained faculty supervisors. The American Board of Internal Medicine recommends simulation training for internal medicine residents before the performance of invasive procedures on patients because research has shown a link between simulation training and improved patient safety.12
Residents frequently insert central venous catheters (CVCs) to provide resuscitation for critically ill patients in the intensive care unit (ICU). Medical trainees without adequate preparation and training often perform this procedure13–15 even though it is associated with serious complications including pneumothorax, arterial puncture, and catheter-associated bloodstream infection.16
McGaghie17 has described medical education research, such as that which examines the link between SBE and patient outcomes, as translational science that follows the 3Ts roadmap. T1 educational research outcomes show trainee skill and knowledge improvement in controlled laboratory settings. T2 education research outcomes demonstrate better patient care practices in hospitals and clinics. T3 education research outcomes demonstrate downstream improvements in patient or public health.17 Research shows that second- and third-year (senior) internal medicine residents who complete SBE to mastery standards demonstrate improved CVC insertion skills2,3 (T1) that are largely retained across one year.18 In addition, this training results in fewer CVC insertion-associated complications3 (T2) and catheter-related bloodstream infections4 (T3) among medical ICU patients.
At our institution, first-year (junior) residents participate in CVC insertion only under the direct supervision of a second- or third-year resident who has completed CVC SBE to mastery standards. Bandura's19 social learning theory addresses the importance of observation and vicarious learning in education. This theory suggests that junior physicians learn at the bedside from direct observation of senior physicians during the performance of procedures such as CVC insertion. However, the effect of vicarious learning on skill acquisition among residents who observe at the bedside is unknown.
We decided, therefore, to study the collateral effect of SBE by analyzing the baseline (pretest) scores on the CVC insertion clinical skills examination during the first three years of our CVC mastery learning curriculum. Because the pretest occurs before SBE, scores on this test reflect the CVC insertion skills of residents acquired vicariously in clinical care. Thus, the aim of this study was to assess changes in resident performance of simulated CVC insertions in an effort to determine whether the skills acquired by senior residents during SBE “trickle down” to junior residents who have not yet experienced SBE.
This is a retrospective, observational study of 102 second- and third-year internal medicine residents at Northwestern University (Chicago, Illinois) from July 2007 to June 2010 (three academic years). The Northwestern University institutional review board approved the study, and all participants provided informed consent before participating.
Internal medicine residents rotate through the medical ICU at Northwestern Memorial Hospital, a 792-bed tertiary care hospital in Chicago, Illinois, for a dedicated medical ICU experience during the first year of training and again as second- and/or third-year residents. Department of Medicine policy dictates that only simulator-trained second- or third-year residents may independently insert CVCs. First-year residents may participate in CVC insertion only under the direct supervision of a simulator-trained second- or third-year resident, a pulmonary and critical care fellow, or an ICU attending physician.
The details of our CVC training intervention are described elsewhere.2,3 In brief, residents complete SBE in CVC insertion to mastery standards one month before their first ICU rotation as a second- or third-year resident. Before the educational intervention, residents undergo baseline testing (pretest). A senior faculty member (J.H.B.) uses a 27-item checklist to assess their internal jugular and subclavian CVC insertion skills on a central line simulator. Subsequently, the residents receive at least two education sessions (each two hours) with standardized lectures, deliberate practice, and feedback using the simulator. After training, residents complete a posttest skills examination (involving the same checklist) on which they are expected to achieve a minimum passing score (MPS). Residents who do not meet the MPS at posttest return to the simulation laboratory for additional training. Meeting the MPS before completing training is the key feature of mastery learning.20,21 In mastery learning, educational outcomes are uniform, whereas training time varies.
An expert panel previously set the MPS during a standard setting exercise22 using the Angoff and Hofstee methods. All pre- and posttests were video recorded and graded by a single unblinded instructor (J.H.B.).
After they successfully complete the SBE training, second- and third-year residents are allowed to insert CVCs during actual patient care in the medical ICU. The education program includes no expectations or requirements for teaching CVC insertion skills to first-year residents. Figure 1 is a graphic representation of internal medicine CVC insertion education at Northwestern Memorial Hospital.
During the study period, first-year residents had progressively more contact with simulator-trained second- and third-year residents. In academic year (July through June) 2007, 57% (46 of 81) of the second- and third-year residents had successfully completed the SBE. In academic year 2008, the percentage increased to 76% (61 of 80 second- and third-year residents), and in academic year 2009, the percentage of simulator-trained residents rose to 96% (77/80).
At baseline testing (which occurred one month before the first medical ICU rotation), we obtained the following data about each second- or third-year internal medicine resident: age, gender, United States Medical Licensing Examination (USMLE) Step 1 and 2 scores, year of training, rating of CVC insertion confidence (scale 0 = not confident, 100 = very confident), and rating of clinical experience (number of internal jugular or subclavian central lines inserted). We evaluated differences in baseline (pretest) checklist scores by academic year (2007, 2008, and 2009) by comparing mean scores and the percent of trainees who met or exceeded the MPS before training. A blinded faculty instructor with expertise in scoring clinical skills examinations (D.B.W) rescored a random 25% sample of the videotaped pretests to assess interrater reliability.
We compared demographic data from each of the three academic years using analysis of variance (ANOVA) or the chi-square statistic. We also used ANOVA to evaluate differences among the three academic years in baseline checklist mean scores, and we used the chi-square statistic to evaluate differences from year to year in the percentage of residents who passed the pretest. We used multiple linear and logistic regression to evaluate the effects of age, gender, year of training, confidence, and CVC clinical experience on baseline test performance; we excluded USMLE Step 1 and 2 scores from the regression analysis because of incomplete data. We used the Cohen kappa coefficient to assess interrater reliability. We performed all analyses with IBM SPSS (version 19.0; Chicago, Illinois). All statistical tests were two-sided, and we set significance at alpha = .05.
In 2007, 46 second- and third-year residents completed CVC SBE; in 2008, 32 residents completed the training; and in 2009, 24 residents completed the CVC SBE. As shown in Table 1, the three groups of residents were very similar demographically; however, a higher percentage of second-year residents (compared with third-year residents) completed the training in 2008 and 2009 than in 2007.
Mean internal jugular pretest scores improved from 46.7% (standard deviation [SD] = 20.8%) in 2007 to 55.7% (SD = 22.5%) in 2008, and to 70.8% (SD = 22.4%) in 2009 (P < .001). Mean subclavian pretest scores dropped slightly from 48.3% (SD = 25.5%) in 2007 to 45.6% (SD = 31.0%) in 2008, but rose significantly to 63.6% (SD = 27.3%) in 2009 (P = .04). The use of multiple linear regression did not change internal jugular (P < .001) or subclavian (P = .01) improvement after adding covariates (age, gender, year of training, confidence rating, and clinical experience rating).
Figure 2 displays the percentage of trainees meeting or exceeding the MPS on the baseline pretest. For internal jugular CVC insertion, the passing rate was 7% (3/46) in 2007, 16% (5/32) in 2008, and 38% (9/24) in 2009 (P = .004; Figure 2). For subclavian CVC insertion, the passing rate was 11% (5/46) in 2007, 19% (6/32) in 2008, and 38% (9/24) in 2009 (P = .028). Again, the multiple logistic regression tests demonstrated that covariates did not affect the passing rate improvement for either internal jugular (P = .006) or subclavian (P = .003) insertion.
Interobserver agreement for the baseline skills checklist was high: κn = 0.92.
This study demonstrates that the baseline internal jugular and subclavian CVC insertion skills of second- and third-year internal medicine residents improved across three consecutive years. We believe this improvement occurred because other residents who had previously completed simulation training taught first-year residents (who had not completed SBE) how to insert CVCs via bedside demonstration. Observing the senior residents in the hospital produced annual improvements in baseline scores in the simulation laboratory. Our results suggest that residents who achieve a high level of skill during SBE vicariously transfer those skills and that knowledge by teaching junior trainees at the bedside. By contrast, residents who have not experienced rigorous training and skill assessment may pass poor techniques and errors down to the next generation of residents.
These findings have several implications. First, rigorous SBE for one group of trainees may have a collateral effect on the skills of subsequent trainees. Further study is needed to evaluate the optimal design of SBE interventions to maximize this outcome. Research methods such as direct observation, focus groups, and interviews (especially of junior residents, the collateral learners) may help clarify interactions among trainees at the bedside. Second, the collateral effect of SBE is modest compared with the powerful direct effect of SBE on individual trainees.2,3,23–25 Although 38% of residents in 2009 were able to demonstrate CVC insertion skills that exceeded the MPS without SBE, the majority of residents still required the full CVC SBE curriculum to reach this skill level. These data should remind educators that clinical experience alone is not only insufficient for residents to acquire invasive procedural skill proficiency but also potentially dangerous for patients.
Implementation science recognizes that a complex intervention, such as the SBE program for medical residents at Northwestern, has many moving parts.26 A recent review by Damschroder and colleagues27 presents a consolidated implementation science framework with five domains: (1) the intervention, (2) the inner setting (e.g., a teaching hospital), (3) the outer setting (the medical education enterprise), (4) the individuals involved, and (5) the process by which the implementation is accomplished. Successful educational interventions attend to these domains singly and in combination. The CVC SBE program focuses on these domains—through such elements as mastery learning (Domain 5) and deliberate practice (Domain 1) for junior residents (Domain 4) in a safe environment (Domain 2)—which may help explain how or why it effects improvement in the skills of trainees who experience it directly, as well as those who experience it collaterally.
Further, Pawson and colleagues28 teach that complex interventions have a variety of elements including a long implementation chain and features that mutate as a result of refinement and adaptation to local circumstances. They assert that complex interventions represent open systems continually feeding back on themselves: “As interventions are implemented, they change the conditions that made them work in the first place.”28 The SBE program has intended effects on procedural skill acquisition among residents and unintended (yet welcome) collateral effects on the educational environment. A similar systemic outcome has been achieved concerning resident mastery of advanced cardiac life support skills.23,29
This study has several limitations. First, it occurred in one institution with a limited number of residents. Further study is required in other programs using the mastery learning model for CVC insertion to determine generalizability to other settings and other skills. Second, because of the complex nature of interactions that occur in a health care system, we cannot rule out unknown factors contributing to the enhanced baseline skills of residents in CVC insertion in the third academic year (2009) compared with the first (2007). We do not believe, however, that skill improvement derived from residents' learning CVC insertion in clinical settings beyond the ICU. Such learning rarely occurs in our program. The training intervention may have become more effective during the study period, but this is also unlikely because the duration, content, and primary instructor did not change. Other unknown factors may have affected our results including health care professional training, equipment changes, case-mix differences, hospital policies and procedures, or other patient-safety initiatives; however, to our knowledge, no such institutional changes occurred during the study period. Third, differences may have existed among the three classes of residents from year to year, and we were unable to control for USMLE Step 1 and 2 scores as a marker of academic achievement because of incomplete data. However, there were no significant differences in mean USMLE scores between academic years, and earlier research shows no correlation between USMLE scores and procedural skill performance.3,23,24,30 Finally, the number of residents who received training per year decreased, but over time (cumulatively) the total number of trained residents increased. This occurred because simulation training occurs before the first ICU rotation as a senior resident, and many residents received the intervention during the second year and not during the third.
In conclusion, the benefits of SBE for senior residents trickled down to junior trainees as evidenced by improved pretest scores on the CVC insertion skill examination across three consecutive years. SBE is a powerful tool that boosts the skills of individuals who participate directly, and it may have collateral effects on others. Further study is needed to clarify the cause-and-effect relationship and to determine the influence of this collateral effect in other procedures and skills and among other trainees in other settings.
The authors would like to thank the Northwestern University internal medicine residents for their dedication to education and patient care. They would also like to acknowledge Dr. Douglas Vaughan and Dr. Mark V. Williams for their support and encouragement of this work.
The authors received some financial support from the Excellence in Academic Medicine Act, which is supported by the Illinois Department of Healthcare and Family Services.
Dr. McGaghie's contribution was supported in part by the Jacob R. Sucker, MD, Professorship in Medical Education and by grant UL 1 RR 025741 from the National Center for Research Resources, National Institutes of Health (NIH). The NIH had no role in the preparation, review, or approval of the manuscript.
The Northwestern University institutional review board reviewed and approved this study.
1 Andreatta PB, Woodrum DT, Birkmeyer JD, et al. Laparoscopic skills are improved with LapMentor training: Results of a randomized, double-blinded study. Ann Surg. 2006;243:854–863.
2 Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med. 2009;4:397–403.
3 Barsuk JH, McGaghie WC, Cohen ER, O'Leary KJ, Wayne DB. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37:2697–2701.
4 Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Use of simulation-based education to reduce catheter-related bloodstream infections. Arch Intern Med. 2009;169:1420–1423.
5 Blum MG, Powers TW, Sundaresan S. Bronchoscopy simulator effectively prepares junior residents to competently perform basic clinical bronchoscopy. Ann Thorac Surg. 2004;78:287–291.
6 Cohen J, Cohen SA, Vora KC, et al. Multicenter, randomized, controlled trial of virtual-reality simulator training in acquisition of competency in colonoscopy. Gastrointest Endosc. 2006;64:361–368.
7 Draycott TJ, Crofts JF, Ash JP, et al. Improving neonatal outcome through practical shoulder dystocia training. Obstet Gynecol. 2008;112:14–20.
8 Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance: Results of a randomized, double-blinded study. Ann Surg. 2002;236:458–464.
9 Wayne DB, Didwania A, Feinglass J, Fudala MJ, Barsuk JH, McGaghie WC. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: A case–control study. Chest. 2008;133:56–61.
10 Boulet JR, Murray D, Kras J, Woodhouse J, McAllister J, Ziv A. Reliability and validity of a simulation-based acute care skills assessment for medical students and residents. Anesthesiology. 2003;99:1270–1280.
11 Issenberg SB, McGaghie WC, Hart IR, et al. Simulation technology for health care professional skills training and assessment. JAMA. 1999;282:861–866.
13 Berns JS, O'Neill WC. Performance of procedures by nephrologists and nephrology fellows at U.S. nephrology training programs. Clin J Am Soc Nephrol. 2008;3:941–947.
14 Duffy FD, Holmboe ES. What procedures should internists do? Ann Intern Med. 2007;146:392–393.
15 Lucas BP, Asbury JK, Wang Y, et al. Impact of a bedside procedure service on general medicine inpatients: A firm-based trial. J Hosp Med. 2007;2:143–149.
16 McGee DC, Gould MK. Preventing complications of central venous catheterization. N Engl J Med. 2003;348:1123–1133.
17 McGaghie WC. Medical education research as translational science. Sci Transl Med. 2010;2:19cm8.
19 Bandura A. Social Foundations of Thought and Action: A Social Cognitive Theory. Englewood Cliffs, NJ: Prentice-Hall; 1986.
20 Block JH, ed. Mastery Learning: Theory and Practice. New York, NY: Holt, Rinehart and Winston; 1971.
21 McGaghie WC, Miller GE, Sajid AW, Telder TV. Competency-Based Curriculum Development in Medical Education: An Introduction. Geneva, Switzerland: World Health Organization; 1978. http://www.eric.ed.gov/PDFS/ED168447.pdf
. Accessed August 20, 2011.
23 Wayne DB, Butter J, Siddall VJ, et al. Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice. J Gen Intern Med. 2006;21:251–256.
24 Wayne DB, Barsuk JH, O'Leary KJ, Fudala MJ, McGaghie WC. Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice. J Hosp Med. 2008;3:48–54.
25 Barsuk JH, Ahya SN, Cohen ER, McGaghie WC, Wayne DB. Mastery learning of temporary hemodialysis catheter insertion by nephrology fellows using simulation technology and deliberate practice. Am J Kidney Dis. 2009;54:70–76.
26 McGaghie WC. Implementation science: Addressing complexity in medical education. Med Teach. 2011;33:97–98.
27 Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.
28 Pawson R, Greenhalgh T, Harvey G, Walshe K. Realist review—A new method of systematic review designed for complex policy interventions. J Health Serv Res Policy. 2005;10(suppl 1):21–34.
29 Didwania A, McGaghie WC, Cohen ER, et al. Progress toward improving the quality of cardiac arrest medical team responses at an academic teaching hospital. J Grad Med Educ. 2011;3:211–216.