Making July Safer: Simulation-Based Mastery Learning During Intern Boot Camp : Academic Medicine

Secondary Logo

Journal Logo

Research Reports

Making July Safer

Simulation-Based Mastery Learning During Intern Boot Camp

Cohen, Elaine R. MEd; Barsuk, Jeffrey H. MD, MS; Moazed, Farzad MD; Caprio, Timothy MD; Didwania, Aashish MD; McGaghie, William C. PhD; Wayne, Diane B. MD

Author Information
Academic Medicine 88(2):p 233-239, February 2013. | DOI: 10.1097/ACM.0b013e31827bfc0a
  • Free
  • AM Rounds Blog Post

Abstract

Teaching hospitals worldwide experience the “July effect,” defined as decreased health care efficiency and increased patient morbidity and mortality during the month of July.1–5 Researchers postulate that academic year-end turnover and interns’ unfamiliarity with a new work environment are major reasons for these findings.1–5 Uneven baseline clinical skills of recent medical school graduates may also contribute to the July effect. Several studies from the University of Michigan and Michigan State University reveal gaps in new interns’ knowledge and skills in such tasks as maintaining aseptic technique and obtaining informed consent.6,7 All interns complete prior clinical experience through their undergraduate medical education; however, completing early clinical training is not a guarantee of clinical competence, and deficiencies measured at the beginning of internship often persist in future training years.7

Medical educators have noted this gap between expected and actual skills, and they have called for a rigorous intern orientation for many years.8–10 One proposed solution for the July effect is the use of simulation-based education with rigorous, competency-based outcome measurement.9,11 Research has shown that simulation-based education improves learning and patient care outcomes in clinical skills such as laparoscopic surgery,12,13 central venous catheter insertion,14 colonoscopy,15 obstetrical emergency management,16 and advanced cardiac life support.17 Simulation has also been used in medical student boot camp experiences to provide trainees with short-duration, high-intensity training meant to improve their confidence,18,19 enhance their skills,20–22 and ease their transition to residency.19 Educators have also used the boot camp model for surgical interns, but the effectiveness of this approach is unknown because most studies have focused on a narrow range of basic skills such as instrument identification, knot tying, suturing, and wound closure.23–26 Outcome measures of surgical boot camps have largely been limited to course participants’ satisfaction and self-confidence rather than to rigorous assessment of their clinical skills, although some studies have shown short-term improvement in core technical skills.23–26 One recent study linked boot camp assessments to evaluations from faculty and in-service scores. In this study, interns underwent training in procedural skills and patient care scenarios using simulation technology. Individual simulation-based performance scores correlated with subsequent clinical and core curriculum evaluations for the remainder of the intern year. These findings support the validity of the boot camp approach for resident competency evaluation.27

Providing high-quality clinical care within training programs requires assurance that all interns have mastered basic clinical skills before assuming patient care responsibilities. Mastery learning is a form of competency-based education in which skills are measured rigorously against high achievement standards. Educational outcomes are uniform, whereas educational time varies among trainees.28 Simulation-based mastery learning (SBML) is a powerful tool that can help boost residents’ clinical skills to high levels, improve patient care quality and outcomes,14,17,29 and reduce hospital costs.30 Applying the mastery model to intern orientation helps ensure that all interns meet a minimum performance standard before they complete their orientation and start their clinical training with patients. Educators have not yet conducted in-depth studies of the use of the mastery model for incoming interns because there are no universal accepted minimum standards for the clinical skills needed to graduate from medical school.

We developed an SBML internal medicine intern boot camp curriculum that covers a range of clinical skills needed to provide safe patient care including physical examination and procedural skills, managing critically ill patients, and communicating with patients. The aim of this study was to evaluate the impact of SBML during intern boot camp on the clinical skills of new trainees.

Method

Study design

This was a cohort study of a boot camp educational intervention designed to increase interns’ clinical skills. Previous cohorts of interns who completed the same clinical skills examinations (CSEs) without the boot camp curriculum served as historical controls.

Procedure

We conducted this study in the Northwestern University Division of Hospital Medicine Simulation Center and the Northwestern University Simulation Technology and Immersive Learning facility. All 47 incoming internal medicine interns in June 2011 at Northwestern University were eligible to participate in the study. The comparison group consisted of 109 interns from two prior training years (2009 and 2010) who were assessed but did not participate in boot camp. All trainees in this program undergo clinical training at Northwestern Memorial Hospital (NMH), a tertiary care urban teaching hospital, and the Jesse Brown Veteran’s Affairs Medical Center.

We asked 2011 interns to report three days early to NMH so that they could participate in boot camp as an addition to the standard medical center orientation. NMH provided the salary support for the interns during this time. We told boot camp participants that we would provide small-group education followed by rigorous individual performance assessment of each skill. The Northwestern University institutional review board approved the study, and all 47 interns provided informed consent before participation.

Boot camp trainees received a standardized intervention and evaluation for each competency and skill. The educational intervention consisted of three days (24 total hours) of small-group teaching sessions and individualized feedback and assessment. Interns rotated in groups of six through five scenarios covering the following skills: (1) cardiac auscultation, (2) paracentesis, (3) lumbar puncture (LP), (4) intensive care unit (ICU) clinical skills, and (5) code status discussion. We selected these skills on the basis of feedback from historical control interns and program faculty. Program faculty had reported that these particular skills were frequently inadequate in historical control interns. Further, these scenarios represent a range of core competencies (patient care, medical knowledge, communication, and interpersonal skills) and patient care skills that internal medicine interns require during residency.

Teaching sessions lasted two to four hours and included both didactic content and the opportunity for simulation-based deliberate skills practice with individualized feedback, which research has shown to be a key component of clinical skill mastery in medicine and its related domains.31 American Board of Internal Medicine (ABIM)-certified faculty preceptors led the standardized teaching sessions. All the sessions began with an objective and an orientation to the skill. Interns experienced a variety of modalities including didactic teaching, hands-on skill practice with simulators, online modules, and interaction with standardized patients. Table 1 provides additional specifics regarding teaching methods and related core competencies for each clinical skill. In addition, we have described the details of each intervention in earlier reports.32–36

T1-28
Table 1:
Details on the Teaching, Content, Assessment, and Comparison With Historical Controls for Each of the Five Clinical Skills Taught and Assessed During Intern Boot Camp, Northwestern University

Measurements

Residents completed a five-station CSE after the teaching sessions. We modeled the CSE after other CSEs that link the mastery model of education to skill transfer in the clinical environment and improved patient care quality.14,17,29 Assessments included performance of simulated procedures (cardiac auscultation, paracentesis, LP), bedside management of critically ill patients, and communication with patients. Data from all assessment tools had been previously evaluated for reliability and validity.32–36 Details regarding assessment tools for each competency including reliability coefficients from earlier studies are available in Table 1.

We required all boot camp participants to meet or exceed a previously set minimum passing standard (MPS) for each of the five components of the CSE.32–36 Clinical experts determined MPSs using the Angoff (item-based) and Hofstee (group-based) standard setting methods.37 Interns who did not achieve the MPS engaged in more deliberate practice, and we retested them until they reached the MPS; this continued practice until the trainee reaches a minimal level of competence is the key feature of mastery learning.28

Historical control interns (n = 109) did not participate in the boot camp educational intervention. We previously assessed them on the same CSEs to provide baseline performance data. Historical controls from 2010 (n = 58) completed the paracentesis, LP, and code status discussion CSEs.33,34,36 Historical control interns from 2009 (n = 51) completed the cardiac auscultation and ICU clinical skills CSEs.32,35

We assessed internal consistency reliability for the cardiac auscultation multiple-choice exam.32 We video-recorded the paracentesis, LP, ICU clinical skills, and code status discussion CSEs from interns in both the intervention and control groups to assess interrater reliability. A second rater rescored a 50% random sample of all video-recorded CSEs using the checklists. These examiners were blind to study design and aim.

Intervention and historical control group participants provided demographic data including age, gender, medical school, and United States Medical Licensing Exam (USMLE) Step 1 and 2 scores. Boot camp participants provided information about their prior clinical experiences managing patients requiring each procedure or skill, and they assessed their baseline confidence to perform each clinical skill on a self-rating scale (0 = not confident to 100 = very confident). We also surveyed boot camp participants about their satisfaction with the curriculum after completing boot camp.

Primary outcome measures were performance of boot-camp-trained interns and historical controls on five parts of a CSE. Secondary outcome measures were resident satisfaction with the curriculum and relationships between CSE performance and resident demographics, prior experience, and self-confidence.

Statistical analysis

We estimated internal consistency reliability for the cardiac auscultation examination using the KR-21 formula. We estimated checklist score reliability for the paracentesis, LP, ICU clinical skills, and code status discussion CSEs by calculating interrater reliability using the mean kappa (Kn) coefficient across all checklist items. We compared demographic differences between boot-camp-trained interns and historical controls using independent t tests and the chi-square statistic. We compared mean boot camp intern CSE scores with mean historical control scores using independent t tests. We used multiple linear regression to evaluate the relationship between CSE scores and training status (boot-camp-trained interns versus historical controls) when controlling for age, gender, and USMLE Step 1 and 2 scores. To assess relationships within the boot-camp-trained interns between CSE performance and age, gender, prior experience, self-confidence, and USMLE Step 1 and 2 scores, we also used multiple linear regression. We conducted all analyses using IBM SPSS Statistics software, version 20.0 (SPSS Inc., Chicago, Illinois).

Results

All 47 eligible interns consented to participate and completed the entire protocol. Demographic data for boot-camp-trained interns and historical controls are available in Table 2. USMLE Step 2 scores were slightly higher in the boot-camp-trained group than in two of the historical control groups. We observed no other differences between groups.

T2-28
Table 2:
Demographic Characteristics of Boot-Camp-Trained Interns (n = 47) and Historical Controls (n = 109) for Each of Five Clinical Skills

Internal consistency reliability for boot camp cardiac auscultation exam scores was KR-21 = 0.70. Interrater reliability was high across all components of the boot camp CSE (Kn = 0.85 for paracentesis, Kn = 0.90 for LP, Kn = 0.94 for ICU clinical skills, and Kn = 0.70 for code status discussion).

Figure 1 presents a graphic portrayal of boot-camp-trained intern and historical control CSE performance. Boot-camp-trained interns performed significantly better (P < .01) than historical controls on all skills assessed. Means for each examination were as follows:

F1-28
Figure 1:
Boot-camp-trained (2011) and historical control (2009, 2010) intern performance on clinical skills examinations at Northwestern University. *Difference between historic controls and boot-camp-trained interns is significant at P < .01.
  • 91.0% (standard deviation [SD] = 9.6%) versus 76.9% (SD = 14.6%) for cardiac auscultation,
  • 94.4% (SD = 4.8%) versus 33.0% (SD = 15.2%) for paracentesis,
  • 96.3% (SD = 4.2%) versus 46.3% (SD = 17.4) for LP,
  • 89.0% (SD = 7.4%) versus 74.8% (SD = 14.1%) for ICU clinical skills, and
  • 76.4% (10.0%) versus 53.2% (SD = 16.2%) for code status discussions.

Results remained significant after controlling for age, gender, and USMLE Step 1 and 2 scores (P < .001).

Table 3 presents baseline experience and self-confidence for boot-camp-trained interns. Regression analysis also showed that there were no significant relationships within the boot-camp-trained group between CSE scores and age, gender, prior experience, self-confidence, or USMLE Step 1 and 2 scores.

T3-28
Table 3:
Self-Reported Experience and Confidence of 47 Interns Prior to Boot Camp

Most interns met the MPS on the CSE for each skill after standard training. Interns who did not initially meet the MPS were referred for additional practice and retesting until they reached the MPS. We did not detect any demographic or academic differences between the interns who initially met the MPS and those who required additional training. Of the 47 boot camp interns, 6 (13%) needed additional training of up to one hour in cardiac auscultation, 2 (4%) needed an additional hour of training in paracentesis, 7 (15%) in LP, 11 (23%) in ICU clinical skills, and 18 (38%) in code status discussions.

The curriculum was rated highly (Table 4). Participants agreed that the boot camp provided valuable preparation for their internship, boosted their self-confidence and clinical skills, and should be a required part of internal medicine residency education.

T4-28
Table 4:
Satisfaction* of 47 Interns With the Boot Camp Curriculum

Discussion and Conclusions

This study shows that a three-day SBML boot camp curriculum is a feasible and effective way to help ensure competence for incoming interns in a broad range of important clinical skills before starting postgraduate training in internal medicine. Our results clearly demonstrate that boot-camp-trained interns performed significantly better than historical controls in a variety of skills essential to high-quality patient care. These skills include the core competencies of patient care (cardiac auscultation, paracentesis, LP), medical knowledge (ICU clinical skills), and communication skills (leading a code status discussion) and include judgment, discernment, and participation in team-based care. To “graduate” from boot camp and begin their clinical training, interns must demonstrate a level of skill predetermined by an expert panel as required for safe and effective patient care.

To our knowledge, this is the first study to impose rigorous competency measurements and an MPS for clinical skills of incoming internal medicine interns. Use of the mastery model requires physicians-in-training to attain a minimum standard of proficiency in a broad range of clinical skills before starting actual patient care. Most interns met the MPS for each clinical skill within the standard boot camp curriculum. Remediation was rarely needed for procedural skills but was necessary for a larger percentage of interns in leading a code status discussion. We think this outcome is likely due to the complexity of talking about code status and the limited opportunity for medical students to observe and participate in such conversations. The individualized approach to learning and assessment used in this study fits with new graduate medical education paradigms such as the Milestones Project of the ABIM, through which residents acquire levels of knowledge, skills, and attitudes to advance in and graduate from their training program.38

Our results confirm that current training models are insufficient to prepare medical school graduates to enter postgraduate training. Interns reported very low self-confidence to perform any of the clinical skills except cardiac auscultation. They reported low confidence despite performing well above average on USMLE Steps 1 and 2 and despite the fact that almost all of them graduated from U.S. medical schools, where one assumes that clinical education is rigorous and of high quality. The majority of interns had no experience with the common bedside procedures of paracentesis and LP, and almost 30% reported no experience with ICU clinical skills or code status discussions. Importantly, neither prior experience nor self-confidence predicted CSE performance. Even though interns are supervised during actual clinical care, skills such as these are necessary on the first day of residency. Because U.S. medical school graduates in this study have not acquired these skills, the alternative to boot camp is the traditional method of learning on the job. This model is inefficient, inaccurate, and challenging because of new duty hours limitations.39 Incoming interns have a range of clinical experience and skill. As shown by the course evaluation questionnaire, interns are aware of their deficiencies, and they are motivated and enthusiastic about participating in a rigorous educational experience. A boot camp curriculum with required minimum performance standards reduces variation and allows for further skill development and assessment once clinical training begins.

Poor baseline performance of basic clinical skills by interns at well-known centers for graduate medical education in this study and others6,7 is cause for serious concern. Reliance on passive clinical training during medical school clerkships, without rigorous skill assessment and feedback, is insufficient preparation for internship and affects patient care quality.1–5 Medical educators must look beyond traditional clerkship models with 19th-century origins40 and develop curricula to ensure that 21st-century physicians are competent to provide safe and effective patient care. Changes in the senior year of medical school and modified graduation requirements are needed to assure the public that medical school graduates possess the necessary knowledge, skills, and attitudes to enter postgraduate training in their chosen specialty.

SBML shows great promise to boost the skills and help correct the deficiencies displayed by recent medical school graduates. Use of simulation-based education as an adjunct to traditional clinical training is superior to clinical training alone.41 A growing body of evidence also demonstrates that skills attained during SBML are retained over time,42,43 improve patient outcomes,14,17,29 and have collateral effects that further improve the residency training environment.44

This study has several limitations. First, we conducted it at a single institution with a limited number of trainees. Second, we did not evaluate all clinical skills. However, the ones we studied are a representative sample of the skills needed to begin internal medicine residency training. Future boot camp curricula might also include topics such as patient care handoffs and transitions of care. Third, we have no information on skill retention after boot camp, and further study is needed to determine the role of booster training. Fourth, we have not yet linked skills attained during boot camp with improved patient care, although the SBML model has demonstrated these effects in earlier work.14,17,29,32 Finally, our intervention cost $34,282 (including faculty time, resident salary, space rental, and supplies), and we have not assessed the cost-effectiveness of boot camp. Doing so is important, yet challenging, because the costs of inefficiency, redundant or erroneous test ordering, and poor physician–patient communication are often hidden. We are encouraged by findings that SBML was highly cost-effective (return on investment 7:1) when it was used previously in the residency training environment to reduce preventable patient care complications.30

In summary, SBML intern boot camp improves clinical skills, reduces variation, and enhances confidence. The mastery model allows for individualized training and assessment of interns and ensures that each intern has attained a baseline level of skill before providing actual patient care. This study is a first step toward the new educational models that medical educators must develop and use in order to make July safer.

Acknowledgments: The authors acknowledge Douglas E. Vaughan, MD, for his support and encouragement of this work.

Funding/Support: Northwestern Memorial Hospital and Excellence in Academic Medicine Act supported by the Illinois Department of Healthcare and Family Services.

Other disclosures: Dr. McGaghie’s contribution was supported in part by the Jacob R. Suker, MD, professorship in medical education and by grant UL 1 RR 025741 from the National Center for Research Resources, National Institutes of Health (NIH). The NIH had no role in the preparation, review, or approval of the manuscript.

Ethical approval: Northwestern University institutional review board approved this study.

Previous presentations: Material from this manuscript was presented as a poster at the Association of Program Directors in Internal Medicine Spring Meeting, April 2012, Atlanta, Georgia.

References

1. Phillips DP, Barker GE. A July spike in fatal medication errors: A possible effect of new medical residents. J Gen Intern Med. 2010;25:774–779
2. Haller G, Myles PS, Taffé P, Perneger TV, Wu CL. Rate of undesirable events at beginning of academic year: Retrospective cohort study. BMJ. 2009;339:b3974
3. Jen MH, Bottle A, Majeed A, Bell D, Aylin P. Early in-hospital mortality following trainee doctors’ first day at work. PLoS ONE. 2009;4:e7103
4. Anderson KL, Koval KJ, Spratt KF. Hip fracture outcome: Is there a “July effect”? Am J Orthop. 2009;38:606–611
5. Young JQ, Ranji SR, Wachter RM, Lee CM, Niehaus B, Auerbach AD. “July effect”: Impact of the academic year-end changeover on patient outcomes: A systematic review. Ann Intern Med. 2011;155:309–315
6. Lypson ML, Frohna JG, Gruppen LD, Woolliscroft JO. Assessing residents’ competencies at baseline: Identifying the gaps. Acad Med. 2004;79:564–570
7. Wagner D, Lypson ML. Centralized assessment in graduate medical education: Cents and sensibilities. J Grad Med Educ. 2009;1:21–27
8. Barach P, Philibert I. The July effect: Fertile ground for systems improvement. Ann Intern Med. 2011;155:331–332
9. Jarrett MP. Impact of the “July effect” on patient outcomes. Ann Intern Med. 2012;156:168
10. Glick S. Changing the guard. N Engl J Med. 1966:275–733
11. Young JQ, Auerbach AD, Ranji SR. Impact of the “July effect” on patient outcomes. Ann Intern Med. 2012:156–168
12. Andreatta PB, Woodrum DT, Birkmeyer JD, et al. Laparoscopic skills are improved with LapMentor training: Results of a randomized, double-blinded study. Ann Surg. 2006;243:854–860
13. Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance: Results of a randomized, double-blinded study. Ann Surg. 2002;236:458–463
14. Barsuk JH, McGaghie WC, Cohen ER, O’Leary KJ, Wayne DB. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37:2697–2701
15. Cohen J, Cohen SA, Vora KC, et al. Multicenter, randomized, controlled trial of virtual-reality simulator training in acquisition of competency in colonoscopy. Gastrointest Endosc. 2006;64:361–368
16. Draycott TJ, Crofts JF, Ash JP, et al. Improving neonatal outcome through practical shoulder dystocia training. Obstet Gynecol. 2008;112:14–20
17. Wayne DB, Didwania A, Feinglass J, Fudala MJ, Barsuk JH, McGaghie WC. Simulation-based education improves quality of care during cardiac arrest team responses at an academic teaching hospital: A case–control study. Chest. 2008;133:56–61
18. Esterl RM Jr, Henzi DL, Cohn SM. Senior medical student “boot camp”: Can result in increased self-confidence before starting surgery internships. Curr Surg. 2006;63:264–268
19. Laack TA, Newman JS, Goyal DG, Torsher LC. A 1-week simulated internship course helps prepare medical students for transition to residency. Simul Healthc. 2010;5:127–132
20. Brunt LM, Halpin VJ, Klingensmith ME, et al. Accelerated skills preparation and assessment for senior medical students entering surgical internship. J Am Coll Surg. 2008;206:897–904
21. Naylor RA, Hollett LA, Castellvi A, Valentine RJ, Scott DJ. Preparing medical students to enter surgery residencies. Am J Surg. 2010;199:105–109
22. Antonoff MB, Swanson JA, Green CA, Mann BD, Maddaus MA, D’Cunha J. The significant impact of a competency-based preparatory course for senior medical students entering surgical residency. Acad Med. 2012;87:308–319
23. Parent RJ, Plerhoples TA, Long EE, et al. Early, intermediate, and late effects of a surgical skills “boot camp” on an objective structured assessment of technical skills: A randomized controlled study. J Am Coll Surg. 2010;210:984–989
24. Chipman JG, Schmitz CC. Using objective structured assessment of technical skills to evaluate a basic skills simulation curriculum for first-year surgical residents. J Am Coll Surg. 2009;209:364–370.e2
25. Schill M, Tiemann D, Klingensmith ME, Brunt LM. Year one outcomes assessment of a masters suturing and knot-tying program for surgical interns. J Surg Educ. 2011;68:526–533
26. Sonnadara RR, Van Vliet A, Safir O, et al. Orthopedic boot camp: Examining the effectiveness of an intensive surgical skills course. Surgery. 2011;149:745–749
27. Fernandez GL, Page DW, Coe NP, et al. Boot cAMP: Educational outcomes after 4 successive years of preparatory simulation-based training at onset of internship. J Surg Educ. 2012;69:242–248
28. McGaghie WC, Miller GE, Sajid A, Telder TV Competency-Based Curriculum Development in Medical Education: An Introduction. 1978 Geneva, Switzerland World Health Organization
29. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Use of simulation-based education to reduce catheter-related bloodstream infections. Arch Intern Med. 2009;169:1420–1423
30. Cohen ER, Feinglass J, Barsuk JH, et al. Cost savings from reduced catheter-related bloodstream infection after simulation-based education for residents in a medical intensive care unit. Simul Healthc. 2010;5:98–102
31. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S81
32. Butter J, McGaghie WC, Cohen ER, Kaye M, Wayne DB. Simulation-based mastery learning improves cardiac auscultation skills in medical students. J Gen Intern Med. 2010;25:780–785
33. Barsuk JH, Cohen ER, Vozenilek JA, O’Connor L, McGaghie WC, Wayne DB. Simulation-based education with mastery learning improves paracentesis skills. J Grad Med Educ. 2012;4:23–27
34. Barsuk JH, Cohen ER, Caprio T, McGaghie WC, Simuni T, Wayne DB. Simulation-based education with mastery learning improves residents’ lumbar puncture skills. Neurology. 2012;79:132–137
35. Schroedl CJ, Corbridge TC, Cohen ER, et al. Use of simulation-based education to improve resident learning and patient care in the medical intensive care unit: A randomized trial. J Crit Care. 2012;27:219.e7–219.e13
36. Szmuilowicz E, Neely KJ, Sharma RK, Cohen ER, McGaghie WC, Wayne DB. Improving residents’ code status discussion skills: A randomized trial. J Palliat Med. 2012;15:768–774
37. Downing SM, Tekian A, Yudkowsky R. Procedures for establishing defensible absolute passing scores on performance examinations in health professions education. Teach Learn Med. 2006;18:50–57
38. Weinberger SE, Pereira AG, Iobst WF, Mechaber AJ, Bronze MSAlliance for Academic Internal Medicine Education Redesign Task Force II. . Competency-based education and training in internal medicine. Ann Intern Med. 2010;153:751–756
39. Nasca TJ, Day SH, Amis ES JrACGME Duty Hour Task Force. . The new recommendations on duty hours from the ACGME Task Force. N Engl J Med. 2010;363:e3
40. Osler W. The hospital as a college. Aequanimitas With Other Addresses to Medical Students, Nurses and Practitioners of Medicine. 19323rd ed Philadelphia, PA P. Blakiston’s Son & Co.:313–325
41. McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation-based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86:706–711
42. Barsuk JH, Cohen ER, McGaghie WC, Wayne DB. Long-term retention of central venous catheter insertion skills after simulation-based mastery learning. Acad Med. 2010;85(10 suppl):S9–S12
43. Wayne DB, Siddall VJ, Butter J, et al. A longitudinal study of internal medicine residents’ retention of advanced cardiac life support skills. Acad Med. 2006;81(10 suppl):S9–S12
44. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Unexpected collateral effects of simulation-based medical education. Acad Med. 2011;86:1513–1517
© 2013 Association of American Medical Colleges