Our analysis yielded four types of procedural skills training interventions: “see one, do one”17,32–52; educational-theory-informed (divided into mastery learning16,53–69 and other, including self-regulated learning and cognitive, theories70–77); medical procedural services (MPSs)78–86; and multifaceted quality improvement/patient safety (QI/PS) interventions.18,87–95 These four intervention types involved delivering procedural skills training in variable ways, and even within a single intervention type, studies described heterogeneous approaches to training. We describe each intervention type and the characteristics of the aggregated studies within each type in Chart 1.
Below, we outline the results of our syntheses from step 1 (within each intervention type) and step 2 (across all four intervention types) for context, mechanism, and outcome. We also provide a summary of our syntheses from step 3, which outlines our identified foundational training model.
Authors used educational technologies, especially simulation, as the training modality in all intervention types, except for QI/PS interventions, which used in-service presentations and workshops grounded in clinical practice. Rationales for using simulation included to capitalize on new educational technologies (see one, do one); to adhere to ABIM recommendations that simulation-based training should precede clinical practice (see one, do one); to move initial or early training away from patients, where harm may occur (see one, do one and educational-theory-informed); to evaluate the impact of educational designs, like competency-based education, on learning outcomes (educational-theory-informed); and to respond to the perceived decline of exposure to procedures during clinical training (see one, do one and MPS). When simulation was used, authors mostly delivered training in simulation centers, with some “just in time, just in place” use of simulation in the clinical setting.
When procedural skills training took place in clinical settings, authors described a need to increase the quality of supervision from staff (MPS), as well as a need to avoid financial penalties—for example, from the Centers for Medicare and Medicaid Services, related to high infection rates (QI/PS).
For see one, do one studies, the experimental groups’ procedural competence improved from baseline or as compared with control groups (who either had no training or traditional training). Most educational-theory-informed studies of mastery learning groups found that they outperformed control groups, though two large trials showed that a single mastery session did not improve future lumbar puncture success in pediatric patients.64,96 For other educational-theory-informed studies, authors applied most educational principles successfully (e.g., group conformity). All MPS groups improved from baseline or outperformed control groups, though authors commented that despite the observed benefits, the MPS was often assigned the most challenging patients, which may have implications for procedural success rates. For QI/PS studies, all showed improved outcomes related to the multifaceted approach, though none could specify which facet (or facets) led to the observed benefit, and none identified education as a key factor.
We synthesized a heterogeneous literature to help stakeholders establish the key components of rigorous, evidence-based training for core invasive bedside procedures in IM. From 67 studies, we identified four intervention types, which we synthesized to identify key considerations for future IM procedural skills training curricula. The observed heterogeneity in how procedural skills training interventions are designed (mechanism) and in how competence is assessed (outcome) suggests that the official expectation that all residents develop competence in the five invasive bedside procedures is likely not fulfilled consistently. Our synthesis suggests that the most robust foundational model would be an adaptable MPS; this finding aligns with recent perspectives on procedural competence.97 After first comparing our findings with those of previous reviews, we describe and consider the implications of three interrelated lines of inquiry for studying IM invasive bedside procedural skills training in the future.
Assuming the current level of resources and funding allocated to IM procedural skills training remains static,98 combined with IM residents’ limited clinical exposure to procedures, program directors will likely be challenged to implement any adapted MPS training model. If that assumption holds true, then policy makers may need to make the difficult decision to recommend targeted training of a smaller group of trainees, who have been identified as needing to develop and maintain procedural competence throughout their careers. A reinvestment of resources and training opportunities to smaller groups of trainees would mark a shift from expecting core competence in all trainees to training a competent core with a specialization in procedures. In such a system, for example, all IM residents could be expected to achieve cognitive competence (i.e., understand the indications, limitations, contraindications, and complications of procedures), as presently required by the ABIM. Beyond this cognitive competence, though, a proceduralist selection system would need to be implemented, based on trainee interest and a career path requiring procedural competence (e.g., plan to practice IM in community settings or in academic centers with a responsibility for training and assessment), to ensure a core set of clinicians who are procedurally competent. We acknowledge that this proposal would require large-scale changes in the procedure service-delivery models of hospitals that currently rely on all IM residents to perform procedures, as well as a philosophical shift in the professional identity and scope of practice of general internists.
A 2009 study provides a practical example of how programs might use criteria to decide privileges for performing procedures.90 When pulmonologists working at an outpatient pulmonary clinic learned that they had a higher frequency of iatrogenic pneumothorax compared with a nearby radiology practice, they imposed numerous practice changes including required retraining on thoracentesis skills to competency standards. The clinic did not allow pulmonologists who did not meet the standard to perform thoracentesis on patients.90 The authors reported a significant decline in pneumothorax rates, which held constant for two years post intervention. This example demonstrates the potential of investing in a core group of trainees, which could be a prudent resource allocation strategy that helps to address the pressing factors of system accountability, patient safety, and the rising costs of clinical errors. Research will be needed to determine whether this approach to training is appropriate for all invasive bedside procedures or whether trainee competence in some procedures might be realistically achieved in core training.
While the MPS studies did use some notable practices of instructional design (i.e., integrating simulation with clinical training99), they did not cite or use notable practices from QI/PS interventions (i.e., appointing champions and emphasizing accountability).100 A shortcoming of many QI/PS studies, however, was that they did not use simulation, which has been shown to be a common component of most procedural skills training interventions.21 Additionally, we found that authors of educational-theory-informed and QI/PS studies largely responded to different contextual drivers, emphasized different educational mechanisms, and generated different outcome measures, all while pursuing the same goal of ensuring that bedside procedures are performed competently. Hence, we agree with recent calls for a better alignment of efforts between these two research domains and believe that such alignment would produce optimized MPS models.101 Specifically, educational-theory-informed researchers should include systems-level QI/PS experts as team members in future studies, and hospital-based quality improvement teams should include education experts as members on their committees; both groups should work to align the design, implementation, and evaluation of procedural skills training that integrates the simulation and clinical settings.
A 2013 article calls for research programs that establish evidence for links between outcome measures collected in the nonclinical setting with those collected in the clinical setting.102 For example, a 2015 meta-analysis examined the relationship between simulation-based assessments and clinical assessments and found that tools requiring raters to observe individual performance directly (e.g., global ratings of a procedure) showed the highest correlations between the two settings.103 We suggest that the benefit of direct observation might result because assessing at the individual level helps avoid unit-of-analysis errors, which arise when outcomes are measured at a group level (i.e., collapsing infection or complication rates for an entire intensive care unit likely masks multiple data points from high or low performers, reducing the specificity of the measurement). Although there are approaches available to analyze such nested data, like hierarchical generalized linear models,104 none of the studies included in our review adjusted for such nesting using these techniques.
Researchers will need to collect a wide array of validity evidence to clarify “pathways that link training interventions to patient health outcomes.”105 Rather than using outcomes that are low-hanging fruit and for which there is little validity evidence, such as self-reported procedural success and group infection or complication rates, researchers will need to identify educationally sensitive outcomes in the clinical setting, especially those involving direct observation,106,107 and establish chains of evidence between outcomes measured in the nonclinical and clinical settings.102,103 Given the validity evidence supporting the use of global rating scales (with or without checklists) in the simulation setting,108–110 adapting these scales to the clinical setting is likely a fruitful research direction.
The primary literature on IM bedside procedural skills training had several limitations which impacted our review. Authors reported nearly universal success and few failures of their training interventions, which implies that there may be an issue of publication bias of positive studies in our dataset. Some procedures were studied more extensively than others, and nearly all studies emphasized the procedures’ technical components and excluded components such as judging whether a procedure needs to be performed, obtaining informed consent, coordinating care, and documenting the procedure.111 All but one study90 evaluated how training affects the development of procedural competence rather than the maintenance of competence. Although we judged the context–mechanism–outcome linkage independently and in duplicate, our evaluations remain subjective; however, that only 45% of studies met our standard for sufficient information on the context, mechanism, and outcome suggests that there are important gaps in how research on procedural skills training has been conducted and reported. By using a realist synthesis approach, we excluded many studies, some of which might have unearthed additional themes. We did not conduct meta-analyses, particularly because we believe that knowledge synthesis methods supported by qualitative research paradigms, like realist synthesis, provide more targeted answers regarding gaps in research, as well as potential solutions and next steps.
We found that actual practices in procedural skills training in IM are highly variable. Such variability is not surprising considering that regulatory organizations mandate procedural competence, yet do not provide guidelines for program directors to follow when implementing training programs. We have identified the MPS as a foundational training model and provided a list of potential key components that educators can incorporate into future procedural training curricula, which researchers can study and test systematically. In an era where evidence shows that high-quality training translates into high-quality care,101,104,112 the imperative to design the best educational experience for our trainees has never been stronger.
3. Reynolds MR, Cohen DJ, Kugelmass AD, et al. The frequency and incremental cost of major complications among Medicare beneficiaries receiving implantable cardioverter–defibrillators. J Am Coll Cardiol. 2006;47:24932497.
4. Baker GR, Norton PG, Flintoft V, et al. The Canadian Adverse Events Study: The incidence of adverse events among hospital patients in Canada. CMAJ. 2004;170:16781686.
5. Brennan TA, Hebert LE, Laird NM, et al. Hospital characteristics associated with adverse events and substandard care. JAMA. 1991;265:32653269.
6. Leape LL, Brennan TA, Laird N, et al. The nature of adverse events in hospitalized patients. Results of the Harvard Medical Practice Study II. N Engl J Med. 1991;324:377384.
7. Neale G, Woloshynowych M, Vincent C. Exploring the causes of adverse events in NHS hospital practice. J R Soc Med. 2001;94:322330.
8. Thomas EJ, Orav EJ, Brennan TA. Hospital ownership and preventable adverse events. Int J Health Serv. 2000;30:211219.
9. Wilson RM, Harrison BT, Gibberd RW, Hamilton JD. An analysis of the causes of adverse events from the Quality in Australian Health Care Study. Med J Aust. 1999;170:411415.
10. de Vries EN, Ramrattan MA, Smorenburg SM, Gouma DJ, Boermeester MA. The incidence and nature of in-hospital adverse events: A systematic review. Qual Saf Health Care. 2008;17:216223.
11. Promes SB, Chudgar SM, Grochowski CO, et al. Gaps in procedural experience and competency in medical school graduates. Acad Emerg Med. 2009;16(suppl 2):S58S62.
12. Wolf KS, Dooley-Hash S. Emergency medicine procedures: Examination of trends in procedures performed by emergency medicine residents. Acad Emerg Med. 2012;19:S171.
13. Ma IW, Teteris E, Roberts JM, Bacchus M. Who is teaching and supervising our junior residents’ central venous catheterizations? BMC Med Educ. 2011;11:16.
14. Huang GC, Smith CC, Gordon CE, et al. Beyond the comfort zone: Residents assess their comfort performing inpatient medical procedures. Am J Med. 2006;119:71.e1771.e24.
15. Pugh C, Plachta S, Auyang E, Pryor A, Hungness E. Outcome measures for surgical simulators: Is the focus on technical skills the best approach? Surgery. 2010;147:646654.
16. Cohen ER, Feinglass J, Barsuk JH, et al. Cost savings from reduced catheter-related bloodstream infection after simulation-based education for residents in a medical intensive care unit. Simul Healthc. 2010;5:98102.
17. Sherertz RJ, Ely EW, Westbrook DM, et al. Education of physicians-in-training can decrease the risk for vascular catheter infection. Ann Intern Med. 2000;132:641648.
18. Berenholtz SM, Pronovost PJ, Lipsett PA, et al. Eliminating catheter-related bloodstream infections in the intensive care unit. Crit Care Med. 2004;32:20142020.
19. Ma IW, Brindle ME, Ronksley PE, Lorenzetti DL, Sauve RS, Ghali WA. Use of simulation-based education to improve outcomes of central venous catheterization: A systematic review and meta-analysis. Acad Med. 2011;86:11371147.
20. Madenci AL, Solis CV, de Moya MA. Central venous access by trainees: A systematic review and meta-analysis of the use of simulation to improve success rate on patients. Simul Healthc. 2014;9:714.
21. Huang GC, McSparron JI, Balk EM, et al. Procedural instruction in invasive bedside procedures: A systematic review and meta-analysis of effective teaching approaches. BMJ Qual Saf. 2016;25:281294.
22. Sawyer T, White M, Zaveri P, et al. Learn, see, practice, prove, do, maintain: An evidence-based pedagogical framework for procedural skill training in medicine. Acad Med. 2015;90:10251033.
23. Colquhoun HL, Levac D, O’Brien KK, et al. Scoping reviews: Time for clarity in definition, methods, and reporting. J Clin Epidemiol. 2014;67:12911294.
24. Wong G, Greenhalgh T, Westhorp G, Pawson R. Realist methods in medical education research: What are they and what can they contribute? Med Educ. 2012;46:8996.
25. Arskey H, O’Malley L. Scoping studies: Towards a methodological framework. Int J Soc Res Methodol. 2005;8:1932.
26. Wong G, Greenhalgh T, Westhorp G, Buckingham J, Pawson R. RAMESES publication standards: Realist syntheses. BMC Med. 2013;11:21.
27. Gordon M, Gibbs T. STORIES statement: Publication standards for healthcare education evidence synthesis. BMC Med. 2014;12:143.
28. Sampson M, McGowan J, Cogo E, Grimshaw J, Moher D, Lefebvre C. An evidence-based practice guideline for the peer review of electronic search strategies. J Clin Epidemiol. 2009;62:944952.
29. Reed DA, Cook DA, Beckman TJ, Levine RB, Kern DE, Wright SM. Association between funding and quality of published medical education research. JAMA. 2007;298:10021009.
30. Tricco AC, Soobiah C, Antony J, et al. A scoping review identifies multiple emerging knowledge synthesis methods, but few studies operationalize the method. J Clin Epidemiol. 2016;73:1928.
31. McCormack B, Rycroft-Malone J, Decorby K, et al. A realist review of interventions and strategies to promote evidence-informed healthcare: A focus on change agency. Implement Sci. 2013;8:107.
32. Britt RC, Novosel TJ, Britt LD, Sullivan M. The impact of central line simulation before the ICU experience. Am J Surg. 2009;197:533536.
33. Chenkin J, Lee S, Huynh T, Bandiera G. Procedures can be learned on the Web: A randomized study of ultrasound-guided vascular access training. Acad Emerg Med. 2008;15:949954.
34. Miller AH, Roth BA, Mills TJ, Woody JR, Longmoor CE, Foster B. Ultrasound guidance versus the landmark technique for the placement of central venous catheters in the emergency department. Acad Emerg Med. 2002;9:800805.
35. Sekiguchi H, Tokita JE, Minami T, Eisen LA, Mayo PH, Narasimhan M. A prerotational, simulation-based workshop improves the safety of central venous catheter insertion: Results of a successful internal medicine house staff training program. Chest. 2011;140:652658.
36. Froehlich CD, Rigby MR, Rosenberg ES, et al. Ultrasound-guided central venous catheter placement decreases complications and decreases placement attempts compared with the landmark technique in patients in a pediatric intensive care unit. Crit Care Med. 2009;37:10901096.
37. Gaies MG, Morris SA, Hafler JP, et al. Reforming procedural skills training for pediatric residents: A randomized, interventional trial. Pediatrics. 2009;124:610619.
38. Griswold-Theodorson S, Hannan H, Handly N, et al. Improving patient safety with ultrasonography guidance during internal jugular central venous catheter placement by novice practitioners. Simul Healthc. 2009;4:212216.
39. Grover S, Currier PF, Elinoff JM, Katz JT, McMahon GT. Improving residents’ knowledge of arterial and central line placement with a Web-based curriculum. J Grad Med Educ. 2010;2:548554.
40. Kamdar G, Kessler DO, Tilt L, et al. Qualitative evaluation of just-in-time simulation-based learning: The learners’ perspective. Simul Healthc. 2013;8:4348.
41. Kilbane BJ, Adler MD, Trainor JL. Pediatric residents’ ability to perform a lumbar puncture: Evaluation of an educational intervention. Pediatr Emerg Care. 2010;26:558562.
42. Hasley P, Preisner R, Anish E, Bulova P, Collin T, Kim Y. Is doing superior to knowing? Simulation training improves junior faculty confidence to teach joint aspiration and injection. J Gen Intern Med. 2010;25:S447S448.
43. Lenchus J, Issenberg SB, Murphy D, et al. A blended approach to invasive bedside procedural instruction. Med Teach. 2011;33:116123.
44. Ma IW, Chapelsky S, Bhavsar S, et al. Procedural certification program: Enhancing resident procedural teaching skills. Med Teach. 2013;35:524.
45. Miranda JA, Trick WE, Evans AT, Charles-Damte M, Reilly BM, Clarke P. Firm-based trial to improve central venous catheter insertion practices. J Hosp Med. 2007;2:135142.
46. Smith CC, Huang GC, Newman LR, et al. Simulation training and its effect on long-term resident performance in central venous catheterization. Simul Healthc. 2010;5:146151.
47. Srivastava G, Roddy M, Langsam D, Agrawal D. An educational video improves technique in performance of pediatric lumbar punctures. Pediatr Emerg Care. 2012;28:1216.
48. Thomas SM, Burch W, Kuehnle SE, Flood RG, Scalzo AJ, Gerard JM. Simulation training for pediatric residents on central venous catheter placement: A pilot study. Pediatr Crit Care Med. 2013;14:e416e423.
49. Vogelgesang SA, Karplus TM, Kreiter CD. An instructional program to facilitate teaching joint/soft-tissue injection and aspiration. J Gen Intern Med. 2002;17:441445.
50. Wayne DB, Cohen ER, Singer BD, et al. Progress toward improving medical school graduates’ skills via a “boot camp” curriculum. Simul Healthc. 2014;9:3339.
51. White ML, Jones R, Zinkan L, Tofil NM. Transfer of simulated lumbar puncture training to the clinical setting. Pediatr Emerg Care. 2012;28:10091012.
52. Xiao Y, Seagull FJ, Bochicchio GV, et al. Video-based training increases sterile-technique compliance during central venous catheter insertion. Crit Care Med. 2007;35:13021306.
53. Barsuk JH, Cohen ER, Caprio T, McGaghie WC, Simuni T, Wayne DB. Simulation-based education with mastery learning improves residents’ lumbar puncture skills. Neurology. 2012;79:132137.
54. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Use of simulation-based education to reduce catheter-related bloodstream infections. Arch Intern Med. 2009;169:14201423.
55. Barsuk JH, Cohen ER, McGaghie WC, Wayne DB. Long-term retention of central venous catheter insertion skills after simulation-based mastery learning. Acad Med. 2010;85(10 suppl):S9S12.
56. Barsuk JH, Cohen ER, Vozenilek JA, O’Connor LM, McGaghie WC, Wayne DB. Simulation-based education with mastery learning improves paracentesis skills. J Grad Med Educ. 2012;4:2327.
57. Barsuk JH, McGaghie WC, Cohen ER, O’Leary KJ, Wayne DB. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med. 2009;37:26972701.
58. Cohen ER, Barsuk JH, Moazed F, et al. Making July safer: Simulation-based mastery learning during intern boot camp. Acad Med. 2013;88:233239.
59. Conroy SM, Bond WF, Pheasant KS, Ceccacci N. Competence and retention in performance of the lumbar puncture procedure in a task trainer model. Simul Healthc. 2010;5:133138.
60. Diederich E, Rigler S, Mahnken J, Dong L, Williamson T, Sharpe M. The effect of model fidelity on learning outcomes of a simulation-based education program for central venous catheter insertion. Chest. 2012;142(4 suppl):535A.
61. Dodge KL, Lynch CA, Moore CL, Biroscak BJ, Evans LV. Use of ultrasound guidance improves central venous catheter insertion success rates among junior residents. J Ultrasound Med. 2012;31:15191526.
62. Evans LV, Dodge KL, Shah TD, et al. Simulation training in central venous catheter insertion: Improved performance in clinical practice. Acad Med. 2010;85:14621469.
63. Jiang G, Chen H, Wang S, et al. Learning curves and long-term outcome of simulation-based thoracentesis training for medical students. BMC Med Educ. 2011;11:39.
64. Kessler DO, Arteaga G, Ching K, et al. Interns’ success with clinical procedures in infants after simulation training. Pediatrics. 2013;131:e811e820.
65. Kessler DO, Auerbach M, Pusic M, Tunik MG, Foltin JC. A randomized trial of simulation-based deliberate practice for infant lumbar puncture skills. Simul Healthc. 2011;6:197203.
66. Taitz J, Wyeth B, Lennon R, et al. Effect of the introduction of a lumbar-puncture sticker and teaching manikin on junior staff documentation and performance of paediatric lumbar punctures. Qual Saf Health Care. 2006;15:325328.
67. Wayne DB, Barsuk JH, O’Leary KJ, Fudala MJ, McGaghie WC. Mastery learning of thoracentesis skills by internal medicine residents using simulation technology and deliberate practice. J Hosp Med. 2008;3:4854.
68. Kessler D, Pusic M, Chang TP, et al.; INSPIRE LP Investigators. Impact of just-in-time and just-in-place simulation on intern success with infant lumbar puncture. Pediatrics. 2015;135:e1237e1246.
69. Barsuk JH, Cohen ER, Potts S, et al. Dissemination of a simulation-based mastery learning intervention reduces central line-associated bloodstream infections. BMJ Qual Saf. 2014;23:749756.
70. Beran TN, McLaughlin K, Al Ansari A, Kassam A. Conformity of behaviors among medical students: Impact on performance of knee arthrocentesis in simulation. Adv Health Sci Educ Theory Pract. 2013;18:589596.
71. Brydges R, Nair P, Ma I, Shanks D, Hatala R. Directed self-regulated learning versus instructor-regulated learning in simulation training. Med Educ. 2012;46:648656.
72. Duncan JR, Henderson K, Street M, et al. Creating and evaluating a data-driven curriculum for central venous catheter placement. J Grad Med Educ. 2010;2:389397.
73. Murphy MA, Neequaye S, Kreckler S, Hands LJ. Should we train the trainers? Results of a randomized trial. J Am Coll Surg. 2008;207:185190.
74. Shanks D, Brydges R, den Brok W, Nair P, Hatala R. Are two heads better than one? Comparing dyad and self-regulated learning in simulation training. Med Educ. 2013;47:12151222.
75. Velmahos GC, Toutouzas KG, Sillin LF, et al. Cognitive task analysis for teaching technical skills in an inanimate surgical skills laboratory. Am J Surg. 2004;187:114119.
76. Craft C, Feldon DF, Brown EA. Instructional design affects the efficacy of simulation-based training in central venous catheterization. Am J Surg. 2014;207:782789.
77. Chan A, Singh S, Dubrowski A, et al. Part versus whole: A randomized trial of central venous catheterization education. Adv Health Sci Educ Theory Pract. 2015;20:10611071.
78. Chang W, Popa A, DeKorte M. A medical invasive procedure service and resident procedure training elective. J Hosp Med. 2012;7:S120.
79. Lenchus J, De Moraes AG, Garg M, Kalidindi V, Soto A, Pavon A. Impact of a standardized curriculum on reducing thoracentesis-induced pneumothorax. J Hosp Med. 2011;6:S69.
80. Lenchus JD. End of the “see one, do one, teach one” era: The next generation of invasive bedside procedural instruction. J Am Osteopath Assoc. 2010;110:340346.
81. Lenhard A, Moallem M, Marrie RA, Becker J, Garland A. An intervention to improve procedure education for internal medicine residents. J Gen Intern Med. 2008;23:288293.
82. Mourad M, Ranji S, Sliwka D. A randomized controlled trial of the impact of a teaching procedure service on the training of internal medicine residents. J Grad Med Educ. 2012;4:170175.
83. Ramakrishna G, Higano ST, McDonald FS, Schultz HJ. A curricular initiative for internal medicine residents to enhance proficiency in internal jugular central venous line placement. Mayo Clin Proc. 2005;80:212218.
84. Tolbert T, Haines L, Dickman E, MacArthur L, Terentiev V, Likourezos A. Central venous catheter location changes and complication rates after the institution of an emergency ultrasonography division. Ann Emerg Med. 2012;1:S7.
85. Lenchus JD, Carvalho CM, Ferreri K, et al. Filling the void: Defining invasive bedside procedural competency for internal medicine residents. J Grad Med Educ. 2013;5:605612.
86. Tukey MH, Wiener RS. The impact of a medical procedure service on patient safety, procedure quality and resident training opportunities. J Gen Intern Med. 2014;29:485490.
87. Burden AR, Torjman MC, Dy GE, et al. Prevention of central venous catheter-related bloodstream infections: Is it time to add simulation training to the prevention bundle? J Clin Anesth. 2012;24:555560.
88. Cherry RA, West CE, Hamilton MC, Rafferty CM, Hollenbeak CS, Caputo GM. Reduction of central venous catheter associated blood stream infections following implementation of a resident oversight and credentialing policy. Patient Saf Surg. 2011;5:15.
89. Costello J, Livett M, Stride PJ, West M, Premaratne M, Thacker D. The seamless transition from student to intern: From theory to practice. Intern Med J. 2010;40:728731.
90. Duncan DR, Morgenthaler TI, Ryu JH, Daniels CE. Reducing iatrogenic risk in thoracentesis: Establishing best practice via experiential training in a zero-risk environment. Chest. 2009;135:13151320.
91. McMullan C, Propper G, Schuhmacher C, et al. A multidisciplinary approach to reduce central line-associated bloodstream infections. Jt Comm J Qual Patient Saf. 2013;39:6169.
92. McKee C, Berkowitz I, Cosgrove SE, et al. Reduction of catheter-associated bloodstream infections in pediatric patients: Experimentation and reality. Pediatr Crit Care Med. 2008;9:4046.
93. Coopersmith CM, Rebmann TL, Zack JE, et al. Effect of an education program on decreasing catheter-related bloodstream infections in the surgical intensive care unit. Crit Care Med. 2002;30:5964.
94. Wall RJ, Ely EW, Elasy TA, et al. Using real time process measurements to reduce catheter related bloodstream infections in the intensive care unit. Qual Saf Health Care. 2005;14:295302.
95. See KC, Jamil K, Chua AP, Phua J, Khoo KL, Lim TK. Effect of a pleural checklist on patient safety in the ultrasound era. Respirology. 2013;18:534539.
96. Kessler DO, Fein D, Chang TP, et al. Impact of a simulator based just-in-time refresher training for interns on their clinical success rate with infant lumbar puncture. Pediatr Emerg Care. 2011;27:1002.
97. Vaisman A, Cram P. Procedural competence among faculty in academic health centers: Challenges and future directions. Acad Med. 2017;92:3134.
98. Mullan F, Salsberg E, Weider K. Why a GME squeeze is unlikely. N Engl J Med. 2015;373:23972399.
99. Cook DA, Hamstra SJ, Brydges R, et al. Comparative effectiveness of instructional design features in simulation-based education: Systematic review and meta-analysis. Med Teach. 2013;35:e867e898.
100. Pronovost P, Needham D, Berenholtz S, et al. An intervention to decrease catheter-related bloodstream infections in the ICU. N Engl J Med. 2006;355:27252732.
101. Wong BM, Holmboe ES. Transforming the academic faculty perspective in graduate medical education to better align educational and clinical outcomes. Acad Med. 2016;91:473479.
102. Cook DA, West CP. Perspective: Reconsidering the focus on “outcomes research” in medical education: A cautionary note. Acad Med. 2013;88:162167.
103. Brydges R, Hatala R, Zendejas B, Erwin PJ, Cook DA. Linking simulation-based educational assessments and patient-related outcomes: A systematic review and meta-analysis. Acad Med. 2015;90:246256.
104. Bansal N, Simmons KD, Epstein AJ, Morris JB, Kelz RR. Using patient outcomes to evaluate general surgery residency program performance. JAMA Surg. 2015;151:111119.
105. Kalet AL, Gillespie CC, Schwartz MD, et al. New measures to establish the evidence base for medical education: Identifying educationally sensitive patient outcomes. Acad Med. 2010;85:844851.
106. Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: A systematic review. JAMA. 2009;302:13161326.
107. Carraccio C, Englander R, Holmboe ES, Kogan JR. Driving care quality: Aligning trainee assessment and supervision through practical application of entrustable professional activities, competencies, and milestones. Acad Med. 2016;91:199203.
108. Walzak A, Bacchus M, Schaefer JP, et al. Diagnosing technical competence in six bedside procedures: Comparing checklists and a global rating scale in the assessment of resident performance. Acad Med. 2015;90:11001108.
109. Ilgen JS, Ma IW, Hatala R, Cook DA. A systematic review of validity evidence for checklists versus global rating scales in simulation-based assessment. Med Educ. 2015;49:161173.
110. Hatala R, Cook DA, Brydges R, Hawkins R. Constructing a validity argument for the Objective Structured Assessment of Technical Skills (OSATS): A systematic review of validity evidence. Adv Health Sci Educ Theory Pract. 2015;20:11491175.
111. MacMillan TE, Wu RC, Morra D. Quality of bedside procedures performed on general internal medicine in-patients: Can we do better? Can J Gen Intern Med. 2014;9:1720.
112. Asch DA, Nicholson S, Srinivas S, Herrin J, Epstein AJ. Evaluating obstetrical residency programs using patient outcomes. JAMA. 2009;302:12771283.