Introduction: Simulation is a safe alternative to practicing procedural skills on patients. However, few published studies have examined the long-term effect of simulation technology on bedside procedures such as central venous catheter (CVC) insertion.
Methods: To determine whether simulation-based teaching improves procedural comfort, performance, and clinical events in CVC insertion, over traditional methods of procedural teaching, and to assess the long-term effect of this training, we conducted a prospective, randomized controlled trial with 53 postgraduate year-1 and postgraduate year-2 medical residents at a tertiary-care teaching hospital. At the start of the study, we assessed all residents’ procedural comfort and previous training and experience with CVCs. We then measured their baseline performance in placing CVCs on simulators, using a validated assessment tool (pretest). For the intervention group, we reassessed performance immediately after simulation training (posttest). All subjects then placed actual CVCs as clinically indicated while on their medical intensive care unit rotations, under the supervision of critical care faculty. We measured clinical events associated with these CVCs. After their medical intensive care unit rotations, we reassessed CVC insertion skills on simulators and procedural comfort of all subjects (delayed posttest).
Results: Intervention subjects demonstrated a significant improvement in skills immediately after simulation training. At delayed posttesting, performance diminished somewhat in the intervention subjects and was not significantly different from control subjects; however, a significant increase over pretest scores persisted in both groups.
Conclusions: A CVC insertion simulation course improves procedural skills. These skills decline over time, and simulation conferred no long-term additional benefit over traditional methods of procedural teaching.
From the Internal Medical Residency Program (C.C.S.), Beth Israel Deaconess Medical Center, Boston, MA; Center for Education (C.C.S., G.C.H., L.R.N., R.M.S.), Shapiro Institute for Education and Research, Beth Israel Deaconess Medical Center, and Harvard Medical School, Boston, MA; Department of Medicine (C.C.S., G.C.H., L.R.N., P.F.C., R.M.S.), Beth Israel Deaconess Medical Center, Boston, MA; Intensive Care Units (P.F.C.), Beth Israel Deaconess Medical Center, Boston, MA; Interventional Pulmonology and School of Medicine (D.F.-K.), Johns Hopkins University, Baltimore, MD; Pulmonary, Critical Care and Sleep Medicine (M.C., T.E.), Harvard Combined Fellowship in Pulmonary and Critical Care Medicine, Boston, MA.
Reprints: C. Christopher Smith, MD, Beth Israel Deaconess Medical Center, 330 Brookline Avenue, Healthcare Associates, Shapiro 1, Boston, MA 02215 (e-mail: email@example.com).
Central venous catheter (CVC) insertion is a commonly performed bedside procedure and a frequent source of morbidity and mortality in hospitalized patients, with an observed complication rate of 15%.1–4 CVC insertion, like other invasive procedures, is commonly taught on real patients using the “see one, do one, teach one” model of procedural teaching. This approach requires inexperienced trainees to perform complex procedures under suboptimal conditions that lack standardization of teaching or assessment of skill. Not surprisingly, trainees report that they are uncomfortable and inadequately prepared to perform these procedures.5–7 Moreover, patients themselves are not comfortable with having inexperienced trainees perform these procedures.8
Detailed attention to technique along with increased procedural experience can reduce the errors associated with procedures such as CVC insertion.9 Recognizing the deficiencies in the present training system, the need to improve patient safety, and the call for increased teaching and supervision of procedures, some programs have sought to move beyond the apprenticeship method of training by providing more direct observation and feedback by expert faculty.10–14 To lower the risk of procedural error even further, some programs have also begun to establish a minimal level of procedural skill proficiency for novice learners before they are allowed to perform a procedure on patients.15
In clinical training, simulation has become accepted as a safe alternative to practicing procedural skills on patients. Simulation offers an opportunity for focused, deliberate practice in a safe, controlled learning environment. By allowing a learner to practice skills repeatedly in a controlled setting until mastery is achieved, simulation offers educational advantages for novice learners.16 In addition, simulation allows exposure to procedures or scenarios that occur infrequently in clinical practice.
Given the inherent appeal of simulation training and positive findings from multiple studies, many organizations have called for broader adoption of simulation in medical education.17,18 Recent studies have demonstrated that simulation-based training can improve the CVC insertion skill and reduce associated complications.7,19–21 However, there are little data to show that procedural skills acquired in a simulated setting are maintained longitudinally.22,23 In addition, validated assessment tools do not exist for most procedures, limiting the ability of investigators to study procedural skill performance.
We sought to examine the additional benefit of simulation- based CVC training over traditional methods of teaching CVC insertion, by measuring knowledge, self-reported procedural comfort, clinical events, and procedural performance. We hypothesized that residents participating in the simulation course would demonstrate higher proficiency in procedural skill and long-term retention of improved CVC insertion skills.
Our study was a prospective, randomized controlled trial to assess the effectiveness of a simulation-based CVC educational intervention. We assessed postgraduate years 1 and 2 (PGY-1 and PGY-2) internal medicine residents at a 556-bed tertiary care Boston teaching hospital. The internal medicine training program in 2007 included 62 PGY-1 residents (14 of whom were in a 1-year preliminary program) and 47 PGY-2 categorical residents. Study participants were PGY-1 and PGY-2 residents assigned to medical intensive care unit (MICU) rotations during the second half of the 2006–2007 academic year and PGY-1 residents assigned to the MICU in the first 3 months of the 2007–2008 academic year. We randomly assigned residents to intervention or control groups.
On all residents, we gathered demographic data and information about previous training and experience with CVCs. To establish baseline characteristics, all subjects completed a 14-item multiple-choice quiz on CVC placement and a self-assessment of procedural comfort and competence using a five-point Likert-type scale. All subjects then performed a baseline CVC insertion on partial-task simulators (Central Venous Access Head/Neck/Upper Torso; Blue Phantom, Kirkland, WA) in our Skills and Simulation Center. Each CVC insertion was digitally videotaped for later review by faculty evaluators.
Figure 1 summarizes the study design. The week before their MICU rotations, subjects in the intervention group (two to four residents at a time) underwent a 2-hour CVC insertion course. The course, designed by a faculty member with expertise in teaching medical procedures, was led by one of two senior pulmonary fellows, who were trained in procedural teaching skills, CVC insertion skills, the use of the assessment tool, and received feedback in their procedural instruction. The course began with a case-based didactic discussion to review cognitive aspects of CVC insertion such as indications, contraindications, and complications. Fellows then reviewed components of the CVC insertion kit and performed a complete CVC insertion on the simulator. The procedure was then broken down into individual components to allow participants to visualize each step. The residents had opportunities to practice the procedure in its entirety on the simulators with supervision and feedback. At the completion of the course, intervention subjects performed CVC insertions on the simulators without guidance while being videotaped.
While in the MICU, both intervention and control groups performed CVC insertions as clinically indicated. All CVCs at our institution are supervised by pulmonary critical care faculty who provide guidance and education before and during each procedure. After each procedure, a faculty evaluator provides feedback to the trainee and completes a written evaluation of procedural performance. A general CVC insertion checklist was used for each procedure; this checklist emphasizes patient safety parameters (eg, barrier precautions and time out) in addition to tracking procedural events, such as number of needle passes, number of operators required to complete the procedure, and immediate complications. All residents also have access to online CVC insertion teaching materials, including videos, still graphics, a comprehensive written curriculum and self-assessment quizzes. These online materials are publicly available, therefore, we did not track usage.
After every CVC insertion on a MICU patient during the study period, a nurse practitioner analyzed medical records for patient characteristics and delayed procedure-related complications.
On an average of 3 months after their MICU rotations (delayed posttesting), all subjects completed an online assessment of knowledge, procedural comfort, and self-reported competence and then performed CVC insertions on the simulators without guidance while being videotaped.
Two trained faculty evaluators used a CVC insertion assessment tool to review videotapes of the CVC insertions on simulators. In a previous study, we demonstrated evidence of content and construct validity for the use of assessment tool in internal medicine residents at our hospital.15 This assessment tool was not available to study subjects. The faculty evaluators were blinded to subjects’ identities due to videotaping techniques and to their study status. The faculty evaluators had previously demonstrated high interrater agreement (R = 0.92) on a subset of resident data after having been trained to use the tool.15 The study protocol was approved in advance by the hospital investigational review board.
We tabulated resident characteristics by whether they received training on the CVC simulator. These characteristics included postgraduate year level, gender, self-reported previous experience performing CVCs during residency (including number of CVCs attempted and number placed), previous experience of performing CVCs during medical school (including number of CVCs attempted and number placed), and extent of CVC teaching during medical school (amount, nature of education, and primary trainer).
We performed bivariable analyses to compare the control and trained subjects as groups. The dependent variables were total number of correct answers on the cognitive test, number of subjects with overall comfort (defined in our previous study24 as a response of “somewhat comfortable” or “extremely comfortable” on the question “How comfortable are you with the procedure as a whole?”), and number of checklist items achieved (out of 23). We used the following tests of statistical significance: χ2 for frequencies, Fisher exact test for frequencies with sample size less than 5, and unpaired two-sample t-tests and analysis of variance for means.
We eliminated observations of subjects who did not complete both pre- and delayed posttests of performance. We then compared the number of checklist items achieved (out of 23) on the delayed posttests with the same subjects’ performance on the pretests. For subjects in the intervention group, we compared performance results on the immediate posttests with the same subjects’ performance on the pretests.
We performed bivariable analyses of clinical events associated with actual CVCs performed by subjects participating in the study. Measures included complications (pneumothorax, bleeding, and arterial access), number of passes required before venous access, and whether the CVC was completed by the initial resident operator.
We display resident characteristics in Table 1. Fifty-two residents participated in the study; 38 (73%) were PGY-1 and 14 (27%) were PGY-2. Twenty-nine (56%) were women and 23 (44%) were men. Approximately half of subjects had had experience with CVCs in residency and most did not have experience with CVCs in medical school. The differences among these characteristics were not statistically significant between the control and training groups except for the extent of CVC experience in medical school (greater in the control group).
Analysis Including All Participants
We report group-level comparisons in Table 2, categorized into cognitive test scores, procedural comfort, and checklist results, before and after training. There were no statistically significant differences between the control and trained groups of subjects in terms of number of cognitive test items answered correctly, reported overall comfort, or total number of checklist criteria achieved.
Analyses Limited to Participants Completing All Assessments
We report checklist performance comparisons of the control and trained groups by pretest, immediate posttest (for trained group), and delayed posttest results in Table 3. All bivariable comparisons using analysis of variance were statistically significant, indicating that trained subjects improved their checklist scores immediately after training and sustained a statistically significant increase over pretest scores at the time of delayed posttests. There was also a statistically significant decrement of performance from immediate posttests to delayed posttests in the trained group.
Results from bivariable comparisons of clinical data showed no statistically significant difference between actual CVCs performed by control subjects and intervention subjects, including no difference in overall complications, number of needle passes to complete the procedure (a surrogate marker for mechanical complications25), or percentage of procedures completed by the initial resident operator (Table 4).
To the best of our knowledge, this is the first controlled study that examines the additional contribution of CVC simulation training to traditional procedural training in terms of long-term retention of cognitive, affective, psychomotor, and clinical metrics. As found in previous studies,7,19,20 residents who participated in CVC simulation training demonstrated a significant and immediate improvement in performance. At an average of 3 months, residents still improved above baseline performance but showed a statistically significant decrement in skills. Finally, residents who did not undergo simulation training achieved the same level of performance as the trained subjects in delayed posttesting.
We confirmed our hypothesis that residents would improve their skills in CVC insertion after additional teaching. Procedural expertise is thought to be achieved through a three-step process in which the learner passes from a cognitive stage (learning the steps of a procedure), to an associative stage (where the learner performs these steps), to an autonomous stage (where actions become subconscious and automatic).26 We believe our CVC simulation course advanced the learner to the second stage of procedural expertise and additionally provided a safe, controlled learning environment away from the responsibilities of clinical care.
We found a significant diminution of these skills in the trained group several months after the course (Fig. 2). Although counter to our initial hypothesis, this decay might be explained by the relative infrequency of CVCs placed; residents inserted a mean of 1.7 CVCs during a 3-week ICU rotation, which provided limited opportunities to reinforce the skills learned during the simulation session. The medical literature offers little data on the decay of procedural skills, but research from other fields indicate that long periods of nonuse of newly learned and complex skills lead to rapid deterioration.27 As Kneebone28 states, “Any recently learned skill is a fragile bloom which can wither if not carefully nurtured.” Consequently, the most important factor in acquiring expertise is sustained, deliberate practice, which may require years to achieve.29–31 Thus, a single training exercise is insufficient to reach a threshold of mastery, a contention supported by our previous study.15 Furthermore, an insidious decay in skills may go unrecognized by the learner as comfort or self-confidence do not always correlate with actual performance,19 leading to increased medical error.27
Although not consistent with other recent studies,7,19,20 our finding that the control group of residents improved to the same degree as the trained residents at delayed posttesting is supported by other research showing that simulation has a limited effect on procedural skills acquisition in comparison with actual experience. One study of gastroenterology fellows demonstrated an initial improvement in colonoscopy skills by those taught using virtual reality; however, after 30 procedures on real patients, both the case and control groups performed similarly.32 In a study of internal medicine residents who underwent CVC insertion training, there was no improvement in CVC insertion practices compared with controls.13 In our case, we believe the improvement in the control group is largely explained by the fact that all residents placing CVCs are supervised by expert faculty members. As such, residents receive guidance, education, and feedback after each procedure that likely contributes to incremental improvement in skill and reduces complications. In other recent similar studies,7,19,20 residents were allowed to perform procedures without supervision after performing only five supervised procedures; it is likely that this lack of supervision accentuated the benefits of additional CVC insertion simulation training. In contrast, the high level of direct faculty supervision and teaching at our institution likely reduced the number of complications in our control group,33 minimizing the effect of simulation training and improving the delayed test scores of the control group.
We speculate that simulation accelerated learning in the intervention group compared with the control group, which improved through experience and guidance under supervision. Thus, simulation would potentially mitigate the risk to patients by allowing residents to achieve a minimal level of proficiency earlier than residents in the control group. One must therefore consider the contribution of simulation not in terms of ultimate performance achieved but in terms of patient safety attained at earlier time points. Is this hypothesized gain worth the expense? We calculated the overall financial burden to be about $143 per resident, based on the cost of a single simulator ($2395) and faculty teaching time ($75/h). With larger training programs and over time, the cost per resident decreases, and to some extent, protection of the patient is “priceless.”
Our study has several limitations. Our findings are limited to a single teaching hospital and may not be generalizable to other teaching settings. The study subjects were internal medicine residents, which is not traditionally considered a procedure-oriented field; hence, results may differ for surgical programs. Our study was underpowered to detect differences in clinical metrics (requiring 800 subjects in each study arm to detect a 1% difference in complication rates). Our control group was exposed to an environment in which all CVC insertions are supervised and online training is continually available, which may attenuate the effect of simulation training compared with a setting where CVCs are performed without supervision. Some residents, predominantly from the control group, did not complete all portions of the study; this attribution may have introduced some bias in our analyses and also reduced the effective sample size. Additionally, although the simulation training resulted in a marked immediate improvement in their overall performance on simulators, we did not guarantee that each individual reached a predetermined level of competence. Finally, in using a binary checklist of steps, we evaluated gross operational skills and did not assess gradations of hand-eye coordination, such as proprioception or respect for tissue.
Through a relatively inexpensive CVC simulation course, learners made rapid advancement in procedural skill, thereby reducing risk of procedural complications and increasing patient safety. In the absence of sustained practice, however, these skills declined significantly. After ∼3 months, the skills of learners who participated in the CVC insertion course were similar to those who did not. Although CVC simulation sessions provide an opportunity for intensive learning in a safe environment, it is clear that a single teaching session is not adequate and should not be viewed as a sole means of conducting procedural training. To prevent a decline in skill level, we recommend that learners undergo regular refresher courses; for example, residents should have CVC simulation refresher courses before each ICU rotation. Frequently scheduled training sessions and ongoing teaching, monitoring, and feedback by experts are needed to reinforce and advance procedural skills and prevent decay.
1. Kohn L, Corrigan J, Donaldson M, eds. To Err Is Human: Building a Safer Health System
. Washington, DC: Committee on Quality of Health Care in America, Institute of Medicine, National Academy Press; 2000.
2. Leape LL, Brennan TA, Laird N, et al. The nature of adverse events in hospitalized patients. Results of the Harvard Medical Practice Study II. N Engl J Med
3. Merrer J, De Jonghe B, Golliot F, et al. French Catheter Study Group in Intensive Care. Complications of femoral and subclavian venous catheterization in critically ill patients: a randomized controlled trial. JAMA
4. Sznajder JI, Zveibil FR, Bitterman H, Weiner P, Bursztein S. Central vein catheterization. Failure and complication rates by three percutaneous approaches. Arch Intern Med
5. Hicks CM, Gonzalez R, Morton MT, Gibbons RV, Wigton RS, Anderson RJ. Procedural experience and comfort level in internal medicine trainees. J Gen Intern Med
6. Wickstrom GC, Kolar MM, Keyserling TC, et al. Confidence of graduating internal medicine residents to perform ambulatory procedures. J Gen Intern Med
7. Barsuk JH, McGaghie WC, Cohen ER, Balachandran JS, Wayne DB. Use of simulation-based mastery learning to improve the quality of central venous catheter placement in a medical intensive care unit. J Hosp Med
8. Santen SA, Hemphill RR, McDonald MF, Jo CO. Patients’ willingness to allow residents to learn to practice medical procedures. Acad Med
9. Pronovost P, Needham D, Berenholtz S, et al. An intervention to decrease catheter-related bloodstream infections in the ICU. N Engl J Med
10. Smith CC, Gordon CE, Feller-Kopman D, et al. Creation of an innovative inpatient medical procedure service and a method to evaluate house staff competency. J Gen Intern Med
11. Ramakrishna G, Higano ST, McDonald FS, Schultz HJ. A curricular initiative for internal medicine residents to enhance proficiency in internal jugular central venous line insertion. Mayo Clin Proc
12. Lucas BP, Asbury JK, Wang Y, et al. Impact of a bedside procedure service on general medicine inpatients: a firm-based trial. J Hosp Med
13. Miranda JA, Trick WE, Evans AT, Charles-Damte M, Reilly BM, Clarke P. Firm-based trial to improve central venous catheter insertion practices. J Hosp Med
14. Lenhard A, Moallem M, Marrie RA, Becker J, Garland A. An intervention to improve procedure education for internal medicine residents. J Gen Intern Med
15. Huang GC, Newman LN, Schwartzstein RM, et al. Procedural competence in internal medicine residents: validity of a central venous catheter insertion assessment instrument. Acad Med
16. Wayne DB, Butter J, Siddall VJ, et al. Mastery learning of advanced cardiac life support skills by internal medicine residents using simulation technology and deliberate practice. J Gen Intern Med
17. Gallagher AG, Cates CU. Approval of virtual reality training for carotid stenting: what this means for procedural-based medicine. JAMA
19. Barsuk JH, McGaghie WC, Cohen ER, O’Leary KJ, Wayne DB. Simulation-based mastery learning reduces complications during central venous catheter insertion in a medical intensive care unit. Crit Care Med
20. Barsuk JH, Cohen ER, Feinglass J, McGaghie WC, Wayne DB. Use of simulation-based education to reduce catheter-related bloodstream infections. Arch Intern Med
21. Britt RC, Reed SF, Britt LD. Central line simulation: a new training algorithm. Am Surg
2007;73:680–682; discussion 682–683.
22. Wayne DB, Siddall VJ, Butter J, et al. A longitudinal study of internal medicine residents’ retention of advanced cardiac life support skills. Acad Med
23. Crofts JF, Bartless C, Ellis D, Hunt LP, Fox R, Draycott TJ. Management of shoulder dystocia: skill retention 6 and 12 months after training. Obstet Gynecol
24. Huang GC, Smith CC, Gordon CE, et al. Beyond the comfort zone: residents assess their comfort performing inpatient medical procedures. Am J Med
25. Mansfield PF, Hohn DC, Fornage BD, Greguirch MA, Ota DM. Complications and failures of subclavian-vein catheterization. N Engl J Med
26. Hamdorf JM, Hall JC. Acquiring surgical skills. Br J Surg
27. Arthur W Jr, Bennett W Jr, Stanush PL, McNelly TL. Factors that influence skill decay and retention: a quantitative review and analysis. Hum Perform
28. Kneebone R, Scott W, Darzi A, Horrocks M. Simulation and clinical practice: strengthening the relationship. Med Educ
29. Ericsson KA, Krampe RT, Tesch-Romer C. The role of deliberate practice in the acquisition of expert performance. Psychol Rev
30. Ernst A, Silvestri GA, Johnstone D. Interventional pulmonary procedures: guidelines from the American College of Chest Physicians. Chest
31. Bolliger CT, Mathur PN, Beamis JF, et al; European Respiratory Society/American Thoracic Society. ERS/ATS statement on interventional pulmonology. Eur Respir J
32. Sedlack R, Kolars J. Computer simulator training enhances the competency of gastroenterology fellows at colonoscopy: results of a pilot study. Am J Gastroenterol
33. Smith C, Gordon C, Feller-Kopman D, et al. Creation of an innovative inpatient medical procedure service and a method to evaluate housestaff competency. J Gen Intern Med