All simulation exercises were scheduled and conducted from June 2012 to April 2013. Each resident received an e-mail after the exercises with an online Likert-type evaluation using Google Forms (Figure 3A). Residents were asked to rate themselves using the Physician Performance Diagnostic Inventory Scale (PPDIS) as unsatisfactory, early learner, competent, or proficient. Expert level was not included in the PPDIS because that category is reserved for practicing neurosurgeons. Subject responses and analysis were blinded to junior (postgraduate years 1-3) or senior (postgraduate years 4-6) resident. Wilcoxon rank testing was used to detect differences within (sign rank) and between (rank sum) groups. Generalized linear mixed models with multinomial distribution and cumulative logit link were built to assess the overall difference across training level and type of simulation.
Using Global Rating Scales of Operative Performance adapted from Reznick et al,9,10 we studied the correlation of self-perceived (subjective) and faculty-perceived (objective) levels in the PPDIS for a selected group of procedures performed at the end of the curriculum schedule (Figure 3B).
One hundred eighty surveys were completed online. Considering all residents, cadaver simulations accrued the highest reported benefit (71.5%; P < .001), followed by physical simulators (63.8%; P < .001) and haptic/computerized simulators (59.1; P < .001).
Junior residents reported proficiency improvements in 82% of simulations performed (P < .001). The reported improvement rate was highest for cadaver (91.7%; P < .001), followed by physical simulators (83%; P < .001) and haptic simulators (69.6%; P = .002). Overall, senior residents reported improvement in 42.5% of the simulations performed (P < .001 for within-group comparison). Their reported improvement rate was highest with haptic simulations (47.6%; P < .001), followed by cadaver (44.5%; P = .008) and physical models (39%; P < .001; Table). Multivariate analysis indicated that senior residents are approximately 83% less likely to improve at any given time than junior residents (odds ratio, 0.17; P = .05).
The simulators and equipment necessary to carry out the described curriculum, along with the initial and annual costs, are presented in Figure 4. The minimum calculated initial outlay is $341 978.00 with $27 876.36 for annual operational expenses. The estimated faculty teaching value was $21 125.00 for the 130 hours of laboratory time (mean core faculty salary/2000 work hours per year × 130 hours of laboratory time). Exercises conducted in the operating room were supported by the Operating Room Management Office as quality improvement exercises with no additional cost to our division. In the exercises performed in the operating room environment, 33% overrated their performance by 1 PPDIS level, 50% correctly evaluated themselves, and 16.7% underrated it by 1 PPDIS level. There was no statistically significant difference between the subjective and objective ratings (P = .60 based on the McNemar test). Video I (Supplemental Digital Content I, http://www.youtube.com/watch?v=mWb8C5PI4mA) presents some simulation exercises executed in the operating room. The residents interact with scrub nurses and technicians and may receive challenges during the operation. Performance assessment and debriefing are supervised by a faculty neurosurgeon.
Simulation in neurosurgical training is still a relatively nascent field of study requiring further investigation and validation. Establishing curricula and using reproducible measures of objective and subjective improvement are critical as we move forward. Although it is intuitive that simulation in neurosurgery will make us better at what we do, as scientists, we have an obligation to investigate this hypothesis and to determine whether it is true and which simulation techniques result in improved technical skill and better patient outcomes. As a means of ensuring compliance with core competencies, limitations imposed by hour regulations, and milestone requirements, it would not be surprising if in the not-so-distant future programs are required to report efforts in simulation training and outcome reports.
Results and Limitations
Although we focused on subjective data, we realize the importance of objectively ensuring that proficiency levels are attained. Benchmarks for proficiency (defined as the experts' mean minus 1 SD) are underway for some simulators but require graduate neurosurgeons to participate.11 An end-of-year random selection of exercises to evaluate proficiency rather than the application of formal evaluations in every session assists, in our opinion, in maintaining the spirit of faculty-resident deliberate teaching/learning time. Future work is needed to objectively assess whether the perceived benefits of simulation training actually translate to improved tangible metrics in patient care, eg, decreased operating room time, complications, and length of stay.
We found that, all exercises and residents considered, cadaver was reported to be more efficacious in improving level of competency, followed by physical and haptic simulators. This could be related to the fact that dissections provide anatomic fidelity and require psychomotor skills similar to those used in the operating room. The results also indicated that senior residents are less likely to improve after simulation exercises, which could be attributed to a steeper learning curve from competent to proficient level, compared with earlier stages. It remains useful for them as indicated by their positive feedback, probably because certain skills required in the process of becoming excellent can be identified and improved without risking patient safety.
We realize that costs will vary across different programs; however, our analysis provides an estimate of the cost for a 1-resident-per-year program from which calculations can be inferred to larger training programs.8 Outlay costs can and were reduced by obtaining grants, using simulator rental programs, or obtaining industry collaboration. We realize exercises in the operating room may have a cost at other institutions: Operating room charges are based on surgical drivers (elements that allow for 5 surgery levels of complexity), accounting for equipment, instruments, setup time, staff, and supplies. For levels 1 and 5, it ranges between $2300 and $5500, respectively, for the first 30 minutes and $926 and $2756, respectively, for each 30 minutes thereafter (at our institution). One hour of simulation laboratory training would cost approximately $200 (annual expenses divided by hours spent in the laboratory throughout the year). Simulation training, although not a substitute for operating room experience, may decrease the time we spend teaching intraoperatively and reduce morbidity from resident error. If we demonstrate this in future studies, pursuing financial or logistic support to improve the resources available for residents will likely become an easier endeavor.
Sources of Logistical Support
Boundaries with industry have to be transparent and well defined to avoid conflict of interest. Engaging core faculty and providing them with protected time is fundamental to ensure quality teaching and continuity of such educational effort. Larger programs may encounter more logistical barriers in terms of scheduling exercises and will likely require full-time equivalent positions for support staff to assist in the daily running of the simulation laboratory.
The curriculum presented worked well for our residents' needs and environment, but we understand that it may not for everyone else and that some may prefer other simulators and alternatives. The creation of an objective, unbiased reporting of simulators via online evaluations and reporting could be useful to determine which simulators produce the best results.12 We believe it is important to tailor individual program efforts to the needs of the residents and the program. Future efforts should focus on translating simulation efficiency into improved metrics such as decreased morbidity numbers for procedures that are being simulated or decreased operating room time for the same type of procedure. Developing a formally approved curriculum in neurosurgery is an intriguing idea, but because every training environment is different, it will be challenging or even undesirable to provide a solution that fits all.
Although implementing simulation could seem costly or time-consuming, we remain encouraged with the following thought: If a single patient can benefit from this adjuvant form of training, then it becomes unarguably worthwhile. Every patient is important, and high-quality, standardized surgery is expected of us and those we have the obligation to educate, with results that are carried across to all surgeons.
The systematic implementation of a simulation curriculum in a neurosurgery training program is feasible, is favorably regarded, and has a positive impact on trainees of all levels, particularly in the first 3 years of postgraduate education. Cadaver dissection, physical models, and haptic/computerized simulators have a role in different stages of learning and should be considered in the development of simulation curricula.
This work was funded in part by the Academy of Master Teachers Educational Technology Grant, University of Texas Medical Branch. The authors have no personal financial or institutional interest in any of the drugs, materials, or devices described in this article.
We sincerely thank Karen Martin, Steve Schuenke, and Christian von Eschenbach for their assistance in developing this manuscript.
1. Ganju A, Aoun SG, Daou MR, et al.. The role of simulation in neurosurgical education: a survey of 99 United States neurosurgery program directors [published online ahead of print November 24, 2012]. World Neurosurg. doi: 10.1016/j.wneu.2012.11.066. Accessed April 1, 2013.
2. Ganju A, Kahol K, Lee P, et al.. The effect of call on neurosurgery residents' skills: implications for policy regarding resident call periods. J Neurosurg. 2012;116(3):478–482.
3. El Ahmadieh TY, El Tecle NE, Aoun SG, Yip BK, Ganju A, Bendok BR. How can simulation thrive as an educational tool? Just ask the residents. Neurosurgery. 2012;71(6):N18–N19.
4. Aoun SG, McClendon J Jr, Ganju A, Batjer HH, Bendok BR. The Association for Surgical Education's roadmap for research on surgical simulation. World Neurosurg. 2012;78(1-2):4–5.
5. Kirton OC, Reilly P, Staff I, Burns K. Development and implementation of an interactive, objective, and simulation-based curriculum for general surgery residents. J Surg Educ. 2012;69(6):718–723.
6. Selden NR, Barbaro N, Origitano TC, Burchiel KJ. Fundamental skills for entering neurosurgery residents: report of a Pacific region “boot camp” pilot course, 2009. Neurosurgery. 2011;68(3):759–764; discussion 764.
7. Selden NR, Origitano TC, Burchiel KJ, et al.. A national fundamentals curriculum for neurosurgery PGY1 residents: the 2010 Society of Neurological Surgeons boot camp courses. Neurosurgery. 2012;70(4):971–981; discussion 981.
8. Danzer E, Dumon K, Kolb G, et al.. What is the cost associated with the implementation and maintenance of an ACS/APDS-based surgical skills curriculum? J Surg Educ. 2011;68(6):519–525.
9. Scott DJ, Bergen PC, Rege RV, et al.. Laparoscopic training on bench models: better and more cost effective than operating room experience? J Am Coll Surg. 2000;191(3):272–283.
10. Reznick R, Regehr G, MacRae H, Martin J, McCulloch W. Testing technical skill via an innovative “bench station” examination. Am J Surg. 1997;173(3):226–230.
11. Zevin B, Levy JS, Satava RM, Grantcharov TP. A consensus-based framework for design, validation, and implementation of simulation-based training curricula in surgery. J Am Coll Surg. 2012;215(4):580–586.e3.
12. Damanakis A, Blaum WE, Stosch C, Lauener H, Richter S, Schnabel KP. Simulator network project report: a tool for improvement of teaching materials and targeted resource usage in skills labs. GMS Z Med Ausbild. 2013;30(1):Doc4.
Keywords:Copyright © by the Congress of Neurological Surgeons
Cost; Education; Laboratory; Neurosurgery; Simulation; Training