Skip Navigation LinksHome > October 2013 - Volume 73 - Issue > A Novel Craniotomy Simulator Provides a Validated Method to...
Neurosurgery:
doi: 10.1227/NEU.0000000000000116
Cranial Neurosurgery Simulators

A Novel Craniotomy Simulator Provides a Validated Method to Enhance Education in the Management of Traumatic Brain Injury

Lobel, Darlene A. MD*; Elder, J. Bradley MD; Schirmer, Clemens M. MD, PhD§,‖; Bowyer, Mark W. MD; Rezai, Ali R. MD

Section Editor(s): Harrop, James S. MD; Bendok, Bernard R. MD

Free Access
Supplemental Author Material
Article Outline
Collapse Box

Author Information

*Department of Neurologic Surgery, Mayo Clinic, Rochester, Minnesota

Department of Neurological Surgery, The Ohio State University Medical Center, Columbus, Ohio

§Division of Neurosurgery, Baystate Medical Center, Springfield, Massachusetts

Department of Neurosurgery, Tufts University School of Medicine, Boston, Massachusetts

The Norman M. Rich Department of Surgery, Uniformed Services University, Bethesda, Maryland

Correspondence: Darlene A. Lobel, MD, Department of Neurologic, Surgery, Mayo Clinic, 200 1st St SW, Rochester, MN 55905. E-mail: Darlene.a.lobel@hotmail.com

Information detailed in this article has been submitted as an abstract to be considered for presentation at the 2013 CNS Annual Meeting.

Supplemental digital content is available for this article. Direct URL citations appear in the printed text and are provided in the HTML and PDF versions of this article on the journal's Web site (www.neurosurgery-online.com).

Received April 22, 2013

Accepted June 12, 2013

Collapse Box

Abstract

BACKGROUND: In a variety of surgical specialties, simulation-based technologies play an important role in resident training. The Congress of Neurological Surgeons (CNS) established an initiative to enhance neurosurgical training by developing a simulation-based curriculum to complement standard didactic and clinical learning.

OBJECTIVE: To enhance resident education in the management of traumatic brain injury by the use of simulation-based training.

METHODS: A course-based neurosurgical simulation curriculum was developed and offered at the 2012 CNS annual meeting. Within this curriculum, a trauma module was developed to teach skills necessary in the management of traumatic brain injury, including the performance of craniotomy for trauma. Didactic and simulator-based instruction were incorporated into the course. Written and practical pre- and posttests, as well as questionnaires, were used to assess the improvement in skill level and to validate the simulator as a teaching tool.

RESULTS: Fourteen trainees participated in the didactic section of the trauma module. Average performance improved significantly in written scores from pretest (75%) to posttest (87.5%, P < .05). Eight participants completed the trauma craniotomy simulator. Incision planning, burr hole placement (P < .02), and craniotomy size (P < .05) improved significantly. Junior residents (postgraduate years 1-3) demonstrated the most improvement during the course.

CONCLUSION: The CNS simulation trauma module provides a complementary method for residents to acquire necessary skills in the management of traumatic brain injury. Preliminary data indicate improvement in didactic and hands-on knowledge after training. Additional data are needed to confirm the validity of the simulator.

ABBREVIATIONS: ACGME, Accreditation Council for Graduate Medical Education

CA, Cochran-Armitage

CNS, Congress of Neurological Surgeons

NCAMSC, National Capital Area Medical Simulator Center

PGY, postgraduate year

TBI, traumatic brain injury

VR, virtual reality

Simulation-based training provides a novel method to teach neurosurgical trainees the skills necessary to perform basic neurosurgical procedures. In recent years, simulation training has been integrated into general surgery and surgical subspecialty resident training programs. In fact, simulation-based training is now mandated by the Accreditation Council for Graduate Medical Education (ACGME) as a requirement for general surgery training programs.1 Because of the complexity of neurosurgical procedures and, in particular, the importance of haptic feedback in performing these procedures, simulation technologies have been instituted on only a limited basis in neurosurgical training programs. As simulators have advanced technologically, and as we now have greater understanding of the necessary components of simulation-based curricula to enhance surgical training, the Congress of Neurological Surgeons (CNS) has worked toward developing a neurosurgical simulation training program that not only enhances residents' ability to acquire neurosurgical skills, but also provides trainees with the necessary education in ACGME core competencies.

Neurotrauma-related procedures, including craniotomy for traumatic hemorrhage or treatment of refractory intracranial pressure elevations, are among the most commonly encountered procedures in resident training. Typically, skills in performance of such neurosurgical procedures are acquired under an apprenticeship model. In this model, a resident's first experience performing these procedures occurs during a neurosurgical emergency. Stress of an emergency situation combined with minimal previous hands-on experience may increase the risk of surgical complications. Additionally, because craniotomy for trauma is among the earliest cranial procedures to which residents are exposed during training, the techniques that residents acquire in learning this procedure may form the basis for surgical techniques for more complex craniotomies. Training using simulation provides residents an opportunity to master these critical skills before performing their first craniotomy on a patient, which may improve patient safety.

Additionally, the military has integrated advances in simulation technology to develop medical simulators for training of military personnel in managing field trauma. The need to train deployed military surgeons who may lack immediate neurosurgical support has led the National Capital Area Medical Simulation Center (Uniformed Services University, Bethesda, Maryland) to develop both virtual reality (VR)2 and physical model-based simulators for trauma craniotomy. These simulators, currently in the final stages of development, are undergoing rigorous validation studies, which have yet to be published. The mutual need to train neurosurgical residents and military surgeons in skills critical to the management of traumatic brain injury (TBI) provided a unique opportunity to combine the efforts of experts at the National Capital Area Medical Simulator Center (NCAMSC) and the trauma section of the CNS Simulation Committee to develop a course-based simulation curriculum focusing on TBI.

Back to Top | Article Outline

MATERIALS AND METHODS

The trauma simulation module was introduced at the 2012 CNS Simulation Symposium, a course offered to neurosurgical trainees at the CNS annual meeting. The curriculum for the trauma module was designed through an iterative process after evaluating both nationally based and international centers that had developed successful simulation programs in emergency medicine, general surgery, and other surgical subspecialties. A case-based approach integrated both didactic and hands-on components to train residents in the skills necessary to successfully perform procedures commonly encountered in the treatment of TBI. Careful attention was given to addressing the 6 ACGME core competencies through the trauma simulation curriculum (Table). Simulators were chosen and developed to best meet the established curricular goals. Evaluation tools were created to assess the performance of the individual trainees, the validity of the simulators, and the course design.

TABLE CNS Simulation...
TABLE CNS Simulation...
Image Tools
Back to Top | Article Outline
Choice of Simulators

The leaders of the CNS trauma simulation team conducted an evaluation of commercially available trauma craniotomy simulators. During the search, it became apparent that there was a paucity of such simulators in existence. We identified the previously described VR-based craniotomy simulator under development at the NCAMSC.2 Although nearly complete, this simulator was not yet ready for commercialization and was therefore not included in the CNS simulation curriculum. In collaboration with Operative Experience, Inc (North East, Maryland), the NCAMSC has developed a hyperrealistic physical model of the human head for the performance of trauma craniotomy (Figure 1A). The model is designed to portray all 5 layers of the scalp, as well as temporalis muscle and fascia, superficial temporal and middle meningeal artery supply, bone, dura, and brain. Additionally, pathology such as subdural or epidural hematoma can be added to the model to increase realism. Each model can be used to perform 2 craniotomies, 1 on each side of the head, and allows for the use of the actual instruments used in the operating room. This craniotomy model was integrated into the 2012 CNS simulation curriculum, replacing a BrainLab (Brainlab, Inc, Westchester, Illinois) based craniotomy model that had been used in the 2011 course.

Figure 1
Figure 1
Image Tools
Back to Top | Article Outline
Course Design

The trauma module was designed along the same guidelines as all modules in the CNS Simulation Symposium. A precourse questionnaire was sent out in the week before the course to query familiarity with simulators and baseline skill level in common neurosurgical procedures. Based on data from the 2011 session, a precourse didactic and practical evaluation was developed for participants in the 2012 course. All attendees in the trauma module of the symposium took the same written pretest and posttest and participated in the same didactic session. The written pretest consisted of 12 multiple-choice questions seeking a single best answer. The questions were designed to assess knowledge of commonly used trauma-rating scales, clinical diagnostic and management skills, common complications, and neuroanatomy relevant to the surgical procedures. Following this, participants were instructed in these key concepts during a 15-minute didactic session. A written posttest identical to the pretest was administered at the conclusion of the trauma module. The posttest included an additional ethics-based question that intentionally did not have a single best correct answer. The purpose of this question was to stimulate discussion between the participants and the faculty. This answer was not analyzed in the results. After the posttest, the faculty reviewed incorrect answers with the participants to reinforce critical teaching points for the module.

After participating in the didactic portion of the course, participants were then assigned to 1 of 2 simulator models. Attendees were allowed to participate in both sessions if time allowed. This portion of the course included a brief 5-minute video demonstrating the use of the craniotomy simulator (see Video 1, Supplemental Digital Content 1, http://www.youtube.com/watch?v=A7DbPF6Rk4g), followed by a hands-on practical pretest in which the trainee performed the simulated task. The faculty evaluated the participant's knowledge of anatomy and technical skills in performance of the procedure. The practical pretest was followed by hands-on training with the same faculty instructor and then 45 minutes of independent practice time with the simulator. A practical posttest was then administered, and the same scoring metrics were applied as during the practical pretest. Hands-on training was followed by an immediate postcourse evaluation with the entire cohort. A final questionnaire was sent out 6 months after the course to obtain information on how the training has affected the skill and comfort level of course participants with common neurosurgical procedures.

Back to Top | Article Outline
Evaluation Tools

Pre- and postcourse online questionnaires were used to assess the baseline experience of the participants and were used to complement face and content validity of the simulator and the course. Both didactic and practical pre- and posttests were administered during the course to obtain validation data for the course and simulator. Participants were graded by the faculty based on the accuracy and efficiency of neurosurgical skills during simulator use, with a technical assessment form that was developed based on the objective structured assessment of technical skill.3 Performance measures included knowledge of anatomy and landmarks, dexterity in instrument use, and technical skills relevant to the craniotomy procedure (Figure 2). Faculty judged accuracy and efficiency for each task, and data were acquired real-time via entry into a tablet database program. For each simulation, there was a pretraining evaluation followed by one-on-one interaction with a faculty member who provided instruction and critique. This was followed by a posttraining evaluation. Scores between 1 and 5 were given for each performance measure (Figure 2). Participants were classified, based on the level of training, into a junior group encompassing medical students and residents in postgraduate years (PGY 1-3) and a senior group, comprising PGY 4 to 7 residents. Performance measures were analyzed by using contingency tables and Pearson χ2 tests and asymptotic Cochran-Armitage test for trends (CA test). Statistical analysis was performed by using SAS JMP (SAS Institute, Carey, North Carolina). Pretest and posttest data were then evaluated to assess the validity of the craniotomy simulator as a teaching tool.

Figure 2
Figure 2
Image Tools
Back to Top | Article Outline

RESULTS

Written Test Performance

Fourteen participants enrolled in a neurosurgery training program at US/Canadian-based or foreign institutions were assigned to the trauma module of the 2012 CNS Simulation Symposium. All 14 took the pretest and were exposed to the structured didactic session. Twelve participants completed the posttest. Data analysis incorporates only the 12 participants who completed both the pre- and posttests.

The median score for the pretest was 75%. This improved to 87.5% during the posttest, given after didactic instruction and completion of the simulator, representing a significant improvement (P < .05). The time taken by participants to answer the test significantly improved by 22% (P < .02; Figure 3).

Figure 3
Figure 3
Image Tools
Back to Top | Article Outline
Hands-on Simulation Performance

Eight participants completed the trauma craniotomy simulator (see Video 2, Supplemental Digital Content 2, http://www.youtube.com/watch?v=5H1zSdWwTXU), whereas the remaining participants completed a different simulator. Participants improved across all measures (Figure 4). For the task of identifying anatomy and relevant landmarks, 75% of all participants reached a perfect score after instruction, practice, and feedback, hereafter referred to as “training.” This represents a 13% relative improvement. Skin incision planning and burr hole placement were significantly improved with 63% reaching a perfect score after training, improving from 13% (P = .23, CA test P < .021). The size of the craniotomy created by participants was judged ideal in 63% of attempts after training, significantly improved from 38% (P = .24, CA test P < .05) (Figure 1B). Dexterity did not improve significantly; most participants reached a score of 3, and there was only modest improvement on average of 0.8 points. One participant was judged to have used ideal technique after training (13%) (P = .36, CA test P = .07). The predefined time limit of 10 minutes to complete all the steps of the simulated craniotomy proved to be an adequate challenge for all participants; no participant was able to complete all steps before training, although 50% completed most steps. This improved to 3 completions (38%) after training. These improvements did not reach statistical significance. Although the overall summary measures regarding complication recognition and management improved from 25% to 63% of participants reaching a perfect score, the relation was not statistically significant. Some participants obtained lower scores on certain measures after training, because they ran into new and unforeseen complications that were not covered by the training given after their practical pretest evaluation.

Figure 4
Figure 4
Image Tools
Back to Top | Article Outline
Results Stratification by Training Level

We analyzed the course design and simulator validity by stratifying the scores by training level. We hypothesized that junior participants would initially perform more poorly and show more improvement after training compared with senior participants.

Junior participants unexpectedly answered more questions correctly in the pretest (79% vs 75%) and improved more in the posttest (89% vs 81%), representing a significant 11% improvement (P > .01, matched pairs). Senior residents were significantly more facile in creating the optimal skin flap and burr hole placement compared with the junior group (P = .26, CA test P < .03) and displayed significantly better dexterity (P = .07, CA test P < .05). There were no significant differences regarding the level of anatomic knowledge (P = .31, CA test P = .08), optimal craniotomy size (P = .57, CA test P = .08), time to complete the procedure (P = .66, CA test P = .21), or complication recognition and management (P = .32, CA test P = .13), although a trend was noted for the former 2 measures. Junior participants demonstrated significantly improved performance after training when planning the skin incision and burr hole placement (P < .04, CA test P < .04); however, dexterity (P = .07, CA test P < .03), improvements in anatomic knowledge (P = .24, CA test P = .17), craniotomy size (P = .18, CA test P = .16), time to complete the procedure (P = .63, CA test P = .5), and complication management (P = .13, CA test P = .21) failed to reach statistical significance (Figure 5).

Figure 5
Figure 5
Image Tools

Overall, the data from the performance of the trainees affirm construct validity of the course and the craniotomy simulator. Postcourse evaluations revealed that participants in the course enjoyed the simulator and felt that it was an effective teaching tool, supporting the simulator in terms of face and content validity. Because of the low number of participants completing the postcourse questionnaire, more definitive statements incorporating statistical analysis regarding face and content validity are not possible.

Back to Top | Article Outline

DISCUSSION

The trauma module of the 2012 CNS Simulation Symposium was designed to provide neurosurgical trainees an opportunity to enhance their knowledge of TBI management and acquire skills necessary to perform craniotomy for trauma. Studies have shown that training with simulators alone, without integrating their use into a well-defined curriculum, does not effectively meet training goals.4 The goal of a simulator-based curriculum is to provide an opportunity to learn and practice skills in a setting that allows participants to attain a level of technical facility that can be transferred from a simulation forum to the operating room environment. Therefore, the CNS simulation course was designed around a curriculum-based approach, to best meet the needs of neurosurgical trainees and to meet the educational requirements set forth by the ACGME (Table). Patient-based care and medical knowledge are addressed throughout the course, during the written examinations, didactic sessions, and simulator training sessions. Training sessions also provide ideal opportunities for faculty to observe and instruct participants in interpersonal skills and professionalism. Finally, practice-based learning and systems-based practice skills are addressed with the precourse and postcourse evaluations completed by the participants. The questionnaires require that participants assess their experience with procedures taught during the CNS simulation course, and compare their complication rate, percentage of autonomy performing a procedure, and comfort level with a procedure, before the course, as a baseline assessment, and at a 6-month interval after completing the course.

The central objective of the CNS neurosurgical simulation program is to aid in the surgical training of neurosurgeons and neurosurgical residents. Deficiencies in certain areas of neurosurgical education have been identified5 that may be addressed by providing simulation-based training as a complement to current training techniques. Implementation of simulation technology into a formal neurosurgical curriculum may also address patient safety concerns being raised since the introduction of resident work hour restrictions.6,7 Recent literature underscores increasing implementation of these training adjuncts, but further validation of the various technologies is needed.8 Validated simulators may shift the learning curve associated with various technical skills from the operating room to simulation courses and centers, thereby enhancing resident training while improving patient safety.6

A variety of simulator modules are discussed in this supplement, including haptic-based devices, physical models, and virtual reality models, as well as combinations of each. Many neurosurgical simulators currently available or in development are designed for endovascular9 and spinal applications.10,11 Only a few cranial simulators have been developed, and nearly all are either Web-based12 or VR-based models that focus more on techniques to handle specific intracranial pathology, such as tumor resection,13 aneurysm clipping,14 or petrosectomy,15 rather than on the key elements for trauma craniotomy, including incision planning, burr hole placement, and extent of bone removal.2,16 VR-based craniotomy models offer significant benefits, including the ability to provide trainees with an unlimited number of simulated practice sessions. Additionally, certain models have the capability of integrating scoring metrics into the simulator,17 thereby minimizing bias in the evaluation of the trainee. The disadvantages with VR models include the high initial upfront purchase cost and imperfect haptic feedback that limits the degree of realism of these models. The physical craniotomy model used in the CNS Simulation Symposium provides a realistic experience in performance of craniotomy and allows trainees to experience the subtleties of the craniotomy procedure, including the nuances of the handling of actual operative instruments, confronting difficulty with an electric drill that can become “stuck” in the bone, and managing dural tears that occur during bony opening.

Validation of simulators and simulation-based training modules is central to their use and development in neurosurgical training. Compared with literature for validation of general surgery simulation-based training, there is a relative paucity of peer-reviewed neurosurgical literature on this topic. A review of this literature reveals that multiple concepts in validation and a number of methods of assessing validity have been used in an effort to validate a given surgical simulator. These include construct, face, and content validation techniques.18,19 Within the general surgery literature, these mechanisms for validating haptic simulators,20,21 virtual reality devices,22 and physical models have laid the groundwork for implementing these tools in surgical training programs. The neurosurgical trauma simulation course described here aimed to validate the craniotomy simulator as a useful training tool.

Construct or internal validation of a simulator is achieved if a simulator can distinguish between novice and expert level experience.22,23 In this sense, a simulator's validity is judged based on its ability to predict the level of proficiency of the course participant. The craniotomy simulator was reviewed for the ability of course participants to optimize the skin incision, size and location of the craniotomy, and time to complete the craniotomy. The simulator accurately distinguished the level of expertise based on data for optimization of skin flap and burr hole placement, as well as dexterity. Other results demonstrated expected trends, but did not achieve statistical significance. In particular, it is surprising that, although there was a trend toward senior residents completing the procedure in less time than junior residents, statistical significance was not achieved for this measure. This may be due, in part, to the small sample size. Additionally, there were some limitations with available operating instruments that may have impacted the results. Additional equipment will be provided at future courses to address this. Finally, there were some characteristics of the model itself that may have prolonged the procedure. For example, the thickness of the skull presented drilling challenges to some of the participants. As the model becomes further refined, we expect to see such issues minimized. Despite these limitations, the overall results are supportive of construct validity for this physical simulator. Additionally, the more junior level residents demonstrated the greatest improvement in hands-on skills (Figure 4), consistent with the results of other simulation studies.24

Face and content validity assess the sentiments of the participants regarding the relevance of the simulator to the clinical task being modeled. Face validity is gleaned from novices who have little experience. Expert-level participants and nonparticipant judges best assess content validity. Both validations are aided by pre- and postcourse questionnaires. Previous work in general surgery simulators has evaluated face and content validity primarily with validated questionnaires that assess the participants' opinions of the relevance, ease of use, and feasibility of the simulator.25 Among participants in the trauma module, trainees were asked for feedback regarding a variety of aspects of the simulator, including the ability to improve their technical skills and their likelihood to recommend the trauma simulation course to other trainees. In general, responses on the questionnaires indicated that the trauma simulator engaged the participants in tasks they felt were relevant to their training, although results could not be evaluated statistically. Future courses will aim to improve on the percentage of residents that fill out the questionnaire.

Concurrent validity assesses the degree to which a simulator correlates with a “gold standard.” Determining concurrent validity is dependent on reliable data from another form of simulator, such as cadaver-based models, which is readily available in the current design of the neurosurgery trauma simulation module. Predictive validity is beyond the scope of this course, because it relates to the ability of a simulator to predict future performance of the real-life task. However, during the postcourse survey sent out 6 months after course completion, course participants were asked whether they felt their experience in the trauma module, and specifically with the craniotomy simulator, had led to improvements in performance of actual craniotomy, including time to complete the procedure and in complication rate. Although there was a low response to this survey preventing statistically significant data analysis, the responses overwhelmingly suggested that residents felt more comfortable performing the procedure and had fewer complications compared with their experience before the course. It should be noted that there is potential bias introduced by assessing the impact of the course through a postcourse survey. Those whose operative experience was positively impacted by the course may have been more likely to respond to the survey, or respondents may have been concerned about the anonymity of their responses, leading them to give positive rather than negative feedback. Ideally, a study that randomly assigns residents at a single institution into simulation training with the trauma craniotomy model, or to standard apprenticeship training, and compares key components of craniotomy, including procedure time and complication rate, on a longitudinal basis would provide a more powerful analysis of the long-term effect of participation in trauma craniotomy simulation on resident education.

Effective simulation-based training must also demonstrate feasibility, reliability, and educational impact.18 The participants completed the simulator in a reasonable period of time. Data regarding time to completion and scores from the judges indicate that consistent results were generated among the participants. When repeating a simulator trial, the performance of the participants typically improved significantly, underscoring the potential for positive educational impact.

The limitations of this study include the unblinded nature of the evaluation by the faculty of the participants' hands-on skills with simulator. The same instructor who graded the pretraining evaluation gave feedback to the participant and then graded the posttraining evaluation, causing a potential bias of the instructor to see the participant improve after feedback. Having 1 instructor conduct the baseline and posttraining evaluations and having a second instructor perform the one-on-one training would eliminate this bias. Alternatively, videotaping the performance of participants with a subsequent review by blinded evaluators would further refine the design. Furthermore, to some degree, dexterity is difficult to judge, and different instructors may have applied different personal standards.

Finally, the number of attempts during the simulation is limited to 1 pretraining evaluation and 1 posttraining evaluation. Although there are some measures relating to anatomy and conceptual understanding of the procedure, there are others including dexterity that are either difficult to improve over such a short training period or are sensitive to interattempt variations, increasing the random chance of a worse score in the posttraining evaluation after a good score in the pretraining assessment.

General disadvantages of physical simulators are largely in the form of cost and maintenance of additional facilities. Such models, including cadavers and the physical craniotomy model described above, tend to have higher ongoing costs. The estimated cost of this physical craniotomy simulator at the time of the course was approximately $1500 per head, which includes 2 sides on which to perform craniotomy. These heads are designed to be refurbished and reused at a reduced cost, but their use represents an ongoing cost for training. Added costs include surgical instruments and storage facilities.

A primary goal is to improve the power of the validation studies by increasing the number of resident participants in the CNS simulation course. The implementation of regional simulation courses in addition to the annual CNS Simulation Symposium are currently underway. The trauma module course directors will also be reevaluating the simulators in order to optimize educational benefit and validity, as well as minimize cost.

Back to Top | Article Outline

CONCLUSION

The CNS simulation course was developed to introduce trainees to the CNS simulation curriculum, which was developed as part of a CNS initiative to enhance current resident training in neurosurgery. Didactic and hands-on sessions provided a training opportunity in scenario and case-specific critical problem solving, focused decision-making, and complication avoidance and management skills. Data analysis revealed that the trauma craniotomy simulator incorporated into the trauma module of the simulation course provides a validated model to enhance resident training.

In the future, residency programs will increasingly incorporate simulation-based training at local simulation centers. Advantages include shifting a portion of the learning curve for a specific surgical task out of the operating room and into the simulation laboratory, which will likely increase patient safety. Strategic course design to allow ongoing validation of simulator technology will be an important adjunct to the continued development of neurosurgical simulation programs. In addition to providing needed supplemental educational opportunities for neurosurgical residents, this simulator model introduced with the described training protocols may also be adapted to train military surgeons before deployment.

Back to Top | Article Outline
Disclosures

Dr Lobel reports a minor consultant relationship with St. Jude Medical. Dr Rezai is currently the President of the Congress of Neurological Surgeons. Dr Rezai reports Ownership in Autonomic Technologies, and MRI Interventions, and a major consulting agreement with Autonomic Technologies. The other authors have no personal, financial, or institutional interest in any of the drugs, materials, or devices described in this article.

Back to Top | Article Outline
Acknowledgment

We gratefully acknowledge Dr Robert Buckman for his contribution as faculty during the CNS Simulation Symposium.

Back to Top | Article Outline

REFERENCES

1. Accreditation Council for Graduate Medical Education (ACGME). Common Program Requirements. Available at: http://wwwacgmeorg/acgmeweb/Portals/0/dh_dutyhoursCommonPR07012007pdf, Accreditation Council for Graduate Medical Education (ACGME), 2011.

2. Acosta E, Liu A, Armonda R, et al.. Burrhole simulation for an intracranial hematoma simulator. Stud Health Technol Inform. 2007;125:1–6.

3. Faulkner H, Regehr G, Martin J, Reznick R. Validation of an objective structured assessment of technical skill for surgical residents. Acad Med. 1996;71(12):1363–1365.

4. Gallagher AG, Ritter EM, Champion H, et al.. Virtual reality simulation for the operating room: proficiency-based training as a paradigm shift in surgical skills training. Ann Surg. 2005;241(2):364–372.

5. Mazzola CA, Lobel DA, Krishnamurthy S, Bloomgarden GM, Benzil DL. Efficacy of neurosurgery resident education in the new millennium: the 2008 Council of State Neurosurgical Societies post-residency survey results. Neurosurgery.2010;67(2):225–232; discussion 232-223.

6. Palter VN, Orzech N, Reznick RK, Grantcharov TP. Validation of a structured training and assessment curriculum for technical skill acquisition in minimally invasive surgery: a randomized controlled trial. Ann Surg. 2013;257(2):224–230.

7. Dumont TM, Rughani AI, Penar PL, Horgan MA, Tranmer BI, Jewell RP. Increased rate of complications on a neurological surgery service after implementation of the Accreditatioin Council for Graduate Medical Education work-hour restriction. J Neurosurg. 2012;116:483–486.

8. Ganju A, Aoun SG, Daou MR, et al.. The role of simulation in neurosurgical education: a survey of 99 United States Neurosurgery Program Directors. World Neurosurg. 2012. Epub ahead of print.

9. Tedesco MM, Pak JJ, Harris J Jr, Krummel TM, Dalman RL, Lee JT. Simulation-based endovascular skills assessment: the future of credentialing? J Vasc Surg. 2008;47(5):1008–1014.

10. Alaraj A, Charbel FT, Birk D, et al.. Role of cranial and spinal virtual and augmented reality simulation using immersive touch modules in neurosurgical training. Neurosurgery.2013;72(suppl 1):115–123.

11. Ghobrial GM, Anderson PA, Chitale R, Campbell PG, Lobel DA, Harrop JS. Simulated spinal cerebrospinal fluid leak repair: an educational model with didactic and technical components. Neurosurgery. 73(suppl 1):S111–S115.

12. Sim P. Web-based Surgical Simulation. [MSc Thesis]. School of Computing, University of Leeds, 2000.

13. Delorme S, Laroche D, DiRaddo R, Del Maestro RF. NeuroTouch: a physics-based virtual simulator for cranial microsurgery training. Neurosurgery. 2012;71(suppl 1 operative):32–42.

14. Wong GK, Zhu CX, Ahuja AT, Poon WS. Craniotomy and clipping of intracranial aneurysm in a stereoscopic virtual reality environment. Neurosurgery. 2007;61(3):564–568.

15. Pflesser B, Petersik A, Tiede U, Höhne K, Leuwer R. Volume cutting for virtual petrous bone surgery. Comput Aided Surg. 2002;7(2):74–83.

16. Scerbo MW, Turner TR, Newlin-Canzone E, et al.. A Preliminary evaluation of a burr hole drilling simulator for craniotomy. Paper Presented at: Proceedings of the Human Factors And Ergonomics Society 54th Annual Meeting. 2010;54(27):2361-2365.

17. Lemole GM Jr, Banerjee PP, Luciano C, Neckrysh S, Charbel FT. Virtual reality in neurosurgical education: part-task ventriculostomy simulation with dynamic visual and haptic feedback. Neurosurgery. 2007;61(1):142–148.

18. Abboudi H, Khan MS, Aboumarzouk O, et al.. Current status of validation for robotic surgery simulators - a systematic review. BJU Int. 2013;111(2):194–205.

19. Kelly DC, Margules AC, Kundavaram CR, et al.. Face, content, and construct validation of the Da Vinci skills simulator. Urology. 2012;79(5):1068–1072.

20. Liss MA, Abdelshehid C, Quach S, et al.. Validation, correlation, and comparison of the da Vinci trainer (™) and the daVinci surgical skills simulator (™) using the Mimic (™) software for urologic robotic surgical education. J Endourol. 2012;26(12):1629–1634.

21. Singapogu RB, DuBose S, Long LO, et al.. Salient haptic skills trainer: initial validation of a novel simulator for training force-based laparoscopic surgical skills. Surg Endosc. 2013;27(5);1653–1661.

22. Plooy AM, Hill A, Horswill MS, et al.. Construct validation of a physical model colonoscopy simulator. Gastrointest Endosc. 2012;76(1):144–150.

23. Slade Shantz JA, Leiter JR, Gottschalk T, Macdonald PB. The internal validity of arthroscopic simulators and their effectiveness in arthroscopic education. Knee Surg Sports Traumatol Arthrosc. 2012. Epub ahead of print.

24. Dayal R, Faries PL, Lin SC, et al.. Computer simulation as a component of catheter-based training. J Vasc Surg. 2004;40(6):1112–1117.

25. Shamim Khan M, Ahmed K, Gavazzi A, et al.. Development and implementation of centralized simulation training: evaluation of feasibility, acceptability and construct validity. BJU Int. 2013;111(3):518–523.

Keywords:

Craniotomy; Curriculum; Education; Neurotrauma; Residency; Simulation; Traumatic brain injury

Copyright © by the Congress of Neurological Surgeons

Login

Article Tools

Images

Share

Search for Similar Articles
You may search for similar articles that contain these same keywords or you may modify the keyword list to augment your search.