Secondary Logo

Teaching Technical Skills to Surgical Residents: A Survey of Empirical Research

Hamstra, Stanley, J; Dubrowski, Adam; Backstein, David

Clinical Orthopaedics and Related Research: August 2006 - Volume 449 - Issue - p 108-115
doi: 10.1097/01.blo.0000224058.09496.34
SECTION I: SYMPOSIUM I: C. T. Brighton/ABJS Workshop on Orthopaedic Education
Free
SDC

We review a series of empirical studies on the use of simulators and bench models in training technical skills and subsequent retention of those skills. We discuss recent research on the transfer of training from bench models and simulators to the clinical setting and provide a theoretical structure to organize the findings. The transfer of training from inanimate bench models and simulators to live patients has recently been demonstrated in a number of areas. The effectiveness of this training is enhanced if focus is placed on the operative, or process-oriented, aspects of the procedure, with suspension of disbelief regarding the physical structure of the training platform. The retention of trained skills is an area of research only beginning to evolve, with recent results suggesting that effective retention can be demonstrated if training is tightly focused and involves an entire procedure. An emerging area of research involves the use of simulators as assessment instruments for high-stakes testing, and recent results involving simulated trauma management support this novel application. Based on these findings, we encourage the use of a wide variety of high- and low-fidelity platforms, with emphasis on training procedural knowledge involving an entire procedure.

From the *Department of Medical Education, University of Michigan, Ann Arbor, MI; the †Department of Surgery, University of Toronto, Toronto, Canada; and the ‡Division of Orthopaedic Surgery, Mount Sinai Hospital, Toronto, Canada.

Each author certifies that he has no commercial associations (eg, consultancies, stock ownership, equity interest, patent/licensing arrangements, etc) that might pose a conflict of interest in connection with the submitted article.

Correspondence to: Stanley J. Hamstra, PhD, Department of Medical Education, University of Michigan, G1208 Towsley Center, 1500 E. Medical Center Drive, Box 201, Ann Arbor, MI 48109. Phone: 734-763-1424; Fax: 734-936-1641; E-mail: shamstra@umich.edu.

While it is widely believed surgical skill is a reflection of innate talent, initial empirical research suggests surgical expertise is acquired through considerable practice.10 This is consistent with research in other domains, suggesting the attainment of true expertise requires 10,000 hours of deliberate practice (eg, violin), and perhaps 10 years of commitment to the field.11 Certainly, this fits with recent data showing a positive relation between surgeon volume and patient outcome.19 In addition, there is evidence for specificity in training because the ability to perform a particular surgical procedure seems to derive from specific practice with the procedure and does not generalize to other, even apparently similar, surgical procedures.30

However, when training residents we are less concerned with the attainment of expertise than the attainment of competence. While the term expertise is usually associated with attainment of the highest level of proficiency in a given field, competence represents the other end of the spectrum, implying the minimum necessary to demonstrate knowledge of the subject matter. Competence in a particular surgical skill requires much less training than what is required for expertise. This can be attained by tightly focused instruction. For example, Matsumoto et al25 found 1 hour of hands-on practice with feedback improves performance substantially in training mid-ureteric stone extraction if training is focused on the surgical procedure. Similarly, Wanzel et al41 showed 5 minutes of hands-on practice with feedback (one-on-one) improved performance among a group of junior residents on a plastic surgery flap procedure. These disparate numbers highlight the importance of feedback and raise a number of questions about the training of technical skills. If we teach one procedure will it transfer to another procedure? If we train on a model or simulator, does this transfer to the live patient, and if so, under what conditions? At what stage of training are simulators most effective? If we teach using simulators, is it appropriate to assess competence using oral and written examinations? Are trained skills retained over time?

We present the findings of recent experimental research, examining: (1) whether simulators and low-fidelity bench models promote the transfer of technical skills training to the clinical setting; (2) what conditions of training promote the retention of technical skills training; and (3) whether simulators can be used effectively as assessment instruments for high stakes exams. We also identify a theoretical framework in which these findings can be organized.

Back to Top | Article Outline

Transfer

The Effectiveness of Low-Fidelity Simulators

Given the pervasiveness of simulators in the training environment for all aspects of surgery, the issue of learning transfer has become an important area of research. Learning transfer is the application of skills and knowledge learned in one context to another context. In surgical education, transfer can take on different levels; simulators and models can promote transfer from the simulator to the cadaver, to the live animal, or to live human patients (the ultimate goal of simulator training). It has become clear simulators are an essential part of any training program and the reasons have been well-documented.15,33,34 To summarize, the general reason for adopting simulators in medical schools is to address reduced opportunities for training in the clinical setting. Additional forces for change include a mandated 80-hour work week, new pressures for continuing medical education (CME), pressure to keep operating room turnover high including associated fiscal constraints, pressure to reduce surgical delay, ethical issues regarding training on live patients, and technical innovations requiring practice before implementation in vivo. The use of simulators in surgical education has exploded in recent years with models and virtual reality (VR) systems available for a wide variety of open and minimal access procedures,12,27 including developments in orthopaedic surgery.3,22,32 This recent emphasis on the development of models and simulators raises important research issues related to evidence for transfer of learning across physical platforms and to the clinical setting. Several studies have now shown technical skills acquired on low-fidelity bench models transfer to improved performance on higher fidelity models, such as human cadavers. While there is currently a great deal of research investigating transfer from the simulator to the clinical setting, there have to date been few well-designed studies exploring this effectiveness.

Early related research focused on the development of better methods of measuring technical skills performance, including transfer to the cadaver or other models. Anastakis et al1 found technical skills acquired on low-fidelity bench models transfer to improved performance on higher fidelity human cadaver models, strongly supporting the potential for transfer into the operating room with real patients. In one of the most dramatic examples of transfer from low-fidelity to high-fidelity models, Matsumoto et al25 had 40 final year medical students randomly assigned to a low-fidelity bench model, high-fidelity model, or didactic training group for a mid-ureteric stone extraction. Performance was measured using an expert rating scale, checklist, pass rating, and time needed to complete the task. The results demonstrate a substantial effect of hands on training on performance in all measurement domains when compared with didactic training, but no difference in performance using the low-fidelity or high-fidelity models (Fig 1). Thus, transfer of learning from one model to another was demonstrated, compared with no effect of didactic teaching. This indicates a bench model can be effective in training a surgical procedure, and it makes little difference on what physical platform the procedure is trained. In this case, we were able to identify characteristics of the context relevant to training essential constructs and demonstrate superficial aspects of the physical structure are unimportant.

Fig 1A

Fig 1A

A second remarkable study on the effectiveness of low-fidelity models showed transfer of learning from a low-fidelity bench model to live patients in the operating room.29 In this case, the procedure was fiberoptic oral intubation and the model was constructed from wood (Fig 2). We demonstrated that novice anesthesiologists who received technical skills training (fiberoptic intubation) on a simple bench model were able to effectively transfer these skills to the clinical setting on live patients. Again, we were able to identify the essential constructs involved in the clinical procedure and develop a model allowing for the training of those specific constructs.

Fig 2

Fig 2

Another study17 involving low-fidelity models examined the effect of technical skills instruction on clinically relevant outcomes in laboratory mice after surgical reexploration 1 month later. In this study, we examined whether skills learned on both low-fidelity and high-fidelity bench models transferred to clinically relevant outcomes in urological microsurgery (ie, vas deferens anastomosis). The low-fidelity model consisted of a portion of silicon tubing on which the participants practiced the procedure, while the high-fidelity model was a rat vas deferens in vivo. Fifty junior surgery residents participated in a 1 day microsurgical training course. Participants were randomized into one of three groups: high-fidelity model training, low-fidelity model training, and didactic training. All participants were assessed on their performance during the procedure using expert ratings, checklists, procedure time, hand motion analysis, measures of patency of the anastomosis and presence of sperm on microscopy after surgery. After training, anastomotic patency rates were observed to be higher for the group receiving hands-on training when compared with didactic training, and rates of sperm presence on microscopy 1 month later were higher for the high-fidelity model group versus the didactic group, with no difference between the high-fidelity and low-fidelity groups.

Back to Top | Article Outline

The Effectiveness of High-Fidelity Simulators

In one of the first studies to examine transfer to the clinical setting, Scott et al38 assessed whether laparoscopic training on a bench model video trainer (Guided Endoscopic Module, Karl Storz Endoscopy, Culver City, CA) improved resident surgery performance. Operating room performance was assessed on a laparoscopic cholecystectomy using a pretest and posttest design. The group trained on the video trainer was found to have a greater improvement in expert ratings of performance during the operative procedure than residents in the control group. In a similar study, Seymour et al39 assessed whether laparoscopic VR training (MIST-VR, Mentice Medical Simulation, Gothenburg, Sweden) transferred technical skills to the operating room. Virtual reality trained residents were faster and much less likely to fail to make progress, burn nontarget tissue, and injure the gallbladder during a laparoscopic cholecystectomy than residents in the control group. Similar studies in general surgery have replicated these findings for the MIST-VR,16,20 and an upper GI simulator7 (GI-Mentor, Simbionix Ltd, Lod, Israel).

Back to Top | Article Outline

Transfer of Learning across Surgical Procedures

In one of the few studies investigating the transfer of learning across surgical procedures, Wanzel et al40 had novice trainees learn a simple two-flap Z-plasty procedure and subsequently tested them on their ability to transfer their training to a more complex version of the same procedure, a four-flap Z-plasty. Participants who had scored well on tests of visual-spatial ability were able to transfer well to the more complex task, but those who scored lower on those tests did not transfer well, suggesting that successful transfer of learning across procedures may involve preexisting (perhaps innate) abilities or some other as yet undetermined special skill. Research in this area continues.

Back to Top | Article Outline

Simulators and Bench-top Models in the Surgical Curriculum

Although models and simulators can be effective educational platforms for transfer to the clinical setting, and the research regarding low-fidelity models clearly has implications for cost-effectiveness, it remains to be seen whether low fidelity models are accepted by trainees to the same degree as state-of-the-art VR systems. In the study by Grober et al,17 residents were asked to rate the educational value of the low-fidelity versus high-fidelity models on a scale of 1 to 7. The mean response was 5.2 and 6.7, respectively, indicating some appreciation of the value of training on both platforms. However, when asked pointedly about which model they preferred to work with, the residents responded quite clearly in favor of the high-fidelity model, 90% versus 10%, respectively (in this case, a live rat vas deferens versus silicon tubing). Thus, an interesting research direction to explore is exactly where in the curriculum low-fidelity models should be substituted for high-fidelity models. It is not clear from the existing research under what conditions each of these platforms is most effective. We propose novices, such as medical students, may be adequately motivated to reap the benefits of training with low-fidelity models while junior and senior residents would need exposure to progressively higher fidelity models, such as VR systems and live animals. This may also hold true for the use of simulators as assessment platforms as it may be most effective to reserve the high-fidelity simulators for assessing advanced trainees who have already demonstrated an understanding of the basics with the low-fidelity simulator. A caveat with using low-fidelity models for training is acceptance by trainees may be low. To counteract this, surgical educators at McGill University14 have instituted a system whereby residents must demonstrate competence on low-fidelity models before being given an opportunity to practice on higher fidelity models or patients.

Back to Top | Article Outline

A Theoretical Basis for Interpreting the Effectiveness of Low-Fidelity Models

Several studies have now shown technical skills acquired on simulators and bench models transfer to improved performance on higher fidelity models (such as human cadavers) and live patients.17,25 By and large, these studies have shown training on a low-fidelity bench model can confer the same degree of benefit as training on a high-fidelity model. To obtain successful transfer, the authors in these studies identified the essential constructs inherent in the relevant surgical procedures and developed models to facilitate transfer of these constructs to the clinical setting. These results reflect careful consideration of functional and structural components of the task, embodying critical aspects of both the procedure and the physical structure of the model itself.

An explanation of these findings can be developed using constructivist theory, which is particularly suited to technical skills acquisition.28,31 In brief, this theoretical approach assumes knowledge arises from collections of structures, which are essentially mental representations of information. Structures can be figurative (representing objects) or operative (representing processes). To illustrate this approach, consider tying a shoe. In this case, the shoelaces and the shoe are figuratives while the procedural knowledge of tying is operative. The implication for transfer is that the operative (ie, the act of tying) is learned without regard for the figurative (the physical substrate), allowing for transfer of this skill (eg., to a boot or skate). In the same way, a surgeon can learn to tie a surgical knot on a lab coat button-or indeed, can learn to extract a kidney stone from a plastic straw-and transfer this learned procedural knowledge to the clinical setting, effectively using a new physical substrate. Thus, when training procedural skills on a model or simulator is the trainee learns to ignore the physical features of the training platform and focus on the procedure, effectively suspending disbelief for the moment. In the Matsumoto et al25 study, the primary figuratives were the urethra, verumontanum, bladder, ureteric orifice, and kidney stone. These were all represented effectively by simple physical structures, such as a Penrose drain for the urethra, or by a symbol, such as the black marker spot at the base of the urethra for the verumontanum. (The bladder was represented by the inverted coffee cup and the ureters by the plastic straws.) The operatives included: insert scope at 45° to 60°, support camera correctly, inspect verumontanum, inspect bladder, identify ureteric orifices, shut off water flow to scope, pass guide wire through scope to correct orifice, advance guide wire until resistance, etc.

In the case of surgical technical skills then, an effective approach apparently is to focus training on the process and ask trainees to suspend disbelief about the physical substrate. Moorthy et al27 have recently shown this can be made easier in the context of total immersion in a simulated operating room. Another way of thinking about this is superficial physical aspects of the simulator are often important, as long as the procedural aspects are trained, reflecting the essential constructs of the technical skill.

Systematic task analysis of any procedure can identify all relevant operative structures and superficial figurative structures. This theoretical approach is particularly well-suited to surgical skills training because the knowledge of anatomy can be assumed to be excellent, ensuring the trainee focuses on the operatives. The implication is we can demonstrate effectiveness if we identify and train essential constructs.

Back to Top | Article Outline

The Simulator as a High-Stakes Assessment Device

Another potential use for the high-fidelity simulator is as an assessment tool. Clearly if the performance metrics have been properly validated, it would be feasible to use simulators to determine technical ability at the end of a training program to demonstrate fitness to practice independently. Such research is just beginning.

There has always been recognition among attending faculty that some residents have greater ability in the social, communicative, or cognitive aspects of clinical performance while other residents are stronger in the technical domains. Yet high stakes exams are still based almost exclusively on written and oral tests. As we change our curricula to include more emphasis on teaching technical skills as a separate focus, and add simulators into the curriculum, the natural question to ask is whether trainees should be assessed on a simulator if they have been trained on a simulator.

Anecdotal reports among teaching faculty indicate some trainees strong in the technical aspects of clinical performance are disadvantaged by high stakes exams focusing on written and oral skills. Simply put, some trainees may be better at doing than telling, and the question is whether we want to identify those strengths (ie, in certain situations, it may be desirable to have someone in charge who can demonstrate management skills rather than simply describe what he or she would do).

While this question has not been subject to empirical study in surgery, a recent paper by Savoldelli et al35 in the field of anesthesia has addressed this issue directly. Using official examiners from a federal licensing body (the Royal College of Physicians and Surgeons of Canada), we assessed senior anesthesia residents on an oral examination and a simulator-based examination using two scenarios, a trauma scenario and a resuscitation scenario. We found some examinees who did well on the simulator-based examination did not perform well on the oral examination and vice versa. In addition, the difference in test modality (ie, simulator versus oral) explained more of the variance in residents' performance scores than did the difference in scenario (resuscitation versus trauma). In other words, knowledge of particular clinical content (eg, how to deal with resuscitation or trauma) did not necessarily help the residents when shifting from one testing modality to the other. However, those residents doing better on the oral examination were able to demonstrate know-how for both content domains (ie, both scenarios) while those doing better on the simulator-based examination were able to demonstrate ‘shows how’ for both content domains.26 This indicates some residents did well on the simulator independent of the scenario, while other residents did well on the oral examination independent of the scenario.

Back to Top | Article Outline

Retention

In one of the few studies looking at the retention of technical skills after training with a bench model, Grober et al18 measured performance of 18 surgical residents 4 months after training on a bench model or in a didactic setting. Performance was reevaluated using a live animal (rat vas deferens anastomosis), and results (expert ratings and anastomotic patency) showed the bench-model group maintained their technical ability to a degree substantially greater than the didactic group. They concluded initial measured performance improvements appear to be durable over time, yielding a substantial retention of technical skill by novice surgeons.

There has been one study on the evaluation of an entire curriculum at a surgical skills center.2 In this study, designed to assess the degree of training received, a 2 year research project was initiated in 1998 using previously validated expert rating scales and checklists.13,24 Nineteen postgraduate-year 1 (PGY-1) residents entering the program in 1998 comprised the experimental treatment group, while an historical control group consisted of 31 PGY-3 residents of 1998. The control group was assessed on technical performance in 1998 and these results were compared with the results of an assessment of identical technical skills for the treatment group in 2000, ie, after they had received 2 years of training in the core curriculum. Using a mixed-methods approach (quantitative and qualitative analyses), we found no differences in the quantitative measures between the treatment and control groups, although the majority of student and faculty evaluation remarks were highly positive. The main conclusion was isolated training on individual skills without repetition and practice is probably not adequate to demonstrate substantial effects, and the authors recommended a more systematic approach in the curriculum involving repetition with feedback.

It appears the difference between these two studies on the retention of skills is the emphasis on training a specific skill on an intense basis, with a clear focus on that skill. In the Anastakis et al2 study, there was no emphasis on practicing the procedure once it had been demonstrated and initially taught. This last point addresses the issue of the optimal practice schedule.

Although we could find no studies addressing specifically the effect of practice schedule on retention of technical skills, there has been some work on determining the optimal learning schedule for training technical skills. This was investigated in a study by Dubrowski et al9 in which an orthopaedic bone-plating task was practiced in functional order, as would be performed in the operating room, or the individual skills were practiced in random or blocked order, as would be practiced in a laboratory setting. Practice of the skills in a blocked order yielded the least amount of learning, whereby practice in the functional order and random order resulted in the same amount of task retention. Based on these results, it is recommended surgical tasks composed of several discrete skills be practiced in a functional order. However, if practice of individual skills is necessary because of constraints in the laboratory environment, these skills should be arranged in random order to optimize learning. While this study does not address the same time frame as the retention studies, it may be interpreted it is better to have trainees understand the entire procedure with an end goal in mind. Thus, in the Anastakis et al2 study, the emphasis was on general skills the residents needed to know for some future undefined purpose, whereas it might be more effective to have the trainees learn with some well-defined goal in mind.

Back to Top | Article Outline

Focus on Outcome versus Process in Training

Another recent line of research relating to technical skills acquisition involves the question of whether training programs should focus on outcomes or process. Researchers in surgical education have traditionally focused on the assessment of technical skills using measures that evaluate the actions and their products, downplaying the importance of the motor processes leading to the product. Quantification of the economy of hand movement during skill performance is an example.5 Although these measures are effective in discriminating between levels of expertise in surgical performance, they provide limited feedback for teaching complex technical skills, such as orthopaedic bone drilling.23,36,37 Movement process measures may provide a better insight into the control system producing a particular motor skill by describing how a movement is carried out in terms of its trajectory or the forces involved. Surgical educators who see technical skills not simply as outcomes but as the result of complex processes, such as planning and movement determination, will recognize teaching these skills requires new strategies referring to the precise characteristics of expert performance to give trainees feedback about their own performance. One example where process measures may be important to understanding the execution of a technical surgical skill is in drilling through a bone. In this case, the surgeon must be able to cease quickly any advancement of the drill when the full thickness of bone has been traversed to avoid plunging to the underlying soft tissue. The length of drill protruding beyond the drilled bone is an example of an outcome measure. Information about the nature of the application of force resulting in a particular plunging outcome when drilling through the bone is an example of a process measure describing the movement.

The relative contributions of outcome and process measures to our understanding of the performance of a surgical technical skill have been explored in a recent study of orthopaedic bone drilling involving expert surgeons and junior surgical residents.8 As expected, the findings showed surgeons demonstrated less plunging than did the junior residents (Fig 3). However, understanding why the junior residents plunged more was of primary interest. It appears the surgeons performed the drilling skill based on an anticipatory representation of bone density, whereas the junior residents performed the same skill based on sensory information about the bone resistance. This was evidenced by the surgeons applying lower drilling forces when penetrating the second cortex in comparison to the first cortex and by applying the forces early in the movement, indicating a control strategy depending on prediction and anticipation of the bone density. By contrast, the junior residents used higher forces when drilling through the second in comparison to the first cortex and applied the forces evenly throughout the drilling process, indicating a reactive control strategy (Fig 3).42

Fig 3

Fig 3

Back to Top | Article Outline

DISCUSSION

The training of technical skills in surgery is enhanced by focused skills practice with immediate feedback and with a distinction made between training to competence versus training to expertise. Recent research has shown the effectiveness of transfer in training technical skills on inanimate bench models and simulators to the live patient setting. The effectiveness of this training is enhanced if focus is placed on the operative, or process-oriented aspects of the procedure, with suspension of disbelief regarding the physical structure of the training platform.

There are very few well-controlled published studies relevant for orthopaedic surgery. Although there is a great deal of published literature on the topic of simulator-based training across all specialties, a recent review has shown that only 18% of these papers are empirical studies with adequate power.21 More to the point for this review, there have been very few randomized control trials from which we can draw definitive recommendations for training orthopaedic skills. Before administrators can justify capital outlays for simulation training, we require clear evaluations of specific simulators or training platforms with demonstrated evidence of validity and effectiveness.

Also, given the resources being poured into simulations, it is important to determine the role fidelity plays in simulator effectiveness, at least in terms of cost-effectiveness. We have heard stories from many centers in which a chair or dean decides to buy some expensive training equipment (usually simulator-based) and this is then handed off to the program director, who then hires a simulation director to develop and run a curriculum. It is, of course, natural for a dean or medical school administrator to provide resources for a curriculum, but simulators represent a relatively large investment. Eventually there are questions from the dean's office about the effectiveness of this program, especially when there are requests for more money to replace worn out systems or parts. At this point the simulation director needs evidence to justify the expenditure, and to date the evidence for their effectiveness is thin. This blind faith is remarkable, especially given the current emphasis in our curricula on training in the methods of evidence-based medicine. On the other hand, if one of the purposes of training is to foster enthusiasm for the subject matter and promote life-long learning, more elaborate training platforms may represent a symbolic appeal to motivate and engage learners.

A more recent development, and a possible burgeoning area of research, is the use of simulators as high-stakes assessment instruments. Finally, the retention of trained skills is an area of research only beginning to evolve, with preliminary results suggesting the training must be tightly focused and involve an entire procedure to ensure retention. Although this model of focused skill training with immediate feedback has been discussed in the context of trainees, it is congruent with principles of medical education associated with improved performance among practicing physicians (i.e. best evidence for CME activities) and the principles for teaching procedural and technical skills throughout the continuum of medical training.6

Back to Top | Article Outline

Acknowledgments

We thank R. Brent Stansfield and Patricia B. Mullan for helpful comments on an earlier draft. We also thank the participants of the Brighton Workshop on Orthopaedic Education for their comments and stimulating discussion.

Back to Top | Article Outline

References

1. Anastakis DJ, Regehr G, Reznick RK, Cusimano M, Murnaghan J, Brown M, Hutchison C. Assessment of technical skills transfer from the bench training model to the human model. Am J Surg. 1999; 177:167-170.
2. Anastakis DJ, Wanzel KR, Brown MH, McIlroy JH, Hamstra SJ, Ali J, Hutchison CR, Murnaghan J, Regehr G, Reznick RK. Evaluating the effectiveness of a two-year curriculum in a surgical skills center. Am J Surg. 2003;185:378-385.
3. Cameron B, Robb R. Virtual-reality-assisted interventional procedures. Clin Orthop Relat Res. 2006;442:63-73.
4. Cole AF, Mallon JS, Rolbin SH, Ananthanarayan C. Fiberoptic intubation using anesthetized, paralyzed, apneic patients: results of a resident training program. Anesthesiology. 1996;84:1101-1106.
5. Datta V, Chang A, Mackay S, Darzi A. The relationship between motion analysis and surgical technical assessments. Am J Surg. 2002;184:70-73.
6. Davis D, Thompson M, Oxman A, Haynes R. Changing physician performance: a systematic review of the effect of continuing medical education activities. JAMA. 1995;274:700-705.
7. Di Giulio E, Fregonese D, Casetti T, Cestari R, Chilovi F, D'Ambra G, Di Matteo G, Ficano L, Delle Fave G. Training with a computer-based simulator achieves basic manual skills required for upper endoscopy: a randomized controlled trial. Gastro Endosc. 2004;60: 196-200.
8. Dubrowski A, Backstein D. The contributions of kinesiology to surgical education. J Bone Joint Surg Am. 2004;86:2778-2781.
9. Dubrowski A, Backstein D, Abughaduma R, Leidl D, Carnahan H. The influence of practice schedules in the learning of a complex bone-plating surgical task. Am J Surg. 2005;190:359-363.
10. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79:S70-S81.
11. Ericsson KA,Krampe RTh, Tesch-Romer C. The role of deliberate practice in the acquisition of expert performance. Psych Rev. 1993; 100:363-406.
12. Eversbusch A, Grantcharov TP. Learning curves and impact of psychomotor training on performance in simulated colonoscopy: a randomized trial using 2 virtual reality endoscopy trainer. Surg Endosc. 2004;18:1514-1518.
13. Faulkner H, Regehr G, Martin J, Reznick R. Validation of an objective structured assessment of technical skill for surgical residents. Acad Med. 1996;71:1363-1365.
14. Fried GM. The Steinberg-Bernstein Centre for Minimally Invasive Surgery at McGill University. Surg Innov. 2005;12:345-348.
15. Gallagher AG, Ritter EM, Champion H, Higgins G, Fried MP, Moses G, Smith CD, Satava RM. Virtual reality simulation for the operating room: proficiency-based training as a paradigm shift in surgical skills training. Ann Surg. 2005;241:364-372.
16. Grantcharov TP, Kristiansen VB, Bendix J, Bardram L, Rosenberg J, Funch-Jensen P. Randomized clinical trial of virtual reality simulation for laparoscopic skills training. Br J Surg. 2004;91:146-150.
17. Grober ED, Hamstra SJ, Wanzel KR, Reznick RK, Matsumoto ED, Sidhu RS, Jarvi KA. The educational impact of bench model fidelity on the acquisition of technical skill: the use of clinically-relevant outcome measures. Ann Surg. 2004;240:374-381.
18. Grober ED, Hamstra SJ, Wanzel KR, Reznick RK, Matsumoto ED, Sidhu RS, Jarvi KA. Laboratory-based training in urologic micro-surgery with bench model simulators: a randomized controlled trial evaluating the durability of technical skill. J Urol. 2004;172: 378-381.
19. Halm EA, Lee C, Chassin MR. Is volume related to outcome in health care? A systematic review and methodological critique of the literature. Ann Intern Med. 2002;137:511-520.
20. Hamilton EC, Scott DJ, Fleming JB, Rege RV, Laycock R, Bergen PC, Tesfay ST, Jones DB. Comparison of video trainer and virtual reality training systems on acquisition of laparoscopic skills. Surg Endosc. 2002;16:406-411.
21. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27:10-28.
22. Jaramaz B, Eckman K. Virtual reality simulation of fluoroscopic navigation. Clin Orthop Relat Res. 2006;442:30-34.
23. Magill RA. Motor Learning: Concepts and Applications. New York, NY: McGraw-Hill; 2000.
24. Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, Brown M. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997;84:273-278.
25. Matsumoto ED, Hamstra SJ, Radomski SB, Cusimano MD. The effect of bench model fidelity on endourologic skills: a randomized controlled study. J Urol. 2002;167:1243-1247.
26. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65:S63-S67.
27. Moorthy K, Munz Y, Adams S, Pandey V, Darzi A. A human factors analysis of technical and team skills among surgical trainees during procedural simulations in a simulated operating theatre. Ann Surg. 2005;242:631-639.
28. Muller U, Sokol B, Overton WF. Reframing a constructivist model of the development of mental representation: the role of higher-order operations. Dev Rev. 1998;18:155-201.
29. Naik V, Matsumoto ED, Houston P, Hamstra SJ, Yeung R, Mallon J, Martire TM. Fiberoptic orotracheal intubation on anesthetized patients: do manipulation skills learned on a simple model transfer into the operating room? Anesthesiology. 2001;95:343-348.
30. Norman GR, Eva K, Brooks L, Hamstra SJ. Expertise in medicine and surgery. In: Ericsson KA, Charness N, Feltovich PJ, Hoffman RR (Eds.) Cambridge Handbook on Expertise and Expert Performance. New York, NY: Cambridge University Press; 2006
31. Piaget J, Inhelder B. Mental Imagery in the Child. New York: Basic Books; 1971.
32. Reinig K, Lee C, Rubinstein D, Bagur M, Spitzer V. The United States Military's thigh trauma simulator. Clin Orthop Relat Res. 2006;442:45-56.
33. Reznick RK. Surgical simulation: a vital part of our future. Ann Surg. 2005;242:640-641.
34. Satava RM. Disruptive visions: surgical education. Surg Endosc. 2004;18:779-781.
35. Savoldelli GS, Naik VN, Joo HS, Houston PL, Graham M, Yee B, Hamstra SJ. Evaluation of patient simulator performance as an adjunct to oral examination for senior anesthesia residents. Anesthesiology. 2006;104:475-481.
36. Schmidt RA, Lee TD. Motor control and learning: A behavioral emphasis. Champaign, IL: Human Kinetics; 1999.
37. Schmidt RA, Wrisberg CA. Motor learning and performance. A problem based learning approach (3rd ed). Champaign, IL: Human Kinetics; 2004.
38. Scott DJ, Bergen PC, Rege RV, Laycock R, Tesfay ST, Valentine RJ, Euhus DM, Jeyarajah DR, Thompson WM, Jones DB. Laparoscopic training on bench models: better and more cost effective than operating room experience? J Am Coll Surg. 2000;191:272-283.
39. Seymour NE, Gallagher AG, Roman SA, O'Brien MK, Bansal VK, Andersen DK, Satava RM. Virtual reality training improves operating room performance: results of a randomized, double-blinded study. Ann Surg. 2002;236:458-463.
40. Wanzel KR, Hamstra SJ, Anastakis DJ, Matsumoto ED, Cusimano MD. Effect of visual-spatial ability on learning of spatially-complex surgical skills. Lancet. 2002;359:230-231.
41. Wanzel KR, Matsumoto ED, Hamstra SJ, Anastakis DJ. Teaching technical skills: training on a simple, inexpensive, and portable model. Plast Reconstr Surg. 2002;109:258-263.
42. Wolpert DM, Kawato M. Multiple paired forward and inverse models for motor control. Neural Netw. 1998;11:1317-1329.
© 2006 Lippincott Williams & Wilkins, Inc.