Journal Logo

Evidence-Based Systematic Reviews

Analysis of Tools Used in Assessing Technical Skills and Operative Competence in Trauma and Orthopaedic Surgical Training

A Systematic Review

James, Hannah K. MBChB(Hons), MRCS, MMedEd1,2,a; Chapman, Anna W. FRCS, MMedEd2; Pattison, Giles T.R. FRCS, MMedEd2; Fisher, Joanne D. PhD1; Griffin, Damian R. MBM, BCh(Oxon), MA, MPhil(Cantab), FRCS1,2

Author Information
doi: 10.2106/JBJS.RVW.19.00167
  • Open
  • Supplementary Content
  • Disclosures

Abstract

Within an educational paradigm shift toward competency-based measures of performance in surgical training1, there is a need to evaluate surgical skills objectively and systematically, and hence, there is a drive toward developing more reliable and valid measures of surgical competence1-3.

Several surgical skill-assessment tools are currently in use in orthopaedic training, and studies evaluating the ability of these tools to objectively measure surgical performance have been performed. To our knowledge, this is the first systematic appraisal of the evidence for these assessment tools. It is imperative that the modernization of surgical curricula be supported by evidence-based tools for assessing technical skill and to enable summative judgments on progression through training and readiness for unsupervised operating.

The aim of this systematic review was to evaluate the orthopaedic surgical-competency literature and report on the metrics and tools used for skills assessment in trauma and orthopaedic surgical training; their utility with respect to validity, reliability, and impact on learning; and evidence for strengths and weaknesses of the various tools.

Materials and Methods

This review was conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines4 and registered with PROSPERO (International Prospective Register of Systematic Reviews)5.

Data Sources

We performed a comprehensive literature search of MEDLINE, Embase, and Google Scholar electronic databases. The search strategy was developed by collating keywords from an initial scoping search (Table I). Categories 1, 2, and 3 were combined using Boolean “AND/OR” operators and results were limited to human subjects. No date or language limits were applied. The last search was performed in June 2019. Duplicates were removed, and retrieved titles were screened for initial eligibility.

TABLE I - Search Strategy
  1. Competence.mp. OR assessment$.mp. OR skills$.mp. OR training.mp. OR performance.mp.

  2. Technical.mp. OR operative.mp. OR simulation.mp.

  3. Orthop$.mp.

  4. Combine 1 AND 2 AND 3


Study Selection

Eligible for inclusion were primary empirical research studies assessing postgraduate surgical resident performance in open or arthroscopic orthopaedic surgical skills in a simulated or live operative theater environment. Nonempirical studies and those that focused solely on patient or procedural outcome, or only described a training intervention, were excluded. A deliberately broad search strategy was employed to capture all studies in which an orthopaedic surgical skill was assessed.

Title and Abstract Review

The search identified 2,305 citations. Initial title screening was undertaken by 1 author (H.K.J., a doctoral researcher), with studies that were obviously irrelevant excluded. One hundred and eighty-seven abstracts subsequently underwent screening by 2 authors (H.K.J. and A.W.C., an attending surgeon), and 106 were retrieved in full text. Of these, 105 were included in the final review (1 study was excluded at full-text review as the participants were not surgical residents). Studies were rejected at screening if they were not empirical research, if the study participants were undergraduates, or if nontechnical skills were being assessed; studies reporting simulator protocol development or validation were also excluded at this stage. The reference lists of full-text articles were examined for relevant studies, and those found by hand searching were subject to the same eligibility screening process.

Data Extraction and Analysis

Data items relevant to the review objectives were extracted into a structured form to ensure consistency. The first reviewer undertook data extraction for all studies. Extracted data included study aim, setting, assessment format, number and training stage of participants, skills assessed, assessment tool and/or metrics, assessment tool category, study results, and “take-home” message related to the assessment tool. Assessment tools were classified by the type of method; the following categories were defined: traditional written assessments, objective assessment of technical skill, procedure-specific rating scale, individual procedural metrics, movement analysis, psychomotor testing, and subjective assessments.

Results

Search Results

One hundred and six articles were evaluated in detail, 1 of which was excluded at full-text review because the participants were not surgeons-in-training; 105 articles were therefore included in the review. The flow of studies is shown in Figure 1.

Fig. 1
Fig. 1:
PRISMA flowchart.

Study Aims, Setting, and Participants

The studies were broadly split into 3 categories: studies measuring the impact of a simulation training intervention (26 studies6-31), studies assessing the construct validity of a simulator designed for training surgeons (42 studies32-73), and studies validating an assessment tool (37 studies74-110) (see Appendix Tables 1 and 2, column 1). Of the included studies, 60% assessed arthroscopic skill involving the knee (34 studies)6,8,9,13,15,17,19,31-33,36,38,39,41,42,47,48,54-56,63,74,75,77-83,86,87,89,91, the shoulder (25 studies)7,12,15,16,18,20,37,40,44,45,49,51,53,54,56,63,76,77,81,82,88,90-92,110, the hip (3 studies)43,50,85, the ankle (1 study)14, and basic general arthroscopic skills (6 studies)10,11,34,52,57,84. The majority (70%) of the studies assessing arthroscopic skill concerned diagnostic arthroscopy; procedural arthroscopic skills assessed included arthroscopic Bankart repair (3 studies), rotator cuff repair (1 study), labral repair (1 study), meniscal repair (2 studies), and anterior cruciate graft preparation (2 studies) and insertion (1 study) (see Appendix Table 1, column 5). The 42 studies that assessed open surgical procedures are shown in Appendix Table 2; as shown in column 5 of the table, the open procedures assessed included dynamic hip screw (DHS) fixation (4 studies), cannulated hip pinning (2 studies), and hemiarthroplasty (1 study) for a fractured femoral neck; spinal pedicle screw placement (6 studies); open surgical approaches to the shoulder (1 study); hand-trauma skills including nail-bed repair, Z-plasty, metacarpal fracture fixation, and tendon repair (1 study each); and various open reduction and internal fixation (ORIF) procedures for fractures of the forearm (7 studies), ankle (2 studies), and tibia (1 study), and complex articular fractures (1 study). Elective hand procedures, including trigger-finger release (1 study) and carpal tunnel decompression (3 studies), and elective hip (1 study) and knee (1 study) arthroplasty were also assessed.

The majority (85%) assessed skills in the simulated setting, 10 studies assessed skills in the live operative theater, and 10 studies assessed skills in both the simulated and live operative theater. Overall, 2,088 orthopaedic resident participants were involved in the studies, with experience level ranging from PGY (postgraduate year) 1 to 10.

Assessment Format

The assessment format varied considerably (see Appendix Tables 1 and 2, column 3). Fifty-nine studies assessed performance using live observation, and 50 used post-hoc analysis of video footage by experts. Simulator-derived metrics were used in 72 studies. Final-product analysis by expert assessors was used for 3 studies, and biomechanical testing of the final product was used in 7.

Assessment Tools or Metrics

A wide variety of assessment tools were used (see Appendix Table 3). Traditional assessments, such as written examinations, were used in 5 studies. Objective assessment of technical skills was widely used, and took many forms: task-specific checklists (20 studies), global rating scales (19 studies), and novel objective skills-assessment tools for both arthroscopy (22 studies) and open surgery (6 studies). Procedure-specific rating scales were used for both arthroscopic (7 studies) and open procedures (6 studies). Individual procedural metrics, such as final-product analysis and procedure time, were used in 56 studies. Movement analysis using simulator-derived metrics, such as hand movements, gaze tracking, hand-position checking, and instrument speed and path length, was used in 22 studies. Psychomotor testing using commercial dexterity tests was used in 5 studies. Subjective assessment measures were used in 4 studies.

Quality Assessment

Van Der Vleuten described a series of utility criteria, known as the “utility index,” which is a widely accepted framework for assessing an evaluation instrument111. The features of the utility index are described in Table II. Each assessment tool was appraised for utility; the evidence for each of the various technical skills-assessment tools in current use is summarized according to the utility index criteria (see Appendix Table 3, columns 5 to 11). There was a wide spread of utility characteristics among the different tools, and their heterogeneity precludes any formal analysis. The strengths and limitations of the respective tools are presented in Appendix Table 3, columns 3 and 4.

TABLE II - Utility Criteria111 for Effective Assessment
Validity The extent to which the skills claimed to be being assessed are assessed by the instrument
 Content validity Describes the appropriateness of the variables measured by the assessment instrument122
 Construct validity Describes the effectiveness of the assessment instrument at differentiating between different skill levels122
 Concurrent validity Describes the extent to which the assessment instrument agrees with existing performance measures122
Reliability Describes the reproducibility of the results
Feasibility/acceptability The extent to which the instrument is usable by the target audience
Educational impact Consideration of the extent to which the instrument itself influences learning
Cost-effectiveness* The extent to which the assessment instrument delivers value for money
*Not evaluated in this review.

Discussion

Robust assessment of competency and operative skill in trauma and orthopaedic surgery is a topical issue in training. The primary goals of surgical-competency assessment are to provide a platform for learning through feedback, to make summative judgments about capability and progression through training, to maintain standards within the profession, and ultimately, to protect patients from incompetent surgeons1.

To our knowledge, this review is the first comprehensive analysis of the tools currently available for assessing technical skill and operative competency in trauma and orthopaedic surgical training.

The results show that none of the tools currently used for assessing technical skill in orthopaedic surgical training fulfill the criteria of Norcini et al. for effective assessment112. There is a similar deficiency of utility evidence in technical skills-assessment tools in general surgery113 and vascular surgery1, which face the same challenges as trauma and orthopaedics in moving toward a competency-based approach to training1.

Checklists and global rating scales were commonly used tools for technical skills assessment in the review studies (see Appendix Trable 3). Checklists deconstruct a task into discrete steps, and may have educational value for teaching procedural sequencing to novice residents. They do not capture the quality of performance, and the rigid binary scoring does not allow deviation resulting from there possibly being >1 acceptable way of undertaking a procedure. Another disadvantage of checklists is an early ceiling effect1. Checklists do have the advantage of being able to be administered by nonexpert assessors, and judgment on performance can be made either live or from video footage. They also can be used in both the simulated and live theater environment. They show reasonable construct validity68,77,96,98, concurrent validity37,77,96,102,103, and reliability37,88,114. With their limitations in mind, checklists are perhaps most appropriate for novice learners in a formative setting1.

Global rating scales use generic domains with a Likert-type scale and descriptive anchors to capture the quality of performance61,66,93. They are generalizable between procedures and can be used to assess complex procedures when there is >1 accepted method. They can discriminate between competent and expert performance, and there are many studies demonstrating their content17,96 and concurrent validity17,77,85,96,98,103 and their reliability17,37,66,96. They require expert surgeon evaluators and are more time-consuming to administer, and may be susceptible to assessor bias, as domains of assessment such as instrument handling and respect for tissue are inherently quite subjective. The ability of global rating scales to distinguish between all levels of performance and the absence of a ceiling effect make them useful for high-stakes, summative assessment1 and the assessment of advanced residents.

Several novel objective assessment tools have been developed and combine task-specific checklists with a global rating scale. The most promising front-runners among these are the Arthroscopic Surgical Skill Evaluation Tool (ASSET)36,37,77, which combines a task-specific checklist with an 8-domain global rating scale with end and middle descriptive anchors, and the Objective Structured Assessment of Technical Skills (OSATS) tool23,93 (see Appendix Table 3). While the ASSET is obviously restricted to arthroscopic procedures, both have a growing body of evidence across all domains of the utility index (Table II). The hybrid approach of combining a task-specific checklist and a global rating scale into 1 assessment tool enables the strengths of both to be brought together within a single tool but has the disadvantage of becoming long and burdensome to complete, which negatively impacts their feasibility and acceptability in a busy workplace in which training assessment conflicts with service pressures.

The OSATS tool is in current use in training programs in obstetrics/gynecology115 and ophthalmology116 and is popular with residents117. It captures the quality of performance and can distinguish competence from mastery, and the stages of progression in between. There were several studies in this review that demonstrated the validity, reliability, feasibility, and educational value of the OSATS tool in trauma and orthopaedics in the simulated setting (see Appendix Table 3, columns 5 to 11). Further work is required to assess its utility in the live operative theater.

There are a variety of procedure-specific rating scales that have been developed for both open21,32,58,70,99,118 and arthroscopic7,76,81,82,90,92 procedures (see Appendix Table 3). Most are in the early stages of validation and are likely to be most useful for the research setting. They are not practical for the live workplace environment given the variety of procedures that are undertaken within a typical training rotation; a generic tool that may be applied to the assessment of all procedures is more feasible.

Motion analysis (see Appendix Table 3) is also promising for assessing technical skill, particularly in arthroscopy, and several studies in this review demonstrated its utility6,13,31,34,41,50,66,74,75,86. Its use to date has been largely restricted to the research setting, and further work on transfer validity and potential educational impact is required. Some of the obvious barriers, such as sterility concerns, have been mitigated by using elbow instead of hand-mounted sensors in the live operative theater31. Hand-motion analysis can generate a sophisticated data profile that can detect subtle improvement in surgical performance, and may be able to measure the attainment of mastery. Other motion parameters, such as gaze tracking6, triangulation time74, instrument path length12,15,40,48,49,51,55,56,63,110, and collisions38,55, have demonstrated construct validity and feasibility in the simulated environment but are unlikely to be useful in the live operative theater, as most of these measurements are derived from the simulator itself.

Individual procedural metrics can also be used to assess technical skill (see Appendix Table 3). Final-product analysis provides an objective assessment of final product quality, from which technical proficiency is inferred. Examples include tip-apex distance in DHS fixation58,62, screw position22,30,59,71,95, and articular congruency73,93. Orthopaedics has the advantage of the routine use of intraoperative and postoperative radiographs from which relevant, real-life final-product analysis metrics such as implant position can easily be measured. Final-product analysis is objective and quite easy and efficient to perform. A nonspecialist assessor (who has been appropriately trained) can make the measurements. In the simulated setting, invasive final-product-analysis measures, such as biomechanical testing of a fracture construct, can be used to assess procedural success. Final-product analysis is appealing as it relates technical performance to real-world, clinically relevant measures of operative success. Conclusions regarding the construct validity of final-product analysis are, however, rather mixed, with almost as many studies refuting its construct validity59,65,68,73,84 as those demonstrating it22,24,30,58,61,71,72,97, and the studies analyzed did not demonstrate evidence of reliability.

Procedure time was extensively used as a procedural metric to assess technical skill in the included studies. It is easy to measure in both the simulated and in vivo setting. It relies on the intuitive assumption that speed equates to proficiency. This is potentially problematic, as extrinsic patient and staff factors beyond surgeons’ immediate control could influence procedure time, and it gives no indication of quality of performance; procedure time may be measured as fast because the surgeon was a masterfully efficient operator, but alternatively they may have rushed the procedure and been careless. The evidence for construct and concurrent validity for procedure time is mixed, with many studies showing it can discriminate between experience levels6,11,12,30,31,33,34,40,43-45,47,48,50,51,53-56,61,63,64,67,78,86,110, and performs well against other types of assessment6,18,47,86, with others showing it cannot18,20,23,57,60,62,72,73,99,102. Both final-product analysis and procedure time are therefore unlikely to be useful in isolation, but rather could be used as adjunctive measures of technical proficiency.

Limitations

This review is limited to the assessment of technical skills in trauma and orthopaedic surgery; the assessment of nontechnical skills for surgeons was not considered in our analysis. Nontechnical skills are undoubtedly an essential dimension of surgical competence and are rightly beginning to receive attention in the surgical education literature119. The perfect technical skills-assessment tool is therefore never going to be usable in isolation to comprehensively assess competence, but rather should form a key part of a battery of evidence-based assessment tools.

Implications and Recommendations

There is growing dissatisfaction with the current technical skills-assessment tools within the surgical education community105,120, and an increasingly urgent need to develop an evidence-based assessment tool that is generalizable to the broad range of technical and nontechnical skills in trauma and orthopaedic surgery, and that satisfies the utility criteria.

The Procedure Based Assessment, which is the current main assessment tool used for high-stakes assessment in the U.K. training system, is lengthy to complete, comprising 40 to 50 tick boxes and 12 free-text spaces105. It was initially implemented prior to any formal validation beyond an initial consensus-setting (Delphi) process to define the domains105,121. Several years after its introduction, a large, pan-surgical-specialty validation study was undertaken109, with a particular focus on demonstrating the reliability of the rating scales105. Within this study, orthopaedics appears underrepresented, with the totality of the procedure-based assessment-validity evidence relating to 2 orthopaedic procedures involving 7 residents. Subsequent validation work, using more traditional frameworks in general and vascular surgery, has demonstrated that the procedure-based assessment is a valid and reliable measure of performance105 and responsive to change105, but there remains a deficiency of evidence for its utility in orthopaedics, which is surprising given that it is the current gold-standard assessment in the U.K. training system (see Appendix Table 3). Adding to the problem, engagement with the Procedure Based Assessment has been poor105, and it remains unpopular120. A national survey of trauma and orthopaedic resident attitudes toward procedure-based assessments (PBAs) in the U.K. found that more than half agreed or strongly agreed with the statement “completing PBAs is nothing but a form-filling exercise,” 60% agreed or strongly agreed that there are “barriers to the successful use of PBAs by residents”120, and only one-third believed that they should be used for high-stakes assessment in training, such as the Annual Review of Competence Progression120. Further work has found that reasons for the poor engagement are that the Procedure Based Assessment is burdensome to complete; with a coarse rating scale of blunt, binary descriptors, it cannot distinguish mastery or higher-order skills; and it results in general assessment fatigue105.

The Procedure Based Assessment was among the earliest formal tools for technical skills assessment in orthopaedic surgical training, and its creators deserve recognition for beginning the process of objectively assessing technical skills of surgeons-in-training. We propose that the Procedure Based Assessment is no longer appropriate for use in summative assessment in a modern competency-based assessment training environment. The OSATS tool and the ASSET show promise as replacements to the Procedure Based Assessment, and validation work on these, with a particular focus on their use in the live operative theater, should be continued.

Conclusions

The evidence for the utility of the technical skills-assessment tools currently used in trauma and orthopaedic surgical training is inadequate to support their use in summative high-stakes assessment of competency. An assessment tool that is generalizable to the broad range of technical and nontechnical skills relevant to trauma and orthopaedics, that satisfies the utility criteria, and that is cost-effective and feasible requires development.

Appendix

Supporting material provided by the authors is posted with the online version of this article as a data supplement at jbjs.org (http://links.lww.com/JBJSREV/A611).

References

1. Mitchell EL, Arora S, Moneta GL, Kret MR, Dargon PT, Landry GJ, Eidt JF, Sevdalis N. A systematic review of assessment of skill acquisition and operative competency in vascular surgical training. J Vasc Surg. 2014 May;59(5):1440-55. Epub 2014 Mar 19.
2. Okoro T, Sirianni C, Brigden D. The concept of surgical assessment: part 1 – introduction. Bulletin of the Royal College of Surgeons of England. 2010 Oct;92(9):322-3.
3. Okoro T, Sirianni C, Brigden D. The concept of surgical assessment: part 2 – available tools. Bulletin of the Royal College of Surgeons of England. 2010 Oct;92(9):324-6.
4. Shamseer L, Moher D, Clarke M, Ghersi D, Liberati A, Petticrew M, Shekelle P, Stewart LA; PRISMA-P Group. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: elaboration and explanation. BMJ. 2015 Jan 2;350:g7647. Erratum in: BMJ. 2016 Jul 21;354:i4086.
5. National Institute for Health Research. PROSPERO international prospective register of systematic reviews. Accessed 15 April 2020. https://www.crd.york.ac.uk/prospero/
6. An VVG, Mirza Y, Mazomenos E, Vasconcelos F, Stoyanov D, Oussedik S. Arthroscopic simulation using a knee model can be used to train speed and gaze strategies in knee arthroscopy. Knee. 2018 Dec;25(6):1214-21. Epub 2018 Jun 20.
7. Angelo RL, Ryu RKN, Pedowitz RA, Beach W, Burns J, Dodds J, Field L, Getelman M, Hobgood R, McIntyre L, Gallagher AG. A proficiency-based progression training curriculum coupled with a model simulator results in the acquisition of a superior arthroscopic Bankart skill set. Arthroscopy. 2015 Oct;31(10):1854-71. Epub 2015 Sep 2.
8. Bhattacharyya R, Davidson DJ, Sugand K, Bartlett MJ, Bhattacharya R, Gupte CM. Knee arthroscopy simulation: a randomized controlled trial evaluating the effectiveness of the Imperial Knee Arthroscopy Cognitive Task Analysis (IKACTA) tool. J Bone Joint Surg Am. 2017 Oct 4;99(19):e103.
9. Camp CL, Krych AJ, Stuart MJ, Regnier TD, Mills KM, Turner NS. Improving resident performance in knee arthroscopy: a prospective value assessment of simulators and cadaveric skills laboratories. J Bone Joint Surg Am. 2016 Feb 3;98(3):220-5.
10. Çetinkaya E, Çift H, Aybar A, Erçin E, Güler GB, Poyanlı O. The timing and importance of motor skills course in knee arthroscopy training. Acta Orthop Traumatol Turc. 2017 Jul;51(4):273-7. Epub 2017 Jul 8.
11. Chong ACM, Pate RC, Prohaska DJ, Bron TR, Wooley PH. Validation of improvement of basic competency in arthroscopic knot tying using a bench top simulator in orthopaedic residency education. Arthroscopy. 2016 Jul;32(7):1389-99. Epub 2016 Apr 23.
12. Gomoll AH, Pappas G, Forsythe B, Warner JJP. Individual skill progression on a virtual reality simulator for shoulder arthroscopy: a 3-year follow-up study. Am J Sports Med. 2008 Jun;36(6):1139-42. Epub 2008 Mar 6.
13. Jackson WFM, Khan T, Alvand A, Al-Ali S, Gill HS, Price AJ, Rees JL. Learning and retaining simulated arthroscopic meniscal repair skills. J Bone Joint Surg Am. 2012 Sep 5;94(17):e132.
14. Martin KD, Patterson D, Phisitkul P, Cameron KL, Femino J, Amendola A. Ankle arthroscopy simulation improves basic skills, anatomic recognition, and proficiency during diagnostic examination of residents in training. Foot Ankle Int. 2015 Jul;36(7):827-35. Epub 2015 Mar 11.
15. Rahm S, Wieser K, Bauer DE, Waibel FW, Meyer DC, Gerber C, Fucentese SF. Efficacy of standardized training on a virtual reality simulator to advance knee and shoulder arthroscopic motor skills. BMC Musculoskelet Disord. 2018 May 16;19(1):150.
16. Rebolledo BJ, Hammann-Scala J, Leali A, Ranawat AS. Arthroscopy skills development with a surgical simulator: a comparative study in orthopaedic surgery residents. Am J Sports Med. 2015 Jun;43(6):1526-9. Epub 2015 Mar 13.
17. Cannon WD, Garrett WE Jr, Hunter RE, Sweeney HJ, Eckhoff DG, Nicandri GT, Hutchinson MR, Johnson DD, Bisson LJ, Bedi A, Hill JA, Koh JL, Reinig KD. Improving residency training in arthroscopic knee surgery with use of a virtual-reality simulator. A randomized blinded study. J Bone Joint Surg Am. 2014 Nov 5;96(21):1798-806.
18. Dunn JC, Belmont PJ, Lanzi J, Martin K, Bader J, Owens B, Waterman BR. Arthroscopic shoulder surgical simulation training curriculum: transfer reliability and maintenance of skill over time. J Surg Educ. 2015 Nov-Dec;72(6):1118-23. Epub 2015 Aug 19.
19. Howells NR, Gill HS, Carr AJ, Price AJ, Rees JL. Transferring simulated arthroscopic skills to the operating theatre: a randomised blinded study. J Bone Joint Surg Br. 2008 Apr;90(4):494-9.
20. Waterman BR, Martin KD, Cameron KL, Owens BD, Belmont PJ Jr. Simulation training improves surgical proficiency and safety during diagnostic shoulder arthroscopy performed by residents. Orthopedics. 2016 May 1;39(3):e479-85. Epub 2016 May 2.
21. Butler BA, Lawton CD, Burgess J, Balderama ES, Barsness KA, Sarwark JF. Simulation-based educational module improves intern and medical student performance of closed reduction and percutaneous pinning of pediatric supracondylar humeral fractures. J Bone Joint Surg Am. 2017 Dec 6;99(23):e128.
22. Gottschalk MB, Yoon ST, Park DK, Rhee JM, Mitchell PM. Surgical training using three-dimensional simulation in placement of cervical lateral mass screws: a blinded randomized control trial. Spine J. 2015 Jan 1;15(1):168-75. Epub 2014 Sep 4.
23. LeBlanc J, Hutchison C, Hu Y, Donnon T. A comparison of orthopaedic resident performance on surgical fixation of an ulnar fracture using virtual reality and synthetic models. J Bone Joint Surg Am. 2013 May 1;95(9):e60-5: S1-5.
24. Nousiainen MT, Omoto DM, Zingg PO, Weil YA, Mardam-Bey SW, Eward WC. Training femoral neck screw insertion skills to surgical trainees: computer-assisted surgery versus conventional fluoroscopic technique. J Orthop Trauma. 2013 Feb;27(2):87-92.
25. Ruder JA, Turvey B, Hsu JR, Scannell BP. Effectiveness of a low-cost drilling module in orthopaedic surgical simulation. J Surg Educ. 2017 May - Jun;74(3):471-6. Epub 2016 Nov 7.
26. Sonnadara RR, Van Vliet A, Safir O, Alman B, Ferguson P, Kraemer W, Reznick R. Orthopedic boot camp: examining the effectiveness of an intensive surgical skills course. Surgery. 2011 Jun;149(6):745-9. Epub 2011 Jan 14.
27. Sonnadara RR, Garbedian S, Safir O, Nousiainen M, Alman B, Ferguson P, Kraemer W, Reznick R. Orthopaedic boot camp II: examining the retention rates of an intensive surgical skills course. Surgery. 2012 Jun;151(6):803-7.
28. Sonnadara RR, Garbedian S, Safir O, Mui C, Mironova P, Nousiainen M, Ferguson P, Alman B, Kraemer W, Reznick R. Toronto orthopaedic boot camp III: examining the efficacy of student-regulated learning during an intensive, laboratory-based surgical skills course. Surgery. 2013 Jul;154(1):29-33.
29. Tonetti J, Vadcard L, Girard P, Dubois M, Merloz P, Troccaz J. Assessment of a percutaneous iliosacral screw insertion simulator. Orthop Traumatol Surg Res. 2009 Nov;95(7):471-7. Epub 2009 Oct 3.
30. Xiang L, Zhou Y, Wang H, Zhang H, Song G, Zhao Y, Han J, Liu J. Significance of preoperative planning simulator for junior surgeons’ training of pedicle screw insertion. J Spinal Disord Tech. 2015 Feb;28(1):E25-9.
31. Garfjeld Roberts P, Alvand A, Gallieri M, Hargrove C, Rees J. Objectively assessing intraoperative arthroscopic skills performance and the transfer of simulation training in knee arthroscopy: a randomized controlled trial. Arthroscopy. 2019 Apr;35(4):1197-1209.e1. Epub 2019 Mar 14.
32. Brusalis CM, Lawrence JTR, Ranade SC, Kerr JC, Pulos N, Wells L, Ganley TJ. Can a novel, low-cost simulation model be used to teach anterior cruciate ligament graft preparation? J Pediatr Orthop. 2017 Jun;37(4):e277-81.
33. Cannon WD, Nicandri GT, Reinig K, Mevis H, Wittstein J. Evaluation of skill level between trainees and community orthopaedic surgeons using a virtual reality arthroscopic knee simulator. J Bone Joint Surg Am. 2014 Apr 2;96(7):e57.
34. Colaco HB, Hughes K, Pearse E, Arnander M, Tennent D. Construct validity, assessment of the learning curve, and experience of using a low-cost arthroscopic surgical simulator. J Surg Educ. 2017 Jan - Feb;74(1):47-54. Epub 2016 Oct 5.
35. Coughlin RP, Pauyo T, Sutton JC 3rd, Coughlin LP, Bergeron SG. A validated orthopaedic surgical simulation model for training and evaluation of basic arthroscopic skills. J Bone Joint Surg Am. 2015 Sep 2;97(17):1465-71.
36. Dwyer T, Slade Shantz J, Chahal J, Wasserstein D, Schachar R, Kulasegaram KM, Theodoropoulos J, Greben R, Ogilvie-Harris D. Simulation of anterior cruciate ligament reconstruction in a dry model. Am J Sports Med. 2015 Dec;43(12):2997-3004. Epub 2015 Oct 12.
37. Dwyer T, Schachar R, Leroux T, Petrera M, Cheung J, Greben R, Henry P, Ogilvie-Harris D, Theodoropoulos J, Chahal J. Performance assessment of arthroscopic rotator cuff repair and labral repair in a dry shoulder simulator. Arthroscopy. 2017 Jul;33(7):1310-8. Epub 2017 Mar 25.
38. Escoto A, Trejos AL, Naish MD, Patel RV, Lebel ME. Force sensing-based simulator for arthroscopic skills assessment in orthopaedic knee surgery. Stud Health Technol Inform. 2012;173:129-35.
39. Fucentese SF, Rahm S, Wieser K, Spillmann J, Harders M, Koch PP. Evaluation of a virtual-reality-based simulator using passive haptic feedback for knee arthroscopy. Knee Surg Sports Traumatol Arthrosc. 2015 Apr;23(4):1077-85. Epub 2014 Feb 12.
40. Gomoll AH, O’Toole RV, Czarnecki J, Warner JJP. Surgical experience correlates with performance on a virtual reality simulator for shoulder arthroscopy. Am J Sports Med. 2007 Jun;35(6):883-8. Epub 2007 Jan 29.
41. Howells NR, Brinsden MD, Gill RS, Carr AJ, Rees JL. Motion analysis: a validated method for showing skill levels in arthroscopy. Arthroscopy. 2008 Mar;24(3):335-42.
42. Insel A, Carofino B, Leger R, Arciero R, Mazzocca AD. The development of an objective model to assess arthroscopic performance. J Bone Joint Surg Am. 2009 Sep;91(9):2287-95.
43. Khanduja V, Lawrence JE, Audenaert E. Testing the construct validity of a virtual reality hip arthroscopy simulator. Arthroscopy. 2017 Mar;33(3):566-71. Epub 2016 Dec 16.
44. Martin KD, Belmont PJ, Schoenfeld AJ, Todd M, Cameron KL, Owens BD. Arthroscopic basic task performance in shoulder simulator model correlates with similar task performance in cadavers. J Bone Joint Surg Am. 2011 Nov 2;93(21):e1271-5.
45. Martin KD, Cameron K, Belmont PJ, Schoenfeld A, Owens BD. Shoulder arthroscopy simulator performance correlates with resident and shoulder arthroscopy experience. J Bone Joint Surg Am. 2012 Nov 7;94(21):e160.
46. Martin KD, Akoh CC, Amendola A, Phisitkul P. Comparison of three virtual reality arthroscopic simulators as part of an orthopedic residency educational curriculum. Iowa Orthop J. 2016;36:20-5.
47. McCarthy A, Harley P, Smallwood R. Virtual arthroscopy training: do the “virtual skills” developed match the real skills required? Stud Health Technol Inform. 1999;62:221-7.
48. McCarthy AD, Moody L, Waterworth AR, Bickerstaff DR. Passive haptics in a knee arthroscopy simulator: is it valid for core skills training? Clin Orthop Relat Res. 2006 Jan;442:13-20.
49. Pedowitz RA, Esch J, Snyder S. Evaluation of a virtual reality simulator for arthroscopy skills development. Arthroscopy. 2002 Jul-Aug;18(6):E29.
50. Pollard TCB, Khan T, Price AJ, Gill HS, Glyn-Jones S, Rees JL. Simulated hip arthroscopy skills: learning curves with the lateral and supine patient positions: a randomized trial. J Bone Joint Surg Am. 2012 May 16;94(10):e68.
51. Rahm S, Germann M, Hingsammer A, Wieser K, Gerber C. Validation of a virtual reality-based simulator for shoulder arthroscopy. Knee Surg Sports Traumatol Arthrosc. 2016 May;24(5):1730-7. Epub 2016 Feb 9.
52. Rose K, Pedowitz R. Fundamental arthroscopic skill differentiation with virtual reality simulation. Arthroscopy. 2015 Feb;31(2):299-305. Epub 2014 Oct 11.
53. Srivastava S, Youngblood PL, Rawn C, Hariri S, Heinrichs WL, Ladd AL. Initial evaluation of a shoulder arthroscopy simulator: establishing construct validity. J Shoulder Elbow Surg. 2004 Mar-Apr;13(2):196-205.
54. Tuijthof GJM, van Sterkenburg MN, Sierevelt IN, van Oldenrijk J, Van Dijk CN, Kerkhoffs GMMJ. First validation of the PASSPORT training environment for arthroscopic skills. Knee Surg Sports Traumatol Arthrosc. 2010 Feb;18(2):218-24. Epub 2009 Jul 24.
55. Tashiro Y, Miura H, Nakanishi Y, Okazaki K, Iwamoto Y. Evaluation of skills in arthroscopic training based on trajectory and force data. Clin Orthop Relat Res. 2009 Feb;467(2):546-52. Epub 2008 Sep 13.
56. Tofte JN, Westerlind BO, Martin KD, Guetschow BL, Uribe-Echevarria B, Rungprai C, Phisitkul P. Knee, shoulder, and fundamentals of arthroscopic surgery training: validation of a virtual arthroscopy simulator. Arthroscopy. 2017 Mar;33(3):641-646.e3. Epub 2016 Dec 16.
57. Wong IH, Denkers M, Urquhart N, Farrokhyar F. Construct validity testing of the Arthroscopic Knot Trainer (ArK). Knee Surg Sports Traumatol Arthrosc. 2015 Mar;23(3):906-11. Epub 2013 May 18.
58. Akhtar K, Sugand K, Sperrin M, Cobb J, Standfield N, Gupte C. Training safer orthopedic surgeons. Construct validation of a virtual-reality simulator for hip fracture surgery. Acta Orthop. 2015;86(5):616-21.
59. Aoude A, Alhamzah H, Fortin M, Jarzem P, Ouellet J, Weber MH. The use of computer-assisted surgery as an educational tool for the training of orthopedic surgery residents in pedicle screw placement: a pilot study and survey among orthopedic residents. Can J Surg. 2016 Dec;59(6):391-8.
60. Blyth P, Stott NS, Anderson IA. Virtual reality assessment of technical skill using the Bonedoc DHS simulator. Injury. 2008 Oct;39(10):1127-33. Epub 2008 Jun 13.
61. Christian MW, Griffith C, Schoonover C, Zerhusen T Jr, Coale M, OʼHara N, Henn RF 3rd, OʼToole RV, Sciadini M. Construct validation of a novel hip fracture fixation surgical simulator. J Am Acad Orthop Surg. 2018 Oct 1;26(19):689-97.
62. Froelich JM, Milbrandt JC, Novicoff WM, Saleh KJ, Allan DG. Surgical simulators and hip fractures: a role in residency training? J Surg Educ. 2011 Jul-Aug;68(4):298-302. Epub 2011 Apr 16.
63. Garfjeld Roberts P, Guyver P, Baldwin M, Akhtar K, Alvand A, Price AJ, Rees JL. Validation of the updated ArthroS simulator: face and construct validity of a passive haptic virtual reality simulator with novel performance metrics. Knee Surg Sports Traumatol Arthrosc. 2017 Feb;25(2):616-25. Epub 2016 Apr 16.
64. Giurin I, Bréaud J, Rampal V, Rosello O, Solla F. A simulation model of nail bed suture and nail fixation: description and preliminary evaluation. J Surg Res. 2018 Aug;228:142-6. Epub 2018 Apr 11.
65. Hohn EA, Brooks AG, Leasure J, Camisa W, van Warmerdam J, Kondrashov D, Montgomery W, McGann W. Development of a surgical skills curriculum for the training and assessment of manual skills in orthopedic surgical residents. J Surg Educ. 2015 Jan-Feb;72(1):47-52. Epub 2014 Aug 6.
66. Leong JJH, Leff DR, Das A, Aggarwal R, Reilly P, Atkinson HDE, Emery RJ, Darzi AW. Validation of orthopaedic bench models for trauma surgery. J Bone Joint Surg Br. 2008 Jul;90(7):958-65.
67. Lopez G, Wright R, Martin D, Jung J, Bracey D, Gupta R. A cost-effective junior resident training and assessment simulator for orthopaedic surgical skills via fundamentals of orthopaedic surgery: AAOS exhibit selection. J Bone Joint Surg Am. 2015 Apr 15;97(8):659-66.
68. Mayne IP, Brydges R, Moktar J, Murnaghan ML. Development and assessment of a distal radial fracture model as a clinical teaching tool. J Bone Joint Surg Am. 2016 Mar 2;98(5):410-6.
69. Qassemyar Q, Boulart L. A 4-task skills examination for residents for the assessment of technical ability in hand trauma surgery. J Surg Educ. 2015 Mar-Apr;72(2):179-83. Epub 2014 Dec 10.
70. Rambani R, Ward J, Viant W. Desktop-based computer-assisted orthopedic training system for spinal surgery. J Surg Educ. 2014 Nov-Dec;71(6):805-9. Epub 2014 Jun 23.
71. Shi J, Hou Y, Lin Y, Chen H, Yuan W. Role of Visuohaptic surgical training simulator in resident education of orthopedic surgery. World Neurosurg. 2018 Mar;111:e98-104. Epub 2017 Dec 15.
72. Sugand K, Wescott RA, Carrington R, Hart A, Van Duren BH. Teaching basic trauma: validating FluoroSim, a digital fluoroscopic simulator for guide-wire insertion in hip surgery. Acta Orthop. 2018 Aug;89(4):380-5. Epub 2018 May 10.
73. Yehyawi TM, Thomas TP, Ohrt GT, Marsh JL, Karam MD, Brown TD, Anderson DD. A simulation trainer for complex articular fracture surgery. J Bone Joint Surg Am. 2013 Jul 3;95(13):e92.
74. Alvand A, Khan T, Al-Ali S, Jackson WF, Price AJ, Rees JL. Simple visual parameters for objective assessment of arthroscopic skill. J Bone Joint Surg Am. 2012 Jul 3;94(13):e97.
75. Alvand A, Logishetty K, Middleton R, Khan T, Jackson WFM, Price AJ, Rees JL. Validating a global rating scale to monitor individual resident learning curves during arthroscopic knee meniscal repair. Arthroscopy. 2013 May;29(5):906-12.
76. Bayona S, Akhtar K, Gupte C, Emery RJH, Dodds AL, Bello F. Assessing performance in shoulder arthroscopy: the Imperial Global Arthroscopy Rating Scale (IGARS). J Bone Joint Surg Am. 2014 Jul 2;96(13):e112. Epub 2014 Jul 2.
77. Dwyer T, Slade Shantz J, Kulasegaram KM, Chahal J, Wasserstein D, Schachar R, Devitt B, Theodoropoulos J, Hodges B, Ogilvie-Harris D. Use of an objective structured assessment of technical skill after a sports medicine rotation. Arthroscopy. 2016 Dec;32(12):2572-2581.e3. Epub 2016 Jul 27.
78. Elliott MJ, Caprise PA, Henning AE, Kurtz CA, Sekiya JK. Diagnostic knee arthroscopy: a pilot study to evaluate surgical skills. Arthroscopy. 2012 Feb;28(2):218-24. Epub 2011 Oct 28.
79. Koehler RJ, Nicandri GT. Using the arthroscopic surgery skill evaluation tool as a pass-fail examination. J Bone Joint Surg Am. 2013 Dec 4;95(23):e1871-6.
80. Koehler RJ, Amsdell S, Arendt EA, Bisson LJ, Braman JP, Butler A, Cosgarea AJ, Harner CD, Garrett WE, Olson T, Warme WJ, Nicandri GT. The Arthroscopic Surgical Skill Evaluation Tool (ASSET). Am J Sports Med. 2013 Jun;41(6):1229-37. Epub 2013 Apr 2.
81. Middleton RM, Baldwin MJ, Akhtar K, Alvand A, Rees JL. Which global rating scale? A Comparison of the ASSET, BAKSSS, and IGARS for the assessment of simulated arthroscopic skills. J Bone Joint Surg Am. 2016 Jan 6;98(1):75-81.
82. Nwachukwu B, Gaudiani M, Hammann-Scala J, Ranawat A. A checklist intervention to assess resident diagnostic knee and shoulder arthroscopic efficiency. J Surg Educ. 2017 Jan-Feb;74(1):9-15. Epub 2016 Aug 23.
83. Olson T, Koehler R, Butler A, Amsdell S, Nicandri G. Is there a valid and reliable assessment of diagnostic knee arthroscopy skill? Clin Orthop Relat Res. 2013 May;471(5):1670-6. Epub 2012 Dec 20.
84. Pedowitz RA, Nicandri GT, Angelo RL, Ryu RKN, Gallagher AG. Objective assessment of knot-tying proficiency with the Fundamentals of Arthroscopic Surgery Training Program Workstation and Knot Tester. Arthroscopy. 2015 Oct;31(10):1872-9. Epub 2015 Aug 19.
85. Phillips L, Cheung JJH, Whelan DB, Murnaghan ML, Chahal J, Theodoropoulos J, Ogilvie-Harris D, Macniven I, Dwyer T. Validation of a dry model for assessing the performance of arthroscopic hip labral repair. Am J Sports Med. 2017 Jul;45(9):2125-30. Epub 2017 Mar 29.
86. Price AJ, Erturan G, Akhtar K, Judge A, Alvand A, Rees JL. Evidence-based surgical training in orthopaedics: how many arthroscopies of the knee are needed to achieve consultant level performance? Bone Joint J. 2015 Oct;97-B(10):1309-15.
87. Slade Shantz JA, Leiter JR, Collins JB, MacDonald PB. Validation of a global assessment of arthroscopic skills in a cadaveric knee model. Arthroscopy. 2013 Jan;29(1):106-12. Epub 2012 Nov 20.
88. Gallagher AG, Ryu RKN, Pedowitz RA, Henn P, Angelo RL. Inter-rater reliability for metrics scored in a binary fashion-performance assessment for an arthroscopic Bankart repair. Arthroscopy. 2018 Jul;34(7):2191-8. Epub 2018 May 2.
89. Hodgins JL, Veillette C, Biau D, Sonnadara R. The knee arthroscopy learning curve: quantitative assessment of surgical skills. Arthroscopy. 2014 May;30(5):613-21.
90. Hoyle AC, Whelton C, Umaar R, Funk L. Validation of a global rating scale for shoulder arthroscopy: a pilot study. Shoulder Elbow. 2012 Jan;4(1):16-21.
91. Koehler RJ, Goldblatt JP, Maloney MD, Voloshin I, Nicandri GT. Assessing diagnostic arthroscopy performance in the operating room using the Arthroscopic Surgery Skill Evaluation Tool (ASSET). Arthroscopy. 2015 Dec;31(12):2314-9.e2. Epub 2015 Aug 28.
92. Talbot CL, Holt EM, Gooding BWT, Tennent TD, Foden P. The Shoulder Objective Practical Assessment Tool: evaluation of a new tool assessing residents learning in diagnostic shoulder arthroscopy. Arthroscopy. 2015 Aug;31(8):1441-9. Epub 2015 Apr 22.
93. Anderson DD, Long S, Thomas GW, Putnam MD, Bechtold JE, Karam MD. Objective Structured Assessments of Technical Skills (OSATS) does not assess the quality of the surgical result effectively. Clin Orthop Relat Res. 2016 Apr;474(4):874-81.
94. Backstein D, Agnidis Z, Regehr G, Reznick R. The effectiveness of video feedback in the acquisition of orthopedic technical skills. Am J Surg. 2004 Mar;187(3):427-32.
95. Bergeson RK, Schwend RM, DeLucia T, Silva SR, Smith JE, Avilucea FR. How accurately do novice surgeons place thoracic pedicle screws with the free hand technique? Spine (Phila Pa 1976). 2008 Jul 1;33(15):E501-7.
96. Bernard JA, Dattilo JR, Srikumaran U, Zikria BA, Jain A, LaPorte DM. Reliability and validity of 3 methods of assessing orthopedic resident skill in shoulder surgery. J Surg Educ. 2016 Nov-Dec;73(6):1020-5. Epub 2016 Jun 3.
97. Burns GT, King BW, Holmes JR, Irwin TA. Evaluating internal fixation skills using surgical simulation. J Bone Joint Surg Am. 2017 Mar 1;99(5):e21.
98. MacEwan MJ, Dudek NL, Wood TJ, Gofton WT. Continued validation of the O-SCORE (Ottawa Surgical Competency Operating Room Evaluation): use in the simulated environment. Teach Learn Med. 2016;28(1):72-9.
99. Pedersen P, Palm H, Ringsted C, Konge L. Virtual-reality simulation to assess performance in hip fracture surgery. Acta Orthop. 2014 Aug;85(4):403-7. Epub 2014 Apr 30.
100. Putnam MD, Kinnucan E, Adams JE, Van Heest AE, Nuckley DJ, Shanedling J. On orthopedic surgical skill prediction—the limited value of traditional testing. J Surg Educ. 2015 May-Jun;72(3):458-70. Epub 2014 Dec 24.
101. Williams JF, Watson SL, Baker DK, Ponce BA, McGwin G, Gilbert SR, Khoury JG. Psychomotor testing for orthopedic residency applicants: a pilot study. J Surg Educ. 2017 Sep - Oct;74(5):820-7. Epub 2017 Mar 7.
102. Van Heest A, Putnam M, Agel J, Shanedling J, McPherson S, Schmitz C. Assessment of technical skills of orthopaedic surgery residents performing open carpal tunnel release surgery. J Bone Joint Surg Am. 2009 Dec;91(12):2811-7.
103. VanHeest A, Kuzel B, Agel J, Putnam M, Kalliainen L, Fletcher J. Objective structured assessment of technical skill in upper extremity surgery. J Hand Surg Am. 2012 Feb;37(2):332-7: 337.e1-4.
104. Beard JD, Marriott J, Purdie H, Crossley J. Assessing the surgical skills of trainees in the operating theatre: a prospective observational study of the methodology. Health Technol Assess. 2011 Jan;15(1):i-xxi, 1-162.
105. Davies RM, Hadfield-Law L, Turner PG. Development and evaluation of a new formative assessment of surgical performance. J Surg Educ. 2018 Sep - Oct;75(5):1309-16. Epub 2018 Mar 24.
106. Gofton WT, Dudek NL, Wood TJ, Balaa F, Hamstra SJ. The Ottawa Surgical Competency Operating Room Evaluation (O-SCORE): a tool to assess surgical competence. Acad Med. 2012 Oct;87(10):1401-7.
107. Hawkes DH, Harrison WJ. Critiquing operative fracture fixation: the development of an assessment tool. Eur J Orthop Surg Traumatol. 2017 Dec;27(8):1083-8. Epub 2017 Mar 23.
108. Hoffer MM, Hsu SC. Hand function in selection of orthopedics residents. Acad Med. 1990 Oct;65(10):661.
109. Marriott J, Purdie H, Crossley J, Beard JD. Evaluation of procedure-based assessment for assessing trainees’ skills in the operating theatre. Br J Surg. 2011 Mar;98(3):450-7. Epub 2010 Nov 24.
110. Martin KD, Patterson DP, Cameron KL. Arthroscopic training courses improve trainee arthroscopy skills: a simulation-based prospective trial. Arthroscopy. 2016 Nov;32(11):2228-32. Epub 2016 May 25.
111. Van Der Vleuten CPM. The assessment of professional competence: developments, research and practical implications. Adv Health Sci Educ Theory Pract. 1996 Jan;1(1):41-67.
112. Norcini J, Anderson B, Bollela V, Burch V, Costa MJ, Duvivier R, Galbraith R, Hays R, Kent A, Perrott V, Roberts T. Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach. 2011;33(3):206-14.
113. van Hove PD, Tuijthof GJM, Verdaasdonk EGG, Stassen LPS, Dankelman J. Objective assessment of technical surgical skills. Br J Surg. 2010 Jul;97(7):972-87.
114. Angelo RL, Ryu RKN, Pedowitz RA, Gallagher AG. The Bankart performance metrics combined with a cadaveric shoulder create a precise and accurate assessment tool for measuring surgeon skill. Arthroscopy. 2015 Sep;31(9):1655-70. Epub 2015 Jul 31.
115. The Royal College of Obstetricians and Gynaecologists. OSATS. Accessed 15 April 2020. https://www.rcog.org.uk/en/careers-training/about-specialty-training-in-og/assessment-and-progression-through-training/workplace-based-assessments/osats/
116. The Royal College of Ophthalmologists. Objective assessment of surgical and technical skills (OSATS). Accessed 2019 Jun. https://www.rcophth.ac.uk/curriculum/ost/assessments/workplace-based-assessments/objective-assessment-of-surgical-and-technical-skills-osats/
117. Tsagkataki M, Choudhary A. Mersey Deanery ophthalmology trainees’ views of the objective assessment of surgical and technical skills (OSATS) workplace-based assessment tool. Perspect Med Educ. 2013 Feb;2(1):21-7.
118. Rambani R, Viant W, Ward J, Mohsen A. Computer-assisted orthopedic training system for fracture fixation. J Surg Educ. 2013 May-Jun;70(3):304-8. Epub 2013 Feb 22.
119. Agha RA, Fowler AJ, Sevdalis N. The role of non-technical skills in surgery. Ann Med Surg (Lond). 2015 Oct 9;4(4):422-7.
120. Hunter AR, Baird EJ, Reed MR. Procedure-based assessments in trauma and orthopaedic training—the trainees’ perspective. Med Teach. 2015 May;37(5):444-9. Epub 2014 Sep 4.
121. Pitts D, Rowley DI, Sher JL. Assessment of performance in orthopaedic training. J Bone Joint Surg Br. 2005 Sep;87(9):1187-91.
122. Bartlett JD, Lawrence JE, Stewart ME, Nakano N, Khanduja V. Does virtual reality simulation have a role in training trauma and orthopaedic surgeons? Bone Joint J. 2018 May 1;100-B(5):559-65.

Supplemental Digital Content

Copyright © 2020 The Authors. Published by The Journal of Bone and Joint Surgery, Incorporated. All rights reserved.