Secondary Logo

Journal Logo

Do Skills Acquired from Training with a Wire Navigation Simulator Transfer to a Mock Operating Room Environment?

Long, Steven A. MS; Thomas, Geb PhD; Karam, Matthew D. MD; Anderson, Donald D. PhD

Clinical Orthopaedics and Related Research®: October 2019 - Volume 477 - Issue 10 - p 2189–2198
doi: 10.1097/CORR.0000000000000799
REGULAR FEATURES
Free

Background Skills training and simulation play an increasingly important role in orthopaedic surgical education. The intent of simulation is to improve performance in the operating room (OR), a trait known as transfer validity. No prior studies have explored how simulator-based wire navigation training can transfer to higher-level tasks. Additionally, there is a lack of knowledge on the format in which wire navigation training should be deployed.

Questions/purposes (1) Which training methods (didactic content, deliberate practice, or proficiency-based practice) lead to the greatest improvement in performing a wire navigation task? (2) Does a resident’s performance using a wire navigation simulator correlate with his or her performance on a higher-level simulation task in a mock OR involving a C-arm, a radiopaque femur model, and a large soft tissue surrogate surrounding the femur?

Methods Fifty-five residents from four different medical centers participated in this study over the course of 2 years. The residents were divided into three groups: traditional training (included first-year residents from the University of Iowa, University of Minnesota, and the Mayo Clinic), deliberate practice (included first-year residents from the University of Nebraska and the University of Minnesota), and proficiency training (included first-year residents from the University of Minnesota and the Mayo Clinic). Residents in each group received a didactic introduction covering the task of placing a wire to treat an intertrochanteric fracture, and this was considered traditional training. Deliberate practice involved training on a radiation-free simulator that provided specific feedback throughout the practice sessions. Proficiency training used the same simulator to train on specific components of wire navigation, like finding the correct starting point, to proficiency before moving to assessment. The wire navigation simulator uses a camera system to track the wire and provide computer-generated fluoroscopy. After training, task performance was assessed in a mock OR. Residents from each group were assessed in the mock OR based on their use of fluoroscopy, total time, and tip-apex distance. Correlation analysis was performed to examine the relationship between resident performance on the simulator and in the mock OR.

Results Residents in the two simulation-based training groups had a lower tip-apex distance than those in the traditional training group (didactic training tip-apex distance: 24 ± 7 mm, 95% CI, 20–27; deliberate practice tip-apex distance: 16 ± 5 mm, 95% CI, 13–19, p = 0.001; proficiency training tip-apex distance: 15 ± 4 mm, 95% CI, 13–18, p < 0.001). Residents in the proficiency training group used more images than those in the other groups (didactic training: 22 ± 12 images, p = 0.041; deliberate practice: 19 ± 8 images; p = 0.012, proficiency training: 31 ± 14 images). In the two simulation-based training groups, resident performance on the simulator, that is, tip-apex distance, image use, and overall time, was correlated with performance in the mock OR (r-square = 0.15 [p = 0.030], 0.61 [p < 0.001], and 0.43 [p < 0.001], respectively).

Conclusions As residency programs are designing their curriculum to train wire navigation skills, emphasis should be placed on providing an environment that allows for deliberate practice with immediate feedback about their performance. Simulators such as the one presented in this study offer a safe environment for residents to learn this key skill.

Level of Evidence Level II, therapeutic study.

S. A. Long, G. Thomas, M. D. Karam, D. D. Anderson, Department of Orthopedics and Rehabilitation, The University of Iowa, Iowa City, IA, USA

S. A. Long, D. D. Anderson, Department of Biomedical Engineering, University of Iowa, Iowa City, IA, USA

G. Thomas, D. D. Anderson, Department of Industrial and Systems Engineering, University of Iowa, Iowa City, IA, USA

D. D. Anderson, Department of Orthopedics and Rehabilitation, University of Iowa, 2181 Westlawn Building, Iowa City, IA 52242, USA, Email: don-anderson@uiowa.edu

The institution of one or more of the authors (SL, GT, MK, DA) has received, during the study period, funding with grants from The American Board of Orthopaedic Surgery and from the Agency for Healthcare Research and Quality (award # R18 HS022077 and # R18 HS025353).

All of the authors are partial owners of a company (Iowa Simulation Solutions LLC) that manufactures the simulator mentioned in this manuscript, but no financial payments were received during the period of this study.

All ICMJE Conflict of Interest Forms for authors and Clinical Orthopaedics and Related Research® editors and board members are on file with the publication and can be viewed on request.

Clinical Orthopaedics and Related Research® neither advocates nor endorses the use of any treatment, drug, or device. Readers are encouraged to always seek additional information, including FDA approval status, of any drug or device before clinical use.

Each author certifies that his institution approved the reporting of this investigation and that all investigations were conducted in conformity with ethical principles of research.

Received December 19, 2018

Accepted April 10, 2019

Online date: May 17, 2019

Back to Top | Article Outline

Introduction

Skills training and simulation play an increasingly important role in modern orthopaedic surgical education. Following a 2013 mandate by the American Board of Orthopaedic Surgery to include laboratory-based training in basic surgical skills for first-year residents, programs have moved to incorporate simulation into their surgical skills education for residents [3]. Early reports after this mandate examined a variety of simulation and training modalities [4, 6, 11-13, 15]. The vast majority of simulation-based studies in orthopaedics have focused on arthroscopic surgical skills [20]. Fluoroscopic wire navigation, the task of directing a surgical wire through bone under fluoroscopic guidance, is a different skill that is equally central to orthopaedic surgery. This task is a critical step in the operative management of hip fractures, making it an excellent application for training orthopaedic residents. Hip fractures are common: there are roughly 300,000 in the United States each year [5], and many early- and intermediate-level residents participate in the wire navigation aspect of these procedures. Several meaningful metrics of surgical performance that have been shown to affect a patient’s result after hip fracture surgery and correlate with surgical experience and/or skill (two factors that are appealing when designing simulators) can be readily measured in the OR. For instance, a study found that measurements of a surgeon's wire navigation time and tip-apex distance correlated with greater surgical experience in the OR, and a tip-apex distance of less than 25 mm has been associated with a decreased likelihood of implant cutout [1, 22].

Most of the published studies on simulation in orthopaedics have focused on demonstrating construct validity, typically defined as the capability to differentiate between the performance of novices and experts on a simulated task. Demonstrating construct validity is an important first goal for task simulation, but the ultimate goal of training simulation is to improve surgical skills in the operating room (OR). Evidence of simulator training improving OR performance is known as transfer validity; if this could be demonstrated, it would be the strongest evidence of the benefit of training with a simulator. Given the number of uncontrolled variables in the OR, demonstrating transfer validity there is very challenging. An intermediate form of transfer validity would be to establish a link between surgical performance on a simulated task with its performance in a higher-level assessment environment. Demonstrating this link would help establish the evidence needed to ultimately tie simulator performance to that in the OR.

The benefit a resident gets from a training experience with a simulator and the magnitude of the effect that might be transferred to the OR may be dominated by the productive feedback and facilitation of that resident’s extended practice with the simulator. Deliberate practice is a form of training that involves task repetition with direct and clear feedback that informs a trainee about ways to improve their performance [7, 8]. Proficiency-based training demarcates improvement by defining goals a trainee attains, such as “reach at least the proficiency level (on two consecutive trials) before progressing to in vivo practice” [9]. Proficiency training has been shown to be an effective tool in teaching trainees a variety of surgical skills [2]. Maximizing the effectiveness of the training is likely to improve the ability to demonstrate the simulator’s transfer validity. Understanding which form of training best maximizes this effect is a key component to the overall goal of establishing transfer validity.

We therefore asked (1) Which training methods (didactic content, deliberate practice, or proficiency-based practice) lead to the greatest improvement in performing a wire navigation task? (2) Does a resident’s performance using a wire navigation simulator correlate with his or her performance on a higher-level simulation task in a mock OR involving a C-arm, a radiopaque femur model, and a large soft tissue surrogate surrounding the femur?

Back to Top | Article Outline

Patients and Methods

This was a prospective institutional review board-approved study. For this study, 55 postgraduate year 1 orthopaedic surgery residents from four different residency programs in the Midwest Orthopaedic Surgical Skills Consortium were recruited to participate. Participants included residents from the University of Iowa, the University of Minnesota, the Mayo Clinic, and the University of Nebraska. Given that only first-year residents were used in this study, multiple visits to each institution were necessary over a 2-year period to collect sufficient data. Residents were divided into three groups, each receiving different methods of training (Fig. 1). The traditional training group included residents from the University of Iowa (6), the University of Minnesota (7), and the Mayo Clinic (10). The deliberate practice group included residents from the University of Nebraska (4) and the University of Minnesota (13). The proficiency training group included residents from the University of Minnesota (4) and the Mayo Clinic (11). We did not collect demographic data about age, gender, or participation in sports across these institutions because we did not believe these would impact the resident baseline surgical performance. Instead, it was assumed that given all residents were in their first year of residency and did not have experience leading a case in the OR, that their baseline performance would be consistent across institutions and training groups. Furthermore, each training group was comprised of residents from multiple institutions.

Fig. 1

Fig. 1

Back to Top | Article Outline

Traditional Training

The traditional training group consisted of 23 residents. Residents received what was considered to be traditional training followed by a performance assessment. The traditional training began with a didactic presentation that described the methods for placing a guidewire to treat an intertrochanteric hip fracture, followed by a video from the American Board of Orthopaedic Surgery module on fluoroscopic knowledge and skills demonstrating the proper technique [16]. The didactic presentation concluded with a quiz that verified that each resident had gained a basic knowledge of the wire navigation task.

After the didactic presentation, residents proceeded to a performance assessment in a mock OR (Fig. 2). A radiopaque left proximal femur (part # 1130-21-22, Sawbones, Vashon Island, WA, USA) was placed inside an artificial soft tissue envelope to simulate the anatomy of a patient with an intertrochanteric fracture. A C-arm was used to provide fluoroscopic visualization of the bone and guidewire as it was drilled into the bone. Residents were instructed to place the guidewire in a center-center position to minimize the tip-apex distance, duration of wire navigation, and the number of fluoroscopic images requested. Residents had discretion in switching between AP and lateral images at their request. The viewpoints of these images were standardized for all residents. Performance feedback was not provided during the task. Completion of the task was determined by the resident participant when they felt they had achieved an acceptable wire position. The final AP and lateral fluoroscopic images were recorded to later measure performance.

Fig. 2

Fig. 2

Back to Top | Article Outline

Deliberate Practice

The deliberate practice group consisted of 17 residents. This group received the same didactic introduction as the traditional training group, followed by participation in a deliberate practice session using a wire navigation simulator (Fig 3.). The simulator uses a camera system to track an etched guidewire relative to a fixed Sawbones femur model [14]. The camera system sends images of the wire to a laptop, where the trajectory and depth of the wire in bone are calculated. This allows the residents to be presented with computer-generated fluoroscopic images of their wire as they practice.

Fig. 3

Fig. 3

The training session for the deliberate practice group involved 30 minutes of practicing the hip wire navigation task on the simulator (Fig. 3). The simulator provided real time feedback to the residents on their wire position relative to the ideal while they practiced navigating the wire (Fig. 4). The interface provided a generic target zone, overlaid in green, onto computer-generated AP and lateral images of the femur. This provided residents with immediate visual feedback on how their position compared with the ideal center-center position. Additionally, residents could see what the tip-apex distance was as they drove the wire into bone. Each resident performed the practice module of the simulator a minimum of three times. After this, the residents practiced guiding the wire with the simulator once without any feedback. This was designed as a transition between the simulated environment and the mock OR environment. After completing the deliberate practice session on the simulator, residents proceeded to the mock OR for performance assessment, following the same procedure as used in evaluating the traditional training group.

Fig. 4

Fig. 4

Back to Top | Article Outline

Proficiency Training

The proficiency training group consisted of 15 residents. This group received the traditional training, then proficiency-based training, and finally the mock OR performance assessment. The proficiency-based training consisted of a series of tasks implemented first in an online format and then on the wire navigation simulator. These tasks emphasized identifying visual cues and achieving wire positions believed to be essential skill elements for wire navigation. Residents were required to achieve specified levels of proficiency before they could advance to the next training exercise.

In the first set of proficiency-based tasks, residents were taken through a series of AP and lateral images of left proximal femurs in an online module. In these images, residents were asked to identify where the guidewire should ideally enter the femur and where the tip of the wire should terminate in the femoral head. When a resident clicked on the image, a feedback screen would immediately appear showing the resident if he or she had chosen the correct locations on the femur (Fig. 5). A circle fit to the femoral head helped identify the center of the femoral head. Lines outlining the femoral neck helped identify the center of the femoral neck. Connecting these two points provided the trajectory needed to identify the proper start and endpoints of the wire for the procedure and achieve the center-center position. Residents reviewed 20 different fluoroscopic viewpoints. Because subtle variations were shown on AP and lateral fluoroscopic projections, residents were forced to respond to the viewpoint variations by accommodating their starting points and trajectories, a frequent challenge in the actual surgical setting. Residents were required to correctly identify start and endpoints on 80% of the images before moving to the next task.

Fig. 5

Fig. 5

After completing the online module, residents were taken through a series of wire navigation subtasks on the simulator, an approach distinct from the simulator training in the deliberate practice group. These subtasks involved interpreting the computer-generated AP and lateral fluoroscopic images to properly place the guidewire on the Sawbones femur so that it was at the correct starting point and had the correct trajectory. This was intended to be an extension of the online training; however, it tested the resident’s knowledge and ability to correctly move his or her hands in three dimensions. For each subtask, the resident was required to correctly position the wire on bone, either looking at the wire starting point or wire trajectory, three times before moving on to the next task. A starting point was deemed correct if the wire tip was within 5 mm of the ideal placement. A wire trajectory was considered correct if the wire was angled within 3° of the ideal wire angle. The simulator automatically measured and collected this information (Fig. 6).

Fig. 6 A-D

Fig. 6 A-D

The physical Sawbones remained fixed on the simulator throughout the training; therefore, to simulate the different viewpoints from the online module, the virtual position of the femur was instead moved with each new attempt. Thus, residents were forced to rely entirely on the fluoroscopic images presented to them and move their hand and guidewire accordingly. Once a resident completed all the subtask trainings, he or she was given one final trial to navigate the wire into bone. Residents in this group did not receive the general practice environment that included wire positioning and tip-apex distance feedback, which were part of the training for the deliberate practice group. After completing the practice session, residents went into the mock OR and placed a guidewire with the same instructions provided to the other groups.

Back to Top | Article Outline

Primary and Secondary Outcomes

The primary outcome for this study was examining resident performance in the mock OR to compare the different training modalities used in the three groups. In all tasks, whether on the simulator or in the mock OR, residents were graded on their wire navigation time, the number of fluoroscopic images requested, and their tip-apex distance. In the data collected in the mock OR, the tip-apex distance was measured using the methods laid out by Johnson et al. [10]. The tip-apex distance measurements were made by two independent reviewers (LT, CR) who examined the final AP and lateral images residents requested in the mock OR. The average of their independent measurements was recorded as the final measured tip-apex distance. This method of measuring the tip-apex distance has been shown to be reliable and reproducible between different users [21]. The tip-apex distance from the simulator trials was automatically computed as part of the simulator software. Additionally, for each trial that residents completed on the simulator, the wire tip’s position and trajectory were stored for each image requested. This allowed for a posttraining analysis to examine how residents directed their wire through bone to reach the apex on the simulated model. The neck of the femur was divided into three regions, the starting point, the midsection, and the endpoint. These three regions were examined to see where residents from different training groups acquired their images and if any differences existed (Fig. 7).

Fig. 7

Fig. 7

A secondary outcome measured was the correlation between metrics of performance on the simulator and performance in the mock OR. For residents who received simulator training, the performance during their final attempt at driving a wire into the femur was correlated with their paired assessment in the mock OR. Correlations for residents’ use of time, fluoroscopy, and tip-apex distance were calculated between the two environments.

Back to Top | Article Outline

Statistical Analysis

To compare performance in the mock OR between groups, we performed a one-way ANOVA test in SAS Enterprise (version 7.12, Cary, NC, USA) to examine the variance among the three groups in tip-apex distance, use of fluoroscopy, and total time. Based on the results of this analysis, we ran T-tests for comparison of means on variables that were deemed as having a significant amount of variance between the groups. We also performed a one-way ANOVA test to examine the differences the number of images acquired on the simulator in the starting point, midsection, and endpoint regions.

To assess the relationship between resident performance on the simulator and in the mock OR, we ran a correlation procedure in SAS Enterprise to calculate the strength and significance of the relationships that might exist for the tip-apex distance, overall time, and number of fluoroscopic images used.

Back to Top | Article Outline

Results

Residents in the deliberate practice and proficiency training groups had a lower tip-apex distance in the mock OR assessment than residents in the didactic training group (Fig. 8). Residents in the proficiency training group used more fluoroscopic images in the mock OR than residents in the deliberate practice and the didactic training groups (Fig 9). Residents in the proficiency group also took more time to place their wires in the mock OR than residents in the other groups (Fig. 10). During performances on the simulator, residents in the proficiency training group used more images in the entry region of the bone (11 ± 7 images) than residents in the deliberate practice group (4 ± 2 images [Table 1]).

Fig. 8

Fig. 8

Fig. 9

Fig. 9

Fig. 10

Fig. 10

Table 1

Table 1

Performance on the simulator was found to correlate with performance in the mock OR (Table 2). Resident tip-apex distance on the simulator had a moderate correlation with their tip-apex distance in the mock OR (r-square = 0.15, p = 0.030). The number of images acquired on the simulator had a strong correlation with the number of images acquired in the mock OR (r-square = 0.61, p < 0.001). Resident overall time on the simulator had a strong correlation with their overall time in the mock OR (r-square = 0.43, p < 0.001).

Table 2

Table 2

Back to Top | Article Outline

Discussion

Wire navigation is a fundamental skill in orthopaedics. It is applied in a variety of orthopaedic procedures, including the treatment of hip fractures, repairing pediatric elbow fractures, placing iliosacral screws, and providing fixation for articular fracture reduction. Studies have shown that simulations can detect differences between novice and expert performance on hip wire navigation tasks. However, there is a paucity of data demonstrating how training on a wire navigation simulator can transfer into improved performance on higher-level tasks. The goal of this work was to examine how different training modalities affected resident wire navigation performance in a mock OR and if simulator performance was correlated with that in the mock OR.

We found that residents who practiced the wire navigation task on the simulator achieved a lower tip-apex distance in the mock OR than residents who simply learned about and watched a video of the procedure. Wire navigation is a complex task that involves coordinating hand movements with two-dimensional fluoroscopic images to control the three-dimensional position of a wire in a patient. This may explain why residents who only practiced the cognitive portion of the task did not perform as well as the residents who practiced both the cognitive and psychomotor portions of the task. As residency programs are establishing their wire navigation training curriculum, this study demonstrates the importance of providing training that allows residents to practice and receive feedback on the physical skills required to drive a wire toward a given target rather than limiting to didactic and case-based discussions.

This study had several limitations. One may be the sample size of the residents trained with the simulator. Although there were 55 participants in total, the two groups that received training only contained 17 and 15 residents. One issue in having a small sample size is that the differences detected between groups in this study may not be representative of the larger population of orthopaedic residents. It is unlikely that this happened here because the difference detected between the tip-apex distance for the groups was found with greater than 99% confidence. That said, the difference detected between the use of images for the proficiency training group and the other two groups may have been a result of a small sample size, given that difference was only found with 95% confidence. A second limitation of this study is that the bone used, both on the simulator and in the mock OR, did not have a fracture as part of the model. We felt that reducing and maintaining a fracture reduction in the hip was a separate task to that of driving a wire through the center of the femoral neck and into the femoral head. Although fracture reduction is a key skill in orthopaedics, it was not the focus of this study nor the simulator.

Another limitation of this study is that this training only focused on the wire navigation portion of one surgical procedure, while there are many different surgical skills that are required to be a competent orthopaedic surgeon. Our belief is that wire navigation is a core skill that once learned will likely be applied to a variety of surgical procedures. Future studies may examine how learning hip wire navigation skills translates to other areas, such as finding the starting point for an intramedullary rod or fixing an elbow fracture. Therefore, learning the skill of wire navigation is likely to spill into other areas of orthopaedics. A final limitation may be in the selection of residents who participated in this study. Different institutions may have residents who perform at a higher or lower level, and this may have influenced our results. However, given that each group had residents from multiple institutions, this seems unlikely. Further, all residents were in their first year of training and did not have any experience in the OR leading cases. It is likely that they all came into the experiment with comparable levels of surgical skill.

The “see one, do one, teach one” training model is challenged by the desire to provide safe and reliable care to patients. An alternative simulation-based training model more closely supports the “first do no harm” mantra [23]. A new “learn, see, practice, prove, do” training model may more closely align the goals of increased patient safety with the need to cultivate the next generation of surgeons [19]. With this model, residents first learn about procedures, see them performed in the OR or in a video, practice them in a simulated environment, and demonstrate that they have achieved a certain level of proficiency in the task, all before first attempting the procedure in the OR. Although it makes sense that simulation training would improve performance compared with traditional didactic training, there is also a question of how to best implement the simulation training.

In this study, we explored the difference between two well-documented training methods: deliberate practice and proficiency training. A somewhat surprising result was the difference in performance between residents in the deliberate practice group and residents in the proficiency training group. Broadly speaking, residents in these two groups improved their tip-apex distance compared with those who did the traditional training; however, the residents in the proficiency group used more fluoroscopic images and took more time to achieve this result than those in the deliberate practice group. We expected that residents in the proficiency training program would not only improve their tip-apex distance, but also use fewer images and take less time to perform the procedure than those in the deliberate practice training group. One explanation for this may be that the residents in the proficiency training group developed a more structured technique for placing the guidewire. This idea is supported by the findings that examined how residents used their images on the simulator (Fig 7). Residents in the proficiency training group used 38% of their images near their entry point, indicating they were following the procedure learned during the proficiency training. Residents in the deliberate practice group, however, used only 26% of their images in this region. Although this was not the intended result of the training, it illustrates two strategies for placing the guidewire. Taking more images before the wire enters bone could be beneficial for a patient with poor bone quality who may not withstand redirection of the guidewire very well. On the other hand, this may lead to taking more images overall and more radiation exposure for the patient because it may be harder to judge the wire trajectory so far from the target of the femoral head apex. It is clear that the style of training influenced the residents’ performance and the strategies they used.

We found that a resident’s performance on the simulator correlated with that in the mock OR. This suggests that in future evaluations, the simulator could be used as the testing platform, instead of needing to test in the mock OR environment. This may be a valuable finding as residency programs look for ways to evaluate surgical competency in their residents. For instance, Canadian residency programs have begun switching to competency-based rotations, where residents demonstrate competency before completing a rotation [17, 18]. A simulation platform that has shown to be related to testing in a mock OR environment could be used as a method for testing wire navigation competency before completing a rotation.

This study demonstrated that residents who receive training on a wire navigation simulator can transfer their acquired skills to a mock OR. It also demonstrated that the simulator platform has the potential to be used as an assessment tool for wire navigation skills going forward. As programs are looking for ways to train their residents on wire navigation skills, simulators such as the one presented in this study may offer a new tool that can be used to safely and effectively teach residents this key skill. For this to be widely adopted by programs, additional data may be required to demonstrate that this training also transfers into the OR. Future studies will work to better understand current learning curves for wire navigation in the OR so that it can be demonstrated that simulation training influenced a resident’s performance.

Back to Top | Article Outline

Acknowledgments

We thank the residency program directors Norman Turner, Ann Van Heest, and Matthew Mormino for hosting us and letting us work with their residents. We also thank Leah Taylor, Colleen Rink, Anna Rodriguez, Madison Chrisman, Marcus Tatum, and Oliver Stroh for their help in collecting data for this study.

Back to Top | Article Outline

References

1. Baumgaertner MR, Curtin SL, Lindskog DM, Keggi JM. The value of the tip-apex distance in predicting failure of fixation of peritrochanteric fractures of the hip. J Bone Joint Surg Am. 1995;77:1058–1064.
2. Bric J, Connolly M, Kastenmeier A, Goldblatt M, Gould JC. Proficiency training on a virtual reality robotic surgical skills curriculum. Surg Endosc. 2014;28:3343–3348.
3. Carpenter JE, Hurwitz SR, James MA, Jeffries JT, Marsh JL, Martin DF, Murray PM, Parsons BO, Pedowitz RA, Toolan BC, Heest AEV, Wongworawat MD. ABOS Surgical Skills Modules for PGY-1 Residents. Available at: https://www.abos.org/abos-surgical-skills-modules-for-pgy-1-residents.aspx. Accessed May 20, 2018.
4. Cecil J, Gupta A, Pirela-Cruz M. An advanced simulator for orthopedic surgical training. Int J Comput Assist Radiol Surg. 2018;13:305–319.
5. Centers for Disease Control and Prevention. Hip fractures among older adults. Available at: https://www.cdc.gov/homeandrecreationalsafety/falls/adulthipfx.html. Accessed May 20, 2018.
6. Christian MW, Griffith C, Schoonover C, Zerhusen T Jr., Coale M, O'Hara N, Henn RF 3rd, O'Toole RV, Sciadini M. Construct validation of a novel hip fracture fixation surgical simulator. J Am Acad Orthop Surg. 2018;26:689–697.
7. Ericsson KA. The Role of Deliberate Practice in the Acquisition of Expert Performance. Berlin-Dahlem, Germany : R. Th. Krampe; 1993.
8. Ericsson KA. Acquisition and maintenance of medical expertise: a perspective from the expert-performance approach with deliberate practice. Acad Med. 2015;90:1471–1486.
9. Gallagher AG. Metric-based simulation training to proficiency in medical education:- what it is and how to do it. Ulster Med J. 2012;81:107–113.
10. Johnson LJ, Cope MR, Shahrokhi S, Tamblyn P. Measuring tip-apex distance using a picture archiving and communication system (PACS). Injury. 2008;39:786–790.
11. Karam MD, Thomas GW, Koehler DM, Westerlind BO, Lafferty PM, Ohrt GT, Marsh JL, Van Heest AE, Anderson DD. Surgical coaching from head-mounted video in the training of fluoroscopically guided articular fracture surgery. J Bone Joint Surg Am. 2015;97:1031–1039.
12. Khanduja V, Lawrence JE, Audenaert E. Testing the construct validity of a virtual reality hip arthroscopy simulator. Arthroscopy. 2017;33:566–571.
13. Kho JY, Johns BD, Thomas GW, Karam MD, Marsh JL, Anderson DD. A Hybrid reality radiation-free simulator for teaching wire navigation skills. J Orthop Trauma. 2015;29:e385–390.
14. Long S, Thomas GW, Anderson DD. Designing an affordable wire navigation surgical simulator. J Med Device. 2016;10.
15. Lopez G, Martin DF, Wright R, Jung J, Hahn P, Jain N, Bracey DN, Gupta R. Construct validity for a cost-effective arthroscopic surgery simulator for resident education. J Am Acad Orthop Sur. 2016;24:886–894.
16. Marsh JL, Westerlind BO. Wire navigation into the femoral head. 2013. ABOS. Available at: https://abos.org/media/7102/1._wire_navigation_into_the_femoral_head.mp4. Accessed March 22, 2019.
17. Nousiainen MT, McQueen SA, Hall J, Kraemer W, Ferguson P, Marsh JL, Reznick RR, Reed MR, Sonnadara R. Resident education in orthopaedic trauma: the future role of competency-based medical education. Bone Joint J. 2016;98-B:1320–1325.
18. Nousiainen MT, Mironova P, Hynes M, Glover Takahashi S, Reznick R, Kraemer W, Alman B, Ferguson P, Safir O, Sonnadara R, Murnaghan J, Ogilvie-Harris D, Theodoropoulos J, Hall J, Syed K, Howard A, Ford M, Daniels T, Dwyer T, Veillette C, Wadey V, Narayanan U, Yee A, Whyne C. Eight-year outcomes of a competency-based residency training program in orthopedic surgery. Med Teach. 2018:1–13.
19. Sawyer T, White M, Zaveri P, Chang T, Ades A, French H, Anderson J, Auerbach M, Johnston L, Kessler D. Learn, see, practice, prove, do, maintain: An evidence-based pedagogical framework for procedural skill training in medicine. Academic Medicine. 2015;90:1025–1033.
20. Tay C, Khajuria A, Gupte C. Simulation training: a systematic review of simulation in arthroscopy and proposal of a new competency-based training framework. Int J Surg. 2014;12:626–633.
21. Taylor LK, Thomas GW, Karam MD, Kreiter CD, Anderson DD. Assessing wire navigation performance in the operating room. J Surg Educ. 2016;73:780–787.
22. Taylor LK, Thomas GW, Karam MD, Kreiter CD, Anderson DD. Developing an objective assessment of surgical performance from operating room video and surgical imagery. IISE Trans Healthc Syst Eng. 2018;8:110–116.
23. Ziv A, Wolpe PR, Small SD, Glick S. Simulation-based medical education: an ethical imperative. Acad Med. 2003;78:783–788.
© 2019 Lippincott Williams & Wilkins LWW