Introduction
Orthopaedic training in Africa is focused on trauma surgery because of a large burden of injuries and limited resources1. Despite the negative effect on the exposure to elective surgery, procedures such as arthroscopy are increasing and competency assessment is needed. Owing to limited employment opportunities in government hospitals, many surgeons in Southern Africa are forced into private practice2. Here, the distribution of trauma and elective surgery often mirrors high-resource settings, where arthroscopy of the knee and shoulder is ranked among the 10 most common procedures for a general orthopaedic surgeon3. For knee arthroscopy in this sector, an estimated 10 knee arthroscopies are performed per 10,000 population annually4. This creates a mismatch between training and preparation for surgical practice, specifically in elective procedures. Deficiencies in surgical education may be due to increasing complexity of procedures or implants, reduced training time, diluted exposure to high-cost elective cases, shortage of trainers in the clinical academic environment, and greater awareness of medicolegal implications and ethical issues5. Therefore, adequate training and assessment of skills, such as basic arthroscopy, is key to train competent orthopaedic surgeons.
Traditionally, the surgical competency of trainees has been assessed by observations in the operating room, subjective end-of-rotation evaluations, and logbooks6. However, there is a need for objective work-based assessment to implement teaching strategies that are effective in a limited-resource setting. Learning curves in surgical skills are variable and largely related to volume of procedures performed7-10. For arthroscopic surgery, training with the use of models can be used to improve technical skills before real-life surgery.
Unfortunately, there are no reports on arthroscopic training and assessment in limited-resource settings in Southern Africa. This knowledge is key to improve competency and allow adequate preparation for future surgical practice. Therefore, the primary aim of this study was to assess the competency of orthopaedic trainees in our hospital. The secondary aim was to assess the interobserver reliability of the scoring system used.
Methods
A prospective observational cohort study was conducted to assess the competency of basic knee arthroscopy skills in postgraduate orthopaedic trainees at an urban Southern African university hospital. There is variability in the experience of elective orthopaedic surgery before entering the 4-year training program. Most trainees rotate through our secondary-level hospitals for at least 2 years as orthopaedic medical officers before their specialist training. Here, input is provided in the form of courses, tutorials, and supervised surgery with close interaction of academic staff from the tertiary care facility. Yet, not all future trainees will go through the same rotations at the secondary-level hospitals, and some are recruited from other provinces of the country without such preparatory prerotation. Our specialist training consists of 18 months of trauma orthopaedic surgery, 6 months of hand surgery, 6 months of paediatric orthopaedic surgery, and 18 months of elective reconstructive surgery. During this final rotation of training, 8 to 9 months are allocated for exposure to arthroscopic procedures of the knee (3 months) and shoulder (6 months). The residency arthroscopy curriculum has not changed over the past 4 years, and senior trainees have been through the same rotations and training as the current junior trainees. Arthroscopic cadaver courses are held approximately 4 times a year throughout the course of the training.
All 21 trainees who were part of the orthopaedic department at the time of this study were approached to participate. Exclusion criteria were clinical duties and annual leave on the day of assessment. Participants completed a questionnaire providing demographic data, hand dominance, year of postgraduate study, training hours received in specific arthroscopy skills, and number of knee arthroscopies performed up to this point. Averages were reported as median (25th percentile, 75th percentile). Trainees in the first 2 years of training were defined as junior trainees. Senior trainees were defined as being in their third year of training or later.
An information sheet was provided which described the standardized basic arthroscopy to be performed. This included a diagnostic knee arthroscopy on a dry silicon knee model (Sawbone, Washington, United States of America), a high-definition lens camera control unit and 4 mm 30° video-arthroscope (Smith + Nephew, Watford, England) was provided with a standard arthroscopic instrument tray. The tray included a high-flow arthroscopic sheath, obturator, probe, and a variety of graspers and punches. The participants were then asked to locate and remove 2 foreign bodies within the knee during the assessment. They were blinded to one another and did not observe others performing the arthroscopy. During the arthroscopy, participants were assessed by 2 raters using the modified Basic Arthroscopic Knee Skill Scoring System (mBAKSSS)8. The raters were qualified orthopaedic surgeons from the same department, experienced in performing and teaching arthroscopic surgery. Both surgeons simultaneously scored the performance but were blinded to each others' marks.
The mBAKSSS consists of 9 questions, in the form of a global rating style questionnaire, including the parameters of instrument handling, depth perception, bimanual dexterity, flow of operation, knowledge of instruments, efficiency, knowledge of procedure, autonomy, and quality of the final product. Each skill is assessed on a Likert scale with a minimum score of one and maximum of 5 in each category8. Construct validity, inter-rater reliability, and internal consistency of the mBAKSSS have been shown in multiple studies11. The score has been validated for baseline skills and learning curves in various high-income countries with virtual simulators, cadavers, and dry models, but its translation into a clinical setting remains to be assessed11,12. There are no validated cutoff points as pass-fail mark for the mBAKSSS. Previous studies have shown a median score of 20 in junior and 32 in senior trainees and have arbitrarily set a competency level at a score of 30.13,14
Inter-rater reliability of the 2 raters was assessed using Cronbach's alpha and interclass correlation with confidence interval (CI) set at 95%. For this, no power analysis was performed. Cronbach's alpha >0.9 is equivalent to excellent internal consistency, 0.8 to 0.9 good, 0.7 to 0.8 acceptable, and <0.5 unacceptable. Interclass correlation closer to 1 shows higher inter-rater reliability. Full ethics approval was obtained before the study from the institutional review board (HREC #231/2017). No funding was received for this study.
Results
A total of 16 (12 male) of the possible 21 trainees were included in this study. Those who did not participate were either performing clinical duties (3) or on annual leave (2).
Detailed demographic data and scores for each participant are summarized in Table I. The median age of participants was 36 (34.8-37) years. There were 3 participants from each of the first 3 years and 7 from the fourth year (Table I). Before this assessment, participants reported an estimated median of 7 (2.8-12.8) hours of exposure to arthroscopy training. Although the training program offers similar rotation and training opportunities, there was reported variability in prerotation experience as medical officers. This included performing diagnostic scopes before a procedure in the theatre, arthroscopy courses, use of simulated devices, and previous cadaver workshops. The group reported a median of 5 (1.5-10) arthroscopies performed on patients before the assessment.
TABLE I -
Demographic and scores for each participant
*
|
Age |
Sex |
Year of Training |
Hand Dominance |
Hours of training |
No. of arthroscopies |
Rater 1 |
Rater 2 |
Average mBAKSSS |
1 |
36 |
M |
4 |
Right |
5 |
20 |
24 |
22 |
23.0 |
2 |
37 |
F |
3 |
Right |
4 |
0 |
26 |
29 |
27.5 |
3 |
37 |
M |
4 |
Right |
10 |
10 |
28 |
33 |
30.5 |
4 |
38 |
F |
4 |
Right |
3 |
2 |
18 |
19 |
18.5 |
5 |
35 |
F |
1 |
Right |
3 |
3 |
25 |
32 |
28.5 |
6 |
42 |
M |
4 |
Right |
20 |
5 |
13 |
11 |
12.0 |
7 |
35 |
M |
3 |
Right |
15 |
15 |
14 |
17 |
15.5 |
8 |
34 |
M |
2 |
Right |
2 |
0 |
25 |
29 |
27.0 |
9 |
33 |
F |
2 |
Right |
30 |
0 |
38 |
34 |
36.0 |
10 |
32 |
M |
1 |
Right |
2 |
10 |
18 |
23 |
20.5 |
11 |
37 |
M |
4 |
Right |
9 |
10 |
40 |
36 |
38.0 |
12 |
35 |
M |
3 |
Right |
1 |
32 |
30 |
35 |
32.5 |
13 |
31 |
M |
1 |
Right |
12 |
5 |
43 |
42 |
42.5 |
14 |
38 |
M |
2 |
Right |
10 |
10 |
34 |
27 |
30.5 |
15 |
37 |
M |
4 |
Right |
20 |
2 |
35 |
30 |
32.5 |
16 |
36 |
M |
4 |
Left |
0 |
0 |
17 |
24 |
20.5 |
*Table I summarizes demographic data, previous experience, scores assessed by 2 raters, and average score of participants. F = female, M = male, and mBAKSSS = modified Basic Arthroscopic Knee Skill Scoring System.
The overall median mBAKSSS was 28.0 (20.3-32.5), with large variability ranging from 12.0 to 42.5. Notably, the 3 lowest scores (12, 15.5, 18.5) were obtained by senior trainees in their third or fourth year (Table I). The scores of the individual components of the mBAKSSS are shown in Fig. 1.
Fig. 1: Box and whisker plot depicting individual categories of the mBAKSSS for 16 trainees assessed. The average score for each component of the tool from the 2 raters was used in the analysis. A Likert scale rates these categories from 1 (worst) to 5 (best).
An assessment of the inter-rater reliability between the 2 raters is provided in Table II. The overall reliability was excellent with Cronbach's alpha of 0.91 and interclass correlation of 0.91 (95% CI 0.75-0.97). There was good to excellent correlation for all the individual assessment points, except for autonomy, knowledge of instruments, and flow of operation and forward planning.
TABLE II -
Inter-rater reliability
Category of mBAKSSS |
Cronbach's alpha |
Interclass correlation |
95% CI*
|
Quality |
0.803 |
0.802 |
0.488 to 0.930 |
Autonomy |
0.709 |
0.715 |
0.184 to 0.900 |
Knowledge of specific procedure |
0.923 |
0.914 |
0.752 to 0.970 |
Efficiency |
0.819 |
0.828 |
0.478 to 0.943 |
Knowledge of instruments |
0.309 |
0.319 |
−1.147 to 0.776 |
Flow of operation and forward planning |
0.756 |
0.754 |
0.291 to 0.916 |
Bimanual dexterity |
0.833 |
0.842 |
0.539 to 0.945 |
Depth perception |
0.863 |
0.863 |
0.619 to 0.952 |
Instrument handling |
0.950 |
0.953 |
0.865 to 0.984 |
Total |
0.910 |
0.914 |
0.750 to 0.970 |
*Table II summarizes the excellent inter-rater reliability of the 2 raters with Cronbach's alpha >0.9 and good to excellent interclass correlation in the various areas of the score. CI = confidence interval for interclass correlation (upper and lower limit), and mBAKSSS= modified Basic Knee Arthroscopy Skill Scoring System.
Discussion
Our study assessed the competency of basic knee arthroscopy skills among trainees in a Southern African university hospital using the mBAKSSS with excellent inter-rater reliability. These scores were comparable with other international training programs. Large variability among scores of trainees was seen, with poor scores for 3 of the 9 senior trainees.
The overall mean mBAKSSS of our study was 28.0, which correlates with those of other training centers and confirms that our trainees have comparable skills with their international counterparts8. However, there was large variability of reported training experience and test performance with no apparent correlation between the 2. Olsen et al. reported on a similar variability in a group of senior trainees with a score of 33 and range of 18 to 35. Their junior trainees' scores were more homogenous but lower with a score of 20 and a range of 18 to 22 on the mBAKSSS. This suggests that there is inconstancy in how trainees access or experience the educational arthroscopy training offered13. After the specialist training, we offer fellowship opportunities when preparing for a subspecialist practice. Yet, the disparity in skills and poor performance of some senior trainees is concerning and their progression into general orthopaedic practice potentially unsafe. We currently have limited mechanisms to restrict the scope of practice for these individuals because our national orthopaedic specialist examination does not evaluate arthroscopy skills as pass-fail. Here, workplace-based assessment can provide an important checkpoint13, which will be introduced into our training program in the near future as part of a national drive. These findings also highlight that our trainees acquire arthroscopy skills before their training program in peripheral hospitals where cases with lower complexity are managed. The environment at the tertiary-level hospital, with more complex cases operated on by subspecialists, might therefore be less amenable to acquire basic arthroscopy skills for trainees in the operating room. Most arthroscopy exposure is limited to the final months of our training program; thus, repetition throughout the training is limited. Focused training of basic arthroscopy skills could be shifted to the peripheral hospitals to overcome some of these challenges.
To establish an adequate learning curve in knee arthroscopy, a high volume of surgery is needed. One study suggested that 150 to 200 knee arthroscopies should be performed, which equates to that by a fellowship-trained surgeon8. Another study proposed 35 knee arthroscopies to achieve an acceptable level of competency10,15. In diagnostic shoulder arthroscopy, around 52 cases logged are needed, which increases to 248 for more complex procedures such as Bankart repairs16. The number of previous arthroscopic procedures has been shown to significantly correlate with rating scores in both the knee and the shoulder17. In our study group, a large variability of self-reported arthroscopic procedures was provided. Here, simulation could become an important component for skills development, as reported in other studies in low-resource settings where clinical exposure to elective arthroscopic procedures is reduced.18-20
This study has some limitations. The experience of the trainees was self-reported and could not be verified. A prospective study including objective recording of the trainees' exposure to both simulated and operating room environments would have provided a more accurate documentation of surgical experience. Other factors which can lead to improved 3-dimensional hand-eye coordination, such as regular video gaming, was not assessed in our study and could have influenced the mBAKSSS.21
Furthermore, the mBAKSSS only assesses technical skills of arthroscopy. Other nontechnical skills that are essential for a safe and competent surgeon are excluded, such as surgical leadership, communication, problem solving, and planning11. Using a dry model might not reflect true arthroscopy skills needed in patients and must be interpreted cautiously, but it does allow for a standardized and repeatable skills assessment as well as a “hands-off” approach by assessors without risks to patients and medical legal implications. This study was performed at a single center and might not reflect other universities or hospitals with different resources or clinical setup. Although challenges might be similar, generalization of these results should be avoided. In addition, this study was underpowered for subgroup analysis (i.e., junior vs. senior trainees) because of the large variability of the scores. The trainees were not assessed longitudinally, and an individual's skills improvement could not be assessed over time.
Conclusion
Although the average basic arthroscopy skill competency of our trainees was adequate, there was great variability of skills. This calls for ongoing assessment and a re-evaluation of the exposure, especially of the actual experience in skills training. Because our graduates are expected to perform basic knee arthroscopy, our program must define competency expectations for all and create a curriculum and evaluation program which matches those goals. Ultimately, this must also lead to greater educational consistency among trainees. Furthermore, the mBAKSSS proved to be reliable in our study, but future work is required to confirm construct validity in a clinical setting.
References
1. Meara JG, Leather AJM, Hagander L, Alkire BC, Alonso N, Ameh EA, Bickler SW, Conteh L, Dare AJ, Davies J, Merisier ED, El-Halabi S, Farmer PE, Gawande A, Gillies R, Greenberg SLM, Grimes CE, Gruen RL, Ismail EA, Kamara TB, Lavy C, Lundeg G, Mkandawire NC, Raykar NP, Riesel JN, Rodas E, Rose J, Roy N, Shrime MG, Sullivan R, Verguet S, Watters D, Weiser TG, Wilson IH, Yamey G, Yip W. Global Surgery 2030: evidence and solutions for achieving health, welfare, and economic development. Lancet. 2015;386(9993):569-624.
2. Dell AJ, Gray S, Fraser R, Held M, Dunn R. Orthopaedic surgeon density in South Africa. World J Surg. 2018;42(12):3849-55.
3. Kellam JF. The core competencies for general orthopaedic surgeons. J Bone Joint Surg Am. 2017;99(2):175-81.
4. Jameson SS, Dowen D, James P, Serrano-Pedraza I, Reed MR, Deehan DJ. The burden of arthroscopy of the knee. A contemporary analysis of data from the English NHS. J Bone Joint Surg Br. 2011;93(10):1327-1333.
5. Marais L, Dunn R. Teaching and training in orthopaedics. South Afr Orthopaedic J. 2017;16(4).
6. Swanepoel S, Dunn R, Klopper J, Held M. The FC Orth(SA) final examination: how effective is the written component? SA Orthopaedic J. 2018;17(3).
7. Hodgins JL, Veillette C, Biau D, Sonnadara R. The knee arthroscopy learning curve: quantitative assessment of surgical skills. Arthroscopy. 2014;30(5):613-21.
8. Insel A, Carofino B, Leger R, Arciero R, Mazzocca AD. The development of an objective model to assess arthroscopic performance. J Bone Joint Surg Am. 2009;91(9):2287-95.
9. Koehler RJ, Goldblatt JP, Maloney MD, Voloshin I, Nicandri GT. Assessing diagnostic arthroscopy performance in the operating room using the Arthroscopic Surgery Skill Evaluation Tool (ASSET). Arthroscopy. 2015;31(12):2314-2319.e2.
10. Jj S, Kerkhoffs G, van Ooij B, Sierevelt N, Schafroth MU, van Dijk CN, Dragoo J, Tuijthof GJM. The suitability of global rating scales to monitor arthroscopic training progress. Int J Sports Exerc Med. 2016;2(2):2469-5718.
11. Velazquez-Pimentel D, Stewart E, Trockels A, Achan P, Akhtar K, Vaghela KR. Global rating scales for the assessment of arthroscopic surgical skills: a systematic review. Arthroscopy. 2020;36(4):1156-73.
12. Alvand A, Auplish S, Gill H, Rees J. Innate arthroscopic skills in medical students and variation in learning curves. J Bone Joint Surg Am. 2011;93(19):e115.
13. Olson T, Koehler R, Butler A, Amsdell S, Nicandri G. Is there a valid and reliable assessment of diagnostic knee arthroscopy skill? Knee. Clin Orthop Relat Res. 2013;471(5):1670-6.
14. Butler A, Olson T, Koehler R, Nicandri G. Topics in training: do the skills acquired by novice surgeons using anatomic dry models transfer effectively to the task of diagnostic knee arthroscopy performed on cadaveric specimens? J Bone Joint Surg Am. 2013;95(3):e15.
15. Koehler RJ, Amsdell S, Arendt EA, Bisson LJ, Bramen JP, Butler A, Cosgarea AJ, Harner CD, Garrett WE, Olson T, Warme WJ, Nicandri GT. The arthroscopic surgical skill evaluation tool (ASSET). Am J Sports Medr. 2013;41(6):1229-37.
16. Middleton RM, Vo A, Ferguson J, Judge A, Alvand A, Price AJ, Rees JL. Can surgical trainees achieve arthroscopic competence at the end of training programs? A cross-sectional study highlighting the impact of working time directives. Arthroscopy. 2017;33(6):1151-8.
17. Middleton RM, Baldwin MJ, Akhtar K, Alvand A, Rees JL. Which global rating scale? a comparison of the ASSET, BAKSSS, and IGARS for the assessment of simulated arthroscopic skills. J Bone Joint Surg Am. 2016;98(1):75-81.
18. Boutefnouchet T, Laios T. Transfer of arthroscopic skills from computer simulation training to the operating theatre: a review of evidence from two randomised controlled studies. SICOT J. 2016;2:4.
19. Howells NR, Gill HS, Carr AJ, Price AJ, Rees JL. Transferring simulated arthroscopic skills to the operating theatre. A randomised blinded study. J Bone Joint Surg. 2008;90-B(4):494-9.
20. Dunn C, Held M, Laubscher M, Nortje M, Roche S, Dunn R. Orthopaedic surgical training exposure at a South African academic hospital–is the experience diverse and in depth? South Afr Orthopaedic J. 2022;21(1):22-8.
21. Gupta A, Lawendy B, Goldenberg MG, Grober E, Lee JY, Perlis N. Can video games enhance surgical skills acquisition for medical students? A systematic review. Surgery. 2021;169(4):821-9.