Orthopaedic surgery has been one of the most difficult residencies to obtain, resulting in a considerable number of unmatched fourth-year medical students annually. In 2012, a total of 1046 applicants (845 fourth-year medical students in the United States) vied for 682 residency positions in orthopaedic surgery . Among US applicants, 641 candidates matched and 240 went unmatched . As a result, there have been questions regarding what course unmatched fourth-year medical students should take to increase their likelihood of matching the following year.
Conventional options for the unmatched applicant have included a preliminary year in general surgery, during which the applicant could rotate through the institution’s orthopaedic surgery department and exhibit aptitude and qualities desired by program leadership. Another route has been a year spent conducting orthopaedic research in either the basic science laboratory or in a clinical setting. Mentors counsel rejected applicants so that these options may strengthen their applications and increase the likelihood that program directors will perceive them as committed to the field. Other factors, including scores on the United States Medical Licensing Examination (USMLE), letters of recommendation, and a program’s history of accepting reapplications, also may play important roles in the reapplication process. An understanding of program preferences would allow counselors to better assist reapplicants in making successful career choices.
We therefore sought to determine (1) whether programs are receptive to interviewing and accepting reapplicants; (2) whether programs view a preliminary surgical internship versus a year of orthopaedic research as preferable; (3) how program leaders perceive the importance of USMLE scores, letters of recommendation, and Alpha Omega Alpha (AOA) membership; and (4) whether there is a difference between academic and nonacademic programs in their evaluation of reapplicants.
Materials and Methods
Between December 5, 2009 and January 5, 2010, we administered an anonymous 19-question web-based survey (Appendix 1) to the 151 orthopaedic residency program directors and chairpersons of Accreditation Council for Graduate Medical Education (ACGME)-accredited orthopaedic surgery residency programs, listed in the Fellowship and Residency Electronic Interactive Database (FREIDA Online) of the American Medical Association . No programs were excluded in this study. We contacted all programs to obtain accurate contact information for the residency program director (RPD) and chairperson.
We used an open-source, Internet-based survey software application (DADOS-Survey, Duke University, Durham, NC, USA) to distribute the survey. DADOS-Survey, previously described by Shah et al. , was designed to promote and report results of Internet-based surveys in compliance with the Checklist for Reporting Results of Internet E-Surveys (CHERRIES) reporting guidelines.
To determine the target cohort, we collected the names and email addresses of each survey recipient before administration of the survey. To ensure anonymity of the respondents, no identifying information was collected during the survey administration; Internet trackers, such as cookies, which serve as unique identifiers to computers, and Internet Protocol (IP) address checks, which prevent potential duplicate entries from the same computer, were not enabled.
Ninety-one of the 151 programs (60%) responded to the survey. Of these, 29 responses (32%) were from chairpersons, 42 (47%) were from RPDs, 19 (21%) were from individuals serving dual roles, and one respondent (1%) did not specify his or her role. Fifty-eight programs (64%) described themselves as academic, 24 (26%) were university affiliated, eight (9%) were community based, and one (1%) was military based. The mean program size was 4.9 residents per year (range, 2-12). All survey data were collected in January 2010.
To provide a general sense of differences of opinion between academic and nonacademic programs, we compared the responses of these two groups using the unpaired Student’s t-test.
Of the 91 responding residency programs, only 10 (11%) stated they often interview unmatched applicants (Table 1). However, 32 programs (35%) stated they never or rarely offered an interview to unmatched applicants. A majority of programs (56%) never or rarely reinterviewed a candidate who already had interviewed at that particular institution during a previous match cycle (Table 2).
Of the 91 responding residency programs, 68 (75%) agreed or strongly agreed that an unmatched applicant should do a preliminary surgery internship year. Fifty-nine programs (65%) agreed or strongly agreed that they were more likely to interview a previously unmatched applicant who pursued such an internship year. All programs were more likely to favor applicants pursuing a preliminary surgery year at their own institution (Table 3).
Seventy-one of the 91 respondents (78%) ranked new letters of recommendation as important or extremely important in considering reapplicants (Fig. 1). Academic residency programs placed more weight (p = 0.006) on new letters than did nonacademic residency programs. When asked about the importance of USMLE scores, 92% of respondents stated that Step 1 scores were important or extremely important. When queried, 84 respondents (92%) stated that Step 2 Clinical Knowledge (CK) scores were important or extremely important. Likewise, 84% stated that Step 2 Clinical Skills (CS) scores were important or extremely important. When asked to identify minimum requirements, 34 of the 91 respondents (37%) stated that the minimum score for Step 1 to interview reapplicants was greater than 220 (Table 4). For Step 2 CK, a score greater than 220 was used as a cutoff by 49 respondents (54%); only 15 respondents (17%) said their programs would interview reapplicants with lower scores.
Our data show that academic institutions were more likely (p = 0.039) to view a research year as favorable (20% strongly agree) compared with nonacademic institutions (3% strongly agree). Of the 36 responding programs that agreed or strongly agreed that an unmatched applicant should pursue a year of orthopaedic research (Table 5), 32 (89%) were academic or university-affiliated residencies. Of the 39 respondents who agreed or strongly agreed that they were more likely to interview an applicant who pursued research, 35 (90%) represented academic or university-affiliated residencies. The percentage of institutions that strongly agreed to interviewing reapplicants who spent a research year at their respective institutions tended to be greater (p = 0.28) for academic programs (48%) than for nonacademic programs (33%).
Differences were seen among the relative values placed on USMLE board scores when comparing academic versus nonacademic residency programs (Table 6). Academic residencies were more likely (p = 0.049, p = 0.045) to emphasize a score higher than 220 on Step 1 and Step 2, respectively, than nonacademic programs. AOA distinction also was deemed more important (p = 0.039) by academic residencies compared with nonacademic residencies.
Obtaining a residency position in orthopaedic surgery has become increasingly difficult. The primary objectives of this study were to gain insight into how residency program leaders view unmatched applicants; to determine whether program leaders prefer a surgical internship or dedicated research experience; to understand the weight placed on standardized test scores, letters of recommendation, and AOA membership; and to ascertain differences in how academic and nonacademic programs evaluate reapplicants.
Study limitations included the following. First, there was a risk of duplicated survey replies resulting from the protection of anonymity. To ensure the integrity of the data, we monitored the data for multiple entries with identical responses; none was found in the data set. Second, the retrospective nature of the survey may have included reporting and recall bias. Bias might have resulted from favoritism or prejudice by programs toward applicants, favoring either a preliminary year of training or research. These data, however, were key to our aim of determining whether bias or preferences did exist. Third, program leaders were asked to self-label their programs as academic or nonacademic, but leaders may have had varying opinions on which traits determined this differentiation. Fourth, recommendations for unmatched students are based solely on an amalgamation of individual opinions. The data do not provide evidence of the success of reapplication via various pathways. Fifth, we assume that all responses accurately reflect the policies and biases of the entire program, not just of the responding individual. Sixth, the data reflect the opinions of an individual at one particular time; a change in leadership, staff, or program mission statements could change respondent opinions and, therefore, the statistics collected.
Academic and nonacademic residencies reported an almost identical mean number of reapplicants in their programs, 1.45 (range 0-9) versus 1.46 (range 0-5), respectively (Table 7). However, academic and nonacademic programs report a difference regarding how often they accept previously unmatched students (Fig. 2). No academic residency program stated that it annually accepted previously unmatched students, and only one nonacademic program of 33 (3%) stated that they annually accepted previously unmatched students. When asked if they accepted reapplicants, six of 58 academic residency programs (10%) stated that they often accepted previously unmatched applicants, as compared with five of 33 nonacademic residencies (15%).
We evaluated whether preference was given to research over a general surgery preliminary year. Dirschl et al. [5, 6] studied predictors of residency proficiency in orthopaedics among 1006 applicants and 20 residents. They found that although academic performance correlated with proficiency in orthopaedics throughout residency, a number of other important factors also correlated, including the number of publications. Thus, increasing numbers of publications may have correlated with an applicant’s attractiveness to residency programs. Our data showed a preference toward a year in general surgery, and a majority of programs that allowed preliminary surgery residents to rotate through orthopaedics weighed this in their decision to interview and accept the reapplicants. This opportunity was considered more important for placement at an academic or university-affiliated residency program than at a community-based residency program. If the reapplicant was interested in a specific residency program, then respondents agreed that it was important to do the preliminary year at that program.
Our data suggest that, in the opinion of the 91 respondent programs, academic institutions were more likely to interview applicants who spent a year doing research at their establishments than applicants who had spent a year at other academic institutions or nonacademic centers. Furthermore, almost 30% of nonacademic respondents stated that they viewed doing research at a nonacademic institution as a less valuable experience and possibly a poor use of the postgraduate year.
We asked whether quantitative indices, such as standardized examination scores, were emphasized when programs consider a reapplication. Our results suggested that USMLE Step 1 scores are highly valued. Although applicants and counselors might have believed that the USMLE Step 1 scores were far more important than the Step 2 Clinical Knowledge scores , our results indicated that this belief is false, especially regarding the reapplicant. Other studies have examined the relationship between USMLE Steps 1 and/or 2, American Board of Orthopaedic Surgery (ABOS) Part I scores, and Orthopaedic-in-Training Examination (OITE) scores. Crawford et al.  reported that residents with USMLE Step 1 scores greater than 220 had higher OITE scores and were less likely to fail Parts 1 and 2 of the ABOS. Dougherty et al.  found that USMLE Step 1 scores correlated with ABOS Part 1 and OITE scores. Black et al.  showed that higher USMLE Step 2 scores correlated with better performances on the OITE, whereas USMLE Step 1 did not correlate. These findings correlated with our results regarding the importance of Steps 1 and 2 of the USMLE. Our respondents reported that 220 was the cutoff score most programs used for applicants, which programs may attempt to correlate to the applicant’s potential performance on future OITE and ABOS examinations.
Bias was possible during the review and interview process, although we did not evaluate this question. Quintero et al.  suggested that similarities in personality between the applicant and the interviewing clinician may have led to more favorable rankings and reviews. Considering the prejudice the reapplicant faces as a result of being unmatched the prior year, it is important to recognize that biases may exist. In the current study, 19 programs stated that a poor interview was the primary reason applicants failed to match on reapplication (Fig. 3).
Four points may be considered from the current study when making recommendations to reapplicants. First, most of the responding programs indicated that reapplication was not widely accepted. Among programs that did consider reapplication, the reapplicant had a slight advantage in applying to the nonacademic programs. Second, at a practical level, USMLE Step 1, Clinical Knowledge, and Clinical Skills Step 2 scores were particularly emphasized by the academic programs. Third, performance in Postgraduate Year 1 was an important consideration, and a general surgery preliminary year was strongly preferred over a year of research. If research was pursued, however, it was better spent at an academic or university-affiliated program. Furthermore, most institutions favored candidates who had completed the research year at their own institutions. Fourth, new letters of recommendation were extremely important, reflecting the reapplicant’s newly acquired skills and increased competencies.
We thank the participants in this survey for their time and input. We thank Research on Research in the Department of Surgery at Duke University Medical Center for their assistance with survey administration and statistical analysis.
2. Bernstein, AD., Jazrawi, LM., Elbeshbeshy, B., Della Valle, CJ. and Zuckerman, JD. An analysis of orthopaedic residency selection criteria. Bull Hosp Jt Dis
2002; 61: 49-57.
3. Black, KP., Abzug, JM. and Chinchilli, VM. Orthopaedic in-training examination scores: a correlation with USMLE results. J Bone Joint Surg Am.
2006; 88: 671-676. 10.2106/JBJS.C.01184
4. Crawford, CH, III, Nyland, J., Roberts, CS. and Johnson, JR. Relationship among United States Medical Licensing Step I, orthopedic in-training, subjective clinical performance evaluations, and American Board of Orthopedic Surgery examination scores: a 12-year review of an orthopedic surgery residency program. J Surg Educ.
2010; 67: 71-78. 10.1016/j.jsurg.2009.12.006
5. Dirschl, DR., Campion, ER. and Gilliam, K. Resident selection and predictors of performance: can we be evidence based? Clin Orthop Relat Res.
2006; 449: 44-49.
6. Dirschl, DR., Dahners, LE., Adams, GL., Crouch, JH. and Wilson, FC. Correlating selection criteria with subsequent performance as residents. Clin Orthop Relat Res.
2002; 399: 265-271. 10.1097/00003086-200206000-00034
7. Dougherty, PJ., Walter, N., Schilling, P., Najibi, S. and Herkowitz, H. Do scores of the USMLE Step 1 and OITE correlate with the ABOS Part 1 certifying examination?: a multicenter study. Clin Orthop Relat Res.
2010; 468: 2797-2802. 10.1007/s11999-010-1327-3
9. Quintero, AJ., Segal, LS., King, TS. and Black, KP. The personal interview: assessing the potential for personality similarity to bias the selection of orthopaedic residents. Acad Med.
2009; 84: 1364-1372. 10.1097/ACM.0b013e3181b6a9af
10. Shah, A., Jacobs, DO., Martins, H., Harker, M., Menezes, A., McCready, M. and Pietrobon, R.DADOS-Survey: an open-source application for CHERRIES-compliant Web surveys. BMC Med Inform Decis Mak.
2006; 6: 34. 10.1186/1472-6947-6-34
Appendix 1. Questionnaire Sent to US Orthopaedics Residency Program Directors and Department Chairpersons Regarding Unmatched Applicants
© 2013 Lippincott Williams & Wilkins LWW