Secondary Logo

Journal Logo

How Should Unmatched Orthopaedic Surgery Applicants Proceed?

Amin, Nirav, H., MD1; Jakoi, Andre, M., MD1; Cerynik, Douglas, L., MD, MBA1; Kumar, Neil, S., MD, MBA1; Johanson, Norman, MD1, a

Clinical Orthopaedics and Related Research: February 2013 - Volume 471 - Issue 2 - p 672–679
doi: 10.1007/s11999-012-2471-8
Clinical Research
Free
SDC

Background Obtaining an orthopaedic surgery residency is competitive. Advisors must understand what factors may help unmatched candidates reapply successfully.

Questions/purposes We determined (1) the attitude of leaders of orthopaedic surgery residency programs toward interviewing unmatched students; (2) whether a surgical internship or a research year is preferred in considering reapplicants; (3) the importance of United States Medical Licensing Examination (USMLE) scores, recommendations, and Alpha Omega Alpha (AOA) membership; and (4) whether academic and nonacademic programs evaluate reapplicants differently.

Methods We sent an anonymous 19-question survey to 151 Accreditation Council for Graduate Medical Education (ACGME)-accredited orthopaedic surgery residency programs in five waves, 1 week apart (December 5, 2009-January 5, 2010). Investigators were blinded to the respondents’ identities.

Results Ninety-one of the 151 programs (60%) responded. Sixty-eight of the 91 programs (75%) stated they rarely accept unmatched applicants. Sixty-eight programs (75%) agreed an unmatched applicant should do a surgery internship for 1 year. Of the 36 programs that recommended a research year, 32 were academic programs. Academic programs were more likely than nonacademic programs to view as important new recommendations (85% versus 67%), minimum scores of 220 on Step I (67% versus 49%) and Step II (64% versus 36%), and AOA membership (85% versus 67%).

Conclusions By completing a surgical internship, unmatched students may increase their chances of matching. Students considering academic programs should ensure their academic record meets certain benchmarks and may consider a research year but risk limiting their acceptance to academic programs.

1 Department of Orthopaedic Surgery, Drexel University College of Medicine, 19102, Philadelphia, PA, USA

a e-mail; norman.johanson@drexelmed.edu

Received: February 20, 2012 / Accepted: June 22, 2012 / Published online: July 24, 2012

The institution of one of the authors (NJ) has received funding from the National Center for Research Resources (NCRR), a component of the National Institutes of Health (NIH) (Grant 1 UL1 RR024128-01), and NIH Roadmap for Medical Research. The content of this paper is solely the responsibility of the authors and does not necessarily represent the official view of NCRR or NIH. Information on NCRR is available at http://www.ncrr.nih.gov/. Information on Re-engineering the Clinical Research Enterprise can be obtained from http://nihroadmap.nih.gov/clinicalresearch/overview-translational.asp.

All ICMJE Conflict of Interest Forms for authors and Clinical Orthopaedics and Related Research editors and board members are on file with the publication and can be viewed on request.

Each author certifies that his or her institution approved the protocol for this investigation, that all investigations were conducted in conformity with ethical principles of research, and that informed consent for participation in the study was obtained.

Back to Top | Article Outline

Introduction

Orthopaedic surgery has been one of the most difficult residencies to obtain, resulting in a considerable number of unmatched fourth-year medical students annually. In 2012, a total of 1046 applicants (845 fourth-year medical students in the United States) vied for 682 residency positions in orthopaedic surgery [8]. Among US applicants, 641 candidates matched and 240 went unmatched [8]. As a result, there have been questions regarding what course unmatched fourth-year medical students should take to increase their likelihood of matching the following year.

Conventional options for the unmatched applicant have included a preliminary year in general surgery, during which the applicant could rotate through the institution’s orthopaedic surgery department and exhibit aptitude and qualities desired by program leadership. Another route has been a year spent conducting orthopaedic research in either the basic science laboratory or in a clinical setting. Mentors counsel rejected applicants so that these options may strengthen their applications and increase the likelihood that program directors will perceive them as committed to the field. Other factors, including scores on the United States Medical Licensing Examination (USMLE), letters of recommendation, and a program’s history of accepting reapplications, also may play important roles in the reapplication process. An understanding of program preferences would allow counselors to better assist reapplicants in making successful career choices.

We therefore sought to determine (1) whether programs are receptive to interviewing and accepting reapplicants; (2) whether programs view a preliminary surgical internship versus a year of orthopaedic research as preferable; (3) how program leaders perceive the importance of USMLE scores, letters of recommendation, and Alpha Omega Alpha (AOA) membership; and (4) whether there is a difference between academic and nonacademic programs in their evaluation of reapplicants.

Back to Top | Article Outline

Materials and Methods

Between December 5, 2009 and January 5, 2010, we administered an anonymous 19-question web-based survey (Appendix 1) to the 151 orthopaedic residency program directors and chairpersons of Accreditation Council for Graduate Medical Education (ACGME)-accredited orthopaedic surgery residency programs, listed in the Fellowship and Residency Electronic Interactive Database (FREIDA Online) of the American Medical Association [1]. No programs were excluded in this study. We contacted all programs to obtain accurate contact information for the residency program director (RPD) and chairperson.

We used an open-source, Internet-based survey software application (DADOS-Survey, Duke University, Durham, NC, USA) to distribute the survey. DADOS-Survey, previously described by Shah et al. [10], was designed to promote and report results of Internet-based surveys in compliance with the Checklist for Reporting Results of Internet E-Surveys (CHERRIES) reporting guidelines.

To determine the target cohort, we collected the names and email addresses of each survey recipient before administration of the survey. To ensure anonymity of the respondents, no identifying information was collected during the survey administration; Internet trackers, such as cookies, which serve as unique identifiers to computers, and Internet Protocol (IP) address checks, which prevent potential duplicate entries from the same computer, were not enabled.

The usability and technical functionality of the final survey were confirmed to be in accordance with established benchmarks for DADOS-Survey. Once we verified the contact information, we sent an introductory email to each survey recipient, informing him or her of the name, institution, and email address of the principal investigator and purpose of the study. No advertisements or incentives were incorporated in the email. The introductory paragraph contained a Uniform Resource Locator (URL) link that allowed recipients to access the survey anonymously (there were no login requirements). The entire study sample received the identical survey as a single wave five separate times, with the waves occurring 1 week apart. This methodology was used because the investigators were blinded not only to the survey responses, but also to the names of the responding institutions. Therefore, to gather as many responses as possible in a blinded manner, we sent the survey link to each subject five separate times. Each mailing contained instructions explaining why the subject was receiving the same survey multiple times even if he or she had already responded and clarifying that the subject should respond only once, in addition to information regarding the study purpose and blinding methodology. The URL linked participants to the Website containing only the survey. Because there were only eight survey questions per page, randomization of the presentation and adaptive questioning of survey questions were not necessary. Once the participant completed the survey, JavaScript in DADOS-Survey was used to check for unanswered questions. DADOS-Survey automatically collected, encrypted, and stored survey responses on a password-protected server. Owing to the brevity of the survey, no review step or timestamp was implemented (that is, before submission, subjects were not taken to a separate page asking them to “review your responses before finalizing submission”).

Ninety-one of the 151 programs (60%) responded to the survey. Of these, 29 responses (32%) were from chairpersons, 42 (47%) were from RPDs, 19 (21%) were from individuals serving dual roles, and one respondent (1%) did not specify his or her role. Fifty-eight programs (64%) described themselves as academic, 24 (26%) were university affiliated, eight (9%) were community based, and one (1%) was military based. The mean program size was 4.9 residents per year (range, 2-12). All survey data were collected in January 2010.

To provide a general sense of differences of opinion between academic and nonacademic programs, we compared the responses of these two groups using the unpaired Student’s t-test.

Back to Top | Article Outline

Results

Of the 91 responding residency programs, only 10 (11%) stated they often interview unmatched applicants (Table 1). However, 32 programs (35%) stated they never or rarely offered an interview to unmatched applicants. A majority of programs (56%) never or rarely reinterviewed a candidate who already had interviewed at that particular institution during a previous match cycle (Table 2).

Table 1

Table 1

Table 2

Table 2

Of the 91 responding residency programs, 68 (75%) agreed or strongly agreed that an unmatched applicant should do a preliminary surgery internship year. Fifty-nine programs (65%) agreed or strongly agreed that they were more likely to interview a previously unmatched applicant who pursued such an internship year. All programs were more likely to favor applicants pursuing a preliminary surgery year at their own institution (Table 3).

Table 3

Table 3

Seventy-one of the 91 respondents (78%) ranked new letters of recommendation as important or extremely important in considering reapplicants (Fig. 1). Academic residency programs placed more weight (p = 0.006) on new letters than did nonacademic residency programs. When asked about the importance of USMLE scores, 92% of respondents stated that Step 1 scores were important or extremely important. When queried, 84 respondents (92%) stated that Step 2 Clinical Knowledge (CK) scores were important or extremely important. Likewise, 84% stated that Step 2 Clinical Skills (CS) scores were important or extremely important. When asked to identify minimum requirements, 34 of the 91 respondents (37%) stated that the minimum score for Step 1 to interview reapplicants was greater than 220 (Table 4). For Step 2 CK, a score greater than 220 was used as a cutoff by 49 respondents (54%); only 15 respondents (17%) said their programs would interview reapplicants with lower scores.

Fig. 1

Fig. 1

Table 4

Table 4

Our data show that academic institutions were more likely (p = 0.039) to view a research year as favorable (20% strongly agree) compared with nonacademic institutions (3% strongly agree). Of the 36 responding programs that agreed or strongly agreed that an unmatched applicant should pursue a year of orthopaedic research (Table 5), 32 (89%) were academic or university-affiliated residencies. Of the 39 respondents who agreed or strongly agreed that they were more likely to interview an applicant who pursued research, 35 (90%) represented academic or university-affiliated residencies. The percentage of institutions that strongly agreed to interviewing reapplicants who spent a research year at their respective institutions tended to be greater (p = 0.28) for academic programs (48%) than for nonacademic programs (33%).

Table 5

Table 5

Differences were seen among the relative values placed on USMLE board scores when comparing academic versus nonacademic residency programs (Table 6). Academic residencies were more likely (p = 0.049, p = 0.045) to emphasize a score higher than 220 on Step 1 and Step 2, respectively, than nonacademic programs. AOA distinction also was deemed more important (p = 0.039) by academic residencies compared with nonacademic residencies.

Table 6

Table 6

Back to Top | Article Outline

Discussion

Obtaining a residency position in orthopaedic surgery has become increasingly difficult. The primary objectives of this study were to gain insight into how residency program leaders view unmatched applicants; to determine whether program leaders prefer a surgical internship or dedicated research experience; to understand the weight placed on standardized test scores, letters of recommendation, and AOA membership; and to ascertain differences in how academic and nonacademic programs evaluate reapplicants.

Study limitations included the following. First, there was a risk of duplicated survey replies resulting from the protection of anonymity. To ensure the integrity of the data, we monitored the data for multiple entries with identical responses; none was found in the data set. Second, the retrospective nature of the survey may have included reporting and recall bias. Bias might have resulted from favoritism or prejudice by programs toward applicants, favoring either a preliminary year of training or research. These data, however, were key to our aim of determining whether bias or preferences did exist. Third, program leaders were asked to self-label their programs as academic or nonacademic, but leaders may have had varying opinions on which traits determined this differentiation. Fourth, recommendations for unmatched students are based solely on an amalgamation of individual opinions. The data do not provide evidence of the success of reapplication via various pathways. Fifth, we assume that all responses accurately reflect the policies and biases of the entire program, not just of the responding individual. Sixth, the data reflect the opinions of an individual at one particular time; a change in leadership, staff, or program mission statements could change respondent opinions and, therefore, the statistics collected.

Academic and nonacademic residencies reported an almost identical mean number of reapplicants in their programs, 1.45 (range 0-9) versus 1.46 (range 0-5), respectively (Table 7). However, academic and nonacademic programs report a difference regarding how often they accept previously unmatched students (Fig. 2). No academic residency program stated that it annually accepted previously unmatched students, and only one nonacademic program of 33 (3%) stated that they annually accepted previously unmatched students. When asked if they accepted reapplicants, six of 58 academic residency programs (10%) stated that they often accepted previously unmatched applicants, as compared with five of 33 nonacademic residencies (15%).

Table 7

Table 7

Fig. 2

Fig. 2

We evaluated whether preference was given to research over a general surgery preliminary year. Dirschl et al. [5, 6] studied predictors of residency proficiency in orthopaedics among 1006 applicants and 20 residents. They found that although academic performance correlated with proficiency in orthopaedics throughout residency, a number of other important factors also correlated, including the number of publications. Thus, increasing numbers of publications may have correlated with an applicant’s attractiveness to residency programs. Our data showed a preference toward a year in general surgery, and a majority of programs that allowed preliminary surgery residents to rotate through orthopaedics weighed this in their decision to interview and accept the reapplicants. This opportunity was considered more important for placement at an academic or university-affiliated residency program than at a community-based residency program. If the reapplicant was interested in a specific residency program, then respondents agreed that it was important to do the preliminary year at that program.

Our data suggest that, in the opinion of the 91 respondent programs, academic institutions were more likely to interview applicants who spent a year doing research at their establishments than applicants who had spent a year at other academic institutions or nonacademic centers. Furthermore, almost 30% of nonacademic respondents stated that they viewed doing research at a nonacademic institution as a less valuable experience and possibly a poor use of the postgraduate year.

We asked whether quantitative indices, such as standardized examination scores, were emphasized when programs consider a reapplication. Our results suggested that USMLE Step 1 scores are highly valued. Although applicants and counselors might have believed that the USMLE Step 1 scores were far more important than the Step 2 Clinical Knowledge scores [2], our results indicated that this belief is false, especially regarding the reapplicant. Other studies have examined the relationship between USMLE Steps 1 and/or 2, American Board of Orthopaedic Surgery (ABOS) Part I scores, and Orthopaedic-in-Training Examination (OITE) scores. Crawford et al. [4] reported that residents with USMLE Step 1 scores greater than 220 had higher OITE scores and were less likely to fail Parts 1 and 2 of the ABOS. Dougherty et al. [7] found that USMLE Step 1 scores correlated with ABOS Part 1 and OITE scores. Black et al. [3] showed that higher USMLE Step 2 scores correlated with better performances on the OITE, whereas USMLE Step 1 did not correlate. These findings correlated with our results regarding the importance of Steps 1 and 2 of the USMLE. Our respondents reported that 220 was the cutoff score most programs used for applicants, which programs may attempt to correlate to the applicant’s potential performance on future OITE and ABOS examinations.

Bias was possible during the review and interview process, although we did not evaluate this question. Quintero et al. [9] suggested that similarities in personality between the applicant and the interviewing clinician may have led to more favorable rankings and reviews. Considering the prejudice the reapplicant faces as a result of being unmatched the prior year, it is important to recognize that biases may exist. In the current study, 19 programs stated that a poor interview was the primary reason applicants failed to match on reapplication (Fig. 3).

Fig. 3

Fig. 3

Four points may be considered from the current study when making recommendations to reapplicants. First, most of the responding programs indicated that reapplication was not widely accepted. Among programs that did consider reapplication, the reapplicant had a slight advantage in applying to the nonacademic programs. Second, at a practical level, USMLE Step 1, Clinical Knowledge, and Clinical Skills Step 2 scores were particularly emphasized by the academic programs. Third, performance in Postgraduate Year 1 was an important consideration, and a general surgery preliminary year was strongly preferred over a year of research. If research was pursued, however, it was better spent at an academic or university-affiliated program. Furthermore, most institutions favored candidates who had completed the research year at their own institutions. Fourth, new letters of recommendation were extremely important, reflecting the reapplicant’s newly acquired skills and increased competencies.

Back to Top | Article Outline

Acknowledgments

We thank the participants in this survey for their time and input. We thank Research on Research in the Department of Surgery at Duke University Medical Center for their assistance with survey administration and statistical analysis.

Back to Top | Article Outline

References

1. American Medical Association. Program Selection Criteria. 2009. Available at: https://freida.ama-assn.org/Freida/user/viewProgramSearch.do. Accessed August 7, 2009.
2. Bernstein, AD., Jazrawi, LM., Elbeshbeshy, B., Della Valle, CJ. and Zuckerman, JD. An analysis of orthopaedic residency selection criteria. Bull Hosp Jt Dis 2002; 61: 49-57.
3. Black, KP., Abzug, JM. and Chinchilli, VM. Orthopaedic in-training examination scores: a correlation with USMLE results. J Bone Joint Surg Am. 2006; 88: 671-676. 10.2106/JBJS.C.01184
4. Crawford, CH, III, Nyland, J., Roberts, CS. and Johnson, JR. Relationship among United States Medical Licensing Step I, orthopedic in-training, subjective clinical performance evaluations, and American Board of Orthopedic Surgery examination scores: a 12-year review of an orthopedic surgery residency program. J Surg Educ. 2010; 67: 71-78. 10.1016/j.jsurg.2009.12.006
5. Dirschl, DR., Campion, ER. and Gilliam, K. Resident selection and predictors of performance: can we be evidence based? Clin Orthop Relat Res. 2006; 449: 44-49.
6. Dirschl, DR., Dahners, LE., Adams, GL., Crouch, JH. and Wilson, FC. Correlating selection criteria with subsequent performance as residents. Clin Orthop Relat Res. 2002; 399: 265-271. 10.1097/00003086-200206000-00034
7. Dougherty, PJ., Walter, N., Schilling, P., Najibi, S. and Herkowitz, H. Do scores of the USMLE Step 1 and OITE correlate with the ABOS Part 1 certifying examination?: a multicenter study. Clin Orthop Relat Res. 2010; 468: 2797-2802. 10.1007/s11999-010-1327-3
8. National Resident Matching Program. Results and data 2012 main residency matchSM. Washington, DC: National Resident Matching Program. Available at: http://www.nrmp.org/data/resultsanddata2012.pdf. Accessed May 24, 2012.
9. Quintero, AJ., Segal, LS., King, TS. and Black, KP. The personal interview: assessing the potential for personality similarity to bias the selection of orthopaedic residents. Acad Med. 2009; 84: 1364-1372. 10.1097/ACM.0b013e3181b6a9af
10. Shah, A., Jacobs, DO., Martins, H., Harker, M., Menezes, A., McCready, M. and Pietrobon, R.DADOS-Survey: an open-source application for CHERRIES-compliant Web surveys. BMC Med Inform Decis Mak. 2006; 6: 34. 10.1186/1472-6947-6-34
Back to Top | Article Outline

Appendix 1. Questionnaire Sent to US Orthopaedics Residency Program Directors and Department Chairpersons Regarding Unmatched Applicants

Table

Table

Table

Table

Table

Table

Table

Table

© 2013 Lippincott Williams & Wilkins LWW