Secondary Logo

Journal Logo

Innovation Reports

An Electronic Interview Tracking Tool to Guide Medical Students Through the Match: Improvements in Advising and Match Outcomes

Frayha, Neda MD; Raczek, John; Lo, Julia MSPH; Martinez, Joseph MD; Parker, Donna MD

Author Information
doi: 10.1097/ACM.0000000000002522

Abstract

Problem

Medical students are applying to ever-increasing numbers of U.S. residency programs.1 In turn, residency program directors are being inundated with applications and must attempt to discern which students are truly interested in their programs. There is a need for more data to inform this process and to allow its participants to make more strategic, evidence-based choices.

The National Resident Matching Program (NRMP) reports that U.S. seniors match successfully at rates of approximately 92% to 95%, but students know there has been a rapid increase in the number of active applicants in the Match over the past 10 years.1 Further, for students who do not match, there are few options in the Supplemental Offer and Acceptance Program, as many specialties fill all of their slots through the Match. In the 2017 Match, the overall fill rate for programs was 96%, and just 48.4% of U.S. seniors matched into their first-choice program.1

Data to help medical students understand their own level of competitiveness or identify programs to which they should apply are limited. In the absence of such data, medical students have responded to national Match statistics by steadily increasing their numbers of applications with the hope of securing enough interview offers to create a rank order list of 12 programs, the recommended length to confer a statistically strong chance of matching.2 From 2002 to 2017, the average length of matched U.S. seniors’ rank order lists increased from 7.96 to 12.14 programs.2

Program directors are beginning to respond to the increase in applications by adding specialty-specific requirements. For example, emergency medicine programs required applicants to participate in the standardized video interview piloted by the Association of American Medical Colleges (AAMC) in 2018. Some orthopedic surgery programs require a video interview. Otorhinolaryngology programs have required customized personal statements since 2015. Orthopedic surgery, emergency medicine, and plastic surgery programs all have specific forms for letters of recommendation.

These specialty-specific requirements for applications add extra tasks for fourth-year students to complete during the application season of July through September. Further, the interview season now extends from October through January. As a result, the residency application process encroaches significantly on the fourth year of medical school, with fewer months available for dedicated learning, and places an increased financial burden on students.

AAMC research shows there is a point of diminishing returns at which submitting additional applications does not improve an applicant’s likelihood of matching to a residency program.3 Therefore, helping students identify and apply to the programs most likely to offer them interviews is a logical way to both decrease application numbers and increase the return on each application, thereby contributing to Match success rates. The NRMP reports that about 90% of medical schools would find program-specific information (e.g., preferred class rank, United States Medical Licensing Examination [USMLE] score ranges or cutoffs, requirements of honors grades) helpful in advising students, but less than 50% of programs are willing to share this information.4

In this Innovation Report, we describe and share preliminary outcomes of the electronic interview tracking tool we created to improve student advising on the residency interview process at the University of Maryland School of Medicine (UMSOM). This tool enables us to collect data on which programs UMSOM students have applied to, interviewed at, and matched to, as well as to parse those data by class rank, Alpha Omega Alpha Honor Medical Society (AOA) status, and USMLE scores. Using this advising tool, we are able to help students target programs more likely to interview them and diminish students’ perceptions that they need to apply to more programs to earn enough interviews. Having access to these school-specific data, as opposed to aggregate national data only, is particularly helpful in advising students because these data factor in the reputation and perceived quality of the UMSOM and our students.

Approach

The UMSOM, which has three student affairs deans (N.F., J.M., D.P.; representing two full-time equivalents in the Office of Student Affairs [OSA]) and an average class size of 160 students, uses a long-standing, homegrown student and curriculum management system called MedScope. A team of developers in the Office of Medical Education expands this system as new needs arise. In academic year 2014–2015 (2015 Match year), under guidance from the OSA, MedScope was expanded to provide a tool to allow students to track their residency application interviews within the online student portfolio. Over subsequent years, iterative developments have improved this electronic interview tracking tool’s ease of use and data analytics.

To facilitate the interview tracking process, students’ residency application data are downloaded from the Electronic Residency Application Service (ERAS) and converted into a format for upload into MedScope shortly after the ERAS opening date of September 15. This prepopulates each student’s portfolio with their list of residency programs, without the need for duplicate data entry. Students may add additional programs to their application list in MedScope after the initial prepopulation. MedScope also contains a master file of all programs to which UMSOM medical students have applied and/or matched in the preceding 10 years so that long-term trends in application and Match data can be analyzed.

Within the interview tracking tool, students may provide some or all of the following details for each program to which they have applied: callback status (defined as initial contact from a residency program, with choices of “waiting for callback,” “interview offered,” “interview rejected,” “interview waitlisted,” or “no callback”); callback date; interview status (with choices of “accepted,” “declined,” or “accepted and canceled”); interview date; and optional comments. The OSA sends out periodic e-mail reminders encouraging students to keep their data updated, because the OSA deans use these data in real time to help all students throughout the application and interview process.

The report view of the data used by the OSA deans includes the data fields described above, as well as individual students’ USMLE scores (Step 1, Step 2 Clinical Knowledge, Step 2 Clinical Skills), class rank (top 10%, upper third, middle third, lower third), AOA status, and enrollment in dual-degree programs. (For an example report view, see Supplemental Digital Appendix 1 at http://links.lww.com/ACADMED/A614.) The OSA deans can easily filter and sort the entire dataset by student, program, specialty, preliminary/nonpreliminary, callback status and date, and interview status and date. Viewing the data by specialty provides historical interview and Match statistics for UMSOM graduates since the 2015 Match year. To protect student confidentiality, individual students’ data are not shared with other students or outside the OSA except with UMSOM residency program directors who coadvise students regarding their residency applications.

In viewing these data in real time, the OSA deans can identify students with few interview offers who may be at risk of not matching and collaborate with clinical department faculty on potential backup plans for these students. By monitoring the timing of interview offers, the deans are able to counsel students regarding when it is appropriate to reach out to residency programs from which they have not heard.

The deans are also able to look for individual programs’ patterns of preferences regarding class rank, USMLE Step 1 scores, and other criteria (e.g., AOA status), based on which students receive interview offers and which do not. These patterns help inform student advising in future years by enabling the deans to provide specific examples of “safe” residency programs that may be within a given student’s reach, and to encourage students to apply to “reach” programs (i.e., programs that typically extend interview offers to students with more competitive credentials), so that students apply to an appropriate range of programs. The OSA may share data on these patterns upon request with UMSOM residency program directors so that they may better advise future students as well.

To conduct a preliminary evaluation of the impact of our electronic interview tracking tool on our school-specific Match outcomes during the first three years of use (Match years 2015, 2016, and 2017), we analyzed aggregate data from the UMSOM and annual data published by the NRMP1,5,6 and ERAS7 using piecewise linear regression with a combined model, allowing for separate slopes at 2015. We also reviewed AAMC Medical School Graduation Questionnaire (GQ) data on recent graduates’ satisfaction with the career advising and mentoring they had received in medical school.8–10 All analysis was done using STATA 15 (StataCorp, College Station, Texas). This research was deemed exempt by the University of Maryland, Baltimore’s Institutional Review Board in December 2017.

Outcomes

In Match years 2015, 2016, and 2017, respectively, 86% (n = 135/157), 87% (n = 138/159), and 94% (n = 151/161) of fourth-year UMSOM students participated actively in the electronic interview tracking tool, updating their individual lists of programs and information about callbacks and interview offers, rejections, and wait lists on an ongoing basis. Each year, more than 3,000 new data points were added to the tracking tool for the OSA deans to monitor, analyze, and refer to while guiding students through the interview process.

Following implementation of the electronic interview tracking tool, the UMSOM’s Match rate increased in 2016 and 2017 by an average of 3% per year (P = .02; 95% CI: 1.5% to 5.2%), while the national Match rate for all U.S. medical schools decreased slightly during this time period (P ≤ .001) (Figure 1).1 In the OSA’s anonymous post-Match survey, the percentage of students reporting a match in one of their top three ranked programs increased from 81% (n = 66/81) in 2014, before implementation of the tool, to 90% (n = 95/106) in 2017 (P = .10; 95% CI: −3.6% to 11%). National data from the NRMP show a decreasing trend from 2014 to 2017,1,5,6 and there was a statistically significant difference between UMSOM and national outcomes (P≤ .001) (Figure 1). The overall caliber of programs to which UMSOM students matched remained high; it did not decrease after the launch of the tracking tool.

Figure 1
Figure 1:
Comparison of national (U.S.) and University of Maryland School of Medicine (UMSOM) outcomes of the National Resident Matching Program’s (NRMP’s) Main Residency Match, before and after the launch of the UMSOM electronic interview tracking tool in 2014–2015 (2015 Match year). The dashed lines represent Match rates; the solid lines represent the percentage of students self-reporting a match in the top three programs of their rank order lists. Sources: NRMP1 , 5 , 6 and UMSOM internal data.

ERAS data show that the national average number of applications per student rose from 2012 to 2017,7 while the average application count by specialty among UMSOM students remained consistent at approximately 30 since the launch of the tracking tool (Figure 2). In addition, GQ data show a marked increase in the percentage of UMSOM students rating the school’s career advising and mentoring as “very useful” following the launch of the tracking tool and in comparison with national data during the same time period (Figure 3).8–10

Figure 2
Figure 2:
Comparison of national (U.S.) and University of Maryland School of Medicine (UMSOM) average numbers of residency program applications submitted, before and after the launch of the UMSOM electronic interview tracking tool in 2014–2015 (2015 Match year). The gray squares represent the national average number of applications per applicant, and the black circles represent the UMSOM’s average number of applications per applicant by specialty. Sources: Electronic Residency Application Service (ERAS) national data aggregated by applicant7 and UMSOM-specific reports with data aggregated by specialty (unpublished).
Figure 3
Figure 3:
Comparison of the percentages of national (U.S.) and University of Maryland School of Medicine (UMSOM) graduates responding to the Association of American Medical Colleges’ (AAMC’s) Medical School Graduation Questionnaire (GQ) who selected “very useful” to describe their medical school’s advising and mentoring in specialty choice and career planning, before and after the launch of the UMSOM electronic interview tracking tool in 2014–2015 (2015 Match year). Source: AAMC GQ.8–10

Next Steps

The creation and launch of an electronic interview tracking tool at the UMSOM has allowed the OSA deans to gather and parse data on residency interview offers for fourth-year students. Using this tool to provide students with customized examples of potentially attainable programs to which they should consider applying has led to an improved advising experience for the OSA deans. Use of the tool has also been associated with increases in (1) the UMSOM’s Match rate, (2) the percentage of students reporting a match in their top three ranked programs, and (3) student-reported usefulness of the career advising process—all without a significant change in the average number of applications per student.

Limitations of this analysis include the use of this electronic interview tracking tool and application of its data by a single medical school. In addition, changes in Match rates from year to year may be due to a wide variety of factors; as such, true causation between the interview tracking tool’s launch and subsequent improvements in the UMSOM’s Match metrics cannot be proven. However, the trend is encouraging and, at minimum, illustrates a positive association.

Next steps for the electronic interview tracking tool include continuing to accumulate data; improving data analytics, such as breaking down outcomes by specialty; investigating possible downstream effects of the tool, such as impact on application costs, amount of travel during the interview season, and student anxiety; increasing buy-in from UMSOM residency program directors regarding their use of the tool’s data; and collaborating with other institutions interested in developing their own frameworks. At the UMSOM, the tool’s database will become more robust over time. Data will be analyzed critically each year, with a goal of identifying programs within each specialty for which future students at different academic tiers may be well suited.

Ideally, this electronic interview tracking tool could be implemented at multiple schools, enabling more institutions to advise their students in a precise, data-driven manner. If our preliminary data hold true on a larger scale, the use of school- and program-specific data in this way could eventually have an impact on the number of applications per student throughout the United States. This could lead to a more reasonable application strategy for all applicants, reduce the number of applications that program directors must review, and improve the fit between applicants and residency programs. It could also decrease the amount of time, expense, and stress surrounding the application process for students, student affairs deans, and residency program directors alike—and even restore time within the fourth year of medical school for more teaching and learning at what is one of the most important transition points in a future physician’s career.

Acknowledgments:

The authors wish to thank Sara Menso, MBA, MS, and Ursula Goldman for their assistance with the manuscript.

References

1. National Resident Matching Program. Results and Data: 2017 Main Residency Match. 2017. Washington, DC: National Resident Matching Program; http://www.nrmp.org/wp-content/uploads/2017/06/Main-Match-Results-and-Data-2017.pdf. Accessed October 11, 2018.
2. National Resident Matching Program. Impact of length of rank order list on Match results: 2002–2017 Main Residency Match. http://www.nrmp.org/wp-content/uploads/2017/06/Impact-of-Length-of-Rank-Order-List-on-Match-Results-2017-Main-Match.pdf. Updated March 10, 2017. Accessed October 11, 2018.
3. Association of American Medical Colleges. Apply smart: New data to consider. https://students-residents.aamc.org/applying-residency/article/apply-smart-data-consider. Accessed October 11, 2018.
4. Signer MM. How competitive is the Match? Presented at: Learn Serve Lead 2016: The AAMC Annual Meeting; November 13, 2016; Seattle, WA. http://www.nrmp.org/wp-content/uploads/2016/11/Signer-AAMC-Annual-Meeting-2016.pdf. Accessed October 11, 2018.
5. National Resident Matching Program. Results and Data: 2016 Main Residency Match. National Resident Matching Program, 2016. Washington, DC: National Resident Matching Program; http://www.nrmp.org/wp-content/uploads/2016/04/Main-Match-Results-and-Data-2016.pdf. Accessed October 11, 2018.
6. National Resident Matching Program. Results and Data: 2015 Main Residency Match. 2015. Washington, DC: National Resident Matching Program; http://www.nrmp.org/wp-content/uploads/2015/05/Main-Match-Results-and-Data-2015_final.pdf. Accessed October 11, 2018.
7. Association of American Medical Colleges. Electronic Residency Application Service, ACGME residency: 2013–2017, residency specialties summary. https://www.aamc.org/download/359232/data/all.pdf. Published 2017. Accessed October 11, 2018.
8. Association of American Medical Colleges. Medical School Graduation Questionnaire: Individual School Report, 2017: University of Maryland School of Medicine. 2017.Washington, DC: Association of American Medical Colleges.
9. Association of American Medical Colleges. Medical School Graduation Questionnaire: Individual School Report, 2016: University of Maryland School of Medicine. 2016.Washington, DC: Association of American Medical Colleges.
10. Association of American Medical Colleges. Graduation Questionnaire (GQ): All schools summary reports, 2016 and 2017. https://www.aamc.org/data/gq. Accessed November 5, 2018.

Supplemental Digital Content

Copyright © 2018 The Author(s). Published by Wolters Kluwer Health, Inc. on behalf of the Association of American Medical Colleges