KEY POINTS
- Question: Is there a way to reduce the burden of interviewing for anesthesiology residency?
- Findings: The applicant being from the same state as the desired residency program is an important predictor for the match.
- Meaning: Applicants and programs would benefit from smart recruiting that include measures to reduce time and cost, specifically interviewing fewer applicants with negligible probability (odds) of matching.
The US residency application, interview, and match processes are costly (eg, mean $211,000 per emergency medicine residency program and $5000 per emergency medicine interviewee1 and median >$7500 per orthopedic interviewee2), time-intensive, and administratively inefficient.3,4 In Family Practice, the major drivers of this cost are lost billable faculty time and administrative costs.5 Surveys of US residency programs show that selection of interviewees is mostly examination score driven.6 For interviewees in otolaryngology, geography is a major determinant for where one matches.7
Using unique interview data from US anesthesiology programs, our objective was to quantify the relative importance of an applicant being from the same state on numbers of interviews needed to match. If the numbers of interviews needed per applicant could be reduced, by dropping programs where the interviewee has a negligible probability of matching, cost and time savings would benefit both programs and interviewees. Our second aim was to quantify the relative importance of an applicant being from the same state in terms of examination scores, a factor of special importance to programs.6
METHODS
The study was reviewed by the University of Iowa Institutional Review Board (IRB, 201909708). Before submitting this for review, interviewee and residency program data were gathered from 4 software platforms (Thalamus, Doximity, ACGME, and US News and World Report) and deidentified before analysis (described below). The University of Iowa IRB determined that the study of this deidentified data does not meet the regulatory definition of human subjects’ research, and therefore does not require review of the IRB or written consent from interviewees or programs. Thalamus is a cloud-based, graduate medical education interview scheduling software and management platform (SJ MedConnect, Inc dba ThalamusGME, Santa Clara, CA; https://thalamusgme.com/). Each author other than F.D. of the University of Iowa is a shareholder.
Preprocessing by Thalamus
Interviewee data were collected from applicants to anesthesiology residency programs who scheduled their interviews via Thalamus. Program data were collected from the Accreditation Council for Graduate Medical Education (ACGME),8 Doximity,9 US News and World Report (USNWR),10 and Thalamus. Figure 1 shows the progressive change in sample size with preprocessing.
Figure 1.: A graph of the sample sizes referenced throughout this article. Some records with missing data were excluded. Each section of the graph reports an n, which corresponds to the number of complete records available for that stage of the analysis. GME indicates Graduate Medical Education; NRMP, National Residency Matching Program; USMLE, US Medical Licensing Exam; USNWR, US News and World Report.
The period of study included 4 consecutive anesthesiology residency recruitment cycles from 2015 to 2018, 1300 interviewees (n1) and 32 residency programs (n2), located in 31 US states. Although 38 anesthesiology residency programs used Thalamus during the period of study, we excluded the 6 programs that did not use Thalamus for all 4 years. As reported by the National Residency Matching Program (NRMP),11 4720 US seniors matched into anesthesiology during the period of this study. Our cohort included 1300 interviewees who matched to 1 of the 32 programs during the 4 recruitment years, thus representing 27.5% of the matched applicant pool in this specialty. Although nearly all anesthesiology candidates use Thalamus, this study focused on those programs using Thalamus and the candidates they matched.
The data used for each interviewee included medical school, current address, US Medical Licensing Exam (USMLE) Step 1 and 2 clinical knowledge (CK) scores, and Alpha Omega Alpha (AOA) status. The datum of “current address” refers to the current address for which an interviewee resides during the time of application submission to the Electronic Residency Application Service (ERAS). This field may overlap with what the interviewee lists as their “permanent address.”
Program-specific means were calculated using the data from the 1300 interviewees (n1). The USNWR 2019 Medical School Rankings Research Score ranks medical schools and assigns a numerical value for each rank, rank 1 being the highest ranked school, etc. As a proxy for medical school prestige, the mean of those scores was calculated for each program among the interviewees who matched at each of the 32 programs (n2). The address of the residency program used was that listed within the ACGME online Accreditation Data System (ADS).8
Of the 1300 interviewees, each applicant only interviewed during a single recruitment year for a postgraduate year (PGY)1 or PGY2 position. Interviewees have a single user account in Thalamus as defined by their Association of American Medical Colleges (AAMC) identification (ID) number. It is a unique account defined by a unique number. Our database query can search across interview seasons. No interviewees studied had interview invitations that extended across interview seasons. The number of PGY2 matches is smaller and therefore challenging to compare to a PGY1 position. Programs may also interview PGY2s differently and/or assess them differently (given they have intern year as an additional metric and number of data points by which to assess the candidate).
These interviewees had a collective 5706 interviews (n3) at one or more of the 32 programs (Figure 1). The programs to which these interviewees matched were determined from the following publicly available resources: program trainee lists, published medical school match lists, interviewees’ Doximity profiles, LinkedIn profiles, and/or Google searches of name with terms such as “anesthesiology resident” or similar. Identifying program features were removed from the dataset. The AAMC ID number and any other disclosing personal information of interviewees were then discarded.
The interviewee’s current address was an optional data field in Thalamus. Excluding the interviewees without a known current address resulted in 967 interviewees who had 4089 interviews (n4). Geocoding of interviewees (current address) and residency programs (listed address on the ACGME website) was completed through Google’s secure geocoding application program interface.12 Reverse geocoding also allowed for the confirmation and/or addition of the interviewees’ state field (ie, extrapolated from the address including postal code).
The proportion of interviewees who both matched at each program and were designated as members of AOA was calculated. Among the 3047 interviews (604 interviewees) (n5) for whom USMLE scores (and same state) were known, the mean USMLE Step 1 and Step 2 CK scores among those who matched at each program were calculated. For each interviewee, the difference was calculated of these variables from the mean of the program (“interviewee-program difference”). See Supplemental Digital Content, Section A, https://links.lww.com/AA/D139, for further detail on variable definition.
Data from programs that lacked USNWR scores were excluded from the last multivariate analysis (see below), resulting in a final sample from 234 interviewees who had 1104 (n6) interviews and matched at one or more of the 32 programs (Figure 1).
Statistical Analyses of Deidentified Data for Current Study
R version 3.4.3 (2017-11-30) and several opensource R libraries were used to complete analyses. For the complete list of libraries used, see Supplemental Digital Content, Section B-Appendix of R Packages, https://links.lww.com/AA/D139.
A first step was choosing an appropriate method to measure geographic relation of interviewees’ and residency programs’ addresses. We compared logged distances from the interviewees’ current address and the residency program’s address, whether the interviewee and program shared the same state, whether they shared the same US Census Bureau geographic Division or the same Region (Supplemental Digital Content, Sections C and K, https://links.lww.com/AA/D139). To make these comparisons, we separately regressed these 4 measures of geographic relation, each against matching tendency and controlling for variables of interviewee and program differences (n6). The full table of regressions can be found in Supplemental Digital Content, Section D, https://links.lww.com/AA/D139. The coefficient estimates and standard errors for the geographic variables are listed in Table 1.
Table 1. -
Coefficient Estimates for 4 Measures of Geographic Relation From 4 Linear Regressions
|
Natural Logarithm (Distance Miles) |
Same State |
Same Division |
Same Region |
Estimate of coefficient |
0.95 |
1.28
|
1.19 |
1.16 |
T statistic |
−5.06 |
5.48
|
4.87 |
5.45 |
Same state, division, and region are dichotomous variables. The same state is in bold to show that the observed
t statistic is the largest, and the data of same-state are available to program directors from the address of the interviewee. Division and Region are from the US Census Bureau (Supplemental Digital Content, Section C,
https://links.lww.com/AA/D139). Confidence intervals for the estimates are provided in Supplemental Digital Content, Section M,
https://links.lww.com/AA/D139.
We tested the relative efficacy of geographic relations related to same-state, specifically neighbors (ie, a bordering state), neighbors’ neighbors (ie, a bordering state to a bordering state), and neighbors’ neighbors’ neighbors (ie, a bordering state to a bordering state to a bordering state). All controlling variables were held constant, and 1 at a time, each of the 4 geographic variables of interest were substituted in, comparing effect size and impact on the model. To illustrate the geographically attributable effect on match probability, we aggregated match rates by edge distance. That is, we took the mean interview-match rate for each state for same-state, neighboring states, and so on (excluding earlier orders). Figure 2 shows the percent of matches that came from each neighboring state grouping when comparing interviewees’ current address to the residency location. The neighboring state groupings are explained in Figure 3.
Figure 2.: The rate of matching for same state, neighboring state, neighbors’ neighbors, and neighbors’ neighbors’ neighbors. The light gray lines each represent a single state. The colored, bold lines show trends aggregated by region. The observed univariate percentage of an interview resulting in a match drops as geographic distance increases between an interviewee and a program. This figure are unadjusted percentages obtained from 4089 interviews (
Figure 1).
Figure 3.: A neighborhood ordering for the state of Kansas, immediate neighbors, neighbors’ neighbors, and neighbors’ neighbors’ neighbors. Kansas is shown as an example due to its centrality to the continental US and consequent lasting, clear visual pattern in its neighbors.
A 2 × 2 contingency table was produced by interview for matching or not matching at the program and by State, meaning interviewee from the same state as the program (n4). The association was analyzed by χ2 test (Supplemental Digital Content, Section E, https://links.lww.com/AA/D139).
A general linear model was created with the dependent variable of matched as a binary condition of whether the interview resulted in a match or not, using same-state as the geographic variable (Supplemental Digital Content, Section F, https://links.lww.com/AA/D139). A logistic link function (binomial family) was used. Standard errors were calculated using robust sandwich estimators (Table 2; Supplemental Digital Content, Section G, https://links.lww.com/AA/D139). The effect size of same-state was estimated by making comparison between the variables of number of interviews and USMLE scores (see Supplemental Digital Content, Section J, https://links.lww.com/AA/D139). Confidence intervals for the ratios of odds ratios were calculated using the delta method, which accounts for the covariance of estimates from the nonlinear model (Supplemental Digital Content, Section H, https://links.lww.com/AA/D139). These 2 confidence intervals were Bonferroni-adjusted for the 2 simultaneous intervals.
Table 2. -
Key Coefficients of the Logistic Regression Reported as the Estimated OR
|
OR |
SW Lower Bound |
SW Upper Bound |
P
|
Difference of USMLE score from program mean |
1.037 |
1.021 |
1.054 |
<.0001 |
Same state |
4.291
|
2.914
|
6.320
|
<.0001
|
Interview count |
0.792 |
0.728 |
0.863 |
<.0001 |
SW (SW lower/upper Bound) are reported in the results section due to the sensitivity of logistic regression to heteroscedasticity in data. Analytical standard errors are provided in Supplemental Digital Content, Section L,
https://links.lww.com/AA/D139. Each coefficient is
P < .0001. The ratio of 4.291/1.037 equals 4.14, as in the Results section. Similarly, the ratio of 4.563/0.82 equals 5.57, again the value in the Results. Standard errors were calculated using SW to give confidence intervals robust to heteroscedasticity in the regression errors (Supplemental Digital Content, Section K,
https://links.lww.com/AA/D139). The lower and upper bounds are 95% confidence intervals.
Bold values represent the primary comparison of values.
Abbreviations: OR, odds ratio; SW, sandwich estimators; USMLE, US Medical Licensing Exam.
The data used for these analyses included all available records for anesthesiology interviews provided by Thalamus. In the Limitations section, we compare the Bonferroni-adjusted confidence intervals’ widths with differences of important magnitudes.
RESULTS
In this cohort of anesthesiology interviewees, 23.4% claimed in-state residence based on program address and interviewee’s current address. Same-state was an important predictor of matching at an anesthesiology residency program (Figure 2). This was illustrated by univariate χ2 test, (c2 = 186.66, df = 1, P < .0001) and by multivariable analyses showing the effect of same-state to be reliably present and consistent in magnitude and direction among regions (Table 3; Supplemental Digital Content, Section I, https://links.lww.com/AA/D139).
Table 3. -
Consistent Impact of Same-State Versus Out-of-State on the Logistic Model Predictions for Each Region
Same State |
ACGME Region |
Match Probability |
Probability Given ±1 SD of Constants |
No |
Mid West |
0.086 |
0.049–0.149 |
Yes |
Mid West |
0.371 |
0.242–0.522 |
No |
North East |
0.075 |
0.042–0.13 |
Yes |
North East |
0.335 |
0.214–0.482 |
No |
South |
0.143 |
0.083–0.236 |
Yes |
South |
0.510 |
0.36–0.658 |
No |
West |
0.177 |
0.104–0.285 |
Yes |
West |
0.573 |
0.421–0.713 |
In addition to varying same state and region in predicting match probability, the last column shows the impact of varying average USMLE difference, USNWR rank, AOA, and interview count by ±1 SD of their mean. “Match probability” shows the prediction for mean interviewee characteristics. For example, comparing the first and second rows, being in the same state increased the estimating chance of matching. However, the table also shows that there are differences in the estimated probabilities among the regions.
Abbreviations: ACGME, Accreditation Council for Graduate Medical Education; AOA, Alpha Omega Alpha; SD, standard deviation; USMLE, US Medical Licensing Exam; USNWR, US News and World Report.
We quantified the value of this effect in relation to numbers of interviews and USMLE scores using ratios of the estimated odds ratios. See Supplemental Digital Content, Sections J and K, https://links.lww.com/AA/D139, for summary information on controlling variables of the logistic regression. An interviewee living in the same state as the interviewing program would have 5.42 fewer total interviews (97.5% confidence interval, 3.02–7.81). The 5.42 interview-effect was comparable to the mean interview counts of 6.01 per interviewee, with 4 interviews at the 25th and 8 at the 75th percentiles. Measured another way, the same-state effect had an equivalent value as an approximately 4.14 USMLE points-difference from the program’s mean (97.5% confidence interval was 2.34–5.94 USMLE points). For comparison, USMLE scores difference from the program’s admitted mean ranged from −10.37 points at the 25th percentile to 5.01 at the 75th percentile. Thus, an interviewee from the same state as the program has greater odds of matching. An interviewee with a 4.14 higher USMLE score receives the same approximate increase in odds of matching as the increase garnered by an interviewee who is from the same state. For details, see Supplemental Digital Content, Section H, https://links.lww.com/AA/D139.
There are 26 of 50 US states with more than 1 anesthesiology program per state. We analyzed if affiliated medical school was the driver of the effect attributed to same-state. Affiliated medical school is an ACGME-defined term to describe the relationship of medical schools to the sponsoring institutions (which oversee graduate medical education). Inclusion of a Boolean indicator of whether the interviewee attended the affiliated medical school did not improve our model; same-state remained significant (P < .0001) while affiliated medical school was not (P = .40).
DISCUSSION
This study represents a novel use of interview data from multiple anesthesiology departments to quantitate the effect of an applicant being from the same state on the probability of matching at a residency program. While this topic has previously been studied using data regarding where applicants have matched,7 our study includes data regarding where applicants interviewed but did not match. The potential time and cost savings to programs and interviewees is by focusing on the latter, did not match but interviewed.
Logically, applicants with more interviews have a smaller probability of matching at any 1 program, and that was known previously. In addition, being from the same state acting as a strong predictor is unsurprising. Our multivariable model (Figure 2) shows the useful, novel result that there is negligible additional predictive value between neighboring states for any region. This implies traditionally held beliefs about regionality in some specialties do not hold as strong an influence as previously thought in this cohort.13,14
Our analysis also provided a quantitative effect of USMLE Step 1 and 2 CK scores on numbers of interviews needed to match. Since programs’ primary selection for interviewing has historically been based on USMLE scores6 (although USMLE Step 1 will be pass/fail by January 2022), our results show that early determination by same-state versus out-of-state would add value to programs when making invitation lists. Given the high costs of interviewing to both applicants and programs, reducing total interviews by approximately 4 per applicant is a significant and specific recommendation for all stakeholders. Per season costs for matched applicants typically exceeds $10,000 per individual,2,15–17 with the NRMP reporting matched applicants ranking 10–11 programs.18 Residency programs have reported spending approximately $150,000–$200,000 per recruitment cycle3,19 with NRMP data showing average rank order lists exceeding 70 applicants during the past 5 years.18 Our long-term objective is to increase program and applicant knowledge for match success, with data-driven improvement practices leading to a less costly and stressful application season.
Comparison of Our Results With Previous Studies
Although we are unaware of a previous study that incorporated interview data, studies in other specialties have illustrated associations of geography, and examination scores, with residency match outcomes. For emergency medicine, postmatch surveys indicated the importance of program location to applicants.20 For orthopedics, being from the same medical school (ie, state) was an important determinant in matching to a program.21 Gastroenterology trainees have been shown to match in training programs that are from their same home state.13
Regarding anesthesiology recruitment and the match, small studies have examined the impact of interview timing,22 the commitment statement,23 and social media.24,25 Our study is unique in the specialty, incorporates the largest set of applicant data, and quantifies this effect. In addition, for anesthesiology, the validity of same-state as an important variable is known for the job decision immediately after training.26,27 This underscores the importance of smart recruitment for these residency positions.
Limitations
In using the “current address” field provided by interviewees, we may have under or overestimated the effect of same-state. Future application of this analysis would benefit from elaboration on the relationship between applicant permanent address, current address, and location of medical school to further clarify the state-match relationship.
Geographic ordering has weaknesses. Defining by state boundary may not account for actual distance or feasibility of travel by air or land. For example, though Nashville, Tennessee, and Little Rock, Arkansas, are capitals of neighboring states, there is no direct flight between them, requiring a greater than 5-hour drive. This might explain why same-state had a bigger effect on interviews needed to match as opposed to region. Future studies could examine a lagged model looking at travel via airport hubs impacting recruitment.
Our sample included records for all interviews among Thalamus’ participating programs, but not all interviews for interviewees. The 97.5% confidence interval for the value of the interviewee being from the same state on the odds of matching was 3.02 to 7.81 interviews. The confidence interval’s width is 4.79 interviews, where 4.79 = 7.81 − 3.02. That width was similar to the difference between the 25th and 75th percentile of interview counts for our data, 4.0 interviews (see Results). This correspondence of interval widths demonstrates the impact of same-state at the given sample size. However, it is important that the difference of 4.0 interviews is based only on the available subset of interviewees’ interviews. This correspondence could change with a more complete dataset. If interviewees have additional, unaccounted for interviews, this would grow the perceived impact of same-state beyond the effect currently measured (ie, a more complete data set of all interviews attended by a candidate across all programs would likely show an even stronger effect of same-state than currently measured).
Our study analyzed allopathic applicants and programs, as osteopathic anesthesiology programs were not incorporated into the ACGME system. The recent merger of the American Osteopathic Association with the ACGME will likely impact this relationship as more applicants enter the pool. In addition, the study of non-US seniors in further analyses will be important, too. Using our approach to analyze larger data sets, and those from other specialties, has the potential to provide predictions for individual applicants and programs.
Our article examined how to model the relationship between interviews and matching, specifically showing that the relevant effect of geography can reasonably be represented as being from the same state. Future studies can investigate how to mitigate candidates applying to many more programs than needed to match and the minimum number of interviews to ensure candidates/programs match/fill with a specified probability. As the economic effects of not matching are substantial and diversity in recruitment is important, further data are needed regarding the information on behavior change by both applicants and programs.
CONCLUSIONS
Our analysis of anesthesiology residency recruitment using previously unstudied interview data shows that same-state locality is a viable predictor of residency matching and should be strongly considered when evaluating whether to interview an applicant.
DISCLOSURES
Name: Ephy R. Love, MS.
Contribution: This author helped conceive the study idea, design and conduct the study, collect and analyze the data, and write the article.
Conflicts of Interest: E. R. Love is a minority shareholder of Thalamus.
Name: Franklin Dexter, MD, PhD, FASA.
Contribution: This author helped design the study, analyze the data, and write the article.
Conflicts of Interest: The Division of Management Consulting of the University of Iowa’s Department of Anesthesia provides consultations to hospitals, individuals, and corporations, including Thalamus. F. Dexter receives no funds personally other than salary and allowable expense reimbursements from the University of Iowa and has tenure with no incentive program. His family and he have no financial holdings in any company related to his study, other than indirectly through mutual funds for retirement. Income from the Division’s consulting work is used to fund Division research.
Name: Jason I. Reminick, MD, MBA, MS.
Contribution: This author helped conceive the study idea; design, supervise, and conduct the study; collect and analyze the data; and write the article.
Conflicts of Interest: J. I. Reminick is a cofounder, majority owner, and executive leader of Thalamus.
Name: Joseph A. Sanford, MD.
Contribution: This author helped design the study, analyze the data, and write the article.
Conflicts of Interest: J. A. Sanford is a minority shareholder of Thalamus.
Name: Suzanne Karan, MD, FASA.
Contribution: This author helped design the study, collect and analyze the data, and write the article.
Conflicts of Interest: S. Karan is a cofounder, majority owner, and executive leader of Thalamus.
This manuscript was handled by: Edward C. Nemergut, MD.
References
1. Van Dermark JT, Wald DA, Corker JR, Reid DG. Financial implications of the emergency medicine interview process. AEM Educ Train. 2017;1:60–69.
2. Fogel HA, Finkler ES, Wu K, Schiff AP, Nystrom LM. The economic burden of orthopedic surgery residency interviews on applicants. Iowa Orthop J. 2016;36:26–30.
3. Gardner AK, Grantcharov T, Dunkin BJ. The science of selection: using best practices from industry to improve success in surgery training. J Surg Educ. 2018;75:278–285.
4. Callaway P, Melhado T, Walling A, Groskurth J. Financial and time burdens for medical students interviewing for residency. Fam Med. 2017;49:137–140.
5. Nilsen K, Callaway P, Phillips JP, Walling A. How much do family medicine residency programs spend on resident recruitment? A CERA study. Fam Med. 2019;51:405–412.
6. Hartman ND, Lefebvre CW, Manthey DE. A narrative review of the evidence supporting factors used by residency program directors to select applicants for interviews. J Grad Med Educ. 2019;11:268–273.
7. Johnson AP, Svider PF, Folbe AJ, et al. An evaluation of geographic trends in the otolaryngology residency match: home is where the heart is. JAMA Otolaryngol Head Neck Surg. 2015;141:424–428.
8. Accreditation Council for Graduate Medical Education (ACGME) - Public Page. Available at:
https://apps.acgme.org/ads/public/. Accessed June 16, 2018. Accessed October 16, 2018.
9. Doximity Residency Navigator 2018–2019. Available at:
https://residency.doximity.com/. Accessed June 16, 2018. Doximity Residency Navigator. Available at:
https://s3.amazonaws.com/s3.doximity.com/mediakit/Doximity_Residency_Navigator_Survey_Methodology.pdf. Accessed June 16, 2018.
10. US News and World Report 2019 Best Medical Schools. Available at:
https://www.usnews.com/best-graduate-schools/top-medical-schools. Accessed June 16, 2018. Available at:
https://www.usnews.com/best-graduate-schools/top-medical-schools. Accessed October 1, 2018.
11. NRMP Program Results 2015-2019 Main Residency Match®. 2018. Available at:
https://mk0nrmp3oyqui6wqfm.kinstacdn.com/wp-content/uploads/2019/04/Program-Result-2015-2019.pdf. Accessed June, 30, 2020.
12. Kahle D, W H. Spatial visualization with ggplot2. R J. 2013;5:144–161.
13. Atsawarungruangkit A, Chenbhanich J, Phupitakphol T, Dickstein G. Landing a GI fellowship: the match and the map. Dig Dis Sci. 2018;63:605–609.
14. Gebhard GM, Hauser LJ, Dally MJ, Weitzenkamp DA, Cabrera-Muffly C. Do otolaryngology residency applicants relocate for training?. Laryngoscope. 2016;126:829–833.
15. Camp CL, Sousa PL, Hanssen AD, et al. The cost of getting into orthopedic residency: analysis of applicant demographics, expenditures, and the value of away rotations. J Surg Educ. 2016;73:886–891.
16. Blackshaw AM, Watson SC, Bush JS. The cost and burden of the residency match in emergency medicine. West J Emerg Med. 2017;18:169–173.
17. Agarwal N, Choi PA, Okonkwo DO, Barrow DL, Friedlander RM. Financial burden associated with the residency match in neurological surgery. J Neurosurg. 2017;126:184–190.
19. Brummond A, Sefcik S, Halvorsen AJ, et al. Resident recruitment costs: a national survey of internal medicine program directors. Am J Med. 2013;126:646–653.
20. DeSantis M, Marco CA. Emergency medicine residency selection: factors influencing candidate decisions. Acad Emerg Med. 2005;12:559–561.
21. Cox RM, Sobel AD, Biercevicz A, Eberson CP, Mulcahey MK. Geographic trends in the orthopedic surgery residency match. J Grad Med Educ. 2018;10:423–428.
22. Avasarala S, Thompson E, Whitehouse S, Drake S. Assessing correlation of residency applicants’ interview dates with likelihood of matching. South Med J. 2018;111:83–86.
23. Moran KR, Schell RM, Smith KA, et al. Do you really mean it? Assessing the strength, frequency, and reliability of applicant commitment statements during the anesthesiology residency match. Anesth Analg. 2019;129:847–854.
24. Smith BB, Long TR, Tooley AA, Doherty JA, Billings HA, Dozois EJ. Impact of doximity residency navigator on graduate medical education recruitment. Mayo Clin Proc Innov Qual Outcomes. 2018;2:113–118.
25. Renew JR, Ladlie B, Gorlin A, Long T. The impact of social media on anesthesia resident recruitment. J Educ Perioper Med. 2019;21:E632.
26. Dexter F, De Oliveira GS Jr, McCarthy RJ. First job search of residents in the United States: a survey of anesthesiology trainees’ interest in academic positions in cities distant from previous residences. A A Case Rep. 2016;6:34–38.
27. Wachtel RE, Dexter F. Training rotations at hospitals as a recruitment tool for certified registered nurse anesthetists. AANA J. 2012;80:S45–S48.