Secondary Logo

Journal Logo

Research Reports

Applicant Reactions to the AAMC Standardized Video Interview During the 2018 Application Cycle

Deiorio, Nicole M. MD; Jarou, Zachary J. MD; Alker, Ashely MD, MPH; Bird, Steven B. MD; Druck, Jeffrey MD; Gallahue, Fiona E. MD; Hiller, Katherine M. MD, MPH; Karl, Erin MD; Pierce, Ava E. MD; Fletcher, Laura MA; Dunleavy, Dana PhD

Author Information
doi: 10.1097/ACM.0000000000002842

Abstract

Physicians need more than academic knowledge and technical skills to be successful in practice; they also need strong interpersonal skills, emotional intelligence, teamwork skills, and professionalism, among other competencies.1–5 The current residency selection process emphasizes academic metrics in assessing applicants, however, which may result in underemphasizing behavioral competencies and inadvertently signal that academics are more important than other competencies. In response, the residency community has called for new selection tools that assess a broader array of the competencies required for success in residency.6–9 Such tools may help residency programs decide whom to invite to in-person interviews and may help widen the pool of applicants invited. These tools may also help balance a prior emphasis on United States Medical Licensing Examination (USMLE) Step exam scores.

To address this gap in the residency selection process, the Association of American Medical Colleges (AAMC) in 2016 introduced the Standardized Video Interview (SVI), an innovative tool that could be a useful supplement when used alongside USMLE Step exam scores and other application materials to select applicants for in-person interviews. The AAMC SVI (www.aamc.org/svi) is an asynchronous online video interview that presents applicants with 6 questions designed to measure 2 Accreditation Council for Graduate Medical Education competencies: interpersonal and communication skills and professionalism2 (renamed “knowledge of professional behavior” for the SVI). An example SVI question is: “Describe a situation in which you were successful in communicating a difficult message. How did you communicate the message? What was the outcome?”

The AAMC partnered with the academic emergency medicine (EM) community to test this new selection tool. The SVI was available for pilot operational use in selection decision making for the entering class of 2018 (Electronic Residency Application Service [ERAS] 2018 cycle) and was administered at no cost to applicants. The SVI was required by EM, but applicants were not required to complete the SVI to submit an ERAS application. Individual EM residency programs decided whether to incorporate the SVI into their selection process.

Applicants’ attitudes about an assessment, particularly perceived fairness, may affect their performance on the assessment, attitudes toward the sponsoring organization, well-being, and likelihood of accepting a future job offer.10–13 The residency selection process is a highly stressful period that requires applicants to outlay considerable financial and time resources. In this context, we think it is important to study applicant reactions to the SVI given concerns about well-being in the current medical training and practice environment. In addition, information about applicant reactions may be used to help improve the applicant experience and attitudes toward the AAMC and programs that require the SVI.

This study used 2 surveys to examine applicants’ reactions to the SVI in its first year of operational use. Research from the employment domain suggests that applicants have more negative reactions to technology-mediated interviews than in-person interviews.14 They also have more negative reactions to highly structured interviews.15,16 We hypothesized that applicants would have a neutral-to-positive experience with procedural aspects of the SVI. We also hypothesized that applicants would have generally negative reactions to the addition of a technology-mediated and highly structured assessment to the residency selection process. Given concerns in the EM community about potential for bias and burden on applicants, we also conducted exploratory analyses to investigate whether applicants’ attitudes about the SVI differed by demographic group and/or applicant type, as well as whether there were self-reported changes in application strategy/behavior due to the SVI.

Method

All individuals who self-classified as EM applicants applying in the ERAS 2018 cycle were instructed to complete the online SVI in summer 2017. Applicants who completed the SVI in standard (nonaccommodated) conditions logged into the SVI site, where they were given up to 30 seconds to read each written question and prepare a response; they were then given up to 3 minutes to respond to each question. Responses were recorded by webcam through the interview platform (HireVue, South Jordan, Utah). The response to each question was scored on a rubric ranging from a low of 1 (rudimentary) to a high of 5 (exemplary). Ratings for the 6 questions were summed to create an SVI total score (range 6–30).

All 3,532 applicants who completed the SVI were invited to participate in 2 voluntary online Verint surveys (Verint Systems, Melville, New York) about their experience preparing for and taking the SVI. The survey questions were based on Hausknecht and colleagues’ framework12 of application reactions. They were modified to the EM context based on feedback from a team of subject matter experts in EM (students, residents, program directors, faculty, clerkship directors) and SVI staff.

Each survey was reviewed by the AAMC Human Subjects Research Protection Program and determined to be exempt by the institutional review board of the American Institutes for Research. Individuals provided explicit consent for their data to be used, and, when possible, their responses were linked to demographic information that was collected for research purposes when they completed the SVI.

Survey 1

Immediately after completing the SVI in summer 2017, applicants were invited to complete an online survey evaluating the SVI. No reminder emails were sent to nonresponders. The survey took approximately 5 minutes to complete and included 4 questions about preparation for the SVI (not included in this analysis17) and 7 questions about general reactions to procedural aspects of taking the SVI. Applicants answered questions using 5-point Likert-type scales (e.g., 1 = strongly disagree to 5 = strongly agree), multiple-choice items, and write-in responses.

Survey 2

Applicants who completed the SVI were invited to participate in the second online survey via email after SVI total scores were released in fall 2017. The survey was open in October and November and took less than 15 minutes to complete. Reminder emails were sent to nonresponders. The survey included 29 questions about perceptions of the current residency selection process, the SVI experience and total scores, and the future of the selection process. Applicants answered questions using 5-point Likert-type scales (e.g., 1 = strongly disagree to 5 = strongly agree), multiple-choice items, and write-in responses. Survey 2 responses were linked to SVI scores for applicants who provided email addresses on the survey; only the responses that could be linked to SVI scores were retained for analyses in this study. Responses from 1,401 applicants were excluded from analyses of survey 2 data because of inconsistent email addresses in the survey and in ERAS.

Statistical analyses

The unit of analysis was the individual applicant. All analyses were conducted using SPSS version 19 (IBM Corp., Armonk, New York). Descriptive statistics were computed, including counts, percentages, means, and standard deviations (SDs). Reactions were compared for different demographic groups, SVI total scores, and USMLE Step 1 scores using t tests and ANOVAs. We also examined potential moderators using hierarchical regression.

Results

Survey 1

Applicants.

For survey 1, 82.3% (2,906/3,532) of applicants responded to at least 1 question. As shown in Table 1, the mean SVI total and USMLE Step 1 scores and the demographic composition of the survey 1 and overall SVI samples were similar. Of applicants who responded to survey 1 and could be matched with demographic data, the majority were white (59.4%; 889/1,412), male (64.4%; 964/1,496), and attendees of U.S. MD-granting institutions (US-MDs) (56.3%; 842/1,491).

Table 1
Table 1:
Characteristics of Final Survey and Overall AAMC Standardized Video Interview (SVI) Samples by Demographic Groupa

Reactions to procedural aspects of the SVI.

The majority of applicants (88.5%; 2,439/2,755) agreed or strongly agreed (hereafter agreed) the SVI instructions were clear (Table 2). While 79.5% (2,175/2,735) agreed they had sufficient time to respond to the interview questions, 49.0% (1,347/2,749) agreed they had enough time to read and prepare answers for questions. Almost half were satisfied with the SVI preparation materials provided by the AAMC (45.9%; 1,262/2,748).

Table 2
Table 2:
Applicant Reactions and Experiences Taking the AAMC Standardized Video Interview (SVI), Survey 1a,b

A majority of applicants (66.9%; 1,851/2,765) agreed that the SVI content was related to the types of activities they perceived as required of residents, but only 31.4% (869/2,764) agreed that the SVI would help program directors conduct a more holistic evaluation of applicants. Overall, 38.1% (1,045/2,765) were satisfied with the SVI, 29.5% (809/2,765) were neutral, and 32.4% (886/2,765) were not satisfied. There were no differences in applicants’ reactions to procedural aspects of the SVI by race/ethnicity. Women were slightly more satisfied with the preparation materials and the SVI overall, and men were more likely to agree that they had enough time to prepare for interview questions (data not shown). Compared with US-MDs, attendees of DO-granting medical schools, U.S. citizen attendees of international medical schools, and non-U.S. citizen attendees of international medical schools often had slightly more positive responses to the procedural aspects of the SVI.

Survey 2

Applicants.

All applicants received their SVI total scores before being invited to participate in survey 2, as described above. The response rate was 58.7% (2,074/3,532). As shown in Table 1 (and Supplemental Digital Appendix 1 at http://links.lww.com/ACADMED/A701), the mean SVI total score and USMLE Step 1 score and the demographic composition of the survey 2 and overall SVI samples were similar. The majority of applicants who responded to survey 2 were white (61.1%; 1,264/1,947), male (63.5%; 1,317/2,067), and US-MDs (64.0%; 1,327/2,066).

Perceptions of the current residency selection process.

The majority of applicants (71.3%; 1,461/2,048) were satisfied or very satisfied (hereafter satisfied) with the information currently available to program directors to use in deciding whom to invite to in-person interviews (Table 3). Between 70% and 90% of applicants reported that they were satisfied with the information about their interpersonal and communication skills and knowledge of professional behavior provided by personal statements, letters of evaluation, the Medical School Performance Evaluation, the electronic Standardized Letter of Evaluation (eSLOE), and the in-person interview (Supplemental Digital Appendix 2 at http://links.lww.com/ACADMED/A701). Satisfaction with the current selection process differed slightly by race/ethnicity and applicant type, with black, Hispanic, and non-US-MD applicants reporting slightly less positive reactions to the current selection process. There were no differences by gender (data not shown).

Table 3
Table 3:
Applicant Perceptions of the AAMC Standardized Video Interview (SVI), Survey 2a,b

Applicants’ satisfaction with information provided by the SVI was associated with SVI total score. Applicants with higher SVI scores were neutral or more satisfied with the information provided about their interpersonal and communication skills and knowledge of professional behavior. For example, as shown in Figure 1, 40% of applicants who scored in the top quartile of SVI scores reported being satisfied with the information provided about their interpersonal and communication skills compared with 3% of applicants who scored in the bottom quartile.

Figure 1
Figure 1:
Satisfaction with information about interpersonal and communication skills provided by the AAMC Standardized Video Interview (SVI) as reported by applicants (n = 1,956)a responding to survey 2, by SVI total score quartile.b Survey 2 was administered in fall 2017, about 1 month after SVI scores were released. The pattern of results for the same question about knowledge of professional behavior was similar (Supplemental Digital Appendix 2 at http://links.lww.com/ACADMED/A701). Abbreviation: AAMC indicates Association of American Medical Colleges. aSurvey participants were applicants who indicated interest in applying to emergency medicine residency programs for the Electronic Residency Application Service 2018 cycle and completed the SVI in summer 2017. bSVI total scores could range from 6 to 30. SVI cutoff scores for each quartile in this analysis were as follows: < 25th percentile = 6 to 16 (n = 366); 25th to 49th percentile = 17 to 18 (n = 429); 50th to 75th percentile = 19 to 21 (n = 731); > 75th percentile = 22 to 30 (n = 424).

More than half of applicants were satisfied with program directors’ use of USMLE Step 1 and Step 2 Clinical Knowledge (CK) scores as filters in the residency selection process, with 55.1% (1,072/1,947) and 64.1% (1,247/1,944) reporting satisfaction, respectively (Supplemental Digital Appendix 2 at http://links.lww.com/ACADMED/A701). Between 22.5% and 26.5% of applicants reported they were satisfied with program directors’ use of Alpha Omega Alpha Honor Medical Society membership, Gold Humanism Honor Society membership, and medical school attended as filters. Only 10.0% (192/1,923) reported they were satisfied with the use of the SVI as a filter. This percentage increased slightly to 14.9% (168/1,132) when considering just those whose SVI total scores were at or above the 50th percentile. Satisfaction with the use of filters in the current selection process differed slightly by race/ethnicity and applicant type, with black, Hispanic, and non-US-MD applicants reporting slightly less positive reactions to the use of filters in the current selection process. There were no differences by gender (data not shown).

Reactions to and experiences taking the SVI.

Less than one-quarter of applicants agreed the SVI gave them an opportunity to describe their interpersonal and communication skills (20.1%; 388/1,928) or knowledge of professional behavior (19.9%; 384/1,926). Only 18.6% (360/1,936) agreed the content of SVI questions was related to the types of activities they perceived as required of residents. (The same question appeared on survey 1, where 66.9% [1,851/2,765] of applicants agreed that the content was relevant.)

Half of applicants (50.0%; 941/1,883) agreed they were able to answer SVI questions by describing their past experiences, and 46.9% (881/1,878) agreed they were able to answer SVI questions by drawing on their training and experience to describe what they would or should do in a hypothetical situation.

Perceptions of SVI scores.

Applicants’ beliefs about whether SVI total scores accurately reflected their levels of interpersonal and communication skills and knowledge of professional behavior were associated with their scores. Applicants who scored in the 50th percentile or higher of SVI scores reported that SVI scores more accurately reflected their level of the competencies (mean [SD] = 2.16 [1.07]) compared with those who scored in the bottom half of SVI scores (mean [SD] = 1.23 [0.56]; t (1057.22) = −18.72; P < .001). Applicants who scored in the bottom half of SVI scores also were more dissatisfied with their SVI scores (mean [SD] = 1.33 [0.71]) compared with those who scored in the top half of SVI scores (mean [SD] = 2.68 [1.13]; t(1014.21) = −24.01; P < .001).

The majority of applicants (84.9%; 1,610/1,897) reported that their SVI score did not affect the number of applications they submitted to EM programs during the ERAS 2018 cycle.

As shown in Figure 2, the relationship between Step 1 scores and applicants’ beliefs that the SVI could be used to balance the use of academic metrics in deciding whom to invite for in-person interviews was moderated by Step 1 scores. There was a positive relationship between SVI total scores and the belief that SVI scores could be used to balance the use of academic metrics in deciding whom to invite for in-person interviews. That effect was stronger for applicants who had Step 1 scores below the 50th percentile compared with applicants who had Step 1 scores at or above the 50th percentile. Applicants with lower Step 1 scores and higher SVI scores had the most positive attitudes about the SVI’s potential to balance the use of academic metrics.

Figure 2
Figure 2:
The interaction between USMLE Step 1 score (possible range 1–300) and AAMC Standardized Video Interview (SVI) total score (possible range 6–30) influencing perceptions of the SVI. Applicants (n = 1,653)a who responded to survey 2 in fall 2017, about 1 month after their SVI scores were released, answered the following question using a 5-point Likert-type scale (1 = strongly disagree to 5 = strongly agree): “The Standardized Video Interview provides information about nonacademic qualifications that may help balance the use of academic metrics in deciding whom to invite for in-person interviews.” The results were significant (ΔR 2 = 0.015; F(3, 1649) = 172.76; P < .001). Abbreviations: USMLE indicates United States Medical Licensing Examination; AAMC, Association of American Medical Colleges. aSurvey participants were applicants who indicated interest in applying to emergency medicine residency programs for the Electronic Residency Application Service 2018 cycle and completed the SVI in summer 2017.

Discussion

As we hypothesized, applicants had a generally positive or neutral experience with procedural aspects of the SVI but generally had negative reactions to the SVI itself. However, the extent of the negative reactions was associated with (1) the aspect of the SVI being studied and (2) applicants’ SVI scores.

We found that applicants were satisfied with most procedural aspects of the SVI. They reported that instructions were clear and that they had sufficient time to respond to questions; however, some indicated that they did not have enough time to read and prepare answers to questions. Most reported being able to draw on past experiences to provide specific examples or provide hypothetical responses explaining what they would or should do in a given situation. Additional instructions may be needed to encourage applicants to provide hypothetical responses to scenarios which they have not directly encountered during their training. This finding is important from a program evaluation perspective and suggests that SVI procedures and preparation materials were easy to understand and considered fair by most applicants.

It is important to note that the SVI only included questions that were rated as relevant by EM program directors and faculty.18 There were contradictory findings about applicants’ perceptions regarding the relevance of the SVI questions to residency, however: In survey 2, less than 20% of applicants agreed that SVI content was related to the activities of residents compared with over 60% of applicants in survey 1. This swing toward negative attitudes may be due to misremembering SVI content, changes in attitudes about the SVI over time, and/or different exposure to residency training (through clerkship rotations) between surveys 1 and 2; it also may be possible that the discrepant findings were influenced by applicants’ knowledge of their SVI scores prior to the second survey. In light of the discrepancy between these responses on surveys 1 and 2, more research is needed to understand applicants’ perceptions of face validity of the SVI.

Our findings suggest that applicants were largely satisfied with the information already available about their interpersonal and communication skills and knowledge of professional behavior in the residency selection process. This finding was not surprising because EM program directors place a great deal of emphasis on the eSLOE, which is intended to provide information about behavioral competencies.19 Surprisingly, applicants also reported high levels of satisfaction with the use of USMLE Step 1 and Step 2 CK scores as filters. These findings were unexpected, given that student representatives, medical schools, and programs have called for changing the residency selection process to de-emphasize the role of Step 1 scores and broaden the competencies assessed during the selection process.6 As applicants were asked to reflect on the current residency selection process and the SVI simultaneously, these findings may be more of a reflection of applicants’ dissatisfaction with the SVI than satisfaction with the current selection process.

The majority of applicants were not satisfied with the information provided by the SVI about their interpersonal and communication skills and knowledge of professional behavior or the possible use of the SVI as a filter. However, about 20% of applicants had neutral reactions to the SVI. Overall, applicants also were generally dissatisfied with their SVI total scores, and this effect was stronger for applicants with low scores.

Low levels of satisfaction with the SVI are predictable given research conducted in the employment domain indicating that applicants report low levels of satisfaction with technology-mediated14 and highly structured interviews.15,16 These negative reactions may be a result of feelings of lack of control and inability to personalize or tell one’s story in a highly structured context. Applicants’ feelings of control could also have been diminished because program directors’ use of the SVI in selection decisions was unclear. Applicants and advisors rely on past experience and the National Resident Matching Program’s program directors survey (administered every 2 years) for information about how data will be used in the selection process.20 This information was not available for the SVI for the ERAS 2018 cycle, which could have contributed to heightened anxiety and feelings of diminished control. In addition, while the AAMC’s SVI preparation materials were well received by applicants, there was limited information available for students and their advisors about how to prepare for the SVI.

Applicants also may have felt that, compared with information provided by the SVI, information provided by current tools (such as the eSLOE and in-person interviews) better reflected their proficiency in interpersonal and communication skills and knowledge of professional behavior because that information was gathered over a longer period of time and based on in-person interactions. In the eSLOE, for example, ratings are based on observation over a 4-week period, compared with the 18 minutes of the SVI, likely giving the eSLOE more face validity as an assessment of these competencies.

Limitations and future directions

There were several limitations to this study. Timing of the survey administrations and self-selection could have affected the generalizability of results. Applicants who participated in survey 2, which occurred 2 to 4 months after they completed the SVI, may not have had accurate memories of their SVI experience, and/or their responses could have been affected by their SVI scores and feedback from other applicants. Additionally, only 58.7% of SVI applicants replied to survey 2; these applicants may have held different attitudes than all SVI applicants. However, respondents to both surveys were similar with respect to demographics, SVI scores, and Step 1 scores. The limited overlap in questions asked in both surveys provides a small window into how applicants’ opinions of the SVI changed. Finally, because of concerns about applicants’ time constraints, the surveys were relatively short.

Future research should explore applicant reactions to the SVI in more detail, using qualitative methods such as interviews or focus groups, linking some survey questions to EM specifically, and/or expanding to study different types of reactions.15 In addition, the AAMC and partner organizations should conduct ongoing research to study applicant reactions and experiences taking the SVI to explore whether reactions change over time and to identify process improvements that could make the experience easier to navigate and more positive for applicants. Information about how program directors use SVI data and about the correlation between SVI scores and performance in residency may also change applicant perceptions of this tool in the future. More broadly, research on how test preparation affects performance on the SVI and the correlation between SVI scores and trainee performance in residency (e.g., milestone assessments, peer ratings, clinical competency committee ratings) is needed to understand the utility of the SVI.

Conclusions

Findings from the first operational administration of the SVI suggest that most applicants were skeptical of its ability to assess interpersonal and communication skills and knowledge of professional behavior and its potential to add value to the residency selection process. SVI scores were associated with these reactions, with applicants who had higher SVI scores having slightly less negative reactions than those with lower SVI scores. Applicants reported generally positive reactions to many procedural aspects of the SVI, suggesting that preparation materials and instructions about taking the SVI were clear and easy to use. Applicant acceptance and appreciation of the SVI will be critical to its acceptance by the graduate medical education community.

Acknowledgments:

The authors wish to acknowledge members of the Emergency Medicine Standardized Video Interview working group: Ashely Alker, MD, MPH (University of California, San Diego School of Medicine); Andra Blomkalns, MD (Stanford University School of Medicine); Steve Bird, MD (University of Massachusetts Medical School); Mary Calderone Hass, MD (University of Michigan Health System); Nicole Deiorio, MD (Virginia Commonwealth University School of Medicine); Ramnick Dhaliwal, MD, JD (Hennepin County Medical Center); Fiona Gallahue, MD (University of Washington); H. Gene Hern, MD (Highland Hospital); Yolanda Haywood, MD (George Washington University School of Medicine and Health Sciences); Katherine Hiller, MD, MPH (University of Arizona College of Medicine–Tucson); Zach Jarou, MD (University of Chicago Medical Center); Rahul Patwari, MD (Rush University Medical Center); Christopher Woleben, MD (Virginia Commonwealth University School of Medicine); and Richard Wolfe, MD (Harvard Medical School/Beth Israel Deaconess Medical Center).

References

1. Emanuel EJ, Gudbranson E. Does medicine overemphasize IQ? JAMA. 2018;319:651–652.
2. Holmboe ES, Edgar L, Hamstra S; Accreditation Council for Graduate Medical Education. The milestones guidebook. Version 2016. http://www.acgme.org/Portals/0/MilestonesGuidebook.pdf?ver = 2016-05-31-113245-103. Accessed April 12, 2019.
3. Association of American Medical Colleges. Core Entrustable Professional Activities for Entering Residency: Curriculum developer’s guide. https://store.aamc.org/core-entrustable-professional-activities-for-entering-residency.html. Accessed July 26, 2019.
4. Levinson W, Roter DL, Mullooly JP, Dull VT, Frankel RM. Physician-patient communication. The relationship with malpractice claims among primary care physicians and surgeons. JAMA. 1997;277:553–559.
5. Anhang Price R, Elliott MN, Zaslavsky AM, et al. Examining the role of patient experience surveys in measuring health care quality. Med Care Res Rev. 2014;71:522–554.
6. Prober CG, Kolars JC, First LR, Melnick DE. A plea to reassess the role of United States Medical Licensing Examination Step 1 scores in residency selection. Acad Med. 2016;91:12–15.
7. Bohm KC, Van Heest T, Gioe TJ, Agel J, Johnson TC, Van Heest A. Assessment of moral reasoning skills in the orthopaedic surgery resident applicant. J Bone Joint Surg Am. 2014;96:e151.
8. Harfmann KL, Zirwas MJ. Can performance in medical school predict performance in residency? A compilation and review of correlative studies. J Am Acad Dermatol. 2011;65:1010–1022.e2.
9. Stohl HE, Hueppchen NA, Bienstock JL. Can medical school performance predict residency performance? Resident selection and predictors of successful performance in obstetrics and gynecology. J Grad Med Educ. 2010;2:322–326.
10. Chan D, Schmitt N, DeShon RP, Clause CS, Delbridge K. Reactions to cognitive ability tests: The relationships between race, test performance, face validity perceptions, and test-taking motivation. J Appl Psychol. 1997;82:300–310.
11. Chan D, Schmitt N, Sacco JM, DeShon RP. Understanding pretest and posttest reactions to cognitive ability and personality tests. J Appl Psychol. 1998;83:471–485.
12. Hausknecht JP, Day DV, Thomas SC. Applicant reactions to selection procedures: An updated model and meta-analysis. Pers Psychol. 2004;57:639–683.
13. Truxillo DM, Bauer TN, McCarthy JM. Cropanzano RS, Ambrose ML. Applicant fairness reactions to the selection process. In: The Oxford Handbook of Justice in the Workplace. 2015:Oxford, UK: Oxford University Press; 621–640.
14. Blacksmith N, Willford JC, Behrend TS. Technology in the employment interview: A meta-analysis and future research agenda. Pers Assess Decis. 2016;2(1):article 2.
15. Campion MA, Palmer DK, Campion JE. A review of structure in the selection interview. Pers Psychol. 1997;50:655–702.
16. Levashina J, Hartwell CJ, Morgeson FP, Campion MA. The structured employment interview: Narrative and quantitative review of the research literature. Pers Psychol. 2014;67:241–293.
17. Jarou Z, Karl E, Alker A, et al. Factors affecting Standardized Video Interview performance: Preparation elements and the testing environment. EM Resident website. https://www.emra.org/emresident/article/svi-study-results. Published April 17, 2018. Accessed May 30, 2019.
18. Bird SB, Hern HG, Blomkalns A, et al. Innovation in residency selection: The AAMC Standardized Video Interview Acad Med. 2019;94:1489–1497.
19. Gallahue FE, Hiller KM, Bird S, et al. The AAMC Standardized Video Interview: Reactions and use by residency programs during the 2018 application cycle Acad Med. 2019;94:1506–1512
20. National Resident Matching Program. Results of the 2016 NRMP Program Director Survey. http://www.nrmp.org/wp-content/uploads/2016/09/NRMP-2016-Program-Director-Survey.pdf. Published June 2016. Accessed April 12, 2019.

Supplemental Digital Content

Copyright © 2019 by the Association of American Medical Colleges