Secondary Logo

Journal Logo

Research Reports

The AAMC Standardized Video Interview: Reactions and Use by Residency Programs During the 2018 Application Cycle

Gallahue, Fiona E. MD; Hiller, Katherine M. MD, MPH; Bird, Steven B. MD; Calderone Haas, Mary Rose MD; Deiorio, Nicole M. MD; Hern, H. Gene MS, MD; Jarou, Zachary J. MD; Pierce, Ava MD; Geiger, Thomas MA; Fletcher, Laura MA

Author Information
doi: 10.1097/ACM.0000000000002714

Abstract

In recent years, there has been a significant increase in the number of applications submitted to residency programs. The mean number of applications per program reached 1,025.7 in 2018, up from 862.2 in 2013.1 This influx has exerted tremendous stress on residency programs because of the number of applications they must review.2 Historically, residency programs have relied on academic metrics, such as United States Medical Licensing Examination (USMLE) Step exam scores and clerkship grades, to assess applicants’ qualifications for in-person interviews.3 Although these metrics play an important role, members of the residency community have expressed a desire for instruments capable of assessing applicants using a more holistic approach.4,5

A 2016 Association of American Medical Colleges (AAMC) survey revealed that program directors are least satisfied with information available about applicants’ interpersonal and communication skills and professionalism.6 Furthermore, program directors indicated that a lack of reliable information about these is a critical deficiency in the resident selection process—perhaps due to the importance of these Accreditation Council for Graduate Medical Education (ACGME) competencies for success not only as a resident but also later as an independent physician.7 To address these concerns, the AAMC developed and tested a new tool, the AAMC Standardized Video Interview (SVI), with the goal of providing residency programs with standardized, valid, and reliable data on applicants’ interpersonal and communication skills and professionalism to help balance the use of academic metrics in the selection process, such as in making decisions about which applicants to invite to in-person interviews and rank order.

The SVI is an asynchronous online video interview that presents residency applicants with six questions designed to measure their interpersonal and communication skills and knowledge of professional behavior. The applicant has up to three minutes to respond verbally to each question, and responses are recorded by the applicant’s computer webcam. Trained human raters score the video responses, resulting in an SVI total score summarizing the applicant’s performance. In summer 2017, applicants interested in applying to emergency medicine (EM) were asked to complete the SVI as part of their application in an operational pilot in the Electronic Residency Application Service (ERAS) 2018 cycle. The overall SVI participation rate was 84% (3,532 completed/4,229 invited); the final result was that 85% (3,469/4,060) of EM applicants completed the SVI.8 EM residency programs that agreed to the SVI terms and conditions were granted access to SVI total scores and videos for use in the ERAS 2018 cycle. The EM community and the AAMC partnered to evaluate programs’ use of the SVI.

Research from the employment domain suggests that users may have more negative reactions to technology-mediated interviews than in-person interviews9; however, research on employer reactions to new selection technology is limited.10 It is important to study user reactions to new tools to assess perceived interest and added value and to identify strategies to improve communication, training, and policies. To that end, we conducted two studies to evaluate how residency programs perceived and used the SVI total scores and videos during the ERAS 2018 cycle. In Study 1, we surveyed program directors about their use of and reactions to the SVI. In Study 2, we examined programs’ usage of videos in the selection process. Given this was the first year of operational use for the SVI, we anticipated that users would have mixed reactions to the SVI. Further, in light of the AAMC’s recommendation not to overemphasize SVI total scores,11 we hypothesized that programs would be cautious as they introduced these new scores into the selection process.

Method

The program director survey was reviewed by the AAMC Human Subjects Research Protection Program. It was determined to be exempt by the institutional review board of the American Institutes for Research because its purpose was to evaluate and improve an operational tool. Program directors consented to share their deidentified survey responses before completing the survey. Applicants consented to share their scores on completion of the SVI and all other applicant information when submitting the ERAS application.

Study 1: Program director survey

Participants.

Program directors from 175 ACGME-accredited EM residency programs that participated in the SVI program were invited to complete an online survey about their experience using the SVI during the ERAS 2018 cycle. These EM programs, which had agreed to the SVI terms and conditions and were provided access to their applicants’ SVI total scores and videos, represented 85% (175/205) of the EM programs invited to participate in the SVI program.

Survey overview.

The AAMC SVI staff collaborated with a small working group from the medical education community to develop a short survey to assess user (i.e., program) reactions to the SVI following the first year of operational use. The working group was composed of residency program directors, medical educators, and residents.

Multiple drafts of the survey were developed and reviewed by the survey working group. The final survey included 27 questions to gauge user reactions to SVI total scores and videos. It included four main topics: use of SVI total scores, use of SVI videos, perceptions of SVI resources, and future use of the SVI. Survey respondents answered questions using five-point Likert-type scales (e.g., 1 = strongly disagree to 5 = strongly agree), multiple-choice responses, and write-in responses. The survey took approximately 5 to 10 minutes to complete. (The survey questions are included, with program responses, in Supplemental Digital Appendices 1–4 at http://links.lww.com/ACADMED/A657.)

The survey was administered through Verint Enterprise Feedback Management (Verint Systems, Melville, New York), a survey platform used by the AAMC. The survey was open November 6–27, 2017. Program directors received an initial invitation via email and up to three reminder emails before the survey close date.

Study 2: Program usage of SVI videos

The 175 EM residency programs that agreed to the SVI terms and conditions had the opportunity to view SVI videos for their applicants. These could be accessed through the AAMC Program Director’s Workstation (PDWS), a dashboard that allows programs to view residency applications. The PDWS collected and stored metadata on the number of videos viewed by program users.

Video response views were recorded at both the program and individual PDWS user level. Views were recorded when a PDWS user loaded an individual video response (e.g., an applicant’s response to question 2 of 6) and clicked play. Individual views were counted regardless of how long the user viewed the video (e.g., five seconds vs three minutes). Each applicant who participated in the SVI had a maximum of six video responses that could be viewed.

Data analysis

The unit of analysis in both studies was the individual program. We included only one survey response per program as only one survey link was administered to each program. Survey responses were merged with existing AAMC data about responding programs’ characteristics (e.g., region, setting, number of residents) from the 2016 Match season, as those were the latest data available at the time of analysis in January 2018. National-level data on geographic region, setting, and program size were accessed using an existing AAMC Excel database containing data provided voluntarily by programs (n = 183).

Video response view data were obtained using Splunk (Splunk Inc., San Francisco, California), a machine data platform that recorded program activity in the PDWS system. We included video response views at the program level. These program-level data could include views by multiple PDWS users. Video response views are presented as medians to mitigate the potential influence of outliers.

Applicant-level demographic data, SVI total scores, and Step 1 scores were collected from the AAMC ERAS system. Demographic data, entered by applicants when completing the ERAS application during the 2018 ERAS cycle, included race/ethnicity and applicant type (i.e., attendee of a U.S. MD-granting medical school [US-MD]; U.S. citizen attendee of an international medical school [US-IMG]; non-U.S. citizen attendee of an international medical school [FMG]; and attendee of a DO-granting medical school [DO]). These data were used to examine potential differences in program SVI video response views across applicant demographic groups. We compared applicants with one or more video response views versus those with zero views across applicant demographic groups. We expected small-to-medium differences in video response views by demographic group.

Applicants’ SVI total scores and scores for first attempts on the USMLE Step 1 exam were included in the analysis to explore differences in program video response views across key performance metrics. Possible SVI total scores range from 6 to 30, with higher scores indicating higher proficiency on the targeted competencies (interpersonal and communication skills and knowledge of professionalism) as evaluated by SVI raters.

All analyses were conducted using SPSS for Windows version 19.0 (IBM Corp., Armonk, New York) or Microsoft Excel version 2013 (Microsoft Corp., Redmond, Washington). Descriptive statistics, including frequencies, percentages, means, medians, and standard deviations (SDs), were computed. Cohen’s h and t tests were used to evaluate group comparisons in program views of SVI videos. Cohen’s h is a statistical method of estimating the size of the difference in proportions from independent populations. It is a measure of practical effect and can be interpreted as follows: 0.20 = small effect size, 0.50 = medium effect size, and 0.80 = large effect size.12

Results

Study 1: Program director survey

Survey respondents.

A total of 125 programs responded to the program director survey (125/175; 71% response rate). The majority of respondents were program directors (115/125; 92%), and more than half had spent less than five years in their current role (71/125; 57%). Responding programs were geographically representative of the EM residency program cohort as a whole. Sixty-four percent (80/125) were university-based programs, 12% (15/125) were community-based programs, and 35% (44/125) were community-based, university-affiliated programs. This is similar to the EM national average of 49% (89/183) university-based programs, 10% (18/183) community-based programs, and 38% (70/183) community-based, university-affiliated programs. Responding programs had an average of 11 first-year residents and 37 total residents, compared with the EM program national average of 10 first-year residents and 34 total residents.

Use of the SVI.

Approximately half of the responding programs considered SVI total scores in the selection process for the ERAS 2018 cycle (67/125; 54%). The most common reported use of the scores was as a tiebreaker between applicants with similar profiles (26/67; 39%) (Table 1). More than two-thirds of programs reported that SVI scores were not important in deciding whom to invite to an in-person interview (85/122; 70%). Most programs did not take missing SVI scores into consideration in making selection decisions, instead focusing on other aspects of the application (78/82; 95%), and the majority did not plan to ask applicants why they did not take the SVI when conducting in-person interviews (106/122; 88%) (Supplemental Digital Appendices 1 and 2 at http://links.lww.com/ACADMED/A657).

Table 1
Table 1:
Reported Use of AAMC Standardized Video Interview (SVI) Total Scores by 125 Emergency Medicine Residency Programs During the ERAS 2018 Cyclea

As shown in Table 1, the methods used by programs to infer meaning from SVI total scores varied considerably. The most common process was watching a sample of videos with different SVI total scores (41/124; 33%), followed by using the SVI score distribution and percentile rank tables (36/124; 29%) and compar ing with other relevant application information (e.g., electronic Standardized Letter of Evaluation, Medical Student Performance Evaluation, personal statement) (34/124; 27%).

Approximately half of the responding programs did not consider SVI scores at any point in the selection process (58/125; 46%). Common reasons for this were uncertainty about how to incorporate SVI scores into the selection process (36/58; 62%) and waiting for additional research on the utility of SVI scores before incorporating them into the selection process (33/58; 57%).

Of the programs that watched video responses, a majority reported watching videos out of curiosity (71/89; 80%) and to understand the range of SVI scores (56/89; 63%). Of the programs that did not watch videos, over two-thirds indicated that they did not have time to watch videos (23/33; 70%).

Reactions to SVI total scores.

Approximately one-third of the programs that used SVI total scores in the ERAS 2018 cycle agreed that the scores contributed unique information to the selection process (18/62; 29%) and helped them compare interpersonal and communication skills and professionalism between applicants from different medical schools (20/61; 33%). More than one-third agreed that SVI scores were easy to use (24/61; 39%) (Supplemental Digital Appendix 3 at http://links.lww.com/ACADMED/A657).

As shown in Figure 1, more than half of the responding programs reported being at least somewhat likely to use SVI scores (55/97; 57%) and videos (52/99; 53%) as part of the application process in the ERAS 2019 cycle. Additionally, about two-thirds of programs that used the SVI in the ERAS 2018 cycle (39/62; 63%) indicated that they would be at least somewhat likely to recommend the SVI to other residency faculty. Approximately one-third of responding programs indicated that they would like SVI percentile ranks included as an enhancement to the PDWS (33/93; 36%), and about one-quarter would like the ability to filter applicants by SVI scores (25/93; 27%) (Supplemental Digital Appendices 1 and 4 at http://links.lww.com/ACADMED/A657).

Figure 1
Figure 1:
Likelihood of using AAMC Standardized Video Interview (SVI) total scores and videos as part of the residency selection process in the ERAS 2019 cycle, as reported by emergency medicine (EM) residency programs responding to the AAMC program director survey in November 2017. The responding EM programs were among those that were granted access to SVI total scores and videos during the ERAS 2018 cycle. For this figure: SVI scores, n = 97; SVI videos, n = 99. Abbreviations: AAMC indicates Association of American Medical Colleges; ERAS, Electronic Residency Application Service.

Study 2: Program usage of SVI videos

The median number of video responses viewed per program was 111. Programs viewed at least 1 video response for a median of 77 applicants. The median number of video responses for a single applicant viewed by programs was 1 (range, 1–6). In total, 50% (10,417/20,814) of available video responses were watched by the 175 programs with access to SVI videos.

As shown in Table 2, 2,912 (82.4%) of the 3,532 applicants who completed the SVI had one or more video response views. There were no differences in the proportions of views by gender or race/ethnicity. However, programs were more likely to view video responses from US-MDs (1,906/2,062; 92.4%) compared with DOs (706/915; 77.2%; h = 0.44), US-IMGs (198/320; 62%; h = 0.77), and FMGs (100/220; 45.5%; h = 1.10). This order maps closely to the percentages of US-MDs (17,740/18,818; 94.3%), DOs (3,771/4,617; 81.7%), US-IMGs (2,900/5,075; 57.1%), and FMGs (3,962/7,067; 56.1%) who matched to residency programs across all specialties in the ERAS 2018 cycle.13

Table 2
Table 2:
Emergency Medicine Residency Program Applicants Who Had One or More SVI Video Response Views (n = 2,912), Group Differences by Demographic Category, ERAS 2018 Cycle

The correlation between video response views per applicant and SVI total scores was not statistically significant, r (2905) = 0.01, P = .493. The correlation between video response views per applicant and USMLE Step 1 scores was significant, r (2622) = 0.19, P < .001. Figure 2 displays the distribution of median number of video response views across SVI total scores, and Figure 3 displays the distribution of median number of video response views across Step 1 scores. Applicants with one or more views had slightly higher SVI total scores (mean [SD] = 19.3 [3.0]) compared with applicants with zero views (mean [SD] = 18.2 [3.3]; t(3530) = 8.15, P < .001). Applicants with one or more views had higher mean Step 1 scores (mean [SD] = 229.3 [17.0]) compared with applicants with zero views (mean [SD] = 218.4 [22.8]; t(2975) = 10.83, P < .001).

Figure 2
Figure 2:
AAMC Standardized Video Interview (SVI) total score distribution by median number of views of applicant video responses. Video responses could be viewed by the 175 emergency medicine residency programs granted access to SVI total scores and videos for consideration in their resident selection processes in the ERAS 2018 cycle. SVI total scores are derived from applicants’ video responses to six questions, each rated on a five-point scale. Ratings for each question are summed to create a total score ranging from 6 to 30, with higher scores indicating higher proficiency in interpersonal and communication skills and knowledge of professionalism, as evaluated by SVI raters. A total of 3,532 applicants completed the SVI, and one or more video responses were viewed for 2,912 of those applicants. SVI total scores are presented for cell sizes of 5 or greater; in this figure, the cell sizes range from 15 to 404 applicants. Data are presented as medians rather than means to control for outliers. Abbreviations: AAMC indicates Association of American Medical Colleges; ERAS, Electronic Residency Application Service.
Figure 3
Figure 3:
USMLE Step 1 score distribution by median number of views of applicant video responses. Video responses could be viewed by the 175 emergency medicine residency programs granted access to AAMC Standardized Video Interview (SVI) total scores and videos for consideration in their resident selection processes in the ERAS 2018 cycle. A total of 3,532 applicants completed the SVI, and one or more video responses were viewed for 2,912 of those applicants. USMLE Step 1 scores for applicants in this study ranged from 181 to 264. USMLE Step 1 scores are presented for cell sizes of 5 or greater; in this figure, cell sizes range from 5 to 75 applicants. Data are presented as medians rather than means to control for outliers. Abbreviations: AAMC indicates Association of American Medical Colleges; USMLE, United States Medical Licensing Examination.

Discussion

These studies represent a first attempt to collect baseline user reactions to the SVI from residency programs that had access to SVI total scores and video responses in the ERAS 2018 cycle. When implementing a new assessment, it is critical to study user reactions as they may directly affect the assessment’s adoption rate and can be used to identify ways to improve communication, training, and policies. We used two different studies to explore user reactions: One study was designed to survey SVI users about their reactions to the SVI and the other study to analyze SVI video usage data.

Overall, we found that programs were cautious with how they used SVI total scores. For example, although approximately half of the programs that responded to the survey considered the scores in the selection process, most programs reported that the SVI was not important when deciding whom to invite to in-person interviews. Moreover, most programs ignored missing scores and did not plan to ask applicants why they did not take the SVI during in-person interviews. These findings suggest that programs used the SVI cautiously with respect to selection decisions, which is consistent with the AAMC’s recommendations on how programs should use SVI scores.11 The most commonly reported reasons for watching video responses—out of curiosity and to understand the range of scores—also support the idea that programs were using the SVI experimentally and for research purposes rather than to make selection decisions in the ERAS 2018 cycle. Programs may become more likely to incorporate SVI total scores into their selection decisions as they become more familiar with the SVI and develop a better understanding of the meaning of SVI data.

Many programs felt they needed additional research on the utility of SVI total scores before incorporating them into the selection process. A best practice in assessment research is to conduct predictive validity research to establish support for a new tool’s ability to accurately predict desired outcomes.14 The purposes for which programs reported viewing SVI videos and the pattern in which programs looked at scores (as shown in Figure 2) suggest that they were trying to better understand the meaning of SVI scores by watching video responses representing the full range of scores. This finding was supported in the survey analysis, as nearly two-thirds of programs reported having watched SVI videos to get a better feel for the range of scores. While the AAMC has established evidence of validity for the SVI,8 studies exploring the correlation between SVI scores and in-person interview scores as well as the ability of the SVI to predict performance in residency would be important next steps to provide additional evidence of the validity of SVI scores and could help facilitate the use of SVI scores during the selection process.

Among programs that considered SVI total scores, there were divided opinions on the utility of the scores, the general ease of use, and the ability to compare competencies between applicants. Mixed reactions are not surprising; this was the first time programs were provided SVI information during the selection process. Reactions to the SVI’s ease of use and the utility of scores will be important to monitor in future surveys, as research has shown that adoption of selection technology can be influenced by perceptions of usefulness and ease of use.9,15 Despite the mixed reactions, slightly more than half of programs indicated that they were at least somewhat likely to use SVI scores next year, and about two-thirds of programs that used the SVI scores in the ERAS 2018 cycle indicated that they were at least somewhat likely to recommend the SVI to other faculty. Thus, it appears that programs have interest in learning more about how SVI data might be incorporated into their selection processes in the future.

The video usage analysis revealed that programs were more likely to watch video responses for applicants with higher Step 1 scores, and that they disproportionately viewed video responses for US-MDs compared with other applicant types. These findings are consistent with past survey results indicating that most programs use filters based on scores or applicant characteristics to reduce their applicant pools.6 One development of note is the AAMC’s intention to add filters for SVI total scores in the ERAS 2019 cycle. It will be useful to monitor how programs use SVI filters in relation to academic data. For example, if programs use SVI filters to balance USMLE Step exam scores and lower their initial screening thresholds, applicants with higher SVI scores (indicating higher perceived interpersonal and communication skills and knowledge of professionalism) may be considered for in-person interviews. Alternatively, if programs add the SVI as another screen along with Step exam scores, it could result in programs filtering in applicants who have both high Step exam scores and high SVI scores. Although the AAMC’s original intention was for programs to use SVI scores to balance academic data, adding additional nonacademic information when evaluating candidates may result in more well-rounded applicants being invited to in-person interviews.

There are several limitations to our study. First, only the 85% of EM residency programs that agreed to the SVI terms and conditions and had access to SVI total scores and videos were invited to participate in the program director survey. This sampling method was necessary given that the purpose of the research was to collect user reactions to the SVI. However, it may be useful to collect information from programs that opted not to use SVI scores and videos as their perspectives on the potential utility of the SVI are important and will likely differ from those of programs that participated in the survey. Second, the rating scale we used to assess programs’ likelihood of using and recommending the tool included one negative option and four options that leaned positive. This may have resulted in slightly skewed reactions given the disproportionate number of positive response options. Future surveys may include a binary option that asks for endorsement on a yes/no scale or use a scale that balances positive- and negative-leaning response options. Collecting qualitative data via open-ended questions could also add value by allowing programs to expand on their reactions. For example, programs could further elaborate on how they used SVI scores and video responses and their rationale for considering or not considering the SVI data in their selection decisions. Third, the video usage data do not reflect the duration of each individual video view. For example, a view duration of five seconds held the same weight as a view duration extending to the end of the entire response. As a result, the quality of the individual video views cannot be compared. The inclusion of a view duration metric would have provided insight on the extent to which all video views can be considered equal with respect to the content viewed by programs.

Conclusions

The SVI is a potentially viable selection instrument that provides information on applicants’ interpersonal and communication skills and knowledge of professionalism. Our survey results indicated that programs used the SVI cautiously in their selection processes, which is consistent with the AAMC’s recommendations on how to incorporate SVI total scores. Data from the two studies suggest that programs are interested in learning more about the meaning of SVI scores. Overall, reactions on the utility and ease of use of scores were mixed. Although slight majorities of programs indicated that they are at least somewhat likely to use the SVI and recommend it to other faculty in the future, there was an overall lack of enthusiasm with respect to using and incorporating SVI data into selection decisions at this juncture. This suggests that, from the perspective of programs, the SVI may be a useful addition to residency selection, but additional evidence, such as predictive validity data, will be needed to help show the value of the SVI over time. Further research on the predictive validity of SVI scores is needed to identify the value of the SVI as a tool for program directors and their selection processes. Finally, the results of our studies help expand the literature on user reactions to new selection technology and assessments while also providing a baseline for user interest in the SVI. Future efforts to survey SVI users during each ERAS cycle will be critical for the AAMC to gauge improvements in user acceptance and familiarity with the SVI.

Acknowledgments:

The authors wish to acknowledge members of the Emergency Medicine Standardized Video Interview (EMSVI) working group: Ashely Alker, MD (University of California, San Diego); Andra Blomkalns, MD (Stanford University School of Medicine); Steven B. Bird, MD (University of Massachusetts Medical School); Mary Rose Calderone Hass, MD (University of Michigan); Nicole M. Deiorio, MD (Virginia Commonwealth University School of Medicine); Ramnick Dhaliwal, MD (Hennepin County Medical Center); Fiona E. Gallahue, MD (University of Washington); H. Gene Hern, MD (University of California, San Francisco and Highland Hospital); Yolanda Haywood, MD (George Washington University School of Medicine and Health Sciences); Katherine M. Hiller, MD, MPH (University of Arizona College of Medicine–Tucson); Zachary J. Jarou, MD (University of Chicago); Rahul Patwari, MD (Rush University Medical Center); Christopher Woleben, MD (Virginia Commonwealth University School of Medicine); and Richard Wolfe, MD (Harvard Medical School and Beth Israel Deaconess Medical Center).

The authors would also like to acknowledge AAMC staff members who provided guidance on survey and manuscript development: Dana Dunleavy, PhD, Rebecca Fraser, PhD, and B. Renee Overton, MBA.

References

1. Association of American Medical Colleges. Electronic Residency Application Service: All applicant data, ERAS 2014–2018: Residency specialties summary. https://www.aamc.org/download/359236/data/all.pdf. Published 2017. Accessed January 31, 2019.
2. Weissbart SJ, Kim SJ, Feinn RS, Stock JA. Relationship between the number of residency applications and the yearly match rate: Time to start thinking about an application limit? J Grad Med Educ. 2015;7:81–85.
3. Green M, Jones P, Thomas JX Jr.. Selection criteria for residency: Results of a national program directors survey. Acad Med. 2009;84:362–367.
4. Bandiera G, Abrahams C, Ruetalo M, Hanson MD, Nickell L, Spadafora S. Identifying and promoting best practices in residency application and selection in a complex academic health network. Acad Med. 2015;90:1594–1601.
5. Prober CG, Kolars JC, First LR, Melnick DE. A plea to reassess the role of United States Medical Licensing Examination Step 1 scores in residency selection. Acad Med. 2016;91:12–15.
6. Dunleavy D, Geiger T, Overton R, Prescott J. Results of the 2016 Program Directors Survey: Current Practices in Residency Selection. 2016. Washington, DC: Association of American Medical Colleges; https://store.aamc.org/downloadable/download/sample/sample_id/180/. Accessed July 29, 2019.
7. Nasca TJ, Philibert I, Brigham T, Flynn TC. The next GMA accreditation system—Rationale and benefits. N Engl J Med. 2015;366:1051–1056.
8. Bird SB, Hern HG, Blomkalns A, et al. Innovation in residency selection: The AAMC Standardized Video Interview. Acad Med. 2019;94:1489–1497.
9. Blacksmith N, Willford JC, Behrend TS. Technology in the employment interview: A meta-analysis and future research agenda. Pers Assess Decis. 2006;2:12–20.
10. Oostrom JK, van der Linden D, Born MP, van der Molen HT. New technology in personnel selection: How recruiter characteristics affect the adoption of new selection technology. Comput Human Behav. 2013;29:2404–2415.
11. Association of American Medical Colleges. Using AAMC Standardized Video Interview scores in residency selection: A resource guide August 2017.Personal communication with emergency medicine residency program directors by AAMC SVI staff, [unpublished].
12. Cohen J. Statistical Power Analysis for the Behavioral Sciences. 1988.2nd ed. Hillsdale, NJ: L. Erlbaum Associates.
13. National Resident Matching Program. Results and data: 2018 Main Residency Match. http://www.nrmp.org/wp-content/uploads/2018/04/Main-Match-Result-and-Data-2018.pdf. Accessed January 31, 2019.
14. Society for Industrial and Organizational Psychology. Principles for the validation and use of personnel selection procedures. http://www.siop.org/_principles/principles.pdf. Published 2003. Accessed January 31, 2019.
15. Brenner FS, Ortner TM, Fay D. Asynchronous video interviewing as a new technology in personnel selection: The applicant’s point of view. Front Psychol. 2016;7:863.

Supplemental Digital Content

Copyright © 2019 by the Association of American Medical Colleges