Secondary Logo

Journal Logo

Invited Commentaries

Vive la Différence: The Freedom and Inherent Responsibilities When Designing and Implementing Multiple Mini-Interviews

Reiter, Harold MD, MEd, FRCPC; Eva, Kevin PhD

Author Information
doi: 10.1097/ACM.0000000000002042
  • Free

Abstract

In April 2003, roughly 40 people squeezed around a long boardroom table at a hotel in Quebec City, Canada. These admissions deans and admissions officers of the 17 Canadian medical schools, along with a handful of student representatives, were conducting their daylong business meeting during the annual conference of the Association of Faculties of Medicine of Canada. One of us (H.R.), newly minted as chair of admissions at the Michael G. DeGroote School of Medicine at McMaster University, was honored with time to deliver two presentations. The first went well enough. Sharing the reliability and early validity data from research on a new assessment method, the multiple mini-interview (MMI),1 led to the expected reaction: a mixture of curiosity and skepticism. The reaction to the second presentation was another matter entirely. Audaciously, H.R. suggested that admissions interviews could be nationalized. Medical schools would be considered interview centers and use a nationally standardized MMI, and virtually all applicants across the country would be offered one interview slot at a school reasonably close to their home. Applicants’ interview scores would be shared, with each school using the scores as they saw fit to maintain the uniqueness of their institution and values. The subsequent discussion was a superb example of how to thoroughly and politely demolish a proposal.

In the years since, the emerging data—including the data shared in the intriguing paper by Henderson et al2 in this issue—have made it clear that the people assembled around that table in Quebec City were entirely correct. The idea of a nationalized MMI was completely misplaced. The answers to some simple questions about MMIs, which we drew from a cursory hunt through the 117 articles on “multiple mini-interview” listed in PubMed as of September 2017 and share below, amply illustrate the folly of the suggestion. As we argue in this Invited Commentary, the design and implementation of locally conducted MMIs should reflect local needs. Individual institutions, given the freedom to exercise their values in constructing local MMIs, carry the responsibility to ensure local assessment tool validity.

Simple Questions About MMIs

What do MMIs measure?

Commonly, the intent is to measure some construct of personal competence. A published factor analysis suggested that MMI scores collapse into two factors, one attributed to oral communication skill and another that had less clear grounding in any one personal competency.3 Using item response theory and differential item functioning analysis, Roberts et al4 concluded that their MMI measured “entry-level reasoning skills in professionalism.” The lack of conclusiveness regarding what MMIs measure5 leaves patterns of correlations as an important pathway of exploration, both to try to further address this question and to determine if the scores arising from MMIs might reflect bias toward or away from any particular group of applicants.

Do MMIs correlate with personality tests?

According to one study,6 there is no correlation at all. Two other studies3,7 reported a correlation with extraversion alone. Yet another study8 reported consistent correlations with both extraversion and conscientiousness, as well as an inconsistent correlation with agreeableness.

Do MMIs correlate with academic measures?

Could there be a negative association, as speculated by Henderson et al,2 between “study habits and other attributes associated with achieving a high GPA [grade point average]” and the “skills required for performance on the MMI, a fast-paced series of interactions”? As noted by Henderson et al, according to four separate studies,2,9–11 the answer is yes—but four different studies6,12–14 indicate the answer is no.

Do MMIs correlate with demographic variables?

There are multiple MMI studies published on the diversity indices of sex, age, race and ethnicity, socioeconomic status (SES), and rurality. As with GPA, there is variability in correlation of MMI results with indices of diversity, some of which were noted by Henderson et al.2 For sex, multiple studies found that women significantly outperformed men,15,16 while others found zero or negligible differences.12,17–23 For age, multiple studies found that older applicants received significantly higher MMI scores than younger applicants,12,18 while others found zero or negligible differences.17,19–23 For race and ethnicity, four studies9,10,17,20 found no significant differences in MMI scores; however, the picture is less clear for indigenous applicants, with one very small study suggesting no difference24 and a larger study demonstrating significantly lower scores.12 For SES, one study found that candidates with lower SES scored less well on an MMI than those with higher SES,10 while four other studies found no difference in scores.12,21,25,26 For rurality, one study found that graduates of rural high schools fared less well on an MMI compared with other applicants,27 while another found that size of community of origin had no impact.12

What Does It All Mean?

When faced with inconsistent results, it is sometimes possible to discern which ones do not fit the general pattern and which ones may be misleading because they were derived from studies with serious methodological limitations. It is also sometimes possible to conclude that too few studies have been done to allow a general pattern to emerge. It is even possible to conclude that, despite the rigor and skills of the researchers involved, we as a community might simply be getting it wrong.

There is, however, another possibility. What if all of the studies are right? What if an MMI run in one specific format, at one specific school, using one specific content set, and conducted with one specific applicant pool measures a construct that is more personal than academic, whereas a different MMI with its own format, institution, content, and applicant pool measures a construct that is more academic than personal? What if the specific details of one MMI administration yield better or worse promotion of different indices of diversity than those of another? What if variables related to the interviewers—like those offered by Henderson et al,2 when writing that “interviewers at [traditional interview] schools were not blinded to the [American Medical College Application System] application, so they may have had knowledge of applicants’ [self-identified disadvantaged] status which, in turn, may have influenced their interview scores”—play a modifying role in the relationships between MMI scores and any of the variables listed above? With an impressive sample size and even more impressive effort to gather data across multiple schools from multiple admissions cycles, Henderson et al have the opportunity to extend their research by checking within their existing data to see just how consistent the relationships are even within their consortium of five California public medical schools. This would create the potential to start looking at what additional factors might be influential as well as provide the reassurance of replication, a research feature that is of particular value when effects are small and not grounded in theory that allows a priori prediction.

Regardless of the outcome of such additional analyses, rather than suggesting that the inconsistent and at times contrary results of MMI studies should repudiate the MMI’s value, there is another way of considering the appropriate implementation of MMIs along with other assessment measures. The Association of American Medical Colleges (AAMC) has described the use of an admissions assessment “toolbox,” reflecting the reality that no one assessment tool can provide all the desired information for purposes of selection.28 The AAMC has also defined Core Competencies for Entering Medical Students that are felt to be important to medical schools,29 and the assessment of some of these competencies might be facilitated through the use of centrally generated data like GPA, Medical College Admission Test (MCAT) score, and/or results of a situational judgment test. At present, though, none of the centrally available data provide desirable information specific to subsets of medical schools, such as whether applicants have qualities to become providers of care to inner-city underserved or rural populations, or to become medical researchers, experts in public health, or leaders in health care system change. For competencies that cannot reasonably be measured centrally, the onus falls on individual schools to develop, implement, and maintain quality assurance on tools that fill the gaps. As schools do so, it is important to bear in mind that the MMI is not an assessment tool but rather an assessment method; it is a format for assessment that can be generated and implemented in an infinite number of ways, not a test that is administered with the exact same blueprint and structure each and every time. An MMI is thus equivalent to a multiple-choice test, not to the MCAT exam.

From this perspective, MMIs should be thought of as ideally being highly individualized to the mission of a specific school (or small band of like-minded schools) and its typical applicant pool. Although such a stance may seem to place considerable onus on individual schools, this responsibility is the reality for every school that chooses to use any locally developed form of assessment, such as an objective structured clinical examination, personal interview, or end-of-course exam. The greater the local influence, the greater the responsibility for quality assurance will be that falls to local practitioners. Each school that recognizes its unique needs, identifies gaps not filled by nationally available data, and decides to use an MMI also accepts the necessity of determining the overall test reliability, predictive validity, and resource and diversity implications of the admissions decisions made through its selection processes. Those results might never be published in peer-reviewed journals or elucidate any national patterns in what MMIs measure, but they are crucial for local consumption nonetheless. Redundancy can be reduced by combining forces to synchronize interviews across schools with similar ideologies and geographical proximity, as has been done by the three purely Francophone schools in Quebec, Canada.30

In Conclusion

Between schools or subsets of schools, there may be differences in MMI construction, content, and format, and as a by-product, differences in study results. In fact, we would argue there should be such differences, precisely because schools have different priorities arising from their own local needs. Heterogeneity in the application of local assessment methodologies need not be a weakness but, rather, may reflect the ability of schools to modify best practices to meet their specific needs. Vive la différence!

References

1. Eva KW, Rosenfeld J, Reiter HI, Norman GR. An admissions OSCE: The multiple mini-interview. Med Educ. 2004;38:314326.
2. Henderson MC, Kelly CJ, Griffin E, et al. Medical school applicant characteristics associated with performance in multiple mini-interviews versus traditional interviews: A multi-institutional study. Acad Med. 2018;93:10291034.
3. Oliver T, Hecker K, Hausdorf PA, Conlon P. Validating MMI scores: Are we measuring multiple attributes? Adv Health Sci Educ Theory Pract. 2014;19:379392.
4. Roberts C, Zoanetti N, Rothnie I. Validating a multiple mini-interview question bank assessing entry-level reasoning skills in candidates for graduate-entry medicine and dentistry programmes. Med Educ. 2009;43:350359.
5. Knorr M, Hissbach J. Multiple mini-interviews: Same concept, different approaches. Med Educ. 2014;48:11571175.
6. Kulasegaram K, Reiter HI, Wiesner W, Hackett RD, Norman GR. Non-association between Neo-5 personality tests and multiple mini-interview. Adv Health Sci Educ Theory Pract. 2010;15:415423.
7. Jerant A, Griffin E, Rainwater J, et al. Does applicant personality influence multiple mini-interview performance and medical school acceptance offers? Acad Med. 2012;87:12501259.
8. Griffin B, Wilson I. Associations between the big five personality factors and multiple mini-interviews. Adv Health Sci Educ Theory Pract. 2012;17:377388.
9. Terregino CA, McConnell M, Reiter HI. The effect of differential weighting of academics, experiences, and competencies measured by multiple mini interview (MMI) on race and ethnicity of cohorts accepted to one medical school. Acad Med. 2015;90:16511657.
10. Jerant A, Fancher T, Fenton JJ, et al. How medical school applicant race, ethnicity, and socioeconomic status relate to multiple mini-interview-based admissions outcomes: Findings from one medical school. Acad Med. 2015;90:16671674.
11. Eva KW, Reiter HI, Rosenfeld J, Norman GR. The ability of the multiple mini-interview to predict preclerkship performance in medical school. Acad Med. 2004;79(10 suppl):S40S42.
12. Reiter HI, Lockyer J, Ziola B, Courneya CA, Eva K; Canadian Multiple Mini-Interview Research Alliance (CaMMIRA). Should efforts in favor of medical student diversity be focused during admissions or farther upstream? Acad Med. 2012;87:443448.
13. Hecker K, Donnon T, Fuentealba C, et al. Assessment of applicants to the veterinary curriculum using a multiple mini-interview method. J Vet Med Educ. 2009;36:166173.
14. Oyler DR, Smith KM, Elson EC, Bush H, Cook AM. Incorporating multiple mini-interviews in the postgraduate year 1 pharmacy residency program selection process. Am J Health Syst Pharm. 2014;71:297304.
15. Ross M, Walker I, Cooke L, et al. Are female applicants rated higher than males on the multiple mini-interview? Findings from the University of Calgary. Acad Med. 2017;92:841846.
16. Barbour ME, Sandy JR. Multiple mini interviews for selection of dental students: Influence of gender and starting station. J Dent Educ. 2014;78:589596.
17. Pau A, Chen YS, Lee VK, Sow CF, Alwis R. What does the multiple mini interview have to offer over the panel interview? Med Educ Online. 2016;21:29874.
18. Traynor M, Galanouli D, Roberts M, Leonard L, Gale T. Identifying applicants suitable to a career in nursing: A value-based approach to undergraduate selection. J Adv Nurs. 2017;73:14431454.
19. Gale J, Ooms A, Grant R, Paget K, Marks-Maran D. Student nurse selection and predictability of academic success: The multiple mini interview project. Nurse Educ Today. 2016;40:123127.
20. Cox WC, McLaughlin JE, Singer D, Lewis M, Dinkins MM. Development and assessment of the multiple mini-interview in a school of pharmacy admissions model. Am J Pharm Educ. 2015;79:53.
21. Kelly ME, Dowell J, Husbands A, et al. The fairness, predictive validity and acceptability of multiple mini interview in an internationally diverse student population—A mixed methods study. BMC Med Educ. 2014;14:267.
22. O’Brien A, Harvey J, Shannon M, Lewis K, Valencia O. A comparison of multiple mini-interviews and structured interviews in a UK setting. Med Teach. 2011;33:397402.
23. Hofmeister M, Lockyer J, Crutcher R. The multiple mini-interview for selection of international medical graduates into family medicine residency education. Med Educ. 2009;43:573579.
24. Moreau K, Reiter H, Eva KW. Comparison of aboriginal and nonaboriginal applicants for admissions on the multiple mini-interview using aboriginal and nonaboriginal interviewers. Teach Learn Med. 2006;18:5861.
25. Griffin B, Hu W. The interaction of socio-economic status and gender in widening participation in medicine. Med Educ. 2015;49:103113.
26. Taylor CA, Green KE, Spruce A. Evaluation of the effect of socio-economic status on performance in a multiple mini interview for admission to medical school. Med Teach. 2015;37:5963.
27. Raghavan M, Martin BD, Burnett M, et al. Multiple mini-interview scores of medical school applicants with and without rural attributes. Rural Remote Health. 2013;13:2362.
28. Kirch DG. A word from the president: A new paradigm for evaluating future physicians. AAMC Reporter. May 2015. https://www.aamc.org/about/leadership/kirch-word-from-president/431902/word.html. Accessed October 26, 2017.
29. Association of American Medical Colleges. Core competencies for entering medical students. https://www.aamc.org/admissions/admissionslifecycle/409090/competencies.html. Accessed October 26, 2017.
30. Université de Sherbrooke. The doctoral program in medicine. Your 2018 admission process: A 5-step journey [in French]. https://www.usherbrooke.ca/doctorat-medecine/admission/mem/. Accessed October 26, 2017.
Copyright © 2017 by the Association of American Medical Colleges