Secondary Logo

Journal Logo

Scholarly Perspectives

Standardization in the MSPE: Key Tensions for Learners, Schools, and Residency Programs

Hauer, Karen E. MD, PhD; Giang, Daniel MD; Kapp, Meghan E. MD; Sterling, Robert MD

Author Information
doi: 10.1097/ACM.0000000000003290
  • Free

Abstract

The Medical Student Performance Evaluation (MSPE), a summary of the academic and professional performance of students during their undergraduate medical education (UME) that provides salient information to residency programs during the selection process, faces persistent criticisms of heterogeneity and obscurity. Specifically, MSPEs do not always provide the same information about students, especially from diverse schools, and important information is not always easy to find or interpret. To address these concerns, a key guiding principle from the Recommendations for Revising the MSPE Task Force of the Association of American Medical Colleges (AAMC) was to achieve “a level of standardization and transparency that facilitates the residency selection process.”1 Early experience with the new MSPE format shows that this aim has been achieved; most schools have adopted at least some components of the newly recommended format. Despite this effort, debates continue about the merits and mechanisms of operationalizing standardization, and some have called for even greater standardization—specifying additional required information for all students across all schools.2 In this article, members of the AAMC Recommendations for Revising the MSPE Task Force (simply MSPE Task Force hereafter, a collaboration of educators of students and residents who produced the new recommendations) explore the tensions inherent in the pursuit of standardizing the MSPE.

Defining Standardization

Standardization entails establishing and comparing something with a defined expectation. Benefits of standardization across industries include clarifying procedures and benchmarks or metrics for performance. Standardization can promote adherence to guidelines that articulate optimal, evidence-based approaches to accomplishing tasks or conducting work. Ultimately, standardization represents a strategy that can enable achievement of defined goals and improvement of quality. In the educational context, rigorous, standardized assessment can constitute an important mechanism to ensure accountability of the system for all learners, including those with varied backgrounds and socioeconomic resources.3 The widespread adoption of competency-based medical education across the continuum of UME to graduate medical education (GME) presents the opportunity to standardize language used to characterize performance.4

In the context of the MSPE, standardization aligns with the current purpose of the document: to serve as a record of each student’s performance rather than a letter of recommendation. The MSPE should document performance to support inferences about each student’s potential.5 Historically, the striking lack of standardization in the MSPE has vexed readers and threatened the value of the document. MSPE length, format, and content have varied widely, and many readers have characterized large portions of the MSPE as unhelpful.6 Faculty members from individual schools have employed highly distinct language and ranking methods that have required readers, often residency program directors (PDs), to interpret coded language and calculate their own inexact comparisons across schools.7 Another concern is that some schools have used the MSPE as a verbose letter of recommendation while simultaneously withholding or obscuring information about some students’ standing within the class or their academic struggles.2 (Consequently, standardized letters of recommendation for student applicants to residency programs in highly competitive specialties have been gaining popularity.8–10) Early experience with the newly recommended MSPE format1 suggests not only that standardization has improved but also that satisfaction has increased among MSPE authors and end users alike.11 Yet, some PDs remain dissatisfied with ongoing variability in aspects of the MSPE across applicants from different schools.7

Standardization is a process goal in service to the outcome goal of an informative MSPE

Medical schools and residency programs share the goal of achieving an MSPE that is clear, useful, and readable. The desired outcome is the presentation of useful information to document a student’s performance and inform residency selection decisions. Standardizing the MSPE format, which will benefit readers and the faculty and staff who prepare the document, may be a means to this outcome. MSPE standardization entails both consistency in the format or structure of the document and similarity in the included content (e.g., descriptions of students and their academic record). While the advantages of consistency and standardization may seem to be a foregone conclusion, they raise questions deserving of more consideration.

Defining the work of the MSPE Task Force

As a culminating record of a student’s medical school experience, the MSPE summarizes 4 or more years of learning. Consistency of content across MSPEs aligns with the call for “standardization of learning outcomes and individualization of the learning process.”6 The MSPE must balance, on one hand, the evidence from each unique medical school that it has fulfilled its obligation to ensure that all students have achieved a level of competence that justifies advancement to GME, with, on the other hand, information about each individual student’s trajectory and accomplishments. Providing all this information in a standard format is widely regarded as a strategy to enable efficient information uptake by selection committee members. The MSPE Task Force defined the boundaries of its work around the format and general content of the MSPE, not curricula within schools, and it recognized that variations in objectives and competencies, curricula and assessment strategies, appropriately exist to reflect each school’s unique mission and values.

In this article, we discuss key tensions for the MSPE related to standardization affecting students, medical schools, and residency programs. We have identified 5 tensions that include arguments in favor of, as well as threats to, standardization: (1) presenting each student’s individual characteristics and strengths in a way that is relevant, while also working with a standard format and providing standard content; (2) showcasing school-specific curricular strengths while also demonstrating standard evidence of readiness for internship; (3) defining and achieving the right amount of standardization so that the MSPE provides useful information, adds value to the residency selection process, and is efficient to read and understand; (4) balancing reporting with advocacy; and (5) maintaining standardization over time, especially given the tendency for the MSPE format and content to drift across schools. Here, members of the AAMC MSPE Task Force share how the group debated and reconciled these tensions. Table 1 provides various stakeholders’ perspectives on key tensions with the MSPE.

T1
Table 1:
Stakeholders’ Perspectives on Standardizing the Medical Student Performance Evaluation (MSPE)

Key Tensions

Tension 1: Presenting each student’s individual characteristics and strengths in a way that is relevant, while also working with a standard format and providing standard content

The MSPE serves the dual purposes of characterizing each student as a unique individual while also presenting comparative data such as grades or a rank, where available, to enable the assessment of each student relative to others. The MSPE should highlight each student’s individual characteristics and relative strengths to demonstrate fit for a particular residency program. The MSPE Task Force extensively debated how to standardize—for each student across schools—the format of the 3 noteworthy characteristics (i.e., the 3 bulleted items that highlight an applicant’s salient experiences and attributes).1 The MSPE Task Force recognized that for some schools, the new format would mean greatly abridging the detailed descriptions of students’ backgrounds, which taskforce members deemed too lengthy to be—as part of the MSPE—useful to residency programs. The MSPE Task Force concluded that additional student background information was better suited for individual letters of recommendation or personal statements. Showcasing aspects of a student’s background and experiences in a more standardized way can level the playing field and increase equity in how students are presented in the MSPE across schools—but only if readers read and value this information for the residency selection process.

MSPE standardization may constitute a threat for students, and the potential threat depends upon how standardization is accomplished. The MSPE process should contribute to a fair experience in the transition from UME to GME for all learners. The risk of bias is an important consideration. An MSPE focusing solely on grades, scores, and rank can seemingly promote a meritocracy, but, in reality, may disadvantage students with more diverse backgrounds or those who have overcome unusual challenges to reach and/or complete medical school. For example, students may come from under-resourced communities, be the first in their family to attend college, or have experienced illness during medical school. These examples of students who have “traveled a greater distance” highlight the complex definition of fairness,12 which encompasses both equity (comparability of learning opportunities and assessment in the context of students’ unique backgrounds) and equality (similar consideration of similar metrics for all learners).

A highly standardized MSPE that emphasizes academic metrics available for all students threatens holistic selection procedures put into place by residency programs.13 Highly standardized reports of grades and scores can result in the neglect or even exclusion of other MSPE content, such as narrative descriptions of performance or the context of a particular student’s performance. If the MSPE becomes a long version of the transcript, duplicating a list of grades and scores, it loses value. Hence, the MSPE Task Force maintained its commitment to promoting the inclusion of readily interpretable performance data, along with high-yield, succinct narrative information.

Standardized presentation of metrics is useful to readers only when a range of metrics are assigned across a class. Grade inflation, which is common in core clerkships, represents a threat in the context of standardization due to the risk of most or all students receiving the highest ratings.14,15 Variable definitions of grades (What is an A or honors?), variable awarding of those grades to different percentages of the class using different criteria (Who earns an A/honors and how?), and variable presentation of grades (Is an A the same as highest honors?) challenge residency selection committee members who must compare students across schools.11 Means to help counter these varying school metrics in the MSPE include using well-designed rating forms with behavior-based anchors to characterize levels of performance and providing faculty and resident supervisors with training on how to use and interpret the rating scales. Further, caps on the number of students receiving top grades or providing a list of summary adjectives may help ensure a spread of ratings within a class. Importantly, however, this use of normative rather than criterion-based ratings compares students with one another rather than to an absolute standard. An unintended consequence of presenting grades that compare students with their classmates is the heightened concern, especially among students, that pressure to earn top marks is detrimental to student well-being, peer collaboration, and the learning environment.16–18

Another risk, however, is overstandardization; that is, every MSPE documenting the same achievement of the same learning outcomes at the same level such that the document appears the same across applicants. For example, if achievement of national core entrustable activities were to become a widespread graduation requirement, then documenting this accomplishment would be standard in all MSPEs. The only exception would be the very small number of students who fail to achieve as expected and then would be deemed unmatchable by programs. The move to competency-based and programmatic assessment systems, already adopted by some schools, deemphasizes norm-referenced performance ratings such as grades, class ranks, or honor society designations in favor of determination of achievement of expectations and emphasis on learning.19,20 An MSPE that does not provide academic or professional information to distinguish students does not enable residency programs to discern each student’s affinity for certain career paths or how that applicant would fit into a particular program.

Tension 2: Showcasing school-specific curricular strengths while also demonstrating standard evidence of readiness for internship

The MSPE can showcase a school’s particular curricular strengths to inform residency programs about graduates’ fit for a residency, and it must also demonstrate that each student has met the school’s standard evidence of readiness for internship. PDs of residency programs with certain missions, such as caring for a particular patient population, seek evidence in the MSPE to guide them in finding applicants who are the best fit for their program.

Overstandardization may eliminate information about a school’s curricular strengths that illustrates how candidates’ individual training backgrounds prepare them for particular residency programs. Absent this information, residency PDs use information that may be less consistently available or less reliable (e.g., national reputation, colleagues’ knowledge) to learn about graduates of particular schools. The MSPE Task Force debated the need to balance information about school-specific curricula with residency PDs’ desire to avoid reading excessive information that is not specific to a particular applicant. The MSPE Task Force settled on the recommendation to colocate information about the medical school’s curriculum and assessment as a final one-page amendment to the document, and to place comparative information about a student relative to peers within the MSPE document itself.

Tension 3: Defining and achieving the right amount of standardization so that the MSPE provides useful information, adds value to the residency selection process, and is efficient to read and understand

Residency program leaders want the MSPE to include valuable, actionable information, yet they also want it to be easy to read and understand. For residency PDs, benefits of a standardized format include more efficient reading and searching. Standardizing the structure of the MSPE facilitates finding information quickly across a large number of lengthy documents. Consistent metrics presented in a uniform format enables the comparison of students within schools and potentially across schools. Standardization in the MSPE yields welcome efficiency not only for readers of the document but also for author teams. Faculty authors, as well as the staff who support them in compiling information about each student’s performance, benefit from the efficiencies inherent in using a template to prepare and organize the document. A standardized format facilitates using technology within and potentially across schools to automate MSPE production.

Lack of usability is both a threat to and a critique of the MSPE. Some residency PDs do not actually read it or do not read it early enough in the residency selection process to guide decisions about extending interview invitations.21,22 A solution to this challenge may entail creating an electronically searchable MSPE available through the Electronic Residency Application Service (ERAS) to enable residency selection committee members to go directly to specific sections. Of course, the countervailing risk is that some sections deemed less useful for comparing applicants might go unused, despite the time and effort put into compiling them.

Tension 4: Balancing reporting with advocacy

Medical schools report information in the MSPE to provide residency programs with an honest summary of each students’ background, experiences, and performance; however, medical schools also feel obligated to ensure student success in the Match. Some medical school educators may be reluctant to follow the recommendations for standardized reporting of students’ performance concerns. They may feel compelled to advocate for students to help them match into the specialties or programs they desire, which in turn demonstrates program success to external stakeholders and accreditors.23

Notably, the MSPE has the potential to convey relevant information about medical students’ professionalism. Relaying such information is important since unprofessional behaviors during medical school are associated with later disciplinary actions by medical boards.24 The MSPE, though, has fallen short in reporting honest information about professional behavior.25 Medical school educators realize that disclosing a student’s unprofessional activity in the MSPE might result in that student not matching to any program. Ideally, however, such a disclosure would allow a residency program to support the trainee’s development toward demonstrating competence in professionalism. Standardized reporting of professionalism lapses in the MSPE, coupled with resources and strategies for residency programs to support resident trainees’ professionalism development, could facilitate matching while enabling programs to receive and act upon transparent information about professionalism. This targeted education or training could reassure medical staff and medical boards of the physician’s professionalism.

Tension 5: Maintaining standardization over time, especially given the tendency for the MSPE format and content to drift across schools

Incentives or requirements may help maintain MSPE standardization over time. Incentives to adhere to recommendations depend upon the availability of guidelines and templates demonstrating the preferred format. The use of data mining to produce individualized school-based reports from a common platform would facilitate information display and sharing. In addition, feedback to medical schools from residency programs about useful and less helpful components can drive future decisions about MSPE content. Technological solutions can serve to mandate adherence to standard formatting requirements. For example, an MSPE template in ERAS would force authors to use proscribed content domains and word count limits. Importantly, ongoing partnership and dialogue between UME and GME about the transition to residency might help sustain the commitment of both to use a standard format and build trust across the educational continuum.

Threats to maintaining a standardized MSPE format arise from the (lack of) systems involved and evolving leadership. Standardization is more beneficial when more schools participate, but, thus far, uptake of past recommendations has varied.26 Members of the MSPE Task Force identified multiple examples of turnover in medical school leaders that led to decay of adherence to a standard MSPE format; new leaders are often eager to try a new approach. Systems that operationalize recommendations remove the temptation for unneeded variability—even across schools and administration changes.

Concluding Remarks

Promoting standardization of the MSPE remains a laudable goal, but until the UME and GME communities resolve underlying differences in understanding the purpose of the MSPE, meeting all stakeholders’ goals will be difficult. Schools experience pressure from residency programs to rank or classify students—and competing pressure from students and accreditation bodies to ensure students’ success in the Match.

The MSPE and licensing exam scores

Without the detailed comparative performance information currently in the MSPE (e.g., rank, grades), United States Medical Licensing Examination (USMLE) scores have carried greater weight in the application process.21 Previous educators have questioned the use of USMLE scores for a purpose not originally intended, and indeed, while this article was in production, the National Board of Medical Examiners announced that it would transition to pass/fail reporting for the USMLE Step 1.27 In the absence of an MSPE that provides useful, readily available information to compare students, and given the upcoming change in Step 1 score reporting, it is unclear whether the USMLE Step 2 Clinical Knowledge (CK) score will become the metric used to compare applicants or whether other changes to the Match will prompt new methods of applicant selection. Step 2 CK, like Step 1, provides limited information about students’ communication skills, professionalism, or life experiences, all of which determine much of a physician’s practice. The MSPE offers an opportunity to highlight students’ outstanding abilities in competency domains beyond medical knowledge (e.g., service, communication) to counterbalance the historical overreliance on USMLE scoring.

Notably, the MSPE Task Force ultimately concluded that the MSPE itself can improve but cannot alone solve the problems associated with the UME-to-GME transition. Additional solutions may entail more nuanced assessment methods in medical school, combined with changes to the residency application and match procedures.

Going forward

Development of a comprehensive strategy to address challenges with the current UME–GME transition could enhance the match process to achieve fit between applicants and residency programs.28 In the meantime, the standardized MSPE may balance the need for consistent information across students and schools that is useful for comparing applicants with a description of the unique characteristics of each student. Maximizing the value of descriptive information about each student through high-quality narratives based on direct observation of a student’s performance over time has merit to enhance the usefulness of the MSPE. These narratives correlate with other measures of performance, but more study is needed to determine how, or if, they can be used feasibly to select residents and determine their future performance.29

Students need educators to develop and consistently communicate, throughout medical school, what information will be included and how it will be presented in the MSPE. Expectations should be clear upfront because they serve as one mechanism to guide students’ learning and their pursuit of extracurricular activities. The adage that assessment drives learning holds true in medical school; students choose to focus more time on exams or clerkship activities that they perceive are most likely to earn them high grades and opportunities for competitive residencies.30,31

Looking to the future, the MSPE should be part of the solution to the larger problem of the transition from UME to GME. Ongoing efforts are needed to monitor the effect of the implementation of the MSPE recommendations and the use of all components of the application, including letters of recommendation. Educators should be vigilant about the fact that, with greater standardization, anything that is not part of the standardized format is deemphasized by students, schools, and residency programs. The goals of accurate information sharing and transparency in the MSPE—for individual students, schools, and residency programs—will contribute to a residency selection process that is fair, equitable, and trustworthy.

Acknowledgments:

The authors acknowledge the members of the Medical Student Performance Evaluation Task Force and staff of the Association of American Medical Colleges: Lee Jones, MD (Chair), Amy Adams, Deborah Clements, MD, John Graneto, DO, MEd, Cynda Johnson, MD, MBA, Cecile Maas, Hilit Machaber, MD, and Geoffrey Young, PhD.

References

1. Association of American Medical Colleges. Recommendations for Revising the Medical Student Performance Evaluation (MSPE).AAMC. https://www.aamc.org/download/470400/data/mspe-recommendations.pdf. Published May 2017. Accessed February 28, 2020.
2. Andolsek KM. Improving the medical student performance evaluation to facilitate resident selection. Acad Med. 2016;91:1475–1479.
3. Murray E, Gruppen L, Catton P, Hays R, Woolliscroft JO. The accountability of clinical education: Its definition and assessment. Med Educ. 2000;34:871–879.
4. Frank JR, Snell LS, Cate OT, et al. Competency-based medical education: Theory to practice. Med Teach. 2010;32:638–645.
5. American Educational Research Association (AERA), American Psychological Association (APA), National Council on Measurement in Education (NCME). Standards for Educational and Psychological Testing. 2014. Washington, DC: American Educational Research Association.
6. Swide C, Lasater K, Dillman D. Perceived predictive value of the Medical Student Performance Evaluation (MSPE) in anesthesiology resident selection. J Clin Anesth. 2009;21:38–43.
7. Boysen Osborn M, Mattson J, Yanuck J, et al. Ranking practice variability in the Medical Student Performance Evaluation: So bad, it’s “good”. Acad Med. 2016;91:1540–1545.
8. Wang RF, Zhang M, Alloo A, Stasko T, Miller JE, Kaffenberger JA. Characterization of the 2016-2017 dermatology standardized letter of recommendation. J Clin Aesthet Dermatol. 2018;11:26–29.
9. Bajwa NM, Yudkowsky R, Belli D, Vu NV, Park YS. Validity evidence for a residency admissions standardized assessment letter for pediatrics. Teach Learn Med. 2018;30:173–183.
10. Friedman R, Fang CH, Hasbun J, et al. Use of standardized letters of recommendation for otolaryngology head and neck surgery residency and the impact of gender. Laryngoscope. 2017;127:2738–2745.
11. Hook L, Salami AC, Diaz T, Friend KE, Fathalizadeh A, Joshi ART. The revised 2017 MSPE: Better, but not “outstanding”. J Surg Educ. 2018:e107–e111.
12. Colbert CY, French JC, Herring ME, Dannefer EF. Fairness: The hidden challenge for competency-based postgraduate medical education programs. Perspect Med Educ. 2017;6:347–355.
13. Conrad SS, Addams AN, Young GH. Holistic review in medical school admissions and selection: A strategic, mission-driven response to shifting societal needs. Acad Med. 2016;91:1472–1474.
14. Fazio SB, Papp KK, Torre DM, Defer TM. Grade inflation in the internal medicine clerkship: A national survey. Teach Learn Med. 2013;25:71–76.
15. Bowen RE, Grant WJ, Schenarts KD. The sum is greater than its parts: Clinical evaluations and grade inflation in the surgery clerkship. Am J Surg. 2015;209:760–764.
16. Bullock JL, Lai CJ, Lockspeiser T, et al. In pursuit of honors: A multi-institutional study of students’ perceptions of clerkship evaluation and grading. Acad Med. 2019;94(11 suppl):S48–S56.
17. Slavin SJ, Schindler DL, Chibnall JT. Medical student mental health 3.0: Improving student wellness through curricular changes. Acad Med. 2014;89:573–577.
18. Reed DA, Shanafelt TD, Satele DW, et al. Relationship of pass/fail grading and curriculum structure with well-being among preclinical medical students: A multi-institutional study. Acad Med. 2011;86:1367–1373.
19. Holmboe ES, Sherbino J, Long DM, Swing SR, Frank JR. The role of assessment in competency-based medical education. Med Teach. 2010;32:676–682.
20. van der Vleuten CP, Schuwirth LW, Driessen EW, et al. A model for programmatic assessment fit for purpose. Med Teach. 2012;34:205–214.
21. National Resident Matching Program, Data Release and Research Committee. Results of the 2018 NRMP Program Director Survey.National Resident Matching Program. https://www.nrmp.org/wp-content/uploads/2018/07/NRMP-2018-Program-Director-Survey-for-WWW.pdf. Published June 2018. Accessed February 28, 2020.
22. Puscas L. Viewpoint from a program director they can’t all walk on water. J Grad Med Educ. 2016;8:314–316.
23. Liaison Committee on Medical Education. Functions and Structure of a Medical School: Standards for Accreditation of Medical Education Programs Leading to the MD Degree. http://lcme.org/publications/#All. Published March 2019. Accessed February 13, 2020.
24. Papadakis MA, Teherani A, Banach MA, et al. Disciplinary action by medical boards and prior behavior in medical school. N Engl J Med. 2005;353:2673–2682.
25. Shea JA, O’Grady E, Wagner BR, Morris JB, Morrison G. Professionalism in clerkships: An analysis of MSPE commentary. Acad Med. 2008;83(10 suppl):S1–S4.
26. Boysen-Osborn M, Yanuck J, Mattson J, et al. Who to interview? Low adherence by U.S. medical schools to medical student performance evaluation format makes resident selection difficult. West J Emerg Med. 2017;18:50–55.
27. National Board of Medical Examiners. United States Medical Licensing Examination. USMLE program announces future policy changes. https://www.nbme.org/news/index.html#policy-changes-USMLE-Step-Examinations. Published February 18, 2020. Accessed February 28, 2020.
28. Summary Report and Preliminary Recommendations from the Invitational Conference on USMLE Scoring (InCUS), March 11-12, 2019. https://www.usmle.org/pdfs/incus/InCUS_summary_report.pdf. Accessed February 17, 2020.
29. Hatala R, Sawatsky AP, Dudek N, Ginsburg S, Cook DA. Using in-training evaluation report (ITER) qualitative comments to assess medical students and residents: A systematic review. Acad Med. 2017;92:868–879.
30. Scott IM. Beyond ‘driving’: The relationship between assessment, performance and learning. Med Educ. 2020;54:54–59.
31. Bennett CR, Mawhood N, Platt MJ. The impact of medical school assessment on preparedness for practice. Med Teach. 2019;41:112–114.

Reference cited in Table 1 only

32. Irby DM, Cooke M, O’Brien BC. Calls for reform of medical education by the Carnegie Foundation for the Advancement of Teaching: 1910 and 2010. Acad Med. 2010;85:220–227.
Copyright © 2020 by the Association of American Medical Colleges