Secondary Logo

Journal Logo

Innovation Reports

Reporting Achievement of Medical Student Milestones to Residency Program Directors: An Educational Handover

Sozener, Cemal B. MD; Lypson, Monica L. MD, MHPE; House, Joseph B. MD; Hopson, Laura R. MD; Dooley-Hash, Suzanne L. MD; Hauff, Samantha MD; Eddy, Mary MD; Fischer, Jonathan P. MPH; Santen, Sally A. MD, PhD

Author Information
doi: 10.1097/ACM.0000000000000953
  • Free

Abstract

Problem

Medical education has undergone a paradigm shift toward competency-based education. This transition within graduate medical education (GME) began with the Accreditation Council for Graduate Medical Education (ACGME) competencies in 1999; the ACGME Outcomes Project in 2002 further fostered the shift. In 2009, the move toward competency-based education gained further momentum with the Milestone Project, which created specialty-specific milestones that mark progress from medical school graduation through residency and into practice (see Appendix 1 for examples from emergency medicine [EM]).

The Liaison Committee on Medical Education requires MD-granting medical schools in the United States to define educational objectives for their students, and many schools have adopted the objectives outlined in the six core competency domains advanced by the ACGME. Until recently, however, the expectations for graduates from U.S. MD-granting schools have not been clearly articulated. The recent publication of expected “level-one milestone achievement” for new residents has begun to fill that gap, and the 2013 publication of Core Entrustable Professional Activities for Entering Residency (CEPAER)1 has further formalized the expected performance of interns.

As the ACGME has moved to competency outcomes and the assessment of specialty-specific milestones, evidence is emerging that medical school graduates have not consistently met level 1 milestones prior to entering residency.2 Not only has the medical education community not clearly documented competency during the transition from undergraduate medical education (UME) to GME, but the community also has not clarified where the responsibility for ensuring level 1 competency of graduates falls.

Currently, the primary document—outside of letters of recommendation and the medical school transcript—used to transmit detailed information on student performance from medical schools to residencies is the medical student performance evaluation (MSPE; formerly referred to as the dean’s letter). For some schools this document includes information about each student’s general competency; however, for many schools, the traditional MSPE does not provide the level of detail needed to determine the milestone levels of entering interns. Further, this traditional letter, which has undergone several iterations based on guidelines by the Association of American Medical Colleges,3 often requires substantial effort to complete. In spite of these efforts, a large degree of variability exists in the content and quality of these letters, which in turn affects their utility. In addition, some residency program directors (PDs) perceive the MSPEs as unreliable because key information on student competency may not be communicated through this document.3–5 Finally, the MSPE is completed early in the fourth year, almost a year before graduates start internship.

We therefore proposed a post-Match milestone-based MSPE (mMSPE) documenting students’ attainment of key competencies that medical schools can use to hand graduates over to residencies. We conducted this feasibility study to assess milestones for graduating medical students entering into EM residencies, to create a post-Match mMSPE for each graduate, and to determine if it could provide useful, detailed performance data on the trainee for the receiving residency PD. This innovation—the creation of a unique mMSPE for each graduating EM-residency-bound student—may provide a path forward in facilitating the medical school’s responsibility for assessing the competency of its graduates and communicating their competency levels during a UME-GME educational handover.

Approach

Leaders from the University of Michigan Medical School (i.e., clerkship [J.B.H., C.B.S.], residency [L.R.H., S.H.], and boot camp [S.L.D.-H.] PDs, plus two assistant deans [M.L.L., S.A.S.]), formed the ad hoc Emergency Medicine Medical Student Milestone Competency Committee (hereafter, simply the Committee) for the purpose of completing this project. The goal was (1) to develop a post-Match mMSPE for University of Michigan medical students matching into EM residencies, (2) to provide that document to the residency PD where each student matched, and (3) to request feedback from the PDs at these programs on the utility of the mMSPE. In the spring of 2013, representatives of the Committee approached each of the seven graduating students who matched into EM residences (not at the University of Michigan) and gained their consent to participate in this pilot study, which the University of Michigan institutional review board determined to be exempt.

In April 2013, the Committee reviewed the medical school’s clinical curriculum, identified existing assessments that provided competency-based performance data on students, and mapped these assessments to the EM milestones6 (three-hour time commitment). The first source of performance data was the mandatory fourth-year EM clerkship assessment, which provided both numerical data (clinical assessment scores) and written data (the perceived strengths and weaknesses noted by supervising faculty and residents) on each student. The second source was the Comprehensive Clinical Exam, a mandatory multistation, multimodality summative clinical examination involving standardized patients that is administered at the end of the third year. The third source of assessment was an EM boot camp elective that most 2013 graduates (n = 4) entering an EM residency chose to take late in their fourth year. Finally, the Committee used other available sources, such as USMLE scores. The Committee did not include assessment data from subinternships because students chose among different non-EM disciplines.

Next, as a group, the Committee reviewed all of the assessment data for each student (five-hour time commitment). Committee members mapped assessment data to the EM milestones6 and, using a milestone assessment rubric (Appendix 1), assigned milestone achievement levels. In cases of conflicting data, the Committee used the most recent assessment data anticipating improvement throughout the year, and on rare occasions, when data were ambiguous, the Committee made decisions by consensus. Finally, the Committee generated an mMSPE for each student (two-hour time commitment; see Appendix 2 for an example). The Committee offered these individual letters to the students to review; the students reacted favorably to the letters, and no changes were made. In July 2013, after graduation, each of the letters was sent to the respective graduate’s PD along with a survey requesting feedback from the PD to help the Committee determine the utility of the mMSPE.

Outcomes

In our feasibility pilot, we created a novel post-Match mMSPE and provided an assessment of graduates’ milestone competency level. Using existing assessments, we measured nearly all milestones for all seven graduates entering EM residencies. We could not consistently assess 3 of the 23 milestones: ultrasound (PC12) was taught and assessed for only the four students taking the EM boot camp; observation/reassessment (PC6) was taught but not consistently assessed; and patient safety (SBP1) was neither taught nor assessed. All graduating students met level 1 or level 2 milestones. Our innovation—using existing assessments to communicate vital, competency-based information about medical school graduates to residency programs during “educational handovers”—proved effective in a small feasibility pilot.

The seven students entering EM who did not match at our own institution represented six distinct EM residency programs. We obtained responses from five of the six PDs (83%). None of the responding PDs stated that they currently use the traditional MSPE to customize training for their incoming interns. The majority (n = 4; 80%) thought the proposed assessment provided new information not available on the traditional MSPE, and one concluded that the proposed letter would allow for early intervention for areas of weakness. All responding PDs felt that the proposed assessment would be useful for all incoming interns.

Advantages of the mMSPE

Currently, information on student milestone competency is not clearly communicated between medical schools and residency programs as part of an educational handover. In response to this gap, many groups, especially in GME, have called for the documentation of trainee performance during transitions across the medical education continuum. To begin to address this lack of information on incoming trainees, medical schools should provide competency assessment as part of an educational handover.

The mMSPE we propose provides a solution. First, although this “second dean’s letter” does not affect residency placement (because it is distributed after the Match occurs), it does provide PDs with a more accurate and up-to-date view of the capacities of the new interns. This information allows the PDs to tailor training to the strengths and weaknesses of their incoming class, which, in turn, affords the opportunity to address any weaknesses before problems arise.

Additionally, if more medical schools were to provide such letters, the letters could serve to standardize assessment and outcomes between schools and receiving residency programs. This standardization could address the disconnect that exists between medical schools and residencies regarding how interns should meet level 1 milestones.2 Residency programs are ultimately responsible for ensuring that their learners obtain the skills necessary for independent practice, but interns are expected to achieve at least level 1 milestone competency prior to beginning their residency. Some GME institutions have instituted an intern assessment to assess and ensure the readiness of new interns.7 Thus, medical school clerkship and subinternship directors must increasingly take responsibility for assessing level 1 milestone attainment in their students. The mMSPE provides a means of communicating these competency-based assessments.

Limitations

Although the mMSPE is feasible and likely beneficial, we note two limitations. First, this pilot study was conducted at a single institution with robust processes already in place, which allowed for easy assessment of milestone attainment. As such, generating the mMSPE did not pose an undue burden or create time constraints; however, at medical schools without existing systems for measuring competency attainment, significant resources may be required to produce this level of assessment. Whether such schools will be interested in establishing additional processes de novo is unknown. Second, medical school deans or other leaders may write mostly positive MSPEs that may not disclose negative information about students in order to bolster the students’ chances for a successful Match.8 Because the mMSPE may potentially provide a more objective assessment of students, some educators may be concerned that the mMSPE and traditional MSPE may seem contradictory. Such concerns could result in initial unwillingness to generate the more objective, post-Match letters; however, this concern could also lead to a healthy discussion within the medical school to reconcile potential differences between the two documents.

We developed the mMSPE for a specific purpose and do not intend for them to address many of the perceived issues with the current MSPE. Residency leaders have expressed the need to have more objective information during the process of interviewing potential program matriculants.4,8–10 Because these mMSPEs are intended to be sent after students have matched, they do not address this problem. Thus, these letters might provide much of the information lacking in MSPEs, but further reforms (e.g., aligning the MSPE to CEPAER or milestones; using a standardized letter of evaluation to communicate specific details) to the current MSPE are still necessary.

Next Steps

For this initial feasibility study, the Committee was able to incorporate data from existing assessments into milestone assessments for each student. The Committee’s current work is aimed at more closely aligning assessments throughout the curriculum with the milestones. For example, during the 2014–2015 academic year we worked to align our EM boot camp assessments with specific EM milestones. We are considering aligning the mandatory EM clerkship with milestones as well. As the medical school’s curriculum undergoes transformation in the future, we will align competency assessments in tandem. We have also planned to engage other PDs, schools, and specialties in this process and to evaluate the results so as to enhance feasibility and reliability.

Medical education is at a crossroads. Competency-based assessment has become increasingly widespread. For this movement to succeed, medical schools must take an active role in using competency-based assessments and reliably communicating the information they garner to residency programs as part of a standardized educational handover. There is a need for an effective competency-based assessment tool. The mMSPE provides a feasible method for sharing information about learner competency between medical schools and residency programs, and, if implemented widely across specialties and locations, it could result in significant benefits to medical education.

As a post-Match communique, it serves the important purpose of allowing PDs to receive an unbiased assessment of their incoming interns, which they can use to customize early educational programs that account for the strengths and deficiencies of their trainees. Ideally, the mMSPE would also serve as a tool for identifying as early as possible other issues that may not be highlighted on a trainee’s MSPE. Next steps for this project include determining the widespread usefulness (generalizability) of and interest in such a tool by all EM (and other specialty) PDs.

Acknowledgments: The authors would like to thank Heather Wourman, Barbara Blackstone, and Dawn Ambs for their invaluable administrative support on this project.

References

1. Association of American Medical Colleges (AAMC). Core entrustable professional activities for entering residency. May 28, 2014. https://www.mededportal.org/icollaborative/resource/887. Accessed August 20, 2015.
2. Santen SA, Rademacher N, Heron SL, Khandelwal S, Hauff S, Hopson L. How competent are emergency medicine interns for level 1 milestones: Who is responsible? Acad Emerg Med. 2013;20:736739.
3. Association of American Medical Colleges. A guide to the preparation of the medical student performance evaluation. 2002. https://www.aamc.org/download/64496/data/mspeguide.pdf. Accessed August 6, 2015.
4. Kiefer CS, Colletti JE, Bellolio MF, et al. The “good” dean’s letter. Acad Med. 2010;85:17051708.
5. Swide C, Lasater K, Dillman D. Perceived predictive value of the medical student performance evaluation (MSPE) in anesthesiology resident selection. J Clin Anesth. 2009;21:3843.
6. Beeson MS, Carter WA, Christopher TA, et al. Emergency medicine milestones. J Grad Med Educ. 2013;5:513.
7. Lypson ML, Frohna JG, Gruppen LD, Woolliscroft JO. Assessing residents’ competencies at baseline: Identifying the gaps. Acad Med. 2004;79:564570.
8. Edmond M, Roberson M, Hasan N. The dishonest dean’s letter: An analysis of 532 dean’s letters from 99 U.S. medical schools. Acad Med. 1999;74:10331035.
9. Naidich JB, Lee JY, Hansen EC, Smith LG. The meaning of excellence. Acad Radiol. 2007;14:11211126.
10. Shea JA, O’Grady E, Morrison G, Wagner BR, Morris JB. Medical student performance evaluations in 2005: An improvement over the former dean’s letter? Acad Med. 2008;83:284291.

Appendix 1

Explanation of Emergency Medicine (EM) Milestonesa

Sources of assessment data (i.e., Assessment):

  • EM clinical rotations (Clerkship) at (1) University of Michigan and (2) Hurley Medical Center
  • EM Boot Camp
  • Applicable medical school curricular grades (Med School)
  • Comprehensive Clinical Exam 4 performance data (CCA4)
  • Advanced Medical Therapeutics (Adv. Med. Therapeutic)
  • Standardized patient interaction (SPI)

Appendix 2

Milestones-Based Medical Student Performance Evaluation (mMSPE) or Supplemental Dean’s Letter for a Representative University of Michigan Student Entering an Emergency Medicine (EM) Residency

Sources of assessment data (i.e., Assessment) of data incorporated into this letter or mMSPE:

  • EM clinical rotations (Clerkship) at (1) University of Michigan and (2) Hurley Medical Center
  • EM Boot Camp
  • Applicable medical school curricular grades (Med School)
  • Comprehensive Clinical Exam 4 performance data (CCA4)
  • Advanced Medical Therapeutics (Adv. Med. Therapeutic)
  • Standardized patient interaction (SPI)

Scores and competencies listed below correspond to EM Milestone Levels described in Appendix 1

Copyright © 2016 by the Association of American Medical Colleges