Webster’s Dictionary defines “recommendation” as “A formal letter that explains why a person is appropriate and qualified for a particular job, school, etc.” and “the act of saying that someone or something is good and deserves to be chosen.”1 Standard narrative letters of recommendation (NLORs) for residency applicants typically succeed at the latter, but rarely provide a comprehensive and specialty-specific comparative assessment.
NLORs are plagued by myriad limitations,2–6 which are summarized by Lee et al7:
In these supportive narrative letters of recommendation, authors advocate for the applicant, de-emphasize or omit negative findings, and are in effect recommending the applicant rather than providing an objective evaluation of actual performance.
The widely variable content and inherent lack of adequate comparisons with other applicants in NLORs lead to inevitable “apples to oranges” comparisons.
To confront these issues, the Council of Emergency Medicine Residency Directors (CORD) developed a standardized letter of evaluation (SLOE), an assessment tool for residency applicants to emergency medicine (EM) built on three basic tenets.2 First, the SLOE is time-efficient both to write and to review—the template enables the author to write a clear and concise synopsis and the reader to evaluate it. Second, the SLOE is standardized—authors answer the same essential questions for each applicant to EM residency, allowing residency program directors (PDs) to more easily compare competing applicants. Third, the SLOE is discriminating—the template conveys normative performance data in specific competencies important to the clinical practice of EM, explicitly stating the relative capacity of each applicant as compared with peers.
The SLOE provides applicant information that is essential to PDs’ decision making, otherwise missing from the Electronic Residency Application Service (ERAS) application. The objective of this Commentary is to share EM’s experience with the SLOE, thereby advancing the conversation of how it may benefit other postgraduate training programs.
The SLOE Instrument
The current ERAS application falls short in providing PDs an understanding of the noncognitive personality attributes that are so important in defining an applicant’s clinical performance. The Medical Student Performance Evaluation (MSPE) attempts to provide this information, but our experience is that this component has been largely ineffective. The complexities of clerkship training and the sheer number of students are significant obstacles to collecting this information in the accurate and standardized fashion necessary to impact PDs’ decision making about which applicants to interview. Practices such as having students complete or edit the section of the MSPE that addresses these attributes not only support this premise but also demonstrate a lack of understanding of proper assessment methodology that would make this information useful.
On the other hand, the goal of a completed SLOE is to “paint a picture” of the applicant that accurately summarizes their capacity for EM training. It attempts to garner the most important academic markers and personal characteristics critical to success in EM, and package these assessments into a concise, standardized, discriminating format.
The instrument (Supplemental Digital Appendix 1, http://links.lww.com/ACADMED/A387) contains four components:
- Background information: institution- and evaluator-specific factors which place the individual SLOE in perspective.
- Qualifications for EM relative to peers: seven competency-focused questions which require the evaluator to place the candidate into one of three tiers of EM-bound students.
- Global or summary assessment: a ranking of the candidate in comparison with all EM-bound students known to the evaluator.
- An open narrative section limited to 250 words: a platform to discuss essential noncognitive attributes such as conscientiousness, resilience, intellectual curiosity, leadership, and special interests that are important in predicting future success as a resident and physician. Additionally, this narrative section is used to clarify any issues in the SLOE or the applicant’s candidacy.
The SLOE is an EM-specific assessment, written by EM faculty based on the student’s clinical performance in the department. It is not meant to include performance parameters available elsewhere in the ERAS application (e.g., basic science grades, United States Medical Licensing Examination [USMLE] scores, etc). The SLOE is a powerful tool; competent authorship enhances its utility. PDs reviewing SLOEs determine the value of the SLOE based on a number of factors found within it: the length of time for which the author has known the applicant, the amount of firsthand clinical experience that the author has had with the applicant, and the experience and reputation of the author. Knowing the faculty member who wrote the SLOE increases its value in reviewers’ decision making.8 This relationship has been demonstrated in other specialties’ letters of recommendation.3,5,7,9 It is important to note that as EM educators become more experienced in program leadership, they become more discriminating in their assessments. Not surprisingly, PDs are considered among the most discriminating of SLOE authors.
There is one equally discriminating author: a team of EM faculty working together to author a “group SLOE.”10 The SLOE was originally designed for use by an individual faculty member to assess a student. Over time, as the utility of the SLOE has become apparent, programs have gradually begun to author group SLOEs, which aim to reflect the departmental perspective on individual students. Thus, a SLOE which compiles multiple faculty members’ experience into a unified departmental perspective reduces personal bias and has developed into an important tool in the assessment of students. These group SLOEs are generally written by several members of the residency program leadership, such as the PD, associate/assistant PDs, and clerkship directors. Ideally, the group meets to review clinical evaluations and incorporates direct feedback from all faculty and residents who served in clinical supervisory roles with the applicant, providing a consensus evaluation of each student in comparison with his or her colleagues. When asked if group SLOEs represent their perspective, 85.7% of surveyed participating authors agreed that they did.11 Elective “away” EM rotations are now completed by the vast majority of students applying to the specialty. Our experience has been that EM programs expect to see a group SLOE from these rotations as part of the student’s residency application. In fact, a survey of EM PDs in 2013 revealed that 85.3% of programs provide a group SLOE for applicants who have completed an EM rotation in their department.8 Ultimately, the group SLOE evaluates the comparative performance of all rotating students at a clinical site and explicitly separates the students in that academic cohort into four categories of performance, providing a “global assessment” of the student compared with his or her peers.
The validity, and thus the value, of a SLOE as a summative work-based assessment is determined by its ability to provide an unbiased, comparative, global picture based on “pixelating” (i.e., multiple clinical observations, varying circumstances, multiple occasions). Perhaps this is one of the reasons that group SLOEs are the preferred format by programs evaluating applicants.8 It is also the reason that the value of SLOEs with regard to authorship is generally ranked in the following order from most to least valuable:
- Single-institution group SLOE,
- Single-author SLOE from senior faculty,
- Single-author SLOE from junior faculty.
The ideal EM application includes several group SLOEs from different institutions that are concordant in their assessment of the applicant. These are also likely the most valid.
The Value of the SLOE in the EM Postgraduate Application Process
As a summative work-based assessment, the SLOE has proven superior to NLORs in a number of important ways. It decreases writing time for authors and decreases reviewing time for readers.12–14 The SLOE is easier than the NLOR to interpret—there is a high interrater reliability, regardless of the level of experience of the interpreter.12 The SLOE also has greater depth in evaluating the applicant, as it is largely based on “extended direct contact” in the clinical setting.10
Most important, the SLOE is predictive. The global rating, provided near the end of the assessment, is one of the factors most highly correlated with resident performance in our core competencies.14 On the basis of these successes, EM has embraced the SLOE as it has become a crucial component of every EM-bound student’s application.2,8
Identifying competencies important to medical practice and then selecting applicants according to an assessment of those competencies has been studied in the United Kingdom. This approach is more predictive of physicians’ performance in medical practice compared with traditional methods of selection.15
As program faculty, authors of the SLOE are in the position of having competing priorities. On one hand, they have a responsibility to support the students of their institution in the residency application process, often serving as advisors and mentors. On the other, as leaders in EM education, they also have a responsibility to their national peers to provide an honest assessment of applicants. SLOEs are thus designed to reflect a balanced perspective that CORD actively strives to maintain. There is a pervasive sense among the EM education community that this balanced perspective is not seen in the MSPE.
Not all programs seek the same qualities in an applicant. Some programs have an academic orientation, while others are focused on community practice. In addition, the relative importance of commitment, work ethic, special interests, leadership, and many other noncognitive factors varies from program to program. By providing a comparative assessment including distinguishing attributes, the SLOE optimizes the ability to determine “fit” between applicant and program.
Two recent studies evaluate the degree to which the SLOE has been adopted in EM and the priority that EM PDs place on the SLOE, revealing its importance in the application process. In 2013, the SLOE made up 58.1% of all letters written in support of EM residency applications.10 The SLOE has become a prerequisite for a successful application to EM, with 60.7% of programs requiring one or more SLOEs to be considered for an interview, and 36.7% recommending one.9 The value of the SLOE in decision making is underscored by the finding that 92.7% of EM PDs rank the SLOE as the most important factor in determining whom to interview; the MSPE ranks sixth behind, among other things, USMLE scores and clerkship grades.
Limitations of the SLOE
Despite its many advantages over other methods of applicant evaluation, the SLOE has a number of limitations as well. Although the SLOE template is standardized, there is no standardization for what constitutes a group SLOE or for the decision-making process in preparing a group SLOE. Equally important, an assessment is only as good as the quality of the information that goes into its creation. No study to date has evaluated the consistency or quality of this information.
Periodically, questions are raised regarding how to use the bottom-third ranking in the SLOE. This ranking is used less often than the middle and upper thirds,10 likely because of a number of factors including (1) grade inflation, (2) lower-performing candidates who are counseled and follow through on selecting a different career path, and (3) a highly competitive pool that is skewed toward high performance. A common follow-up question is, “Doesn’t a lower third score disadvantage the student?” Lower third scores from multiple authors do limit a student’s chances to match in EM. Then again, it is in the best interest of competitive applicants, residencies, and ultimately patients that acceptance to postgraduate training be based on an honest assessment of an applicant’s performance.
Future Directions: Specialty-Specific SLOEs
Other specialties could benefit similarly from their own specific SLOEs. There is both a growing interest from other fields5,7,13 and a documented need.16 We believe there is value in each specialty developing a unique SLOE template based on an assessment of clinical skills and unique attributes important to successful practice of that specialty. Examples might include fine motor skills, attention to detail, ability to “think on one’s feet,” or capacity to relate to children. As a work-based assessment written by clinical faculty in the specialty for other programs within the specialty, the SLOE provides a broader understanding of an applicant’s clinical performance, including medical decision making and noncognitive abilities pertinent to the specialty. SLOE authors are in an ideal position to reflect on the qualities and personal attributes so critical to success in clinical practice. A specialty-specific perspective is not currently represented in most ERAS applications and would be a welcome addition toward the goal of “holistic review.”
Acknowledgments: Special thanks to John H. Schatzer, PhD, and Larry Gruppen, PhD, for their support, guidance, and editorial review of this manuscript.
1. Merriam-Webster Dictionary. Recommendation. http://www.merriam-webster.com/dictionary/recommendation
. Accessed July 7, 2016.
2. Keim SM, Rein JA, Chisholm C, et al. A standardized letter of recommendation for residency application. Acad Emerg Med. 1999;6:11411146.
3. Shultz K, Mahabir RC, Song J, Verheyden CN. Evaluation of the current perspectives on letters of recommendation for residency applicants among plastic surgery program directors. Plast Surg Inter. 2012. http://dx.doi.org/10.1155/2012/728981
. Accessed July 7, 2016.
4. Stedman JM, Hatch JP, Schoenfeld LS. Letters of recommendation for the predoctoral internship in medical schools and other settings: Do they enhance decision making in the selection process? J Clin Psychol Med Settings. 2009;16:339345.
5. Fortune JB. The content and value of letters of recommendation in the resident candidate evaluative process. Curr Surg. 2002;59:7983.
6. Dirschl DR, Adams GL. Reliability in evaluating letters of recommendation. Acad Med. 2000;75:1029.
7. Lee AG, Golnik KC, Oetting TA, et al. Re-engineering the resident applicant selection process in ophthalmology: A literature review and recommendations for improvement. Surv Ophthalmol. 2008;53:164176.
8. Love JN, Smith J, Weizberg M, et al.; SLOR Task Force. Council of Emergency Medicine Residency Directors’ standardized letter of recommendation: The program director’s perspective. Acad Emerg Med. 2014;21:680687.
9. Kaffenberger BH, Kaffenberger JA, Zirwas MJ. Academic dermatologists’ views on the value of residency letters of recommendation. J Am Acad Dermatol. 2014;71:395396.
10. Love JN, Deiorio NM, Ronan-Bentle S, et al.; SLOR Task Force. Characterization of the Council of Emergency Medicine Residency Directors’ standardized letter of recommendation in 2011–2012. Acad Emerg Med. 2013;20:926932.
11. Hegarty CB, Lane DR, Love JN, et al. Council of Emergency Medicine Residency Directors standardized letter of recommendation writers’ questionnaire. J Grad Med Educ. 2014;6:301306.
12. Girzadas DV Jr, Harwood RC, Dearie J, Garrett S. A comparison of standardized and narrative letters of recommendation. Acad Emerg Med. 1998;5:11011104.
13. Perkins JN, Liang C, McFann K, Abaza MM, Streubel SO, Prager JD. Standardized letter of recommendation for otolaryngology residency selection. Laryngoscope. 2013;123:123133.
14. Bhat R, Manish G, Goyal N, et al. Predictors of success in emergency medicine residency: A multicenter study. West J Emerg Med. 2014;15(5.1). http://escholarship.org/uc/item/2bg5s1kr
. Accessed July 7, 2016.
15. Patterson F, Ferguson E, Norfolk T, Lane P. A new selection system to recruit general practice registrars: preliminary findings from a validation study. BMJ. 2005;330:711714.
16. DeZee KJ, Thomas MR, Mintz M, Durning SJ. Letters of recommendation: Rating, writing, and reading by clerkship directors of internal medicine. Teach Learn Med. 2009;21:153158.