Secondary Logo

Journal Logo

More About USMLE Step 1 Scoring

Weissman, Sidney H. MD

doi: 10.1097/ACM.0000000000002928
Letters to the Editor

Clinical professor of psychiatry and behavioral science, Northwestern University Feinberg School of Medicine, Chicago, Illinois;

Disclosures: None reported.

Back to Top | Article Outline

To the Editor:

The March 2019 issue of Academic Medicine contains an editorial and three Invited Commentaries that discuss the uses of the United States Medical Licensing Examination (USMLE) Step 1 exam. The authors give specific attention to how residency program directors (PDs) use Step 1 exam scores to determine which senior medical students to invite for interviews. Some believe that PDs should not use Step 1, a standardized examination, for this purpose. As a former PD in the 1970s, 1980s, and 2000s, I disagree.

Chen and colleagues1 address the issue from the medical student’s perspective. They urge that the Step 1 grading become pass/fail since Step 1 is seen by some medical students as producing unneeded stress. Andolsek2 offers proposals that would alter the entire structure of Step 1, which Sklar3 supports in his editorial. Sklar agrees with her call for major changes, including grading Step 1 as pass/fail, developing new assessment tools, and urging the development of new mechanisms for students to negotiate their transition from medical school to residency. However, Katsufrakis and Chaudhry4 urge caution in altering Step 1, pointing out its current value while warning that changes might cause unintended consequences.

Proponents for converting Step 1 grading to pass/fail advise substituting numerical scores for a national standardized holistic student assessment process. Katsufrakis and Chaudhry4 note that this proposal might not ensure fair standardized assessments of students’ performances. ten Cate and Regehr5 point out that all student assessments by faculty are subjective. Students with near-identical medical student performance evaluations and letters of recommendation—from the same school but prepared by different faculty—might or would not be fairly evaluated because of the assessors’ subjectivity. Additionally, residency programs each fall would be unable to effectively holistically assess hundreds of applicants for interviews in a few weeks.

The numerically graded Step 1 provides PDs with a standardized assessment tool to determine who to invite for residency interviews. It additionally provides residency programs with the same kind of information in assessing applicants as the SAT and ACT do for colleges, the GRE for graduate schools, the LSAT for law schools, and the MCAT for medical schools. If Step 1 is made pass/fail, PDs would need to develop new screening tools. For example, they might use the national ranking of an applicant’s medical school. More likely, pressure will boil up to develop a new standardized exam, a “Graduate Medicine Admission Test.” If accomplished, this would place us back where we started.

Back to Top | Article Outline


1. Chen DR, Priest KC, Batten JN, Fragoso LE, Reinfeld BI, Laitman BM. Student perspectives on the “Step 1 climate” in preclinical medical education. Acad Med. 2019;94:302–304.
2. Andolsek KM. One small step for Step 1. Acad Med. 2019;94:309–313.
3. Sklar DP. Matchmaker, matchmaker, make me a match: Is there a better way? Acad Med. 2019;94:295–297.
4. Katsufrakis PJ, Chaudhry HJ. Improving residency selection requires close study and better understanding of stakeholder needs. Acad Med. 2019;94:305–308.
5. ten Cate O, Regehr G. The power of subjectivity in the assessment of medical trainees. Acad Med. 2019;94:333–337.
Copyright © 2019 by the Association of American Medical Colleges