Secondary Logo

Journal Logo

Step Up—Not On—The Step 2 Clinical Skills Exam: Directors of Clinical Skills Courses (DOCS) Oppose Ending Step 2 CS

Ecker, David, J., MD; Milan, Felise, B., MD; Cassese, Todd, MD; Farnan, Jeanne, M., MD, MHPE; Madigosky, Wendy, S., MD, MSPH; Massie, F., Stanford, Jr, MD; Mendez, Paul, MD; Obadia, Sharon, DO; Ovitsh, Robin, K., MD; Silvestri, Ronald, MD; Uchida, Toshiko, MD; Daniel, Michelle, MD, MHPE

doi: 10.1097/ACM.0000000000001874
Perspectives
Free
SDC

Recently, a student-initiated movement to end the United States Medical Licensing Examination Step 2 Clinical Skills and the Comprehensive Osteopathic Medical Licensing Examination Level 2-Performance Evaluation has gained momentum. These are the only national licensing examinations designed to assess clinical skills competence in the stepwise process through which physicians gain licensure and certification. Therefore, the movement to end these examinations and the ensuing debate merit careful consideration. The authors, elected representatives of the Directors of Clinical Skills Courses, an organization comprising clinical skills educators in the United States and beyond, believe abolishing the national clinical skills examinations would have a major negative impact on the clinical skills training of medical students, and that forfeiting a national clinical skills competency standard has the potential to diminish the quality of care provided to patients. In this Perspective, the authors offer important additional background information, outline key concerns regarding the consequences of ending these national clinical skills examinations, and provide recommendations for moving forward: reducing the costs for students, exploring alternatives, increasing the value and transparency of the current examinations, recognizing and enhancing the strengths of the current examinations, and engaging in a national dialogue about the issue.

D.J. Ecker is assistant professor of medicine, assistant director of education, Hospital Medicine Group, and director, Integrated Clinicians Course, University of Colorado School of Medicine, Aurora, Colorado, and chair, Advocacy and Advancement Subcommittee, Directors of Clinical Skills Courses (DOCS); ORCID: http://orcid.org/0000-0002-1530-0079.

F.B. Milan is professor of medicine and director, Ruth L. Gottesman Clinical Skills Center and Introduction to Clinical Medicine Program, Albert Einstein College of Medicine, Bronx, New York, and president, Directors of Clinical Skills Courses (DOCS).

T. Cassese is associate professor of medical science and director, Clinical Arts and Sciences Course, Frank H. Netter MD School of Medicine, Quinnipiac University, North Haven, Connecticut, and president-elect, Directors of Clinical Skills Courses (DOCS).

J.M. Farnan is assistant dean, Curricular Innovation and Evaluation, associate professor of medicine, and director, Clinical Skills Education, University of Chicago Pritzker School of Medicine, Chicago, Illinois, and secretary, Directors of Clinical Skills Courses (DOCS); ORCID: http://orcid.org/0000-0002-1138-9416.

W.S. Madigosky is associate professor of family medicine and director, Foundations of Doctoring Curriculum, University of Colorado School of Medicine, Aurora, Colorado, and chair, Nominations Subcommittee, Directors of Clinical Skills Courses (DOCS); ORCID: http://orcid.org/0000-0003-0714-4114.

F.S. Massie Jr is professor of medicine, director, Introduction to Clinical Medicine Curriculum, and director, Clinical Skills Scholars Program, University of Alabama School of Medicine, Birmingham, Alabama, and past president (2014–2015), Directors of Clinical Skills Courses (DOCS).

P. Mendez is associate dean, Clinical Curriculum, associate professor of medicine, and director, Clinical Skills Program, University of Miami Miller School of Medicine, Miami, Florida, and representative, Southern Group on Educational Affairs, Directors of Clinical Skills Courses (DOCS).

S. Obadia is associate dean, Clinical Education and Services, associate professor of internal medicine, and codirector, Medical Skills Courses, A.T. Still University, School of Osteopathic Medicine, Mesa, Arizona, and chair, Program Planning Subcommittee, Directors of Clinical Skills Courses (DOCS).

R.K. Ovitsh is assistant dean, Clinical Competencies, and assistant professor of pediatrics, State University of New York Downstate School of Medicine, Brooklyn, New York, and representative, Northeast Group on Educational Affairs, Directors of Clinical Skills Courses (DOCS).

R. Silvestri is assistant professor of medicine and site director, Practice of Medicine Clinical Skills Course, Harvard Medical School, Boston, Massachusetts, and chair, Research Subcommittee, Directors of Clinical Skills Courses (DOCS).

T. Uchida is associate professor of medicine and medical education and director, Clinical Skills Education, Northwestern University Feinberg School of Medicine, Chicago, Illinois, and treasurer, Directors of Clinical Skills Courses (DOCS).

M. Daniel is assistant dean, Curriculum, and assistant professor of emergency medicine and learning and health sciences, University of Michigan Medical School, Ann Arbor, Michigan, and past president (2015–2016), Directors of Clinical Skills Courses (DOCS); ORCID: http://orcid.org/0000-0001-8961-7119.

Editor’s Note: An Invited Commentary by W.P. Burdick, J.R. Boulet, and K.E. LeBlanc appears on pages 690–692.

Funding/Support: None reported.

Other disclosures: David J. Ecker, MD, served as a United States Medical Licensing Examination (USMLE) Step 2 Clinical Skills (CS) Pilot Standard Setting Panelist in Philadelphia, Pennsylvania, August 4–5, 2016. Jeanne M. Farnan, MD, MHPE, served as a USMLE 2 CS Pilot Standard Setting Panelist in Philadelphia, Pennsylvania, February 22–24, 2017. Sharon Obadia, DO, served as a Comprehensive Osteopathic Medical Licensing Examination-USA Level 2-Performance Examination Standard Setting Panelist for the Humanistic Domain in Conshohocken, Pennsylvania, on October 2–3, 2015.

Ethical approval: Reported as not applicable.

Correspondence should be addressed to David J. Ecker, Mail Stop F782, 12401 E. 17th Ave., Aurora, CO 80045; telephone: (720) 848-4289; e-mail: david.ecker@ucdenver.edu.

Recently, a student-initiated movement to end both the United States Medical Licensing Examination Step 2 Clinical Skills (USMLE Step 2 CS) and the Comprehensive Osteopathic Medical Licensing Examination Level 2-Performance Evaluation (COMLEX Level 2-PE) has gained momentum. These performance-based examinations require senior medical students to interview and examine standardized patients (SPs) at testing sites across the country. In this Perspective, we the authors, elected representatives of the Directors of Clinical Skills Courses (DOCS), express concerns about eliminating these examinations as a national standard, a step we feel could weaken the clinical skills of medical graduates and ultimately threaten patient safety and public welfare.

Back to Top | Article Outline

The Movement to End Step 2 CS

In early 2016, a group of Massachusetts medical students—citing their displeasure with the cost of the USMLE Step 2 CS and expressing doubts about its necessity—initiated a movement to end the examination.1 Since that time, the Michigan State Medical Society, Massachusetts Medical Society, and the American Medical Association (AMA) Student Section have all voted in favor of ending the USMLE Step 2 CS and COMLEX Level 2-PE. On June 15, 2016, the AMA House of Delegates adopted a resolution calling for the AMA to work with the Federation of State Medical Boards, the National Board of Medical Examiners (NBME), the National Board of Osteopathic Medical Examiners (NBOME), state medical societies, and other key stakeholders to transition the USMLE Step 2 CS and COMLEX Level 2-PE to school-administered clinical skills examinations (Box 1).2 As of January 2017, more than 17,500 students and physicians signed a petition stating, “[W]e strongly believe eliminating the national clinical skills exam for U.S. medical graduates reduces unnecessary costs in the education process without negatively affecting patient care.”1 The petition goes on to describe the examination’s expense, poor value, and lack of efficacy.

Back to Top | Article Outline

Box 1American Medical Association (AMA) Resolution Regarding the Abolition of the United States Medical Licensing Examination Step 2 Clinical Skillsa

Table

Table

Although the USMLE Step 2 CS and COMLEX Level 2-PE each represent just one component of the stepwise process through which physicians gain licensure and certification, they are the only national licensing examinations (NLEs) designed to assess clinical skills competence. The movement to end these examinations and the related debate, therefore, merit careful consideration of the potential consequences.

As a representative body of clinical skills educators in the United States and beyond, the DOCS organization has a keen interest in this national debate. DOCS was founded in 2011 with the purpose of building a cohesive and productive international alliance of educators who direct, teach, and/or support clinical skills courses and the assessment of medical students’ clinical skills. DOCS has more than 400 members, representing all but five MD-granting medical schools accredited by the Liaison Committee on Medical Education (LCME), many DO-granting medical schools accredited by the Commission on Osteopathic College Accreditation (COCA), and several international medical schools.

The DOCS executive council, on behalf of the DOCS members, recognizes that the movement to end the national clinical skills examinations raises valid concerns. We do not, however, support ending the USMLE Step 2 CS or COMLEX Level 2-PE, and we believe the recommendation to move testing to individual medical schools is deeply flawed. Here we offer important additional background information, outline the potential consequences of ending these national clinical skills examinations, and provide recommendations for moving forward.

Back to Top | Article Outline

Main Contentions of the Movement to End Step 2 CS

As noted above, the main contentions raised by the movement to end the USMLE Step 2 CS are the cost of the NLEs and doubts about their value and impact. The USMLE Step 2 CS costs $1,280 and the COMLEX Level 2-PE costs $1,295 per student. Collectively, the costs to examinees exceed $30 million annually, not including travel expenses and time away from school. Students who fail twice are arguably at greatest risk of performing below a minimum competency standard. Lehman and Guercio3 estimate that the cost to identify a single examinee who fails USMLE Step 2 CS on back-to-back attempts is $1.1 million and conclude that the examination is “a poor value proposition.” Those advocating the abolition of national clinical skills examinations further highlight the lack of data to support a causal link between the examinations and patient outcomes; they propose that local medical schools, the majority of which already conduct their own clinical skills examinations, may more efficiently assess clinical skills.1

Back to Top | Article Outline

Potential Consequences of Ending National Clinical Skills Examinations

Devaluing clinical skills in medical education

In 2004 and 2005, the NLEs in the United States expanded to include clinical skills assessments; the NBME added the USMLE Step 2 CS, and the NBOME added the COMLEX Level 2-PE. These changes occurred in response to growing concerns that graduates of U.S. medical schools lacked essential skills in communication, physical examination, written documentation, and clinical reasoning.4–6 These concerns voiced by the medical education community were echoed loudly in the lay press and continue to the present day.7,8 In addition to ensuring that graduates of U.S. medical schools entering residency programs demonstrate competency in basic clinical skills, the implementation of the national clinical skills examinations drove curricular reform to improve clinical skills instruction and assessment at the local level.

Prior to these national clinical skills examinations, medical students’ basic clinical skills were never or rarely observed.9,10 Medical schools tended to focus on medical knowledge assessment and often neglected appropriate methods for the assessment of clinical skills.11 Even the AMA noted the potential value of national clinical skills examinations by stating that “while licensure examinations should not dictate medical school curriculum, it is evident that the content and structure of those examinations do directly influence medical student education and evaluation.”12

The USMLE Step 2 CS and COMLEX Level 2-PE examinations have unquestionably altered the landscape of clinical skills education in the United States.13 In 2005, just one year after the implementation of the national clinical skills examinations, the Clerkship Directors in Internal Medicine surveyed 109 clinical skills course directors.14 Of the 88 respondents, 40% reported increased emphasis or curricular time devoted to clinical skills education. Furthermore, 39% of these directors reported significant changes to their course content, and 45% reported altering their course objectives.14 These percentages have steadily increased over time, and the presence of the national clinical skills examinations continues to influence the appropriate emphasis on clinical skills education in undergraduate medical education. In 2015–2016, 133 of 142 (94%) LCME-accredited medical schools administered a final comprehensive objective structured clinical examination (OSCE) compared with 94 of 126 (75%) schools in 2003–2004.15,16

Ending the USMLE Step 2 CS and COMLEX Level 2-PE risks devaluing clinical skills, potentially undermining emphasis on clinical skills education in favor of a return to an excessive focus on knowledge-based assessments. Further, in the absence of these examinations, schools may have less incentive to allocate adequate resources to clinical skills education, which by its very nature is resource intensive. Sending such a message is particularly concerning in light of calls to improve education and assessment related to clinical skills.17 Poor communication skills are consistently among the most prevalent complaints against physicians and are reported more frequently than complaints about the quality of care.18 Furthermore, assessment drives learning, stimulates study effort, assists with learner buy-in of important concepts, and “expands professional horizons.”19–22 Finally, we feel that continuing the commitment to clinical skills evaluation as a part of medical licensure demonstrates that the values of physicians align with public concerns.

Back to Top | Article Outline

Forfeiting a national standard and generalizability

Graduation from an accredited MD- or DO-granting medical school is not a viable substitute for passing an NLE devoted expressly to clinical skills assessment. In 2016, only 108 of the 133 (81%) LCME-accredited medical schools with a required comprehensive OSCE mandated that their students pass their local examination in order to graduate.15 Further, we feel that the variability of passing standards among the schools that do have local clinical skills examination graduation requirements raises significant concern about their use as viable substitutes for NLEs. Additionally, many schools continue to employ norm- rather than criterion-referenced standard setting, whereby the passing score is determined by comparing examinees with one another rather than against a competency standard.23

Creating reliable criterion-referenced grading standards is challenging and time consuming. Even if each medical school could develop these locally, doing so would most certainly still result in countless different passing standards across the country. In fact, a systematic review examining OSCE checklists used to assess the communication skills of undergraduate medical students found that the heterogeneity in assessment rubrics and a lack of agreement among reviewers makes comparison of student competence within and across institutions difficult.24

Medical schools offer diverse curricula and often use locally produced OSCEs to assess unique and innovative aspects of their curricula. This need for local OSCE variability presents another barrier to the feasibility of all medical schools administering comparable final comprehensive OSCEs.

For these reasons, many of those calling for locally administered clinical skills examinations as replacements for the NLEs advocate oversight of local examinations by an authoritative body. To enable useful comparisons among examinees from various institutions, supervisory/advisory committees would need to be formed with agreed-upon common exam standards, objectives, structures, and assessment strategies. Ensuring that every accredited medical school used these resources to assess student competence would be an arduous and costly endeavor. The importance of local exam oversight in the absence of a national examination cannot be overstated given that institutions are often hesitant to fail learners with low performance.25

Back to Top | Article Outline

Threatening robust examination psychometrics

The proposed transfer of responsibility for clinical skills examinations to medical schools raises significant concerns about local assessment quality, rigor, reliability, validity, and security. To approach the psychometric quality of the currently administered national clinical skills examinations, the majority of accredited medical schools would need to make substantial changes to their clinical skills examinations. First, most schools would need to increase the resources dedicated to training and evaluating SPs to ensure utmost consistency. Second, most schools would need to modify and validate their evaluation rubrics and improve interrater reliability.26 Third, schools would need to develop a large number of clinical cases that would allow for each student to be assessed across multiple stations with diverse clinical scenarios in an attempt to improve exam reliability.26 Fourth, schools would need dedicated psychometricians to perform robust analyses of their examinations’ reliability and validity. Psychometricians with expertise in performance-based assessments capable of performing generalizability analysis, the current standard to establish the reliability of OSCEs, are rare. Psychometric analysis and data management could be regionalized or shared among medical schools, but this would require administration of the same OSCE at each school. Finally, each school would need to guarantee the security of its examinations, which would require systems and personnel to ensure strict adherence to rigorous safeguards and procedures.

Back to Top | Article Outline

Increasing costs for examinees

A major driving force behind the movement to end USMLE Step 2 CS and COMLEX Level 2-PE is the financial burden on students. Currently, no available evidence indicates that the local delivery of comprehensive clinical skills examinations would decrease costs to examinees beyond the elimination of travel expenses. Medical schools would need to ensure that their clinical skills centers have adequate staff, facilities, and technology to accommodate these improved examinations with the aforementioned necessary modifications, or schools would need to arrange for their students to be tested elsewhere. In fact, decentralization, requiring national oversight of over 140 different tests, would surely increase the overall costs. As a result, schools would likely pass on their increased expenses in the form of higher tuition or additional student fees.

Back to Top | Article Outline

Failing to protect the public

To date, the patient and the public voice have been notably absent in this debate. Given that physicians have approximately 100,000 patient encounters over a career, society should place a high value on identifying students who fail to meet minimum competency standards. The aforementioned estimate of $1.1 million to identify a single student who fails on two attempts was based on the original passing rate of 98% for first-time examinees and 91% for repeat examinees,3 which is no longer current. At present, the first-time pass rates for examinees from U.S. and Canadian medical schools on the USMLE Step 2 CS and COMLEX Level 2-PE are, respectively, 96% and 94%.27,28 These are comparable to the 96% first-time pass rates for USMLE Step 1 and Step 2 Clinical Knowledge.29,30 Opponents of Step 2 CS also question the ability of the examination to predict future performance and argue that public confidence in national clinical skills examinations may be unfounded. Studies by the NBME probing this issue used outdated data from 2005.31,32 Both examinations have been significantly modified since that time, so any correlation between current exam performance and future practice is unknown.

Back to Top | Article Outline

Recommendations

While we are deeply concerned about the potentially negative consequences of abolishing the USMLE Step 2 CS and COMLEX Level 2-PE examinations, we concede that reform of the examinations could address some of the concerns expressed by those who are in favor of ending them. Some recommendations for going forward are as follows: reducing the costs for students, exploring alternatives, increasing the value and transparency of the current examinations, recognizing and enhancing the strengths of the current examinations, and engaging in a national dialogue about the issue (List 1).

Back to Top | Article Outline

List 1Summary of Possible Consequences of Abandoning National Clinical Skills Examinations and of Key Recommendations

Potential Consequences

  • Devaluing clinical skills in medical education
  • Forfeiting a national standard and generalizability
  • Threatening robust examination psychometrics
  • Increasing costs for examinees
  • Failing to protect the public
Back to Top | Article Outline

Key Recommendations

  • Reduce the total cost of national licensing examinations for students
  • Explore alternate assessment methods
  • Increase value to examinees, medical schools, and residency programs
  • Increase transparency
  • Recognize and enhance the strengths of the current examinations
  • Engage key stakeholders in a national dialogue
Back to Top | Article Outline

Reduce the total cost of NLEs for students

Medical student debt is increasing at an alarming rate. All parts of the NLEs, including but not limited to the national clinical skills examinations, contribute to this financial burden. We believe a national dialogue amongst all stakeholders is needed to determine how to best reduce these costs.

Swanson and Roberts33 predict that emphasis on NLEs will only grow as the number and diversity of medical schools increase. If major goals of NLEs are to enhance standards in medical schools, to screen out doctors with significant deficits, and to ensure national competency standards to protect patients, we contend that the United States need not abolish national clinical skills examinations but, rather, work to contain costs and initiate a broader dialogue on whether students should bear the sole burden of the costs of licensure.

While the current number of clinical skills testing sites may represent a financially optimized system from the perspectives of the NBME and NBOME, it is clearly suboptimal with regard to travel expenses for students. A cost–benefit analysis of opening more sites that includes the views of all parties should be undertaken and the results disseminated.

Back to Top | Article Outline

Explore alternate assessment methods

Innovative assessment methods with the potential to improve the value of the current national clinical skills examinations should be further explored. Sequential, multistage, or flexi-level testing has the potential to make more effective use of testing resources.33,34 In this type of system, examinees who will clearly pass are quickly identified through a screening examination, and then resources can be better expended by focusing on examinees whose performance is close to the passing score.

Workplace-based assessments (WBAs) also hold promise, and recent research in graduate medical education suggests that aligning WBAs with the concepts of entrustable professional activities (EPAs) and supervisory scales improves their validity and reliability.35 Regrettably, medical students undergo direct observation only infrequently, and significant resources would need to be invested at the local level to broadly implement reliable WBAs. In addition, the results of WBAs are inherently linked to the clinical contexts in which they are obtained, which limits their generalizability and, therefore, their value in high-stakes, summative decisions.33 A combination of formative WBAs and a summative national clinical skills examination that provides an independent assessment of whether a graduate is achieving EPAs would be an ideal to strive for; this combination would provide more meaningful feedback to students and residency programs alike.

Back to Top | Article Outline

Increase value to examinees, medical schools, and residency programs

Interestingly, we are aware of no calls to end the national medical knowledge examinations despite similar costs to students and pass rates. Perhaps this is because the medical knowledge examinations provide additional information of value to examinees, medical schools, and residency programs—primarily in the form of quantitative data that can be used to stratify students.36 In contrast, the national clinical skills examinations provide third parties with only a verification of examinees’ minimal competency and cannot be used to rank passing examinees’ relative clinical skills.

We believe the NBME and NBOME should institute tiered, domain-specific scoring systems that are reported to third parties and examinees alike. While the psychometrics may not be robust enough to rank-order students, as is done with the national medical knowledge examinations, less reliability is needed to divide the examinees into tertiles or quartiles. As the data allow, based on the standard error, this scoring system could provide students, residency programs, and licensing authorities with information beyond minimal competency, such as an examinee’s areas of notable strength or domains in need of improvement. This information might also provide a counterbalance to the considerable emphasis placed by residency programs on national medical knowledge exam scores, such as USMLE Step 1.37

Back to Top | Article Outline

Increase transparency

The NBME and NBOME should be credited for their rigorous and robust efforts to create standardized, reliable, fair assessments of student performance. To maintain the integrity of these examinations naturally requires some degree of discretion or confidentiality about techniques and content; however, increasing the transparency of the specific objectives and assessment standards of the national clinical skills examinations might further increase their perceived value for institutions and students. To that end, the NBME and NBOME should consider distributing more detailed feedback to examinees’ medical schools. Providing schools with domain-specific aggregate student data would enable schools to assess the efficacy of their curricula against a national standard. Providing such information could stimulate a mutually beneficial cycle of local clinical skills curriculum review, innovation, quality improvement, and enhanced assessment.13,38

Back to Top | Article Outline

Recognize and enhance the strengths of the current examinations

Critics of these examinations should remember that the NBME has engaged in continuous evaluation, revision, and improvement of the USMLE Step 2 CS since its inception. This evaluation and improvement cycle, including retraining of SPs, revising requirements for the postencounter note, and redesigning the assessment of communication and interpersonal skills, has served to strengthen the examination. Most recently, the NBME has added clinical images, synthetic models, and simulators to patient encounters. As the NBME considers further enhancements, studies are needed to evaluate the effect of these changes on the predictive power of the examination, and we believe it is incumbent on the NBME and NBOME to spearhead this effort.

Back to Top | Article Outline

Engage key stakeholders in a national dialogue

We believe the NBME and NBOME would benefit from engaging the medical community in a national dialogue about standards for minimally acceptable competency and the role of USMLE Step 2 CS and COMLEX Level 2-PE. Key stakeholders should include, but are not limited to, medical students, medical educators, medical schools, the LCME, the COCA, the Educational Commission for Foreign Medical Graduates, the Accreditation Council for Graduate Medical Education (ACGME), the Federation of State Medical Boards, the Association of American Medical Colleges (AAMC), the AMA, the American Osteopathic Association, and the public. There is a wealth of knowledge to be gained from the ACGME’s Milestone Project39 and the AAMC’s Core Entrustable Professional Activities Pilot,40 so the participation of these two organizations would be invaluable. A national meeting could serve as the perfect springboard for initial discussion and for the formation of workgroups. We also encourage ongoing dialogue in the medical literature and invite others’ perspectives.

Back to Top | Article Outline

In Summary

In summary, we do not support the arguments used in the calls to end the national clinical skills examinations. We believe abolishing these examinations would have major negative consequences on the clinical skills training of medical students, and risk forfeiting a national clinical skills competency standard—both of which could, in turn, potentially diminish the quality of care provided to patients. We recommend instead several steps to improve the impact of the national clinical skills examinations and their usefulness to students, medical schools, and third parties.

Back to Top | Article Outline

References

1. End Step 2 CS. Join the national movement to end Step 2 CS. http://endstep2cs.com. Accessed May 25, 2017.
2. American Medical Association. Proceedings of the 2016 Annual Meeting of the House of Delegates. https://policysearch.ama-assn.org/policyfinder/detail/Clinical%20Skills%20Assessment%20During%20Medical%20School%20D-295.988?uri=%2FAMADoc%2Fdirectives.xml-0–876.xml. Accessed June 27, 2017.
3. Lehman EP 4th, Guercio JRThe Step 2 Clinical Skills exam—A poor value proposition. N Engl J Med. 2013;368:889–891.
4. Ramsey PG, Curtis JR, Paauw DS, Carline JD, Wenrich MDHistory-taking and preventive medicine skills among primary care physicians: An assessment using standardized patients. Am J Med. 1998;104:152–158.
5. Mangione S, Nieman LZCardiac auscultatory skills of internal medicine and family practice trainees. A comparison of diagnostic proficiency. JAMA. 1997;278:717–722.
6. Mangione S, Burdick WP, Peitzman SJPhysical diagnosis skills of physicians in training: A focused assessment. Acad Emerg Med. 1995;2:622–629.
7. Joshi NDoctor, shut up and listen. N Y Times. January 4, 2015. https://www.nytimes.com/2015/01/05/opinion/doctor-shut-up-and-listen.html?_r=0. Accessed May 25, 2017.
8. Boodman SGKaiser Health News. What a doctor may miss by reaching for the MRI first. Washington Post. May 19, 2014. https://www.washingtonpost.com/national/health-science/what-a-doctor-may-miss-when-he-reaches-for-the-mri-first/2014/05/19/50ce45a8-c19c-11e3-b574-f8748871856a_story.html?utm_term=.c9c367778a52. Accessed May 25, 2017.
9. Holmboe ESFaculty and the observation of trainees’ clinical skills: Problems and opportunities. Acad Med. 2004;79:16–22.
10. Howley LD, Wilson WGDirect observation of students during clerkship rotations: A multiyear descriptive study. Acad Med. 2004;79:276–280.
11. Kassebaum DG, Eaglen RHShortcomings in the evaluation of students’ clinical skills and behaviors in medical school. Acad Med. 1999;74:842–849.
13. Hauer KE, Teherani A, Kerr KM, O’sullivan PS, Irby DMImpact of the United States Medical Licensing Examination Step 2 Clinical Skills exam on medical school clinical skills assessment. Acad Med. 2006;81(10 suppl):S13–S16.
14. Gilliland WR, La Rochelle J, Hawkins R, et alChanges in clinical skills education resulting from the introduction of the USMLE Step 2 Clinical Skills (CS) examination. Med Teach. 2008;30:325–327.
15. Association of American Medical Colleges. Number of medical schools requiring final SP/OSCE examination. https://www.aamc.org/initiatives/cir/406426/9.html. Accessed May 25, 2017.
16. Barzansky B, Etzel SIEducational programs in US medical schools, 2003–2004. JAMA. 2004;292:1025–1031.
17. Chen HC, van den Broek WE, ten Cate OThe case for use of entrustable professional activities in undergraduate medical education. Acad Med. 2015;90:431–436.
18. Davignon P, Young A, Johnson DMedical board complaints against physicians due to communication: Analysis of North Carolina medical board data, 2002–2012. J Med Regul. 2014;100:28–31.
19. Raupach T, Brown J, Anders S, Hasenfuss G, Harendza SSummative assessments are more powerful drivers of student learning than resource intensive teaching formats. BMC Med. 2013;11:61.
20. Newble DRevisiting “The effect of assessments and examinations on the learning of medical students.” Med Educ. 2016;50:498–501.
21. Kerdijk W, Tio RA, Mulder BF, Cohen-Schotanus JCumulative assessment: Strategic choices to influence students’ study effort. BMC Med Educ. 2013;13:172.
22. Ben-David MFThe role of assessment in expanding professional horizons. Med Teach. 2000;22:472–477.
23. Personal communication with the Directors of Clinical Skills Courses executive committee members, December 16, 2016.
24. Setyonugroho W, Kennedy KM, Kropmans TJReliability and validity of OSCE checklists used to assess the communication skills of undergraduate medical students: A systematic review [published online ahead of print June 27, 2015]. Patient Educ Couns. doi: 10.1016/j.pec.2015.06.004.
25. Guerrasio J, Furfari KA, Rosenthal LD, Nogar CL, Wray KW, Aagaard EMFailure to fail: The institutional perspective. Med Teach. 2014;36:799–803.
26. Brannick MT, Erol-Korkmaz HT, Prewett MA systematic review of the reliability of objective structured clinical examination scores. Med Educ. 2011;45:1181–1189.
27. Federation of State Medical Boards and National Board of Medical Examiners. United States Licensing Examination 2015 Performance Data Step 2 CS. http://www.usmle.org/performance-data/default.aspx#2015_step-2-cs. Accessed May 25, 2017.
28. National Board of Osteopathic Medical Examiners, Inc. COMLEX-USA Level 2-Performance Evaluation (PE) examination: Post-examination FAQ. https://www.nbome.org/docs/COMLEX_L2PE_Post_Exam_FAQ.pdf. Published March 2017. Accessed May 25, 2017.
29. Federation of State Medical Boards and National Board of Medical Examiners. United States Licensing Examination 2015 performance data Step 1. http://www.usmle.org/performance-data/default.aspx#2015_step-1. Accessed May 25, 2017.
30. Federation of State Medical Boards and National Board of Medical Examiners. United States Licensing Examination 2015 performance data Step 2 CK. http://www.usmle.org/performance-data/default.aspx#2015_step-2-ck. Accessed May 25, 2017.
31. Winward ML, Lipner RS, Johnston MM, Cuddy MM, Clauser BEThe relationship between communication scores from the USMLE Step 2 Clinical Skills examination and communication ratings for first-year internal medicine residents. Acad Med. 2013;88:693–698.
32. Cuddy MM, Winward ML, Johnston MM, Lipner RS, Clauser BEEvaluating validity evidence for USMLE Step 2 Clinical Skills data gathering and data interpretation scores: Does performance predict history-taking and physical examination ratings for first-year internal medicine residents? Acad Med. 2016;91:133–139.
33. Swanson DB, Roberts TETrends in national licensing examinations in medicine. Med Educ. 2016;50:101–114.
34. Pell G, Fuller R, Homer M, Roberts TAdvancing the objective structured clinical examination: Sequential testing in theory and practice. Med Educ. 2013;47:569–577.
35. Weller JM, Castanelli DJ, Chen Y, Jolly BMaking robust assessments of specialist trainees’ workplace performance. Br J Anaesth. 2017;118:207–214.
36. Alvin MDThe USMLE Step 2 CS: Time for a change. Med Teach. 2016;38:854–856.
37. Green M, Jones P, Thomas JX Jr.Selection criteria for residency: Results of a national program directors survey. Acad Med. 2009;84:362–367.
38. Yudkowsky R, Park YS, Hyderi A, Bordage GCharacteristics and implications of diagnostic justification scores based on the new patient note format of the USMLE Step 2 CS exam. Acad Med. 2015;90(11 suppl):S56–S62.
39. Holmboe ES, Call S, Ficalora RDMilestones and competency-based medical education in internal medicine. JAMA Intern Med. 2016;176:1601–1602.
40. Lomis K, Amiel JM, Ryan MS, et alAAMC Core EPAs for Entering Residency Pilot Team. Implementing an entrustable professional activities framework in undergraduate medical education: Early lessons from the AAMC Core Entrustable Professional Activities for Entering Residency pilot. Acad Med. 2017;92:765–770.
© 2018 by the Association of American Medical Colleges