Secondary Logo

Journal Logo

Realism Challenges in Documentation Components of Objective Structured Clinical Examinations

Jayakumar, Kishore, L.

doi: 10.1097/ACM.0000000000001997
Letters to the Editor

MD/MBA candidate, Perelman School of Medicine/Wharton School, University of Pennsylvania, Philadelphia, Pennsylvania;; ORCID:

Disclosures: None reported.

Back to Top | Article Outline

To the Editor:

Objective structured clinical examinations (OSCEs) have become nearly universal in undergraduate medical education. In 2016, 133 medical schools required students to complete a final institutional OSCE, and 108 schools required a passing grade for graduation.1 OSCEs often include documentation components to allow the evaluation of students’ competency in writing patient notes. When evaluating OSCE documentation—for either summative or formative assessment—medical schools should consider and address two challenges relating to consistency and generalizability.

First, students writing a patient note may find it unclear how they should handle discrepancies between the intended diagnosis and physical exam findings. For example, for an intended diagnosis of asthma, the simulated encounter commonly features a 20-year-old patient who presents with episodic wheezing, shortness of breath, chest tightness, and a nonproductive cough, all of which are exacerbated by tobacco smoke. Lung auscultation, however, paradoxically reveals good air movement with no wheezing or crackles since this patient is, in actuality, a healthy actor. This inconsistency, which extends to visual findings (skin, mouth, eyes, nose, and ears), cardiac murmurs, abnormal reflexes, and many other signs, is especially problematic because OSCEs (including the United States Medical Licensing Examination Step 2 Clinical Skills exam) expect students to list pertinent positive and negative physical exam findings to support a proposed diagnosis.2

Second, most OSCE documentation bears only superficial resemblance to inpatient notes because OSCE patients usually present to “first contact” settings such as primary care clinics or emergency departments.2,3 Specifically, OSCE documentation does not test important skills involved in admitting a patient (e.g., reviewing past medical records), in caring for an admitted patient (e.g., tracking a patient’s daily clinical progress and laboratory results), or in discharging a patient (e.g., writing a discharge summary). The deemphasis of these skills is particularly concerning since postgraduate year 1 residents practice primarily in inpatient settings.

Institutions that incorporate documentation aspects into OSCEs should address these two challenges to realism to improve their internal validity and generalizability. To minimize inconsistency, medical schools could supplement OSCEs with multimedia equipment, such as monitors and headphones that simulate relevant visual and auscultatory physical exam findings. The USMLE Step 1 and Step 2 Clinical Knowledge exams already employ this strategy for certain questions. To optimize applicability to inpatient settings, OSCEs should include common inpatient scenarios, such as responding to night calls or synthesizing a discharge summary from progress notes.

Kishore L. Jayakumar

MD/MBA candidate, Perelman School of Medicine/Wharton School, University of Pennsylvania, Philadelphia, Pennsylvania;; ORCID:

Back to Top | Article Outline


1. Association of American Medical Colleges. Number of medical schools requiring final SP/OSCE examination: 2011–2012 through 2015–2016. Published 2017. Accessed September 8, 2017.
2. United States Medical Licensing Examination. Step 2 CS. Published 2017. Accessed September 8, 2017.
3. Pugh D, Smee S. Guidelines for the Development of Objective Structured Clinical Examination (OSCE) Cases. 2013.Ottawa, Ontario, Canada: Medical Council of Canada.
© 2018 by the Association of American Medical Colleges