Secondary Logo

Journal Logo

Scholarly Perspectives

The COVID-19 Pandemic as an Imperative to Advance Medical Student Assessment: Three Areas for Change

Hauer, Karen E. MD, PhD; Lockspeiser, Tai M. MD, MHPE; Chen, H. Carrie MD, PhD

Author Information
doi: 10.1097/ACM.0000000000003764
  • Free


The COVID-19 pandemic has upended clinical practice and all aspects of medical education. Despite major challenges to classroom and clinical learning, however, the need to continue to train physicians to meet the health care needs of patients and communities is greater than ever. From the educational community, early responses to the disruption to medical education have been swift. Discussion boards and listservs are replete with ideas of how to convert curricula to online learning, administer exams remotely, adjust clinical rotations for students and residents, and engage learners in telehealth.1 Because this disruptive crisis is unprecedented and was largely unanticipated, medical schools, residency programs, and clinical care delivery systems have not had their usual timeframes to plan educational changes. The aim has been “speed over elegance” and, understandably, many solutions are not perfect.2

Unfortunately, as education programs continue to debate how to respond to the crisis situation, discussions tend to focus on adapting the inputs, or the delivery of an acceptable curriculum, rather than on ensuring outputs, or desired learner outcomes. Questions are arising regarding the boundaries of acceptable adaptations. How much or should any clinical workplace learning be replaced by virtual experiences? Residents and fellows who are assigned to help cover other services may be asked to practice new skills, but at what point are they practicing beyond their qualifications? Redeployment of clinical teachers to meet care needs prompts questions about the adequacy of trainee supervision by faculty who are practicing outside of their usual scope of practice. Students eager to contribute to the response to COVID-19 and medical centers seeking to reinforce their workforce have asked whether students can enter the workforce early. The core concern underlying these questions is the educational community’s ability to assure readiness for practice, with heightened urgency due to pandemic conditions.

The current disruption to medical training and clinical practice thus presents a unique obligation to focus on outcomes and catalyze desirable change in competency assessment. Evidence that the pandemic may be an ongoing challenge for months or years to come should compel educators to take a long-range view and a systems approach when addressing immediate needs, to identify lessons learned that can extend beyond the pandemic.3,4 This strategic thinking entails innovating in the direction toward which the medical education community already has wanted to go, and implementing approaches that are better aligned with current educational theory and evidence. Conditions created by the pandemic invite us to tackle barriers that have rooted us to traditional practices that exist because “that is the way things have always been done,” and emerge better and with more robust educational systems. In this article, we discuss 3 opportunities for innovation and improvement in learner assessment that are aligned with competency-based medical education (CBME) that we believe can make enduring contributions: focusing on outcomes, broadening the assessment toolbox, and improving the undergraduate medical education-to-graduate medical education (UME-to- GME) transition.

The Imperative to Focus on Outcomes

First, current experience with the effects of the COVID-19 pandemic highlights the need to accelerate the transformation of medical education from a purely time-based model to one that operationalizes outcomes and the principles of CBME. These principles demand clearly stated, standardized outcomes that all learners must achieve, along with tailored learning experiences based on individual learner needs as well as programmatic assessment.5,6 Despite the common endorsement and adoption of competency frameworks, competency language, and learning objectives, almost all medical schools and residency programs in the United States currently base achievement on time rather than measurement and documentation of expected learner outcomes. Before and in the midst of the pandemic, a few medical schools have taken the bold step to graduate students early to strengthen the workforce,7 and medical schools in the United Kingdom (UK) have received permission to do so for graduates who had achieved competence.8 These decisions for early graduation must be supported by robust longitudinal data collection to ensure attainment of competence.9 Some educators have asked themselves what evidence they would use to know that their students have achieved readiness for GME. By viewing time as the proxy rather than the outcome, educators can be persuaded that many students should have achieved the skills they need by a few months before the usual graduation time. The question then arises: why not graduate students early next year as well? Creating more clear medical school outcomes, and robust assessment methods that provide evidence of achievement of those outcomes, can shift the emphasis to graduating students when they are ready for residency, rather than at a specified time point. To enable this shift, students and faculty must have a shared understanding of the essential facets of competence and the purposes of assessments. The Association of American Medical College’s (AAMC’s) Core Entrustable Professional Activities for Entering Residency (EPAs) represent one example of a step toward clearly defined minimal outcomes for medical school that have been variably adopted.10 Whether outcomes are shared or individually defined by a particular school, schools must be able to gather the data necessary to make the determination of achievement. Educators must then hold students, their teachers, and school leaders accountable for student achievement of those outcomes through robust performance data collection and ongoing progress monitoring, and by only graduating students for whom there is evidence of outcome achievement.

Similarly, with restrictions on student placements in clinical settings due to the COVID-19 pandemic, educators now debate how much or little time is needed to ensure that students have met the requirements for each core clerkship. Learning experiences in core clerkships are mostly time based, with a certain number of weeks in each specialty. This focus on time creates challenges for modifying clerkships in response to conditions created by the pandemic. In contrast, clear outcomes required of each clerkship, with robust assessment systems, would enable determination of whether individual students have achieved those outcomes or whether additional time or adjusted experiences are needed to meet those goals. Defining the types of patients students need to see and the core knowledge/skills they need to demonstrate for each core clerkship, as well as a robust system of assessing and tracking completion, would yield more opportunity for flexibility in learning experiences. One example of the benefits of more individualized learning experiences commonly arises within longitudinal integrated clerkships (LICs); for example, students receiving more individualized clerkship experiences at Harvard and University of British Columbia achieved the same or better outcomes than students in traditional block clerkships.11–13 The experiences of students in LICs, and potentially all students, could be adapted to the changing environment of the pandemic through opportunities to meet certain requirements for one field with experiences in a different field (i.e., seeing a pediatric patient with a family medicine preceptor), as recently endorsed by the Liaison Committee on Medical Education.14 Coupled with flexibility to alter schedules, this approach can meet students’ individual learning needs. By defining core learning experiences based on competencies and focusing on specific learning outcomes for each clerkship, educators can optimize the time students have in medical school even if that time is different from before COVID-19.

Broadening the Assessment Toolbox

Second is the responsibility to broaden our toolbox of assessment methods. For example, medical educators have traditionally relied heavily on certain standardized assessments such as United States Medical Licensing Examination (USMLE) Step 1 and 2 CK and clerkship subject examinations to assess students’ medical knowledge achievement and readiness for the next phase of medical training. These medical knowledge metrics are less available, or available on a different schedule, due to testing center closures that limit learners’ access to computer-based testing. A way forward is illustrated by the decisions by some undergraduate colleges to make standardized testing optional rather than required due to the pandemic, and to emphasize other aspects of college applications.15 The situation is further complicated in medical training by licensing requirements; as such, any adjustments to the licensing schedule require the buy-in of not only education programs but also state licensing boards. Though not currently available for licensing examinations, remote administration of medical school examinations including the National Board of Medical Examiner (NBME) subject (“shelf”) examinations has met with mixed success. Remote video proctoring is flexible, with students able to take examinations at home.16 However, technical requirements and lack of distraction-free environments may present barriers for some learners and exacerbate inequities. In addition, questions persist about exam security with remote proctoring. Information about students’ medical knowledge achievement will still be needed and could be captured more flexibly through confirmation of minimum competence on clerkship examinations or one but not necessarily both of the Step examinations before residency, given the high correlation among scores on these examinations.17

Ultimately, the ability to individualize learning experiences and advance students based on achievement of outcomes necessitates assessments that can provide a well-rounded picture of learner strengths and weaknesses, and evidence of learner readiness.18 To meet these goals, medical educators need to rely on assessments beyond existing standardized or “objective” knowledge exams. Objective structured clinical examinations (OSCEs), now the norm in UME, can be further developed and emphasized to enable competency-based assessment to inform progression and provide competency-based feedback to learners.19–21 As schools temporarily change grading schemes to adjust for shortened or altered clinical placements, a variety of new tools that enable students to demonstrate their learning within and across disciplines may emerge. Innovative examples such as workplace-based assessment tools to capture narrative assessment of how a learner participated in telehealth patient visits or documentation of their volunteerism to support communities at risk of infection should be embraced. Educators can identify, implement, and adapt assessment tools as needed to measure achievement of different outcomes. For example, virtual technology can enable remote assessment through oral examinations, standardized patient televisits, or simulations.22 However, students may justifiably perceive remote administration of OSCEs that simulate telehealth encounters23 as assessing telehealth skills that may not have been taught through the curriculum as well as failing to assess important physical exam skills. Workplace-based assessments using EPAs can provide multiple measures of performance that could be aggregated by advancement committees to make judgments about overall learner achievement.24 Online portfolios or dashboards can facilitate gathering varied and sufficient assessment data to assure that all outcomes have been met. Accelerated attention to the incorporation of artificial intelligence and learning analytics in medical education can support adaptive learning that is responsive to each student’s learning edge.3 Lastly, with supportive faculty coaching, students can and must be active partners both in learning and in gathering assessment data about their own performance. In the UK, increased reliance on learner self-assessment contributes to competence determination.25 As medical educators in the United States adapt, clear communications to learners about expectations and empowerment of students to be proactive in self-assessing their learning needs and seeking feedback will be critical.

Improving the Transition to Residency

Third, the disruption to medical education caused by the pandemic highlights multiple issues in the transition from UME to GME; most pertinent to this article is the poor trust between UME and GME programs, with lack of confidence in evidence of learner readiness by GME programs. Improved precision of communication from UME about learner competence can encourage trust in program-based performance data from medical schools. GME overreliance on specific performance measures will be upended due to the disrupted availability of USMLE scores, “audition” rotations that provide firsthand knowledge of students’ performance, and proxy measures such as number of specialty rotations completed. Medical schools have asked for, and the AAMC is exploring, the addition of template Medical Student Performance Evaluation language for explanations of curricular adjustments due to COVID-19 to help GME programs understand the unique circumstances at each school and better interpret changes in student assessments and grading. The increased transparency and uniformity of reporting should be embraced not just during the pandemic but as an ongoing expectation, and expanded to include more details of program-specific assessments of student performance and documented achievement of competence. In addition, schools that invest in examination of students’ achievement of graduation milestones signaling readiness to provide care at the GME level can use these individualized data in partnership with GME programs to enable early transition to residency training.26,27 UME and GME must not see each other as opposing sides, but rather as relay partners in the continuum of medical training, especially in this time in which collaboration is more necessary than ever.

GME programs already know they must reconsider what data to use for selection. Earlier this year, the NBME announced plans to change Step 1 score reporting from 3-digit numeric scores to pass/fail outcomes only, to take effect sometime on or after January 2022.28 Changes wrought by COVID-19 have accelerated this push to think beyond current practices. For instance, the Council of Residency Directors in Emergency Medicine (EM) has encouraged their GME programs to relax the 2020–2021 application cycle requirements for standardized letters of evaluation from away rotations and the number of EM rotations completed.29 Accordingly, GME programs may need to seek new types of information and place trust in information from a broader range of sources. The experience in the 2020–2021 application cycle will challenge previous assumptions and processes, requiring reexamination of what student data residency programs should use moving forward and how these data are used and prioritized. Rather than depend primarily on who has the highest ranking or USMLE scores to predict student success in residency, GME programs could use a constellation of outcomes data including clinical performance ratings, OSCE performance, and portfolio or dashboard records. Together, these data constitute evidence for student demonstration of growth over time (rather than the unrealistic expectation that students will achieve “top 5%” status early in training), readiness for transition, and capacity for success in the residency program regardless of differences in the students’ curricular contexts.

A Call to Action

The COVID-19 pandemic will continue to present an ongoing challenge to health, health care, and medical education. When location, time, and familiar assessment benchmarks are not available as organizing principles for education, and the usual, comfortable sequence of learning and assessment events is not possible, educators are forced (or freed up) to use other methods to determine whether learners have achieved competence. The scale of this crisis requires all education programs as well as larger national and international educational organizations to rethink systems and modify usual procedures. The variable impact of the pandemic by region and its uncertain duration will continue to require flexible responses that can address local conditions and adapt over time. In this context, the medical educational system faces increased accountability and an imperative to use best evidence and medical education frameworks to optimize assessment of learner competence. By focusing on the outcomes of training, defined through the competencies essential for practice, and by rethinking what constitutes useful, meaningful assessment data, educators can design flexible educational systems while ensuring outcomes can be achieved and measured using an adaptable set of tools. Rapidly implemented responses from educators during this crisis can constitute a series of pilot experiments or PDSA (plan–do–study–act) cycles on a path forward to produce enduring improvements to medical education.


The authors wish to thank Robert Englander, MD, MPH, for his helpful feedback on an earlier draft of the manuscript.


1. Woolliscroft JO. Innovation in response to the COVID-19 pandemic crisis. Acad Med. 2020;95:1140–1142.
2. Renjen P. The heart of resilient leadership: Responding to COVID-19. Deloitte Insights. Published March 16, 2020. Accessed September 17, 2020.
3. Goh P-S, Sandars J. A vision of the use of technology in medical education after the COVID-19 pandemic. MedEdPublish. Published March 26, 2020. Accessed September 17, 2020.
4. Yang DY, Cheng SY, Wang SZ, et al. Preparedness of medical education in China: Lessons from the COVID-19 outbreak. Med Teach. 2020;42:787–790.
5. Frank JR, Snell LS, ten Cate O, et al. Competency-based medical education: Theory to practice. Med Teach. 2010;32:638–645.
6. Van Melle E, Frank JR, Holmboe ES, Dagnone D, Stockley D, Sherbino J; International Competency-Based Medical Education Collaborators. A core components framework for evaluating implementation of competency-based medical education programs. Acad Med. 2019;94:1002–1009.
7. Early Graduation at NYU Grossman School of Medicine Sends New Doctors to Join COVID-19 Fight. NYU Langone News. Published April 5, 2020. Accessed September 17, 2020.
8. Harvey A. Covid-19: Medical schools given powers to graduate final year students early to help NHS. BMJ. 2020;368:m1227.
9. Gruppen LD, ten Cate O, Lingard LA, Teunissen PW, Kogan JR. Enhanced requirements for assessment in a competency-based, time-variable medical education system. Acad Med. 2018;933 supplS17–S21.
10. Obeso V, Brown D, Aiyer M, et al, eds. Core EPAs for Entering Residency Pilot Program. Toolkits for the 13 Core Entrustable Professional Activities for Entering Residency. Published 2017. Accessed September 17, 2020.
11. Hudson JN, Poncelet AN, Weston KM, Bushnell JA, A Farmer E. Longitudinal integrated clerkships. Med Teach. 2017;39:7–13.
12. Walters L, Greenhill J, Richards J, et al. Outcomes of longitudinal integrated clinical placements for students, clinicians and society. Med Educ. 2012;46:1028–1041.
13. Gentles JQ. Why wait until residency? Competency-based education in longitudinal integrated clerkships. Can J Surg. 2017;60:64–65.
14. Liaison Committee on Medical Education. LCME update on medical students, patients, and COVID-19: Approaches to the clinical curriculum. Published March 20, 2020. Accessed September 17, 2020.
15. University of California. Admissions.. UC’s response on admissions to COVID-19. Accessed September 17, 2020.
16. Murphy B. Medical school assessment during COVID-19: Shelf exams go remote. American Medical Association. Published April 23, 2020. Accessed September 17, 2020.
17. Monteiro KA, George P, Dollase R, Dumenco L. Predicting United States Medical Licensure Examination Step 2 clinical knowledge scores from previous academic indicators. Adv Med Educ Pract. 2017;8:385–391.
18. Carraccio C, Englander R, Van Melle E, et al.; International Competency-Based Medical Education Collaborators. Advancing competency-based medical education: A charter for clinician-educators. Acad Med. 2016;91:645–649.
19. Hauer KE, Teherani A, Kerr KM, O’Sullivan PS, Irby DM. Impact of the United States Medical Licensing Examination Step 2 Clinical Skills exam on medical school clinical skills assessment. Acad Med. 2006;8110 supplS13–S16.
20. Association of American Medical Colleges. Curriculum reports. Assessment methods at US and Canadian medical schools. Accessed September 17, 2020.
21. Patricio MF, Juliao M, Fareleira F, Carneiro AV. Is the OSCE a feasible tool to assess competencies in undergraduate medical education? Med Teach. 2013;35:503–514.
22. Dedeilia A, Sotiropoulos MG, Hanrahan JG, Janga D, Dedeilias P, Sideris M. Medical and surgical education challenges and innovations in the COVID-19 Era: A systematic review. In Vivo. 2020;34suppl 31603–1611.
23. Lara S, Foster CW, Hawks M, Montgomery M. Remote assessment of clinical skills during COVID-19: A virtual, high-stakes, summative pediatric objective structured clinical examination. Acad Pediatr. 2020;20:760–761.
24. Murray E, Gruppen L, Catton P, Hays R, Woolliscroft JO. The accountability of clinical education: Its definition and assessment. Med Educ. 2000;34:871–879.
25. Rimmer A. Covid-19: Health Education England shares advice for trainees. BMJ. 2020;369:m1635.
26. Andrews JS, Bale JF Jr, Soep JB, et al.; EPAC Study Group. Education in Pediatrics Across the Continuum (EPAC): First steps toward realizing the dream of competency-based education. Acad Med. 2018;93:414–420.
27. Flotte TR, Larkin AC, Fischer MA, et al. Accelerated graduation and the deployment of new physicians during the COVID-19 pandemic. Acad Med. 2020;95:1492–1494.
28. United States Medical Licensing Examination. Invitational conference on USMLE scoring. Accessed September 17, 2020.
29. Council of Residency Directors in Emergency Medicine. Consensus statement regarding SLOEs and away rotations from the Advising Students Committee in Emergency Medicine. Accessed September 17, 2020.
Copyright © 2020 by the Association of American Medical Colleges