The Consequences of Step 2 Clinical Skills Examination Discontinuation for Medical Schools and Sustainability Plans for Clinical Skills Assessment : Academic Medicine

Secondary Logo

Journal Logo

Research Reports

The Consequences of Step 2 Clinical Skills Examination Discontinuation for Medical Schools and Sustainability Plans for Clinical Skills Assessment

Phillips, Abigail MD1; Hauer, Karen E. MD, PhD2; Chen, H. Carrie MD, PhD3; Wray, Alisa MD, MAEd4; Watanaskul, Sarah5; Boscardin, Christy K. PhD6

Author Information
Academic Medicine 98(6):p 717-722, June 2023. | DOI: 10.1097/ACM.0000000000005138

Abstract

Purpose 

Comprehensive clinical skills examinations using standardized patients are widely used to assess multiple physician competencies. However, these exams are resource intensive. With the discontinuation of the Step 2 Clinical Skills (CS) exam in 2021, how medical schools will change their approaches to comprehensive clinical skills exams is unknown. This study explores school responses to this change and future directions of comprehensive clinical skills exams using the program sustainability framework.

Method 

This cross-sectional, descriptive study surveyed medical school curriculum deans at 150 Liaison Committee on Medical Education–accredited U.S. medical schools from September to October 2021. The 30-question survey included questions about medical school and participant role, current comprehensive clinical skills exams, sustainability dimensions, and challenges and future directions. Descriptive statistics were used to characterize responses, and content analysis was used to identify themes in the open-ended responses.

Results 

Educators at 75 of 150 institutions (50%) responded. Sixty-three respondents (84%) reported conducting a comprehensive clinical skills exam. The comprehensive clinical skills exam assessed readiness for graduation (51 [81%]), provided feedback for students (49 [78%]), evaluated curricula (38 [60%]), provided information for medical student performance evaluation or communication with residency (10 [16%]), and assessed other factors (6 [10%]), including preparation for Step 2 CS in the past and readiness for advancement to fourth year of medical school (multiple responses were allowed). Factors facilitating sustainability included sufficient funding to continue the exam (55 [87%]) and the belief that clinical skills assessment in medical school is now more important after discontinuation of the Step 2 CS exam (55 [87%]). Challenges to sustainability included organizational capacity and limited interinstitutional collaboration.

Conclusions 

Educators remain committed to the purpose of comprehensive clinical skills exams. Adapting to changed licensing requirements while sustaining clinical skills exams enables innovation and improvement in assessment of clinical competence.

Medical education serves the public through preparation of physicians equipped with the skills, knowledge, and attitudes to provide high-quality, patient-centered care. Competency-based medical education emphasizes defining and measuring outcomes of training and using assessment tools that generate defensible assessment data about learners and their progress toward competence.1 Comprehensive clinical skills examinations using standardized patients (SPs) are a type of objective structured clinical examination (OSCE) that is widely used as an assessment tool to gather data on multiple physician competencies during encounters with a range of patient problems.2–5 However, the use of comprehensive clinical skills exams for high-stakes assessment is resource intensive and logistically challenging.6–10 These exams require a large number of cases with highly trained SPs to achieve reliable scores.11,12 Thus, educators question the feasibility of rigorous and sound comprehensive clinical skills exams within medical schools.

The United States Medical Licensing Examination (USMLE) Step 2 Clinical Skills (CS) exam served the purpose of a national, high-stakes, rigorous clinical skills exam required for licensure in the United States from 2005 to 2020. Schools responded to the introduction of the Step 2 CS exam by adding or augmenting in-house clinical skills exams in part to prepare students for the licensing exam.13 Despite providing assurance to the public about trainees’ achievement of minimum expectations, the Step 2 CS exam suffered shortcomings, including cost and travel requirements for exam administration, lack of feedback to trainees on their performance, and limited validity evidence, such as relationship to other variables.14,15 In the face of these challenges and the ongoing COVID-19 pandemic, on January 26, 2021, the Federation of State Medical Boards and National Board of Medical Examiners, cosponsors of the USMLE, announced the discontinuation of the Step 2 CS licensing requirement.16 The consequences of this decision for medical schools and their approaches to clinical skills assessment are unknown.

The major change in licensing requirement could generate a range of responses from schools. The discontinuation of the Step 2 CS exam could increase pressure for schools to maintain high-quality comprehensive clinical skills exams to fill the gap. Alternatively, schools that were using in-house exams as preparation for the Step 2 CS exam could feel liberated to allocate time and resources elsewhere. A useful lens through which to understand the future direction and factors that contribute to future decisions about medical school comprehensive clinical skills exams is the program sustainability framework by Schell et al.17 Sustainability addresses the ability of programs to maintain programming and achieve desired outcomes over time. Schell et al17 conceptualized sustainability as the existence of structures to leverage resources to implement and maintain programming activities. This conceptual framework comprises 8 organizational and contextual domains that are critical to building capacity for maintaining programs, such as comprehensive clinical skills exams (Table 1). These domains cluster in 3 categories: (1) current state of clinical skills exams: resources and structures currently in place for comprehensive clinical skills exams, including environmental support, partnerships, program evaluation, and communication; (2) plans for adaptation, including program adaptation and strategic planning; and (3) resources to enact adaptation plans, including organizational capacity and funding. In applying this framework to the sustainability assessment of comprehensive clinical skills exams, we grouped the domains into the 3 categories described above and combined the domains of program evaluation and communication, given the significant overlap between the 2 domains, such as in communications with stakeholders about a program’s efficacy. In Table 1, we illustrate how we are applying the 7 domains of sustainability to study comprehensive clinical skills exams.

T1
Table 1:
Application of the Sustainability Framework to Medical School Comprehensive Clinical Skills Exams17

Because comprehensive clinical skills exams are resource intensive, we used this program sustainability framework as an approach to understand how medical schools are responding to the discontinuation of the Step 2 CS exam. The aims of this study are to explore (1) the current state of comprehensive clinical skills exams at U.S. medical schools, (2) future plans for comprehensive clinical skills exams at schools, and (3) the influence of Step 2 CS exam discontinuation on comprehensive clinical skills exams.

Method

Design

We conducted a cross-sectional, descriptive study by surveying medical school curriculum deans and designees at 150 Liaison Committee on Medical Education (LCME)–approved U.S. medical schools from September to October 2021. The institutional review board at the University of California, San Francisco, approved the study.

Participants and setting

Investigators extracted names and email addresses for medical school curriculum deans for all U.S. medical schools from publicly available websites and verified them through personal knowledge, review of institution websites, or email with the institution. The survey invitation asked the recipient dean to complete the survey or forward it to another faculty member qualified to respond for the institution. The invitation specified that only 1 survey be completed per institution.

Data collection

We followed a rigorous survey development protocol.18 On the basis of previous literature3 on investigating the landscape of clinical skills exams, we developed and mapped survey items to components of the program sustainability framework and then conducted pilot testing of the draft survey and cognitive interviews with the 6 pilot participants at 3 institutions for clarity, timing, and completeness before administration at all sites. The 6 pilot test participants were directors of clinical skills courses and not eligible for study participation. On the basis of pilot testing, we made minor language updates for clarity and added additional answer choices for several questions. Our 30-item survey (Supplemental Digital Appendix 1 at https://links.lww.com/ACADMED/B376) included 3 items eliciting descriptive information about the medical school and participant role, 4 items eliciting descriptive information about the school’s current comprehensive clinical skills exam (e.g., number of stations, length of time), 8 multiple-choice items, 5 five-point Likert-type scale items, 6 yes or no items that targeted sustainability dimensions, 2 follow-up free response questions, and 2 open-ended response items soliciting information on challenges and future directions of comprehensive clinical skills exams during the next 5 years. For questions about the current comprehensive clinical skills exam, we directed participants within the survey to answer based on the exam administered before any temporary COVID-19 modifications.

We distributed the survey through Qualtrics, an online survey software system. Eligible participants received an email invitation; nonresponders received up to 3 follow-up emails.

Data analysis

We used descriptive statistics to characterize responses and Fisher exact test or Wilcoxon/2-sample t test procedures, as appropriate, for univariate analyses to characterize and compare demographic information for respondent and nonrespondent schools. We performed all statistical analyses using SPSS software, version 27 (IBM Corp) with a 2-sided α = .05.

For responses to the 2 open-ended items, we conducted content analysis to identify themes.19 Three investigators (A.P., S.W., and C.B.) reviewed responses for each open-ended question and independently identified concepts in the data to generate initial codes. Through an iterative process, these investigators generated and revised a preliminary codebook based on discussions and additional data review. We compared responses (both in developing codes and applying codes to the data) through discussion to derive the final coding scheme. Each coder recorded themes and reflections based on coding and further abstracted and synthesized the themes through team discussions.

Results

Demographic characteristics of the respondents

Educators at 75 of 150 institutions (50%) responded to the survey. The survey results appear in Supplemental Digital Appendix 2 at https://links.lww.com/ACADMED/B376. Of the 75 respondents, 47 (63%) were from public institutions, similar to the proportion of public schools (45 of 75 [60%]) among nonresponders. Geographic representation was higher for the West and Southwest and slightly lower for other regions (Table 2). Of the respondents, 46 (61%) identified as curriculum deans, 14 (19%) as directors of clinical skills, 3 (4%) as assessment and/or evaluation deans, 1 (1%) as director of assessment, and 11 (15%) as holding other education positions. We present the results in 3 sustainability categories: (1) current state of comprehensive clinical skills exams, (2) plans for adaptation, and (3) resources to enact adaptation plans.

T2
Table 2:
Characteristics of Responder vs Nonresponder Schools in a Survey of U.S. Medical School Deans About Step 2 Clinical Skills Exam Discontinuation, 2021

Current state of comprehensive clinical skills exams

Of the 75 respondents, 63 (84%) reported conducting a comprehensive clinical skills exam in the third or fourth year of medical school as a summative assessment. Respondents reported that their institutional comprehensive clinical skills exam had been administered for a mean (SD) of 13.4 (6.8) years (range, 0–28 years). The mean (SD) number of stations was 8 (5), and the mean (SD) duration per SP encounter was 17.6 (5) minutes.

Currently, the comprehensive clinical skills exam served the following purposes at respondents’ institutions (respondents could select more than 1 choice and provide free response): assessment of readiness for graduation (51 [81%]), identification of students for remediation (51 [81%]), feedback for students (49 [78%]), curriculum evaluation (38 [60%]), information for medical student performance evaluation or communication with residency (10 [16%]), and other (6 [10%]). Other included preparation for the Step 2 CS exam and readiness for advancement to fourth year or return to clinical work after research or dual degree time.

Respondents perceived that their comprehensive clinical skills exam aligned well with their overall clinical skills curriculum, with 55 (87%) responding with extremely, very, or moderately well. In addition, they endorsed their comprehensive clinical skills exams as effective in assessing clinical skills (55 [88%] extremely, very, or moderately effective) and providing accurate evidence of students’ clinical skills (56 [89%] extremely, very, or moderately accurate). They found the information from their comprehensive clinical skills exam not quite as helpful for making curricular changes, with 40 (63%) responding with extremely, very, or moderately helpful. Fifty-two respondents (83%) reported conducting continuous improvement (program evaluation) of their comprehensive clinical skills exam.

Plans for adaptation

All respondents currently administering a comprehensive clinical skills exam (59 of 63 answered this question) reported it was at least moderately important to continue the exam, and 51 (81%) reported it was very or extremely important. Fifty-five respondents (87% of respondents who reported currently administering a comprehensive clinical skills exam) believed that it is more important to conduct a clinical skills exam in medical school after the Step 2 CS exam discontinuation. Fifty-six respondents (89%) administering current comprehensive clinical skills exams planned to continue the exam for at least the next 3 years. These respondents cited the following reasons for continuing the exam (respondents could select more than 1 choice and could also provide free response): alignment with curriculum (53 [84%]), program evaluation (43 [68%]), preparation for LCME accreditation (21 [33%]), alignment with other institutions’ comprehensive clinical skills assessment approach (8 [13%]), funding availability (2 [3%]), and other (19 [30%]). The common topics for other were identifying students for remediation, providing feedback to students, and assessing readiness for residency or graduation. Many respondents highlighted expanded school accountability for ensuring clinical skills absent a national exam.

Resources to enact adaptation plans

Content analysis of open-ended items revealed challenges facing clinical skills exam programs and potential strategies for sustainability. Five thematic areas characterize these challenges and sustainability opportunities: (1) absence of national guidelines and resources, (2) assessment challenges, (3) resource limitations, (4) lack of buy-in, and (5) SP challenges. The theme of national guidelines and resources addressed the importance of standardization across institutions in the absence of both current national guidelines and resources or consensus on core clinical competencies. Assessment-related challenges included limited knowledge and expertise at the local institution to develop comprehensive clinical skills exams and limitations around the lack of validity information of the local clinical skills exams for future clinical performance in residency. Although 55 respondents (87%) reported having institutional funding to continue their school’s comprehensive clinical skills exam for at least the next 3 years, respondents cited resource limitations, including funding, space, faculty time, and curricular time, that threatened program sustainability. Respondents noted that some institutions will have access to more funding than others, potentially setting up for unequal program quality. After Step 2 CS exam discontinuation, respondents anticipated challenges with maintaining buy-in among students, faculty, and leadership for clinical skills programs without the comprehensive clinical skills exam serving as preparation for the Step 2 CS exam. Challenges with SPs included availability, training, and retention.

Participants envisioned multiple future directions for comprehensive clinical skills exams. Many identified the discontinuation of the Step 2 CS exam as an impetus for innovative changes to comprehensive clinical skills exam content and format at schools; envisioned changes included adding assessments of communication, procedural skills, clinical reasoning, interprofessional team skills, skills in working with patients from diverse backgrounds, ethics, and professionalism. Some respondents advocated for other methods of standardization by reimplementing a national exam; defining a standard set of competencies; creating a national repository of resources for faculty development, evaluation strategies, and clinical cases; or standardizing performance reporting to provide meaningful information for residency selection.

Aside from national collaboration, some respondents advocated for regional consortia that could collaborate to develop test resources, help with exam implementation and logistics, and share student performance data across institutions to aid in benchmarking. Sixteen respondents (25%) reported collaborating on case development with other schools, 15 (24%) on SP training, and 7 (11%) on sharing space to administer the exam. Reported benefits of collaboration were sharing creative ideas for case development, format, scoring, and grading strategies (21 respondents [33%]), developing professional relationships with colleagues (18 [29%]), collaborative research (17 [27%]), shared patient trainer or trainer resources (13 [21%]), and cost savings (4 [6%]). Thirteen (21%) planned to increase collaboration with other schools due to the discontinuation of the Step 2 CS exam, and 18 (29%) were considering collaborating.

Future of comprehensive clinical skills exam for institutions with no current exam

For the 12 institutions (16% of respondents) not currently administering a comprehensive clinical skills exam, reasons (respondents could cite multiple reasons) were administering another type of assessment (8 [67%]), such as a series of mini-OSCEs, clerkship OSCEs, or a formative exam; the Step 2 CS exam served that purpose until now (5 [42%]); assessment methods are adequate (5 [42%]); reliability and validity concerns for clinical skill exams (3 [25%]); and inadequate funding (1 [8%]). One (8%) reported having definite plans to implement a comprehensive clinical skills exam in the future, 7 (58%) were considering it, and 4 (33%) had no plans.

Discussion

With the discontinuation of the Step 2 CS exam, the majority of medical schools in this study plan to continue their comprehensive clinical skills exam. Most respondents view the importance of the comprehensive clinical skills exam as greater now than when the Step 2 CS exam was administered. By viewing decisions related to continuing a school-based comprehensive clinical skills exam through a program sustainability framework using 7 domains, we found several facilitators that support the sustainability of comprehensive clinical skills exams. For example, within the environmental support domain, a strong purpose for the exam to provide information about graduation milestones and identify students needing remediation facilitate commitment to the exam. Among the domains related to resources and ability to enact adaptation plans, sufficient funding to continue the comprehensive clinical skills exam was highlighted as essential.

As schools adapt their approach to clinical skills assessment with the discontinuation of the Step 2 CS exam, our findings within the sustainability domains of program adaptation and strategic planning highlight the balance between seeking national guidelines and standardized resources while also valuing independence to focus on school-specific priorities and innovations. Despite sharing common aims to train future physicians, all medical schools have unique focus areas.20 Furthermore, medical schools, unlike graduate medical education specialties, do not share competencies or even the same competency framework. Although the Association of American Medical Colleges Physician Competency Reference Set and core entrustable professional activities (EPAs) provide sets of standards, neither was mandated nor adopted by all schools.21 Lessons learned from a collaborative effort on the development and implementation of core EPAs may serve as a guide for future development of national resources.22 National standards for clinical skills assessment could provide guidance on the content for comprehensive clinical skills exams by highlighting any areas of consensus around the core clinical skills expected at the time of graduation, augmenting the current efforts on the core EPAs. Additionally, the development of national standards may aid schools that are currently not administering comprehensive clinical skills exams in their program adaptation and strategic planning as future external pressure for accountability potentially increases. Schools could use existing exams to ensure that their current programs of assessment provide evidence of clinical skills competence or start administering comprehensive exams.

Although we found strong commitment to sustaining comprehensive clinical skills exams, multiple concerns arose about limited resources, local expertise, and uneven resources across schools. These concerns suggest a need for national discussions about equity and the potential benefits of standards as described above and sharing of resources so that certain schools are not disadvantaged. In terms of resource sharing, multiple schools expressed interest in forming or deepening collaborations across schools, a finding that highlights the importance of the organizational capacity domain of the program sustainability framework.23 These institutional collaborations are a mechanism to fill gaps in institutional expertise or capacity in case development or SP training and to reduce financial burden.24,25 Collaborations, such as the American Medical Association Accelerating Change in Medical Education consortium and regional SP consortia in California and the Mid-Atlantic, can serve as possible prototypes for clinical skills exam partnerships.26

Lastly, the discontinuation of the Step 2 CS exam at the national licensure level highlights key challenges with comprehensive clinical skills exams, including cost burden on the students, resource requirements and logistical challenges of conducting exams, and debate around the validity of the exams. However, the commitment to sustain comprehensive clinical skills exams at medical schools despite the elimination at the national licensure level illustrates the perceived utility and value of comprehensive clinical skills exams as part of programs of assessment in schools. Ironically, many of the challenges faced by the national licensure exam can potentially be mitigated at the local level by reducing or eliminating the cost burden on students, minimizing the resource burden on the schools by sharing of resources through collaboratives, and increasing validity evidence or value of the comprehensive clinical skills exam by more proximal alignment with local curriculum and other performance metrics available at the school. SP exams can serve summative or formative assessment purposes. Using comprehensive clinical skills exams for summative assessment requires rigor in exam construction, administration, and standard setting, which may exceed the capacity of some individual schools.27 Alternatively, given the challenges with sufficient case numbers and rigor to achieve reliability, schools may opt to use their exams primarily for formative purposes.28

There were several limitations of this study. The response rate (50%) was modest, and nonrespondents may have different experiences and opinions. Response bias may have led schools that are not administering a comprehensive clinical skills exam to opt out of completing the survey. However, our sample represents diverse geographic and school characteristics, which may increase the generalizability of our findings, and a response rate of 50% is considered acceptable in social research surveys.29–31 Additionally, respondents’ roles varied. We also did not capture information about SP exams that were occurring on a smaller scale more frequently (e.g., within each clerkship). Our study relied on self-report from respondents, which may not always align with what actually happens at each institution. Finally, our study focused on the impact of the discontinuation of the Step 2 CS exam soon after the announcement of its discontinuation; the long-term impact of this change will require future study.

Conclusions

Despite the discontinuation of the USMLE Step 2 CS exam in January 2021, most medical schools currently administer a comprehensive clinical skills exam and report that this exam has become more important with the discontinuation of the Step 2 CS exam. Educators remain committed to the purpose of comprehensive clinical skills exams and see opportunities for collaboration and innovation. Adapting to changed licensing requirements while sustaining clinical skills exams enables innovation and improvement in assessment of clinical competence.

Acknowledgments:

The authors wish to thank Gillian Earnest, MS, research analyst, University of California, San Francisco, School of Medicine.

References

1. Frank JR, Snell LS, Cate OT, et al. Competency-based medical education: Theory to practice. Med Teach. 2010;32:638–645.
2. Swanson DB, van der Vleuten CPM. Assessment of clinical skills with standardized patients: State of the art revisited. Teach Learn Med. 2013;25:S17–S25.
3. Hauer KE, Hodgson CS, Kerr KM, Teherani A, Irby DM. A national study of medical student clinical skills assessment. Acad Med. 2005;80:S25–S29.
4. Howley LD. Standardized patients. Levine AI, DeMaria S Jr, Schwartz AD, Sim AJ, eds. In: The Comprehensive Textbook of Healthcare Simulation. New York, NY: Springer; 2013:173–190.
5. Association of American Medical Colleges. AAMC Curriculum Reports: SP/OSCE Final Examinations at US Medical Schools. https://www.aamc.org/data-reports/curriculum-reports/interactive-data/sp/osce-final-examinations-us-medical-schools. Accessed December 6, 2021.
6. Gillette C, Stanton RB, Rockich-Winston N, Rudolph M, Anderson HG. Cost-effectiveness of using standardized patients to assess student-pharmacist communication skills. Am J Pharm Educ. 2017;81:6120.
7. Bosse HM, Nickel M, Huwendiek S, Schultz JH, Nikendei C. Cost-effectiveness of peer role play and standardized patients in undergraduate communication training. BMC Med Educ. 2015;15:183.
8. Harden RM. Misconceptions and the OSCE. Med Teach. 2015;37:608–610.
9. Brown C, Ross S, Cleland J, Walsh K. Money makes the (medical assessment) world go round: The cost of components of a summative final year Objective Structured Clinical Examination (OSCE). Med Teach. 2015;37:653–659.
10. Goh HS, Ng E, Tang ML, Zhang H, Liaw SY. Psychometric testing and cost of a five-station OSCE for newly graduated nurses. Nurse Educ Today. 2022;112:105326.
11. Epstein RM. Assessment in medical education. N Engl J Med. 2007;356:387–396.
12. Khan KZ, Gaunt K, Ramachandran S, Pushkar P. The Objective Structured Clinical Examination (OSCE): AMEE guide no. 81. Part II: Organisation & administration. Med Teach. 2013;35:e1447–e1463.
13. Hauer KE, Teherani A, Kerr KM, O’sullivan PS, Irby DM. Impact of the United States Medical Licensing Examination Step 2 Clinical Skills exam on medical school clinical skills assessment. Acad Med. 2006;81:S13–S16.
14. Kogan JR, Hauer KE, Holmboe ES. The dissolution of the step 2 clinical skills examination and the duty of medical educators to step up the effectiveness of clinical skills assessment. Acad Med. 2021;96:1242–1246.
15. Taylor ML, Blue AV, Mainous AG, Geesey ME, Basco WTJ. The relationship between the National Board of Medical Examiners’ prototype of the Step 2 Clinical Skills exam and interns’ performance. Acad Med. 2005;80:496–501.
16. United States Medical Licensing Examination. Work to relaunch USMLE Step 2 CS discontinued. https://www.usmle.org/work-relaunch-usmle-step-2-cs-discontinued. Accessed December 7, 2021.
17. Schell SF, Luke DA, Schooley MW, et al. Public health program capacity for sustainability: A new framework. Implement Sci. 2013;8:15.
18. Artino AR, La Rochelle JS, Dezee KJ, Gehlbach H. Developing questionnaires for educational research: AMEE guide no. 87. Med Teach. 2014;36:463–474.
19. Elo S, Kyngas H. The qualitative content analysis process. J Adv Nurs. 2008;62:107–115.
20. Grbic D, Hafferty FW, Hafferty PK. Medical school mission statements as reflections of institutional identity and educational purpose: A network text analysis. Acad Med. 2013;88:852–860.
21. Association of American Medical Colleges. Physician Competency Reference Set (PCRS). https://www.aamc.org/what-we-do/mission-areas/medical-education/curriculum-inventory/establish-your-ci/physician-competency-reference-set. Accessed December 28, 2022.
22. Lomis K, Amiel JM, Ryan MS, et al. Implementing an entrustable professional activities framework in undergraduate medical education: Early lessons from the AAMC core entrustable professional activities for entering residency pilot. Acad Med. 2017;92:765–770.
23. Skochelak SE, Stack SJ. Creating the medical schools of the future. Acad Med. 2017;92:16–19.
24. Klepper WM, Stodt MM. The benefits of consortium participation. New Directions Higher Educ. 1987;1987:87–93.
25. Lomis KD, Santen SA, Dekhtyar M, et al. The accelerating change in medical education consortium: Key drivers of transformative change. Acad Med. 2021;96:979–988.
26. Nevins AB, Boscardin CK, Kahn D, et al. A call to action from the California Consortium for the Assessment of Clinical Competence: Making the case for regional collaboration. Acad Med. 2022;97:1289–1294.
27. Boulet JR, De Champlain AF, McKinley DW. Setting defensible performance standards on OSCEs and standardized patient examinations. Med Teach. 2003;25:245–249.
28. Williams B, Song JJY. Are simulated patients effective in facilitating development of clinical competence for healthcare students? A scoping review. Adv Simul. 2016;1:6.
29. Richardson JTE. Instruments for obtaining student feedback: A review of the literature. Assess Eval Higher Educ. 2005;30:387–415.
30. Babbie ER. Survey Research Methods. Belmont, CA: Wadsworth Publishing Co; 1973.
31. Kidder LH, Judd CM, Smith ER. Research Methods in Social Relations. New York, NY: Holt, Rinehart and Winston; 1986.

Supplemental Digital Content

Copyright © 2023 by the Association of American Medical Colleges