In an era in which patient information is often gleaned from electronic medical records and imaging studies, the physical exam remains a critical tool.1,2 Several studies suggest that poor physical exam skills can lead to lower-quality care and medical errors.3,4
Teaching the physical exam is labor intensive and, in particular, requires significant human resources. The individuals involved include standardized patients (SPs) and actual patients, as well as the teachers, including faculty and senior students. It can be difficult to find sufficient numbers of patients who are willing and able to allow students to practice the physical exam, and equally difficult to find the time and space for such practice. Although teaching medical students is rewarding for faculty, the competing demands of patient care, research, and administrative duties result in many physicians struggling to find the time to teach.5,6 Today’s student who receives inadequate physical exam training becomes tomorrow’s physician who lacks expertise and confidence in physical examination, relies on unnecessary diagnostic testing, and downplays the value of the physical exam to future students.7
Concerns about the adequacy of physical exam education have been raised for decades,8–10 and up to 48% of clerkship directors report that they feel students are less prepared than necessary to perform interviewing and physical exam skills.11 Yet, little is known about how medical schools currently teach the physical exam. The aim of our study is to describe pedagogical approaches and resources used for teaching the physical exam to preclerkship medical students in Liaison Committee on Medical Education (LCME)-accredited U.S. medical schools. We present a summary of these findings here, focusing on resource utilization. Results regarding the pedagogical approaches, including assessment, will be presented elsewhere.
The cross-sectional survey was developed after a literature review by the Research Committee of the Directors of Clinical Skills Courses (DOCS). DOCS is the largest professional organization of preclerkship clinical skills educators at U.S. medical schools. Thirteen members (including T.U., F.I.A., A.D.B., M.B., J.M.F., D.G., J.H., R.K.O., R.S.) of the DOCS Research Committee used an iterative consensus-building process to develop the survey with an emphasis on capturing the diversity of approaches to physical exam education. The survey was piloted with a small group of clinical skills faculty from across the country that was not included in the final, analyzed sample. After incorporating feedback from the pilot, the survey was approved by the DOCS Executive Committee.
The survey has 49 items including questions about curricular design, instructional methods, assessment, and resource utilization, as well as demographic data. The complete survey is available as Supplemental Digital Appendix 1 at http://links.lww.com/ACADMED/A508, and the items addressed by this study are marked with an asterisk (21 questions). Survey items are predominantly multiple-choice with some items including a “choose all that apply” option, as well as several opportunities for free-text comments. Participants’ individual responses are confidential to the researchers, with data presented here only in aggregate. The Institutional Review Board at the University of Chicago Pritzker School of Medicine has determined that the survey is exempt from further review under federal regulations (45 CFR 46.101(b)).
Survey sampling and administration
Each LCME-accredited medical school has one faculty member identified as the institutional representative (IR) to DOCS. When applicable, the IR is the overall director of preclerkship clinical skills education for that medical school. For schools where there is not one director for preclerkship clinical skills education, the faculty decide in consultation with the DOCS Executive Committee who serves as the IR to DOCS.
In October 2015, we sent the survey to DOCS IRs from the 141 LCME-accredited medical schools, including schools with preliminary and provisional accreditation at the time. IRs were encouraged to consult with colleagues as needed to answer the survey questions accurately. We sent the survey in an electronic format via Survey Gizmo (SurveyGizmo.com), which allows respondents to save and return to the survey later. We then followed up with multiple e-mails and phone calls to the IRs. The survey was open between October 22, 2015, and February 8, 2016. Participants did not receive any incentives for completing the survey.
We used survey weights to refine the precision of responses and make inferences at the population level. Standard sampling techniques for constructing weights were applied in making adjustments.12,13 We used the survey response rate to create base weights, and poststratification weights to adjust for school size, hypothesizing that the instruction of and resources for physical exam education may vary between larger and smaller schools. We defined “large” schools as those with a class size of 150 or more students and “small” schools as those with a class size less than 150. Survey results were analyzed according to this variable. The resulting overall sampling error was ± 5%.
Survey weights were used to report sample distributions and were analyzed in aggregate form. We used sample-adjusted Pearson χ2 tests and t tests to test for differences in response patterns by school size. We conducted all analyses using the “svy” survey weighting procedure via Stata statistical software, version 14 (StataCorp, College Station, TX).
To calculate the mean and median number of preclerkship hours students spend practicing the physical exam with various subjects, we multiplied total preclerkship hours spent teaching the physical exam at each school by the percentage of time that the school reported students practice the exam with each type of subject.
Conventional content analysis was performed on the constructed response portions of the survey14 with themes identified directly from the text data without a prior hypothesis.
Of 141 schools, 106 completed the survey for a response rate of 75%. Of these, 54 (51%) were large schools and 52 (49%) were small schools.
The average number of hours spent teaching the physical exam in the preclerkship phase was 82 hours (SD 71). There was wide variation across schools, with 12 schools devoting 30 hours or fewer and 6 schools reporting 200 hours or more for physical exam instruction. The median number of hours spent teaching the physical exam was 59. The largest proportion of time teaching the physical exam was in a small-group classroom setting at 32% (SD 26), with lesser amounts of time spent in simulation centers (30%, SD 32), lectures (13%, SD 13), inpatient clinical settings (13%, SD 16), and outpatient clinical settings (9%, SD 13).
Subjects for physical exam practice
Table 1 shows the individuals with whom the physical exam was practiced. SPs were used most frequently (38% of the time, median 18 hours), followed by peer-to-peer practice (30%, median 20 hours), inpatients (13%, median 3 hours), and outpatients (10%, median 2 hours). Forty-eight percent of schools (51) reported that their students spend less than 15% of physical exam practice time with actual patients (inpatients, outpatients, or emergency department patients).
A variety of teachers instructed students in the physical exam. Sixty-five percent (SD 25) of the instruction was by generalist faculty. Thirteen percent (SD 14) was by specialists teaching only the physical exam of their specialty, and 4% (SD 10) was by specialists teaching the entire exam. Five percent of physical exam teaching was by senior students without faculty present, and 12% was by SPs without faculty present. Because of rounding, these numbers do not add to 100%.
Excluding the genitourinary exams, SPs independently taught at least part of the physical exam at 31% of schools (n = 33). The breast exam was most frequently taught by SPs (22% of schools, n = 23), with 16% to 20% of schools (n = 17–21) using SPs to teach the various other nongenitourinary parts of the exam.
The genitourinary exams were most often taught by SPs (86% of schools, n = 92), either internally trained SPs (49%, n = 52) or externally contracted SPs (40%, n = 42). Mannequins and simulators were employed for genitourinary exam practice at 45% of schools (n = 48), while untrained patient volunteers were used at only 7% of schools (n = 7).
When a small-group classroom format was used for teaching the physical exam, the average size of student groups was 8 (SD 4, range 2–20). When students were assigned to an inpatient preceptor, on average there were 3 students per preceptor (SD 2, range 1–12), with a bimodal distribution: 37% of schools (n = 39) reported a group size of 2 students and 31% of schools (n = 33) reported a group size of 4 students. In the outpatient setting, on average 2 students (SD 1, range 1–5) were assigned to each preceptor.
Seventy-six percent of schools (n = 80) reported that faculty observe their students practicing with SPs at least part of the time. Seventy-six percent of schools (n = 80) also used direct observation of students at least part of the time during peer-to-peer practice. Students practicing with inpatients and outpatients were observed at least part of the time in 58% and 54% of schools (n = 61 and 57), respectively. Practice on mannequins and simulators was observed at least part of the time at 53% of schools (n = 56). At 19% of schools (n = 20), students were directly observed in all of these practice settings. Finally, although all schools included direct observation during some type of physical exam practice, 20% of schools (n = 21) did not have any faculty observation of students practicing the exam on either actual inpatients or outpatients.
Nearly all schools (98%, n = 103) required students to have a stethoscope. Other equipment, while often recommended, was required less often. For example, an otoscope/ophthalmoscope was required at 50% of schools (n = 53), and a sphygmomanometer was required at 40% of schools (n = 42). Eighty-nine percent of schools (n = 93) required textbook readings. A number of nontextbook resources were also used, including videos (76%, n = 80), simulators (64%, n = 67), ultrasonography (39%, n = 41), and virtual patients (11%, n = 12).
Table 2 summarizes compensation of clinical skills faculty and course directors. Only 6% of responding medical schools (n = 6) did not compensate their clinical skills course director(s). Forty-two percent of schools (n = 45) reported compensation in protected time only, 29% of schools (n = 31) reported monetary compensation only, and 22% of schools (n = 23) reported both protected time and monetary compensation. In contrast to course directors, fewer schools compensated their physical exam faculty. Specifically, small-group facilitators were uncompensated at 30% of schools (n = 32), inpatient clinical preceptors were uncompensated at 39% of schools (n = 41), and outpatient clinical preceptors and large-group lecturers were uncompensated at 48% of schools (n = 51). Of the 66% of schools (n = 70) that used senior students as teaching assistants, only 6% (n = 6) reported paying their student–teachers monetarily, with 59% (n = 63) compensating them with course credit, and 35% (n = 37) not compensating senior student–teachers at all.
Large versus small schools
All 54 large schools compensated their clinical skills course director(s) either in the form of money or protected time, whereas 12% of the 52 small schools (6) provided no compensation (P = .01). For all other items analyzed, there were no statistically different results between small and large schools.
Respondents were asked to list the two most successful and the two most challenging aspects of their physical exam curriculum. One hundred two respondents gave 178 comments on their most successful curricular aspects. Use of SPs was mentioned most often with 40 comments, while dedicated faculty who—as described by one respondent “know students and hold them accountable for learning”—were noted 22 times, and the small-group format was raised 24 times.
One hundred respondents gave 159 comments regarding their greatest challenges. Difficulties included finding small-group and clinical faculty (34 comments), standardizing faculty teaching and the assessment of student physical exam skills (34 comments), and time constraints (25 comments).
With 75% of all LCME-accredited U.S. medical schools responding, we believe this is the largest reported survey of the resources used to teach the physical exam to preclerkship students. There is little empiric evidence about the optimal methods for teaching the physical exam, so our goal in this survey was to take the first step by documenting the current state of physical exam education. The survey results identify the wide spectrum of resources used in physical exam curricula, and some of the current challenges in physical exam education. There is significant variability not only in the number of hours devoted to teaching the physical exam but also in the use of SPs versus real patients, and in the compensation of instructors.
At first glance, the average total hours of preclerkship physical exam education appears robust (82 hours). A closer look, however, indicates that half of schools use fewer than 59 hours to teach the physical exam, with 11% of schools devoting no more than 30 hours. Students with more intensive physical exam training demonstrate improved exam skills,15,16 so these limited hours may be insufficient. Learning particularly complex skills like fundoscopy may be further compromised because 50% of schools do not require students to have their own ophthalmoscopes.17 Furthermore, studies have shown that beyond the preclerkship phase, students have fewer opportunities to hone their physical exam skills. In prior studies, 81% of clerkship students reported that they had never been observed by a faculty member doing a complete physical exam,18 and 36% to 60% of clerkship students reported that their residents spent no time teaching the physical examination.7,18 Given the relative lack of emphasis on the physical exam in the clerkship years, it is critical that students acquire strong physical exam skills in the preclerkship curriculum.
A large percentage of student time practicing physical exam skills was spent with subjects other than actual patients, with nearly three-quarters of practice time taking place with simulations and peers. The benefits of practicing skills in a standardized and controlled manner are well documented, especially in early learning.19,20 SPs can be masterful in observing technique and giving descriptive feedback on how the exam is perceived by the patient.21,22 Peer practice also has some advantages, including allowing students to experience the exam as subjects,23 but SPs and students typically lack abnormal physical findings. In nearly half of schools, less than 15% of physical exam practice was with actual patients, and across all schools, students spent a median of only three hours practicing with actual inpatients and two hours practicing with actual outpatients. With this limited amount of time, students may lack skill in examining ill and disabled patients and have difficulty identifying abnormal physical findings.24
Faculty are also a vital, but limited, resource in physical exam education. Direct observation by faculty is one of the most robust ways to teach clinical skills,25,26 but only about half of schools report that practice with actual patients was directly observed. This is a missed opportunity, as direct observation of physical exam skills has been associated with more self-reported confidence in the physical exam.27
Given the expense and limited availability of faculty, many medical schools have turned to other instructors for the physical exam. In nearly half of schools, SPs taught portions of the physical exam without faculty present. Students who learn the physical exam from SPs, however, may miss out on teaching contributions from faculty,22 including practical tips gleaned from years of clinical experience and pathophysiologic correlations.
Despite the labor-intensive role that clinical skills preceptors play in physical exam education, they were left uncompensated in one-third to one-half of schools. In addition, 12% of smaller schools did not even compensate their clinical skills course directors. Compensation with financial reimbursement or protected time is important in recruiting, but especially important in retaining, clinical educators.28,29
Senior students may represent one solution to the teaching needs of clinical skills courses and may mitigate demands on overburdened faculty. Our survey finds that 66% of schools used senior students to teach the physical exam, while a 2008 survey of 130 U.S. medical schools found that 45% of schools used students in this role.30 Additional benefits of having senior students teach physical exam skills include their clear identification with the students, their familiarity with the curriculum, mentorship opportunities, and potential cost savings. Senior students themselves likely benefit from teaching as they hone their physical exam skills shortly before residency.
There are several limitations to this study, and there remain other important questions yet to be answered. Recall bias regarding the specific numbers and breakdown of hours can affect the interpretation of our findings, as well as the generalizability of findings given the unique environment at each medical school. In addition, items of interest which were not addressed include the number of hours preceptors and course directors spend per year in their respective roles and what precise compensation they receive. Increased transparency about resources may improve parity among schools in compensation for teaching.
The major strengths of this study are the high response rate of 75% and small sampling error of 5%, which make it likely that the results are representative of medical schools across the country. Furthermore, because we surveyed our organization’s members, we were largely able to ensure that the person most knowledgeable about physical exam education at his or her medical school completed the survey. Additionally, 102 IRs gave a total of 365 comments to the three free-text questions, which attests to the seriousness with which they approached the survey. A final strength is the internal consistency of the survey results. For example, one survey question found that schools spend, on average, 22% of their time teaching the physical exam in inpatient and outpatient settings, while another question found that students spend, on average, 23% of their time examining actual inpatients and outpatients. Given these strengths, we believe this survey provides an accurate snapshot of the current state of preclerkship physical exam education in the United States.
This study shows that the teaching of the physical exam to preclerkship students is a resource-intensive endeavor. Across U.S. medical schools there is large variation not only in how many hours are spent teaching this essential skill but also in the degree to which actual patients are used and the amount of faculty observation employed. It is possible that some schools do not allocate sufficient resources to this portion of the curriculum and that some medical students enter clerkships with insufficient physical exam training and skills.11
We believe the solution is multifaceted and includes improving direct observation of students by increasing faculty teaching time and compensating faculty adequately; creatively using senior students as instructors; finding new and better ways of accessing real patients willing to be examined by students; and sharing innovative ways schools and hospitals have found to ensure that students’ physical exam skills continue to grow.31 Without strategic investments such as these, the physical exam skills of U.S. medical students may be at risk, to the detriment of their future patients.
The authors wish to thank Maylene Liang, MPH, for her expert assistance in managing the online survey and data cleaning. The authors also wish to thank the members of the Directors of Clinical Skills Courses (DOCS) who devoted time to completing the survey.
2. Feddock CAThe lost art of clinical skills. Am J Med. 2007;120:374378.
3. Reilly BMPhysical examination in the care of medical inpatients: An observational study. Lancet. 2003;362:11001105.
4. Verghese A, Charlton B, Kassirer JP, Ramsey M, Ioannidis JPInadequacies of physical examination as a cause of medical errors and adverse events: A collection of vignettes. Am J Med. 2015;128:13221324.e3.
5. Ryan MS, Vanderbilt AA, Lewis TW, Madden MABenefits and barriers among volunteer teaching faculty: Comparison between those who precept and those who do not in the core pediatrics clerkship. Med Educ Online. 2013;18:17.
6. Christner JG, Dallaghan GB, Briscoe G, et al.The community preceptor crisis: Recruiting and retaining community-based faculty to teach medical students—A shared perspective from the Alliance for Clinical Education. Teach Learn Med. 2016;28:329336.
7. Smith MA, Gertler T, Freeman KMedical students’ perceptions of their housestaffs’ ability to teach physical examination skills. Acad Med. 2003;78:8083.
8. Wiener S, Nathanson MPhysical examination. Frequently observed errors. JAMA. 1976;236:852855.
9. Mangione S, Nieman LZCardiac auscultatory skills of internal medicine and family practice trainees. A comparison of diagnostic proficiency. JAMA. 1997;278:717722.
10. Vukanovic-Criley JM, Criley S, Warde CM, et al.Competency in cardiac examination skills in medical students, trainees, physicians, and faculty: A multicenter study. Arch Intern Med. 2006;166:610616.
11. Windish DM, Paulman PM, Goroll AH, Bass EBDo clerkship directors think medical students are prepared for the clerkship years? Acad Med. 2004;79:5661.
12. Lohr SLSampling: Design and Analysis. 2010.Boston, MA: Brooks Cole Publishing;
13. Valliant R, Dever JA, Kreuter FPractical Tools for Designing and Weighting Survey Samples. 2013.New York, NY: Springer;
14. Hsieh HF, Shannon SEThree approaches to qualitative content analysis. Qual Health Res. 2005;15:12771288.
15. Smith MA, Burton WB, Mackay MDevelopment, impact, and measurement of enhanced physical diagnosis skills. Adv Health Sci Educ Theory Pract. 2009;14:547556.
16. Roberts L, Lu WH, Go RA, Daroowalla FEffect of bedside physical diagnosis training on third-year medical students’ physical exam skills. Teach Learn Med. 2014;26:8185.
17. Gilmour G, McKivigan JEvaluating medical students’ proficiency with a handheld ophthalmoscope: A pilot study. Adv Med Educ Pract. 2017;8:3336.
18. Howley LD, Wilson WGDirect observation of students during clerkship rotations: A multiyear descriptive study. Acad Med. 2004;79:276280.
19. Cleland JA, Abe K, Rethans JJThe use of simulated patients in medical education: AMEE guide no 42. Med Teach. 2009;31:477486.
20. Giesbrecht EM, Wener PF, Pereira GMA mixed methods study of student perceptions of using standardized patients for learning and evaluation. Adv Med Educ Pract. 2014;5:241255.
21. Barley GE, Fisher J, Dwinnell B, White KTeaching foundational physical exam skills: Study results comparing lay teaching associates and physician instructors. Acad Med. 2009;81(10 suppl):S95S97.
22. Allen SS, Miller J, Ratner E, Santilli JThe educational and financial impact of using patient educators to teach introductory physical exam skills. Med Teach. 2011;33:911918.
23. Wearn A, Bhoopatkar HEvaluation of consent for peer physical examination: Students reflect on their clinical skills learning experience. Med Educ. 2006;40:957964.
24. Jayakumar NBedside teaching with unwell patients: Can it ever be appropriate? Med Teach. 2017;39:323324.
25. Hauer KE, Holmboe ES, Kogan JRTwelve tips for implementing tools for direct observation of medical trainees’ clinical skills during patient encounters. Med Teach. 2011;33:2733.
26. Fromme HB, Karani R, Downing SMDirect observation in medical education: A review of the literature and evidence for validity. Mt Sinai J Med. 2009;76:365371.
27. Chen W, Liao SC, Tsai CH, Huang CC, Lin CC, Tsai CHClinical skills in final-year medical students: The relationship between self-reported confidence and direct observation by faculty or residents. Ann Acad Med Singapore. 2008;37:38.
28. Peters AS, Schnaidt KN, Zivin K, Rifas-Shiman SL, Katz HPHow important is money as a reward for teaching? Acad Med. 2009;84:4246.
29. Kumar A, Kallen DJ, Mathew TVolunteer faculty: What rewards or incentives do they prefer? Teach Learn Med. 2002;14:119123.
30. Soriano RP, Blatt B, Coplit L, et al.Teaching medical students how to teach: A national survey of students-as-teachers programs in U.S. medical schools. Acad Med. 2010;85:17251731.
31. Anderson RC, Fagan MJ, Sebastian JTeaching students the art and science of physical diagnosis. Am J Med. 2001;110:419423.