Purpose: Clerkship directors’ practices regarding the National Board of Medical Examiners (NBME) subject exam in medicine are important in enhancing educational evaluation policy. The study’s purpose was to determine clerkship directors’ use of the subject exam in medicine and related learning activities in the context of curricula and outcomes of the directors’ internal medicine clerkships.
Method: The authors conducted a survey of directors of internal medicine clerkships in 2007. They performed descriptive statistical and multivariate analyses on all responses.
Results: Of 110 clerkship directors, 82 responded to the survey, for an overall response rate of 75%. Eighty-eight percent of the clerkship directors required the NBME subject examination in medicine. The mean minimum passing score was 62 (SD = 4.2); this score was not adjusted throughout the academic year, and it contributed 20% to 25% of the final grade. Most (89%) clerkships allowed students a retake after a failed first attempt. Most clerkship directors prepared students for the NBME subject exam in their programs through some combination of lectures, independent self-study, and review sessions with exam-preparation review books. However, 42% of clerkship directors lacked a specific strategy for a retake after a failure.
Conclusion: Clerkship directors’ use of the NBME subject exam in medicine is high. Most allow a retake after a first failure, and a combination of strategies is currently provided to help students prepare. A need exists to develop remediation plans for students who fail the exam. This report may serve as a reference for curricular and programmatic clerkship decisions.
Dr. Torre is associate professor of medicine, Medical College of Wisconsin, Milwaukee, Wisconsin.
Dr. Papp is senior research associate, Case Western Reserve University School of Medicine, Cleveland, Ohio.
Dr. Elnicki is professor of medicine and director, Section of General Internal Medicine at Shadyside Hospital, University of Pittsburgh, Pittsburgh, Pennsylvania.
Dr. Durning is associate professor of medicine, Uniformed Services University of Health Sciences F. Edward Hebert School of Medicine, Bethesda, Maryland.
Correspondence should be addressed to Dr. Torre, Department of Medicine, Medical College of Wisconsin, Froedtert Cancer Care Center, 9200 West Wisconsin Avenue, Fifth Floor, Milwaukee, WI 53226; telephone: (414) 805-0850; fax: (414) 805-0535; e-mail: (email@example.com).
The National Board of Medical Examiners (NBME) subject exam in medicine is a highly reliable exam that can provide a measure of the knowledge that students acquire during an internal medicine (IM) clerkship and that does provide national, norm-referenced data.1 Scores that are reported to schools are standardized to a national mean of 70 (SD = 8). The subject exam in medicine has been shown to identify medical students who are at risk of poor performance on the United States Medical Licensing Exam (USMLE) Step 2, Clinical Knowledge.2 Therefore, it is important for educators and students to gather information about utilization and learning activities related to the subject exam in medicine that may eventually serve to develop and enhance students’ preparation for future licensing exams. Because USMLE Step 2 scores have also been used by many residency programs in the selection of residency candidates,3 the use by clerkships of the NBME subject exam in medicine may have implications for decisions about IM clerkship curricula as well as for students’ future residency applications.
Previous research indicated that the NBME subject exam in medicine has been used to measure both the effects of a new program such as the implementation of an ambulatory rotation4 and the effect of an intensive 12-week conference series.5 The NBME subject exam in medicine also has been used to compare students’ knowledge-based performance during ambulatory and inpatient rotations.6 Furthermore, important collaborative efforts have been implemented to align the curricular objectives of the IM clerkship with the content of the subject exam.7 Despite the importance of assessing the exam’s current use, it has been almost a decade since the 1999 survey by Hemmer and colleagues8 that examined clerkship directors’ practices with respect to the NBME subject exam in medicine. The present study reexamines this issue from a current point of view and investigates whether remediation was provided after a second exam failure and whether minimum passing scores rose with students’ training as the academic year progressed.
Thus, the purpose of this study was to identify IM clerkship directors’ practices with respect to the subject exam in medicine within the context of the curricular activities and outcomes of the IM clerkship. In addition, this study explored the learning activities that were provided to prepare medical students for the first or second try at the examination. Finally, it examined whether there was a statistically significant relationship between school size, the clerkship directors’ demographics or academic role, and the opportunity to retake this exam.
Survey design and development
In April 2007, we conducted a survey of the institutional members of the Clerkship Directors in Internal Medicine (CDIM). The survey was confidential and was administered in both electronic and paper formats.
After a review of the literature on the subject exam in medicine in an IM clerkship,1,2,5,8–11 the CDIM Research Committee (which currently includes authors D.T. and K.P. and previously included S.D.) developed a series of questions and included them in the annual CDIM survey for 2007. The researchers developed questions that focused on the following constructs: the use in IM clerkships of the subject exam in medicine, the learning activities implemented to prepare students for the exam, the effect of the exam on clerkship outcomes, and the presence (and nature) or absence of remediation plans for students who failed the exam. The IRB at the Uniformed Services University of Health Sciences F. Edward Hebert School of Medicine approved this study.
The final survey contained 6 demographic questions and 23 questions about the NBME subject exam in medicine. The demographic questions asked about the clerkship director’s age, sex, academic rank, current role, and the number of years as a director, as well as the size of the medical school’s matriculating class. Many of the 23 questions on the exam were closed-ended questions that used a combination of categorical and continuous scales.
In the next section of the survey, we asked whether students were required to take the NBME subject exam in medicine as part of their medicine rotation. If so, we then asked whether a knowledge-based pretest was administered at the beginning of the clerkship. We also asked what learning activities were implemented by clerkship faculty to prepare students for the subject exam (whether first attempt or retake). The response format required selecting among such learning activities as lectures that cover topics frequently encountered on the exam, live board review sessions using the Medical Knowledge Self-Assessment Program for Students 3 or similar texts, self-directed Web-based tutorials, independent self-study of test material recommended by clerkship directors, and other options. Clerkship directors could select more than one answer.
Next, we asked clerkship directors to specify the percentage of the students’ final clerkship grade that was contributed by the exam result. We also (1) asked whether students had the option of retaking the test if they failed the first attempt, provided that clerkship performance in other areas (e.g., clinical evaluations or OSCE) was satisfactory, (2) asked the clerkship directors to specify the earliest date after the failed test on which students were allowed to retake the test and the latest date by which students had to retake the test (the questionnaire referred to these dates as “minimum time” and “maximum time”), and (3) asked the clerkship directors whether passing the retake exam would affect a student’s final clerkship grade. Next, we posed a series of yes/no questions about any remediation plans that were in place for students who failed the exam on the first try. The options included allowing the student to take the exam again, requiring the student to take the clerkship again, providing the student with counseling, giving the student extra assignments, and giving the student a passing grade if he or she had satisfactorily completed all other components of the clerkship. The answers were not mutually exclusive, and more than one answer could be checked. We asked the clerkship directors to specify the highest final clerkship grade that a student could achieve if he or she failed the test on the first attempt or if he or she also failed the retake (the grading scale included honors, high pass, pass, low pass, fail, and other).
We reviewed the survey for content validity, overall design, and usability for the CDIM Research Committee; it was subsequently approved by the CDIM Council. Feedback from the expert panel, which included all members of the CDIM Research Committee and members of the CDIM Council, was incorporated into the questionnaire, which was tested among 14 CDIM Research Committee members. We analyzed survey pilot results for nonresponses, missing data, and comments by respondents, and those analyses led to additional revisions.
In April 2007, CDIM conducted its annual, voluntary, confidential survey of its 110 institutional members in the United States and Canada (of the 126 schools that are eligible by virtue of their membership in the Association of American Medical Colleges, 16 do not have a CDIM member). The survey was sent to participants via e-mail; it included a link to a Web site where the survey could be completed. The survey was mailed in April 2007; after distribution of the survey, we made up to four attempts to contact nonresponders by using e-mail, postal mail, or telephone.
Each participating school has only one CDIM institutional member, to whom the survey is sent. We used current e-mail and postal mail addresses, which CDIM updates yearly, to contact respondents.
We performed descriptive statistical analyses on all responses to assess the practices of clerkship directors with respect to the use of the NBME exam. We calculated a point-biserial correlation coefficient to assess the correlation coefficient between school size (continuous variable) and the retaking of the subject exam in medicine (dichotomous variable). We also performed logistic regression analysis to assess whether school size, the age of the clerkship directors, the number of years he or she has been a director, and his or her academic rank were statistically significant predictors of a student’s opportunity to retake the exam (dependent dichotomous variables) after a failed first attempt. We calculated both descriptive statistics and correlational and multivariate analyses by using a standard statistical software program (SPSS version 12; SPSS Inc., Chicago, Illinois).
Eighty-two clerkship directors responded to the survey, for an overall survey response rate of 75%. Most respondents (n = 50; 61%) were male. Respondents had been in the role of clerkship director a mean of 8.0 years (SD = 4.8), and most were at the rank of associate (n = 35; 43%) or assistant (n = 25; 30%) professor. The mean number of students who matriculated at the surveyed institutions was 145 (SD = 48).
The NBME subject exam medicine was required by 72 (88%) of the responding clerkships. The great majority of IM clerkships (n = 77; 93%) did not administer a standardized multiple-choice knowledge pretest at the beginning of the clerkship, and none of the clerkships that used such a test included the students’ resulting scores in their final clerkship grades. The result of the NBME subject exam in medicine was part of the final clerkship grade in 71 (99%) of the clerkships that used the exam.
In 26 (36%) of the clerkships, the NBME exam result accounted for 25% of the final grade, and, in 17 (25%) of the clerkships, it accounted for 20% of the clerkship grade. Twelve clerkships (16%) weighted the exam result between 30% and 35% of the clerkship grade, and 10 clerkships (13%) weighted it between 10% and 15%. In the remaining clerkships, the percentage of a student’s final grade accounted for by the NBME exam result ranged from a minimum of 18% (n = 1) to 40% (n = 4) and 50% (n = 1).
Sixty-three (88%) of the clerkship directors indicated that their programs did have a minimum score for passing the NBME exam. The mean minimum score for passing the exam was 62 (SD = 4.2). The minimum passing score was between 60 and 65 for 58 (81%) clerkships, between 55 and 59 for 9 (12%) clerkships, and between 66 and 70 for 5 (7%) clerkships.
Most (68%) of the IM clerkships did not adjust the minimum passing score throughout the academic year to reflect students’ increasing clinical experience and knowledge. We asked the clerkship directors to specify the highest clerkship final grade that a student could achieve if he or she failed the subject exam at the first attempt but passed the retake. Most (62%) of the clerkship directors reported that the highest grade given after passing the retake was a pass, with other clerkship directors reporting highest grades of high pass (18%), honors (8%), and low pass (6%).
Most clerkship directors indicated that, to prepare students for the first attempt at the NBME subject exam in medicine, they used a combination of lectures and independent self-study of material (34%) or a combination of lectures, independent self-study, and board review sessions (31%) (Table 1). Only 11% of clerkship directors reported that their clerkship did not have a specific curriculum, whether self-directed or faculty-based, to help students prepare for the exam.
If students failed the NBME exam at the end of the clerkship, 63 (89%) of clerkships allowed them to retake the examination. When we asked the clerkship directors to describe how they facilitated students’ preparation to retake the exam, the most frequently used learning activity reported (42%) was that of self-study: students were directed to study board review multiple-choice-question practice books or other similar texts. However, 42% of clerkship directors reported that, in contrast with the learning opportunities made available to students taking the exam for the first time, their clerkship did not have a specific teaching strategy or curriculum to help students prepare to retake the subject exam in medicine (Table 2). In fact, if a student failed the retake, 42% of clerkship directors indicated that those students would have to retake the clerkship, and 30% reported no formal remediation plan for those who also failed the retake exam (Table 3).
Of the directors of the 89% of clerkships that allowed a retake, 51% reported that their clerkship formally specified how soon after the failed test a student was allowed to retake the test and the latest date by which a student had to retake the test. The mean earliest time at which the exam could be retaken was 6.5 (SD = 4.4) weeks after the failed exam, and the median was 6 weeks. The mean latest time by which the exam had to be retaken was 19 (SD = 22) weeks after the failed exam, and the median was 10 weeks.
There was no statistically significant correlation between school size and the opportunity to retake the exam (P > .05). Neither the age, academic rank, or sex of the clerkship directors nor the school size predicted that students would have the opportunity to retake the exam after one failed attempt (P > .05).
The NBME subject exam in medicine was used by a preponderance of the clerkships, and, for most of these clerkships, the subject exam accounted for 20% and 25% of the final grade. Most clerkship directors did not use a pretest to assess knowledge at the beginning of the clerkship. Most clerkship directors reported that a mean minimum score of 62 was required to pass the exam. Such a score is only 1 SD below the mean for the national reference group, and it represents a fairly high bar for students in completing the IM clerkship. For students who failed the NBME subject exam in medicine, most clerkship directors allow at least one retake of the examination. A combination of lectures and self-study, in which students were directed to specific exam study material, was the most frequently used learning activity to help students prepare for their first attempt to take the exam. In contrast, one third of clerkships did not have specific plans in their curriculum to prepare students for a retake of the exam.
Despite evidence that the administration of a pretest at the beginning of the clerkship may help to identify students with knowledge deficiencies,9,10,12 the vast majority of clerkships did not administer such a knowledge test. This decision may result from the fact that the development of a valid and reliable test is a labor- and time-intensive task for faculty and clerkship personnel. Nonetheless, a combination of evaluation strategies, including just such a written test at the beginning of the clerkship, may be helpful in identifying students with marginal funds of knowledge.13,14
Hemmer and colleagues have reported that the proportion of IM clerkships that used the subject exam increased from the 66% reported by Magarian and Mazur in 19901 to 83% in 19998 and 85% in 2005.15 The current report confirms that the percentage of IM clerkships using the NBME subject exam has been slightly but constantly increasing during the past 10 years (88%). Yet, the proportion of students allowed a retake after one failed attempt increased from 63% in 19998 to 89% in 2007. Whether this increase is related to the recognized greater level of difficulty of the current exam or to the more frequent incorporation of the exam score into the final grade remains to be determined. The current study also showed that most of the clerkships used a combination of teaching methods to prepare students for the exam. An educational approach that uses different learning strategies—whether traditional lectures, self-study guided by a clerkship faculty member (whether in the setting of problem-based learning or not), or Web-based instructional material—may be more likely to meet the needs of different learners and may be an appropriate and effective preparation strategy for such important exams as the USMLE Step 1 and Step 2.16,17 However, we do not know which approach is most effective, and this question should be explored in future studies.
It is interesting that most of the clerkships did not change their passing scores throughout the academic year, even though there is evidence that students in the second half of the academic year achieve better test performances than do students in the first half.18 National mean scores provided by the NBME for first-time test takers (by quarter of the previous year) also show an increase throughout the four quarters of the academic year.19 Nevertheless, the NBME does not recommend that clerkships require any specific passing score, nor does the board indicate how the exam should be weighted in the final clerkship grade. The reasons that most of the clerkship directors do not increase the passing score throughout the year are difficult to ascertain. One hypothesis is that increasing the passing score might penalize students who take the IM clerkship later in the year. Another hypothesis is that, if other clerkships within the same institution were to adopt a different scoring policy in reference to the subject exam, such as adjusting the score, students’ overall grading may be biased, and such action may be perceived by students as unfair.
Most of the clerkship directors reported that a retake of the test was allowed. Research has shown that students who undertook a period of self-study in preparation for a retake of the subject exam in medicine had a higher rate of passing the exam than did students with no self-study preparation. Thus, a self-study strategy may constitute a reasonable didactic approach.10 Constance and colleagues9 showed that a four-week, interactive, structured program using a problem-based learning approach reduced individual students’ deficits in the IM knowledge base and increased the students’ passing rates on the NBME exam retakes. Similarly, Magarian4 showed that a faculty-led conference series (mainly lectures but also some problem-based conferences) that covered core topics of IM as selected by faculty members and lasted an average of 12 hours per week for three months enhanced students’ performance on the NBME exam. However, the development and conduct of such programs requires a great amount of faculty time.20 The current study suggests that a combination of lectures and self-directed study was the learning activity most often used to prepare students for the exam. Such frequent use of independent self-study, as reported in our survey, may suggest an attempt by clerkship directors to foster self-directed learning activities, in an effort to encourage students to take control of their own learning, coupled with the need to save faculty time. Some clerkship directors reported the use of electronic learning resources, and there is evidence that the use of electronic rather than print learning resources has no effect on NBME exam scores in an IM clerkship.21
A number of limitations are evident in this study. First, we did not ask whether clerkship directors modified test scores for the retake exam by increasing or decreasing the cutoff score for passing, although we did ask about the effect of the retake exam score on the final clerkship grade. Second, we did not query clerkship directors about the content and implementation details of learning activities (such as how often learning activities were conducted and how much time was devoted to each) performed by clerkship faculty. Third, this survey does not report clerkship directors’ views on the use and effectiveness of local faculty-developed or externally developed end-of-clerkship exams if their clerkships do not use the NBME subject exam. Although such research was beyond the scope of the survey at this time, it would be important in the future to gather data about such end-of-clerkship exams, including information about their validity and reliability. Fourth, in this survey, we asked clerkship directors to make a selection of preparatory learning activities from a predetermined list. All clerkship directors selected at least one learning activity from the survey list, and even though we provided an open-ended option (“other”) in the survey, none entered any free text in the “other” option of the survey. Fifth, we did not ask why clerkship directors used the subject exam in medicine rather than other written tests. However, it is possible that the high validity and reliability of the exam, coupled with the opportunity of saving faculty members the time that would be needed for test development, may play a role in clerkship directors’ decisions to use this exam. Sixth, the findings of this survey about IM clerkships may not be generalizable to other clerkship disciplines, but they may represent the need for future studies. Given the fairly high response rate to the survey, nonresponse bias should not be significant.
Future research should focus on gathering information about the structure and perceived effectiveness of clerkship learning activities provided to students for exam preparation. It is crucial to obtain information about the features of existing clerkship remediation plans, which may serve to inform the development of effective and feasible educational programs for those students who fail the exam twice. In addition, it would be important to design intervention studies that implement and evaluate different remediation programs for those students who fail the exam.
1 Magarian GJ, Mazur DJ. Evaluation of students in medicine clerkships. Acad Med. 1990;65:341–345.
2 Ripkey DR, Case SM, Swanson DB. Identifying students at risk for poor performance on the USMLE Step 2. Acad Med. 1999;74(suppl):S45–S48.
3 Pangaro L, Gibson K, Russell W, Lucas C, Marple R. A prospective, randomized trial of a six-week ambulatory medicine rotation. Acad Med. 1995;70:537–541.
4 Magarian GJ. Influence of a medicine clerkship conference series on students’ acquisition of knowledge. Acad Med. 1993;68:923–926.
5 Berber ES, Brooks CM, Erdmann JB. Use of USMLE to select residents. Acad Med. 1993;68:753–759.
6 Fincher RE, Case SM, Ripkey DR, Swanson DB. Comparison of ambulatory knowledge of third-year students who learned in ambulatory settings with that of students who learned in inpatient settings. Acad Med. 1997;72(10 suppl 1):S130–S132.
7 Elnicki DM, Lescisin DA, Case S. Improving the National Board of Medical Examiners Internal Medicine Subject Exam for use in clerkship evaluation. J Gen Intern Med. 2002;17:435–440.
8 Hemmer PA, Szauter K, Allbritton TA, Elnicki DM. Internal medicine clerkship directors’ use of and opinions about clerkship examinations. Teach Learn Med. 2002;14:229–235.
9 Constance E, Dawson B, Steward D, Schrage J, Schermerhorn G. Coaching students who fail and identifying students at risk for failing the National Board of Medical Examiners medicine subject test. Acad Med. 1994;69(10 suppl):S69–S71.
10 Hemmer PA, Pangaro LN. Natural history of knowledge deficiencies following clerkships. Acad Med. 2002;77:350–353.
11 Rockney RM, Allister RG. Dropping the shelf examination: Does it affect student performance on the United States Medical Licensure Examination Step 2? Ambul Pediatr. 2005;5:240–243.
12 Denton GD, Durning SJ, Wimmer AP, Pangaro LN, Hemmer PA. Is a faculty developed pretest equivalent to pre-third year GPA or USMLE step 1 as a predictor of third-year internal medicine clerkship outcomes? Teach Learn Med. 2004;16:329–332.
13 Hemmer PA, Grau T, Pangaro LN. Assessing the effectiveness of combining evaluation methods for the early identification of students with inadequate knowledge during a clerkship. Med Teach. 2001;23:580–584.
14 Hemmer PA, Pangaro L. The effectiveness of formal evaluation sessions during clinical clerkships in better identifying students with marginal funds of knowledge. Acad Med. 1997;72:641–643.
15 Hemmer PA, Papp KK, Mechaber AJ, Durning SJ. Evaluation, grading, and use of the RIME vocabulary on internal medicine clerkships: Results of a national survey and comparison to other clinical clerkships. Teach Learn Med. 2008;20:118–126.
16 Enarson C, Cariaga-Lo L. Influence of curriculum type on student performance in the United States Medical Licensing Examination Step 1 and Step 2 exams: Problem-based learning vs. lecture-based curriculum. Med Educ. 2001;35:1050–1055.
17 Blake RL, Hosokawa MC, Riley SL. Student performances on Step 1 and Step 2 of the United States Medical Licensing Examination following implementation of a problem-based learning curriculum. Acad Med. 2000;75:66–70.
18 Magarian GJ, Mazur DJ. Does performance on the NBME Part II medicine examination when used as a clerkship examination reflect knowledge acquired during the medicine clerkship? J Gen Intern Med. 1991;6:145–149.
19 National Board of Medical Examiners. ________. Examiner. Winter 1999;46:3.
20 Barzansky B, Jonas HS, Etzel SI. Educational programs in US medical schools, 1997-1998. JAMA. 1998;280:803–808.
© 2009 Association of American Medical Colleges
21 DeZee K, Durning S, Denton GD. Effect of electronic versus print format and different reading resources on knowledge acquisition in the third-year medicine clerkship. Teach Learn Med. 2005;17:349–354.