The structure of traditional internal medicine training has been criticized for taking too long, lacking certain content, and not allowing trainees to focus on their specific interests.1 To address these issues, the Alliance for Academic Internal Medicine (AAIM) has created a task force to make recommendations for redesigning graduate medical education,2 and educational leaders in internal medicine have proposed better aligning graduate education with professional needs and making the field more attractive to medical students.3–6
One possible redesign would be reducing the length of training. Several pilot studies have shown that accelerated programs in internal medicine and family medicine can be successful (at least for students who excel academically), but most of these studies involve shortening medical school training—not residency training.7–9 Proposals for redesigning graduate education typically retain the three-year requirement while allowing some customization in the last year.
Recognizing that residents achieve competency at different rates, the AAIM task force has recommended implementing a framework, first set out by the Accreditation Council for Graduate Medical Education and the American Board of Medical Specialties, in which residents are evaluated on the basis of six general competencies independent of completing a three-year training program.10 However, the task force was still reluctant to allow the third year to count as “other” training. Debate in the internal medicine community continues about whether to allow trainees to shorten their general internal medicine training in order to begin subspecialty training in the third year.6 Will using the third year of training for something other than the “core” of internal medicine negatively affect physicians’ clinical judgment when caring for patients?
To address some of the internal medicine community’s concerns, we decided to study a cohort of internal medicine physicians whose clinical training is, in fact, “short-tracked”: physician–scientists. The medical community recognizes the vital role of clinical research.11,12 Outcomes and population-based research, clinical trials, and translational research13,14 are necessary for advancing medical knowledge, improving patient care, preventing disease, and defining best practices. But it can be difficult to sustain a grant-supported research career.15,16 In 1985, to encourage those trainees who intend to pursue a career that is largely clinical research, the American Board of Internal Medicine (ABIM) established an integrated program, now called the “Research Pathway” (RP). The program, which combines training in clinical care and clinical research, is five years for general internists and either six or seven years for subspecialties.17 Whereas trainees in the traditional pathway (TP) must complete 36 months of accredited categorical internal medicine training with a minimum of 24 months of direct patient care, RP trainees complete 24 months of accredited categorical internal medicine training with a minimum of 20 months in direct patient care. RP trainees may sit for the internal medicine certification exam after four years (i.e., after completing 24 months of clinical training and 24 months of research). To determine how “short-tracked” internal medicine training affects physicians’ competency, we compared the ABIM certification performance of those who followed the RP with those who completed traditional training. To determine whether research training was sustained throughout physicians’ careers, we examined RP physicians’ rates of maintenance of an internal medicine certificate 10 years later and characteristics of their current careers.
We looked at a national sample of 101,031 physicians who completed internal medicine training after 1992 and took the internal medicine certification examination for the first time in the years 1993 through 2008. Of those, 1,009 (1%) were RP trainees, and the rest were trainees who followed the traditional three-year pathway. To compare the groups, we used the scores from their first attempts to pass the ABIM certification exam and their eventual initial certification status as dependent measures in two separate analyses.
The certification exam assesses cognitive skills in the broad domain of internal medicine, including diagnosis and treatment of common and rare conditions (www.abim.org/exam/cert/im.aspx#content).18 Passing the exam is necessary for certification in internal medicine and is a prerequisite to certification in subspecialty areas of internal medicine. Exam reliability coefficients are high and range from 0.90 to 0.92. An absolute standard is used to set the passing score for certification. To compare performance across exam forms, we used standardized equated scores (mean = 500, SD = 100).
We collected comparison data from ABIM registration files, which include the physicians’ gender, location of medical school (U.S./Canadian or international), country of birth (U.S./Canadian or international), program directors’ ratings of their medical knowledge in the last year of residency training (1–3 = unsatisfactory, 4–6 = satisfactory, 7–9 = superior), training pathway (research or traditional), and whether they completed the exact number of requisite training months. We included these variables, which have been shown to be predictors of exam performance, in the model to better assess the direct impact of training pathways on performance.19 For the physicians who followed the TP, we determined whether their training programs also offered an RP, because we assumed that the characteristics of RP and TP trainees in those programs would be similar.
Since 1990, the ABIM has issued certifications that last 10 years, and so, to remain certified, physicians must complete the maintenance of certification (MOC) program every decade. To find out whether MOC program participation and performance differs between RP and TP physicians, we limited our analysis to physicians who needed to complete the MOC program because their most recent certificate (internal medicine or subspecialty) was issued before 2001. We obtained MOC data from registration files and a survey, completed on enrollment in the MOC program and updated every 18 months (with a typical response rate of 95%), in which physicians self-report their practice characteristics. The registration data include demographics, MOC program enrollment and completion rates, and MOC exam performance. The practice characteristics collected in the survey include rates of clinical activity, percentage of time spent in direct patient care or clinical research, and type of primary practice. For physicians who completed the survey more than once, we used the most recent survey.
We compared residents from the RP group with residents from all TP programs, as well as with TP residents whose programs offered RP tracks (TP-RP). We used descriptive statistics to examine physician characteristics. We used a chi-square test of significance for categorical variables and a t test of significance for continuous variables. We used the general linear model to assess the relationship between training pathway and score on first attempt of the internal medicine exam, adjusting for physician characteristics. Stepwise linear regression was used to dictate variable inclusion in the model. We also used stepwise logistic regression to examine the relationship between training pathway and eventual internal medicine certification status, also adjusting for physician characteristics. There was sufficient power to detect differences, given the extremely large sample size in the study. We performed analyses using SPSS software.
Our study did not require institutional review board approval; when physicians register to take the certification exam or enroll in the MOC program, they enter into a business associate agreement20 that allows the ABIM to use their data—only at an aggregate level—for research purposes.
The RP group had an average of 63.1 (SD = 19.1) residents per year. Of the 500 residency training programs, 140 (28%) had at least one physician in the RP group, but only 23 programs accounted for 73% of the RP group’s training. Table 1 shows that the RP trainees were mostly men, born and trained in the United States or Canada; they passed their first attempts on the internal medicine certification exam at a slightly higher rate than the TP and TP-RP groups; 98% of them were eventually certified in internal medicine. On average, RP residents were a year older (33.9 years) than the TP group (32.5 years); this makes sense because they cannot take the exam until they have completed four years of training. On average, the RP group had less formal clinical training (25.1 months) than the TP group (36.3 months), but the majority (91%) of them completed the 24-month training requirement as prescribed. Program directors clearly rate the RP group higher on their medical knowledge (mean = 7.5/superior versus 6.6 and 6.8/satisfactory for the TP and TP-RP groups).
The left panels of Table 2 show that, after adjusting for physician characteristics, type of program, and training, being in the RP training group was significantly associated with only slightly lower scores on the first attempt at the internal medicine exam, although the effect size was quite small (β = –0.02). The factors that explain the most variance in the model were program directors’ ratings of medical knowledge, followed by age at time of initial internal medicine exam and being a medical graduate born and trained outside of the United States and Canada. Physicians scored highest if they received high ratings from their program directors, were younger, were born and trained abroad or were born and trained in the United States or Canada, were female, completed the exact number of requisite training months, trained in a program that also included an RP track, and did not train through the RP. Together, the variables explained 30% of the variance in the model.
Because the ultimate goal is certification in internal medicine, we used logistic regression to examine the relationship between certification status as the dependent variable and the same explanatory variables as in the linear regression. We show in the right panels of Table 2 that being born and trained abroad or born and trained in the United States or Canada, having high program director ratings, being in a program with an RP, completing the exact number of requisite training months, and being female all contributed to becoming certified. The variables explained 21% of the variance in the model. In this analysis, RP training did not contribute to the model, suggesting that it does not negatively impact overall competence in clinical judgment as measured by certification status.
We compared the RP group’s MOC participation rates and performance with those of the TP and TP-RP groups for those physicians whose certificates would have lapsed had they not returned for the MOC program. There were 239 RP physicians (Table 3) in this subset. Not surprisingly, significantly fewer RP physicians than TP or TP-RP physicians enrolled in the MOC program (77% versus 89% or 91%, respectively). Once enrolled, there are no significant differences between the groups’ rates for completing the program, taking the MOC exam, and passing, or the amount of knowledge and practice performance self-assessment points earned.
Table 4 shows the practice characteristics of a subset of RP physicians. They significantly differ from both the TP and the TP-RP groups. The RP group comprised fewer women (11% versus 37% or 39%, respectively), more academicians (63% versus 14% or 19%, respectively), more who were clinically inactive (11% versus 3% or 4%, respectively), and more who spend, on average, a larger portion of time in medical research (37% versus 3% or 5%, respectively).
Discussion and Conclusions
In this study, we addressed a gap in understanding whether the third year of internal medicine training can be customized to a trainee’s career goals without dramatically affecting the physician’s clinical judgment and, indirectly, patient care. After controlling for physicians’ ability (as measured through the program directors’ ratings of medical knowledge), we found that a short-tracked RP for internal medicine training, in which trainees receive just two years of internal medicine training before moving on to research, did not adversely affect certification status. First-attempt certification scores were slightly lower for the RP group, but only in comparison to peers in the same institutions, and the difference was less than one-tenth of a standard deviation. And even with this difference, the effect size was quite small and likely not practically significant.
RP physicians are clearly seen as exceptional, as reflected by the substantially higher ratings they receive from program directors regarding their medical knowledge. We confirmed—for a small subset of the group—that RP physicians continue to spend a respectable portion of time in medical research (37%) and that, to the credit of the medical community, 63% have remained in academic medicine. These findings validate the original motivation for offering this alternative “short-track” pathway: to enable academically oriented physician–scientists to pursue their career goals without compromising their medical knowledge and clinical skills. RP physicians value MOC as demonstrated by a substantial enrollment rate (77%). Not surprisingly, this rate is lower than the TP-RP rate (91%), probably because the certificate is less relevant to their careers.
This study has several limitations. First, we restricted our analyses to physicians who completed training after 1992 because the database before then was missing critical variables. Second, we did not examine differences in quality of patient care between the groups. Instead, we used certification and MOC exam performance as measures of clinical judgment. Certification scores have been correlated with clinical performance, but we recognize that performing well on a cognitive exam is not the same thing as delivering high-quality care.21 Third, the practice characteristics data came from a self-reporting survey that is updated only every 18 months. Fourth, we analyzed MOC enrollment and practice characteristics from a smaller subset of physicians; the results may not generalize to the whole population.
The medical community clearly values the role of the physician–scientist. This is demonstrated by the unique, shorter internal medicine training path made available to this small group of talented physicians. Redesigned GME could extend this model to benefit other talented trainees by allowing them to customize training to meet specific career goals. With the introduction of new work hours for U.S. residency training,22 we need to reconsider the key to good training. Perhaps, as Ludmerer23 states, “we should be focusing less on the time residents spend at work and more on the educational character of that work.” Our study supports the notion that different training pathways can lead to similar achievements in clinical knowledge and judgment. On the basis of this work, we believe that, at least for exceptional physicians, time-independent, competency-based education is a viable alternative to traditional, time-dependent education.
Acknowledgments: The authors wish to thank Ms. Nancy Rohowyj for her work on editing and formatting the manuscript.
Other disclosures: All authors are employed by the ABIM.
Ethical approval: Not applicable.
Previous presentations: A version of this work was presented at the 2009 American Educational Research Association annual conference.
1. Goldman L. Modernizing the paths to certification in internal medicine and its subspecialties. Am J Med. 2004;117:133–136
2. Meyers FJ, Weinberger SE, Fitzgibbons JP, Glassroth J, Duffy FD, Clayton CPAlliance for Academic Internal Medicine Education Redesign Task Force. . Redesigning residency training in internal medicine: The consensus report of the Alliance for Academic Internal Medicine Education Redesign Task Force. Acad Med. 2007;82:1211–1219
3. Charap MH, Levin RI, Pearlman RE, Blaser MJ. Internal medicine residency training in the 21st century: Aligning requirements with professional needs. Am J Med. 2005;118:1042–1046
4. Ludmerer KM, Johns MM. Reforming graduate medical education. JAMA. 2005;294:1083–1087
5. Smith LG, Humphrey H, Bordley DR. The future of residents’ education in internal medicine. Am J Med. 2004;116:648–650
6. Weinberger SE, Smith LG, Collier VUEducation Committee of the American College of Physicians. . Redesigning training for internal medicine. Ann Intern Med. 2006;144:927–932
7. Petrany SM, Crespo R. The accelerated residency program: The Marshall University family practice 9-year experience. Fam Med. 2002;34:669–672
8. Chang LL, Grayson MS, Patrick PA, Sivak SL. Incorporating the fourth year of medical school into an internal medicine residency: Effect of an accelerated program on performance outcomes and career choice. Teach Learn Med. 2004;16:361–364
9. Delzell JE Jr, McCall J, Midtling JE, Rodney WM. The University of Tennessee’s accelerated family medicine residency program 1992–2002: An 11-year report. Fam Med. 2005;37:178–183
11. Savill J. More in expectation than in hope: A new attitude to training in clinical academic medicine. BMJ. 2000;320:630–633
12. Ley TJ, Rosenberg LE. The physician–scientist career pipeline in 2005: Build it, and they will come. JAMA. 2005;294:1343–1351
13. Oinonen MJ, Crowley WF Jr, Moskowitz J, Vlasses PH. How do academic health centers value and encourage clinical research? Acad Med. 2001;76:700–706
14. Schwartz DA Medicine Science and Dreams: The Making of Physician–Scientists. 2011 New York, NY Springer
15. Balaban CD. Toward revitalizing the role of physician–scientists in academic medicine. Otolaryngol Head Neck Surg. 2008;139:766–768
16. Shea JA, Stern DT, Klotman PE, et al. Career development of physician scientists: A survey of leaders in academic medicine. Am J Med. 2011;124:779–787
19. Lipner R, Song H, Biester T, Rhodes R. Factors that influence general internists’ and surgeons’ performance on maintenance of certification exams. Acad Med. 2011;86:53–58
21. Holmboe ES, Lipner R, Greiner A. Assessing quality of care: Knowledge matters. JAMA. 2008;299:338–340
22. Accreditation Council for Graduate Medical Education. . Resident duty hours language: Final requirements. 2012 http://www.acgme.org
. Accessed July 15
23. Ludmerer KM. Redesigning residency education—Moving beyond work hours. NEngl J Med. 2010;362:1337–1338