Purpose: Prior studies suggest that students on a longitudinal integrated clerkship (LIC) have comparable academic performance to those on a rotation-based clerkship (RBC); however, most of these studies did not adjust for preclerkship academic performance. The objective of this study was to compare the academic performance of LIC and RBC students matched on prior academic performance over a three-year period.
Method: Each LIC student in the University of Calgary classes of 2009, 2010, and 2011 (n = 34) was matched with four RBC students (n = 136) of similar prior academic performance. Knowledge and clinical skills performance between the streams was compared. Knowledge was evaluated by internal summative examinations and the Medical Council of Canada Part 1 licensing exam. Clinical skills were evaluated via in-training evaluation report (ITERs) and performance on the clerkship objective structured clinical examination (OSCE). Meta-analysis was used to compare knowledge evaluations and clinical performance for all core clerkship disciplines, and pooled effect sizes from the fixed-effect models were reported.
Results: Meta-analyses showed no statistically significant heterogeneity. There were no differences between LIC and RBC students on knowledge evaluations (pooled effect size 0.019; 95% confidence interval [−0.155, 0.152], P = .8), ITERs (pooled effect size −0.015 [−0.157, 0.127], P = .8), or mean OSCE ratings (67.9 [SD = 4.6] versus 68.6 [SD = 5.8], P = .5).
Conclusions: After matching on prior academic performance, LIC and RBC students at one school had comparable performance on summative evaluations of knowledge, clinical performance, and clinical skills over three years.
Dr. Myhre is associate dean for distributed learning and rural initiatives, University of Calgary, Calgary, Alberta, Canada.
Dr. Woloschuk is director of program evaluation, Office of Undergraduate Medical Education, University of Calgary, Calgary, Alberta, Canada.
Dr. Jackson is director, Rural Integrated Community Clerkship, University of Calgary, Calgary, Alberta, Canada.
Dr. McLaughlin is assistant dean of undergraduate medical education, University of Calgary, Calgary, Alberta, Canada.
Editor’s Note: Commentaries by D.A. Hirsh, E.S. Holmboe, and O. ten Cate and by C.D. Stevens, L. Wilkerson, and S. Uijtdehaage appear on pages 201–204 and 205–207.
Funding/Support: None reported.
Other disclosures: Drs. Myhre and Jackson were administrators of the University of Calgary Rural Integrated Community Clerkship during the time of this study.
Ethical approval: This study was carried out as one of the objectives cited in the three-year evaluation plan of the longitudinal integrated clerkship program, which received ethical approval from the conjoint health ethics board at the University of Calgary.
Previous presentations: Dr. Myhre presented these findings at the Consortium of Longitudinal Integrated Clerkships (CLIC) Rendez-Vous 2012 conference in Thunder Bay, Canada.
Correspondence should be addressed to Dr. McLaughlin, University of Calgary, Office of Undergraduate Medical Education, 3330 Hospital Dr. NW, Calgary, Alberta T2N 4N1; telephone: (403) 220-4252; fax: (403) 210-3852; e-mail: firstname.lastname@example.org.
A longitudinal integrated clerkship (LIC) has been introduced in a variety of medical schools in the United States,1–5 Australia,6–8 New Zealand,9 South Africa,10 and Canada11,12 and represents a significant departure from the traditional rotation-based clerkship (RBC) model. Although some institutions have adopted LICs in the belief that this represents improved pedagogy with a focus on social learning theory, others have introduced this model for more pragmatic reasons—for example, to allow them to increase the number of medical students they enroll or to meet health care needs in underserved communities. Although LICs come in a variety of shapes and sizes, each has the same core elements: medical students participate in the comprehensive care of patients over time; students have continuing learning relationships with these patients’ clinicians; and students meet, through these experiences, the majority of the year’s core clinical competencies across multiple disciplines simultaneously.13
Evidence has demonstrated that LICs appear to address some of the shortcomings of the RBC model by increasing continuity-of-care learning opportunities and improving teamwork.2,11,14,15 Compared with RBC students, LIC students have been shown to be more satisfied with their learning,2,14 to feel better prepared to solve ethical dilemmas,14 and to give higher ratings for feedback and mentoring during clerkship.11,14 LICs may also help bridge health care gaps, as LIC students who trained at rural sites at one school were more likely to end up in rural practice.16 In considering academic outcomes, a recent substantial review by Walters et al17 suggests comparable outcomes for knowledge and clinical skills for LIC and RBC participants. Thus far, however, few studies addressing academic outcomes of LICs have evaluated or adjusted for differences in academic performance of participants prior to clerkship. This is an important limitation because many LICs screen their applicants, and this screening step may introduce a selection bias in favor of LICs. A recent Canadian study that compared LIC and RBC students’ preclerkship academic performance found that LIC students did not perform as well in some areas, including evaluations of clinical skills.12
To address the issue of potential selection bias, in this study we matched each LIC student from three consecutive years at the University of Calgary with four RBC students from the same academic year who had similar preclerkship academic performance. We then compared LIC and RBC students’ performance on all summative clerkship evaluations of knowledge and clinical skills. We hypothesized that, if exposure to different clerkship models is associated with differences in academic performance, then performance of LIC and RBC students would differ after controlling for preclerkship performance. This study builds on the existing literature by examining the comprehensive academic performance (knowledge, skills, and behaviors) of LIC students and is unique in that it is a three-year, prospective, matched-cohort study.
The University of Calgary has a three-year undergraduate medical curriculum in which the third year is the clinical clerkship. Since the graduating class of 2009, we have had two clerkship streams: a traditional RBC and an LIC called the Rural Integrated Community Clerkship. As there are typically more LIC applicants than positions, we interview all LIC applicants and then randomly select students from a list of acceptable candidates. Students who are not selected for LIC enter the RBC stream.
Both the LIC and RBC versions of the clerkship last 54 weeks and share the same learning objectives and evaluations. In our LIC, after an initial six weeks of electives, pairs of students spend 36 weeks of their clerkship year at an established rural teaching site where their primary preceptors are family physicians. The students also have the opportunity to receive training from other clinical preceptors, including subspecialists, at their teaching site. At their LIC site, students complete all of their clinical experience in family medicine, emergency medicine, anesthesia, obstetrics–gynecology, and psychiatry, in addition to most of their training in internal medicine, surgery, and pediatrics. After 32 weeks, the students return to Calgary to complete the final 12 weeks of their training, divided equally between internal medicine, surgery, and pediatrics. In the RBC curriculum, after an initial 6 weeks of electives, clerks rotate through eight mandatory disciplines: 2-week blocks of emergency medicine and anesthesia; 6 weeks of family medicine, obstetrics–gynecology, psychiatry, surgery, and pediatrics; 10 weeks of internal medicine; and another 4 weeks of electives.
Our LIC cohort included all students (n = 34) from the classes of 2009, 2010, and 2011 who completed the LIC program. The 9 students from the class of 2009 were included in a previous study that combined data from LIC programs from three Western Canadian medical schools.12 Our RBC cohort of 136 students comprised four contemporaneous controls for each LIC student, matched on academic performance during the first two years of training. We chose four RBC controls for each LIC student because the statistical power of matched-cohort studies is raised by increasing the number of controls up to four, after which this effect is attenuated.18
We created separate RBC cohorts to analyze knowledge and clinical outcomes. For knowledge outcomes, LIC and RBC students were matched on grade point average (GPA) of summative multiple-choice examinations in the first two years. For clinical competence and clinical performance outcomes, LIC and RBC students were first matched on their performance on the medical skills examination in the second year, followed by GPA in the first two years.
We collected the results of all summative clerkship evaluations for each participant. Each of the clerkships has two components to the summative evaluation: knowledge, assessed by a multiple-choice question (MCQ) examination; and clinical performance, which is assessed via an in-training evaluation report (ITER). In addition to these, we have a clerkship objective structured clinical examination (OSCE) as an assessment of clinical skills, and all of our students take the Medical Council of Canada (MCC) Part 1 licensing exam after completing clerkship.
This was a three-year, prospective, matched-cohort observational study in a three-year medical school setting. For each of the three class years included in the study, we created our matched cohorts for LIC students prior to the start of clerkship, and gathered data on our students from the first group of LIC students (class of 2009) until the third group (class of 2011) had completed clerkship. The study was approved by the University of Calgary conjoint health research ethics board.
We compared preclerkship performance of LIC and RBC cohorts using a two-sample t test. We considered outcomes of knowledge, clinical performance, and clinical skills separately for postclerkship performance. For MCQ examinations (including the MCC Part 1 examination), we used meta-analysis to convert means and standard deviations into standardized mean differences. In view of our small sample size, we used the Hedges g effect size calculation to correct for small sample bias.19 We used the Q statistic to assess for heterogeneity between clerkships, and reported the results of the fixed-effects model if there was no significant heterogeneity. We performed a similar analysis to generate the standardized mean difference for performance in ITERs. Because all students had a single OSCE, we compared OSCE ratings using a two-sample t test. We used STATA version 11.0 (Stata Corp, College Station, Texas) for our statistical analyses.
There was no significant difference between LIC and RBC students matched for preclerkship GPA (81.65% [SD = 5.01] versus 81.50% [SD = 4.93], P = .88). Similarly, for the cohort matched first by performance on the medical skills evaluation followed by GPA, there was no difference between LIC and RBC students (medical skills: 85.26% [SD = 3.96] versus 85.11% [SD = 3.75], P = .83; GPA: 81.65% [SD = 5.01] versus 80.61% [SD = 5.25], P = .30).
In our meta-analysis of knowledge evaluations, there was no significant heterogeneity between examinations (Q statistic = 12.7, P = .08). Overall, there was no significant difference between LIC and RBC students’ performance on MCQ examinations (pooled effect size = 0.019; 95% confidence interval = [−0.115, 0.152], P = .78). Figure 1 shows the standardized mean differences between LIC and RBC students for each examination along with the pooled effect size.
In our meta-analysis of ITERs, there was no significant heterogeneity between clerkships (Q statistic = 0.1, P > .99) and no significant difference in performance between LIC and RBC students (pooled effect size = −0.015, confidence interval = [−0.157, 0.127], P = .84). Figure 2 shows the standardized mean differences between LIC and RBC students for each clerkship ITER and the pooled effect size.
We found no difference in mean score on the summative clerkship OSCE between LIC and RBC students (67.9 [SD = 4.6] versus 68.6 [SD = 5.8], respectively, P = .5).
The goal of introducing an LIC may be slightly different for each medical school. The typical reason to introduce an LIC may be to improve the pedagogy of clerkship training (e.g., by emphasizing longitudinal, team-based care) and the perceived quality of clerkship learning experiences, while also addressing societal needs, such as bridging health care gaps in underserved communities. For the LIC model to be successful, however, it should deliver these benefits while at least matching the RBC standard for academic performance.
In this study, we selected a matched-cohort design to mitigate the potential selection bias that may have reduced the reliability of previous studies comparing academic performance of LIC and RBC students. When we compared performance on evaluations of knowledge, clinical performance, and clinical skills of LIC students versus RBC students with similar prior performance in these areas, we found no significant difference. Our findings are consistent with the existing literature demonstrating that LIC students perform academically at least as well as RBC students17 but, importantly, also demonstrate that comparable outcomes are achievable in a three-year medical school in which the objectives and evaluations were designed for a traditional RBC model.
Our study has limitations that we should acknowledge. Although we tried to mitigate the potential for selection bias using a matched-cohort design, this design is less reliable than a randomized controlled trial design. This was a single-institution study with a rural LIC program using family medicine preceptors, which reduces the generalizability of our results to other schools with a different LIC model. As with most studies on LIC curricula, we had a small sample size, which limits our statistical power to detect a difference in academic outcomes. For this reason we used meta-analysis to analyze our results where possible. Finally, students were matched only on academic performance and not on other factors, such as gender or age, which could potentially have influenced the results.
Successful LIC curricula should match or exceed the academic standard of the RBC curriculum. The results of our study suggest that, after matching on prior academic performance, LIC and RBC students have comparable performance on summative evaluations of knowledge, clinical performance, and clinical skills. LIC programs may allow us to address societal needs, including health care needs in underserved (i.e., rural) communities, without sacrificing LIC students’ academic or clinical skills development. There is a need to advance the literature by examining the long-term benefits of LIC programs. Outcomes such as performance during residency and practice could be considered.
Acknowledgments: The authors wish to thank Ms. Jeanette Somlak Pedersen for her contributions to the literature review and manuscript revisions.
1. Hansen L, Simanton E. Comparison of third-year student performance in a twelve-month longitudinal ambulatory program with performance in traditional clerkship curriculum. S D Med. 2009;62:315–317
2. Hirsh D, Gaufberg E, Ogur B, et al. Educational outcomes of the Harvard Medical School–Cambridge integrated clerkship: A way forward for medical education. Acad Med. 2012;87:643–650
3. Poncelet A, Bokser S, Calton B, et al. Development of a longitudinal integrated clerkship at an academic medical center. Med Educ Online. April 4, 2011;16
4. Schauer RW, Schieve D. Performance of medical students in a nontraditional rural clinical program, 1998–99 through 2003–04. Acad Med. 2006;81:603–607
5. Zink T, Power DV, Finstad D, Brooks KD. Is there equivalency between students in a longitudinal, rural clerkship and a traditional urban-based program? Fam Med. 2010;42:702–706
6. Worley P, Esterman A, Prideaux D. Cohort study of examination performance of undergraduate medical students learning in community settings. BMJ. 2004;328:207–209
7. Worley P, Lines D. Can specialist disciplines be learned by undergraduates in a rural general practice setting? Preliminary results of an Australian pilot study. Med Teach. 1999;21:482–484
8. Worley P, Silagy C, Prideaux D, Newble D, Jones A. The parallel rural community curriculum: An integrated clinical curriculum based in rural general practice. Med Educ. 2000;34:558–565
9. Poole P, Bagg W, O’Connor B, et al. The Northland Regional–Rural program (Pūkawakawa): Broadening medical undergraduate learning in New Zealand. Rural Remote Health. 2010;10:1254
10. Norris TE, Schaad DC, DeWitt D, Ogur B, Hunt DDConsortium of Longitudinal Integrated Clerkships. . Longitudinal integrated clerkships for medical students: An innovation adopted by medical schools in Australia, Canada, South Africa, and the United States. Acad Med. 2009;84:902–907
11. Couper I, Worley PS, Strasser R. Rural longitudinal integrated clerkships: Lessons from two programs on different continents. Rural Remote Health. 2011;11:1665
12. McLaughlin K, Bates J, Konkin J, Woloschuk W, Suddards CA, Regehr G. A comparison of performance evaluations of students on longitudinal integrated clerkships and rotation-based clerkships. Acad Med. 2011;86(10 suppl):S25–S29
13. Consortium of Longitudinal Integrated Clerkships. CLIC overview. http://www.clicmeded.com/
. Revised September 2011. Accessed October 29, 2013
14. Ogur B, Hirsh D, Krupat E, Bor D. The Harvard Medical School–Cambridge integrated clerkship: An innovative model of clinical education. Acad Med. 2007;82:397–404
15. Hauer KE, Hirsh D, Ma I, et al. The role of role: Learning in longitudinal integrated and traditional block clerkships. Med Educ. 2012;46:698–710
16. Worley P, Martin A, Prideaux D, Woodman R, Worley E, Lowe M. Vocational career paths of graduate entry medical students at Flinders University: A comparison of rural, remote and tertiary tracks. Med J Aust. 2008;188:177–178
17. Walters L, Greenhill J, Richards J, et al. Outcomes of longitudinal integrated clinical placements for students, clinicians and society. Med Educ. 2012;46:1028–1041
18. Pang D. A relative power table for nested matched case–control studies. Occup Environ Med. 1999;56:67–69
19. Egger M, Davey Smith G, Altman DG Systematic Reviews in Health Care: Meta-Analysis in Context. 2001 London, UK BMJ Publishing Group