Secondary Logo

Journal Logo

Innovation Reports

An Innovative Blended Preclinical Curriculum in Clinical Epidemiology and Biostatistics: Impact on Student Satisfaction and Performance

Evans, Kambria H. MEd, MA; Thompson, Atalie C. MD, MPH; O’Brien, Colin; Bryant, Madika MA; Basaviah, Preetha MD; Prober, Charles MD; Popat, Rita A. PhD

Author Information
doi: 10.1097/ACM.0000000000001085
  • Free

Abstract

Problem

“Flipped” or “blended” classrooms in medical and health professions education have gained attention recently. This model shifts content for students to learn on their own, outside of class, and dedicates class time to student-centered learning activities (e.g., problem-based learning, inquiry-oriented strategies).1 There is little understanding, however, of the impact of teaching clinical epidemiology and biostatistics in a blended format in medical school.

At the Stanford University School of Medicine, Practice of Medicine (POM) is a six-quarter course that extends throughout the first two years of the MD program. POM interweaves skills training in medical interviewing and examination with instruction in quality/safety nutrition, quantitative medicine (QM; i.e., epidemiology and biostatistics), psychiatry, ethics, health policy, and population health. POM’s seven-week QM curriculum takes place in the fall quarter for first-year medical students. The objectives of QM are for students to (1) understand the research methods needed to create and synthesize knowledge applied to patient care and (2) interpret and implement information from the literature as it relates to disease, patient care, and public health. From 2006 through 2012, QM used two in-class formats for its 12 sessions: 9 large-group, lecture-style 50-minute sessions and 3 small-group 75-minute sessions (total time: 11.25 hours). The small-group sessions facilitated collaborative application of underlying concepts from the lectures.

The benefits and limitations of using this traditional, classroom-based format for QM were identified through review of the literature, course satisfaction data, and in-class survey data regarding previous experience and preferred learning style, as well as the instructor’s experience. One benefit of a traditional format is real-time feedback; it was noted, though, that with a large class size (about 90 students), small groups tend to offer better opportunities for feedback than lecture-style sessions. There are three limitations to teaching QM using this traditional format.

First, learners are heterogeneous; some medical students have not taken an undergraduate course in basic statistics, whereas others have formal training in epidemiology or biostatistics. Many have clinical research experience. This population needs opportunities for self-paced learning. Second, there is inflexible time for application and synthesis of information. To many students, learning clinical epidemiology and biostatistics terms is comparable to learning a new language. Mastering the basic vocabulary is essential for achieving proficiency in applying knowledge. In the traditional model, though, the nine QM lectures covering basic concepts were spread out over a seven-week period. By the time students were ready for higher-level thinking, the end of the quarter had arrived. Third, students vary in their preferences for how knowledge is acquired and processed (e.g., auditory/verbal, sensing/intuitive).2 The synchronous, physical format of in-class instruction offers few opportunities to adapt to different learning styles.

As possible new formats for the QM curriculum were considered, the “blended learning strategy guide” proposed by Singh and Reed3 was reviewed. This source demonstrates that a blend of self-paced and in-class formats would be well suited for delivering QM’s objectives. Further, the results of a U.S. Department of Education meta-analysis of evidence-based studies of online learning suggest that, when compared with face-to-face instruction, blended instruction produces the largest positive change in student outcomes, and online instruction alone results in a modest improvement.4 Blended instruction allows for self-paced learning, time for application and synthesis of concepts, and balanced instruction for varied learning styles, and it provides online resources for longitudinal training.5

Medical school is an ideal setting for blended learning. Medical students are faced with an overwhelming amount of information, so flexibility regarding how they can learn and develop methods of learning for use throughout their career is imperative. Blended learning gives medical students more control over their time and allows them to use their preferred learning style. In this report, we describe how the QM curriculum was redesigned to use a blended format. We also report our initial evaluation of the redesigned curriculum, comparing student satisfaction and performance data from the pilot year of the blended format with data from the prior two years.

Approach

Blended curriculum: Components and development

The redesigned QM curriculum, introduced in the fall quarter of 2013, blends online self-paced learning with in-class collaborative learning. Its innovative, asynchronous online component provides opportunities for active and self-monitored learning. Although the curricular format was altered, the curricular content remains the same as in previous years.

The self-paced online component consists of nine modules, each composed of a video and quiz. Each module runs 40 minutes on average (totaling 6 hours). The nine module topics are the same as in previous years: diagnostic tests and screening; measures of disease frequency; describing and summarizing data; randomized clinical trials; cohort studies; case–control studies; statistical inference; basic statistical tests; and bias, confounding, and effect modification.

Stanford’s CourseWork Web site provides access to the online curriculum and allows students to navigate their learning on the basis of their previous experience and comfort level with the material. Students have the option of watching the videos in their preferred order and can skip content they already know. For students who prefer reading over viewing videos, online text (an electronic version of the textbook, or e-text) is provided as an alternate option. Additional online resources available for students include PowerPoint slides and lecture notes.

Students complete three online modules every two weeks, followed by a small-group session. The three small-group sessions (totaling 4.3 hours) allow for synthesis and application of concepts through interactive, case-based discussions. The small-group session topics are critical appraisal of a screening study and application to clinical practice; critical appraisal of a cohort study and application to clinical practice; and critical appraisal of a case–control study (mock trial proceeding).

It was important to address time neutrality during the QM redesign. The redesigned curricular time totals 10.3 hours (4.3 in class; 6 online); the previous, traditional curriculum totaled 11.25 hours of in-class time. In the POM course, 190 minutes of protected time is set aside for students to complete the nine online QM modules. Some students may choose to spend less than 6 hours on the modules, whereas others may choose to spend more time, especially on challenging or new topics.

The blended QM curriculum was taught in 2013 by one faculty member (R.A.A.) and one teaching assistant (A.C.T.); this faculty member also taught the traditional QM curriculum in 2011 and 2012. Neither received formal training to teach in the blended format. They worked with education technology specialists to create the online modules.

Learner population: Pilot cohort and controls

The pilot cohort of first-year Stanford medical students (n = 101) was enrolled in the redesigned QM curriculum in September 2013. This cohort was 46% female (n = 47); its self-identified ethnic composition, as reported on an in-class survey, was 36% white/white European (n = 37), 20% Chinese (n = 20), 15% South Asian (n = 15), and 30% other/declined to state (n = 30). Only 29% (n = 29) of the students reported that they had not taken an undergraduate course in statistics. Thirty-eight percent (n = 38) described themselves as active learners, 76% (n = 77) described themselves as visual learners, and 41% (n = 41) indicated that their preferred learning environment was a blended curriculum.

As controls, we used two years of historic data (exam scores and evaluations) from the first-year medical school classes in 2011 and 2012 (n = 178 students).

Evaluation methods

We expected that the blended instruction model would improve student satisfaction and performance. We evaluated performance outcomes and satisfaction with the overall format (2011–2013 cohorts), as well as satisfaction with the individual components of the blended format (i.e., online and in-class components; 2013 cohort). This study was deemed exempt by the Stanford University institutional review board because it was considered program evaluation and because all data were anonymous and deidentified.

Satisfaction.

We hypothesized that student satisfaction with the blended QM curriculum in 2013 would improve compared with student satisfaction with the traditional format offered in 2011 and 2012. We were curious whether any improvements in satisfaction would be due to any of the following variables: flexibility of online learning, teaching effectiveness, face-to-face student–instructor interactions, and student–student interactions. We hypothesized that students would use the online videos more than the other online QM resources, such as the PowerPoint slides or book chapters, in preparation for the online quizzes and small-group sessions.

Students in the 2013 cohort completed anonymous online midquarter and end-of-quarter evaluations in which they rated six course domains: overall production quality of the online videos; ease of use (i.e., ability to navigate and access materials); value of content in the online videos; learning efficiency (i.e., ability to review familiar material quickly and spend more time on other topics, ability to tailor learning to preferred learning style); value of online quizzes; and value of interactive small-group sessions.

Satisfaction data were collected anonymously through online surveys in EValue, the medical school’s password-protected evaluation database for medical students and faculty. Satisfaction data from the 2013 blended QM pilot cohort were compared with data from the traditional QM control cohorts (2011, 2012) using analysis of variance (ANOVA) and post hoc pairwise comparisons in SAS version 9.4 (SAS Institute Inc., Cary, North Carolina).

Students in 2013 also self-reported patterns of utilization of the different online QM materials (i.e., the assigned text, PowerPoint slides, and online videos) available to students prior to completion of each of the nine QM module quizzes. This information was collected throughout the quarter by the teaching assistant.

Performance.

To assess whether the blended QM curriculum had an impact on student mastery of the QM core material, we compared performance on the 2013 QM final exam with historical performance data from 2011 and 2012 with an ANOVA and two post hoc mean comparisons (2013 vs 2012 and 2013 vs 2011). We also collected 2013 performance data for the nine online module quizzes, which averaged six multiple-choice questions per module.

All quizzes and final exam questions were administered through our institution’s Learning and Management System (LMS), a password-protected online portal. All quiz and final exam scores were deidentified, tabulated, and recorded as aggregated data in LMS.

Outcomes

Overall, our analysis of satisfaction and performance outcomes indicated that the blended curriculum had a positive impact on student satisfaction and mastery of the core material.

Student satisfaction

Online component of the blended curriculum.

Students in the 2013 cohort rated the online videos and quizzes positively on the midquarter and end-of-quarter evaluations, and there was an increase in their end-of-quarter satisfaction ratings (Table 1). The response rates were ≥ 94%.

Table 1
Table 1:
First-Year Student Satisfaction Ratings for Quantitative Medicine Online Modules, Stanford University School of Medicine, Fall 2013a

Utilization of online material was high. The majority of the students reported choosing the online videos as their primary learning resource (the proportion varied from 69% to 85% across the nine modules) over the additional resources provided to students (e.g., PowerPoint slides, links to e-text).

Blended versus traditional format.

Table 2 shows mean scores for satisfaction in 2011, 2012, and 2013. When comparing 2013 ratings of the blended format with 2011 and 2012 ratings of the traditional format, we saw significant improvements in all four areas: overall rating (P < .0001), logical sequence (P = .008), organization (P < .0001), and value of content (P < .0001).

Table 2
Table 2:
Comparison of Student Satisfaction Scores for the Quantitative Medicine Course, Blended and Traditional Formats, Stanford University School of Medicine, 2011–2013a

Small-group sessions.

Most students in the 2013 cohort rated their small-group experience positively (95/101; 94%). Mean (SD) ratings for each of the three small-group sessions were as follows (ratings used a five-point scale ranging from 1 = poor to 5 = excellent):

  1. Critical appraisal of a screening study and application to clinical practice: 3.34 (1.020);
  2. Critical appraisal of a cohort study and application to clinical practice: 3.39 (0.99); and
  3. Critical appraisal of a case–control study (mock trial proceeding): 3.47 (0.98).

We observed an increase in the overall satisfaction rating for small-group sessions when comparing the average satisfaction rating for the three sessions in 2013 (mean [SD] = 3.40 [1.03]) with the average satisfaction ratings for the three small-group sessions both in 2011 (mean [SD] = 2.79 [1.0]) and 2012 (mean [SD] = 2.83 [1.06]).

Comments on overall blended curriculum experience.

Of the 199 comments students provided regarding the QM blended curriculum on the midquarter and end-of-quarter evaluations, 55% (n = 110) were specifically about the online curriculum, and 42% (n = 83) were positive feedback. The following themes emerged: accessibility, time neutrality, pace, additional learning aids, quizzes, and learning style preferences. (See Table 3 for example comments.) Some students viewed the blended format as adding a time commitment on top of regular classroom hours, and others indicated that they would prefer more time dedicated to traditional lectures. Some students felt the videos were an effective means of delivering QM lessons.

Table 3
Table 3:
Themes From Student Comments on the Quantitative Medicine (QM) Blended Curriculum, From Midquarter and End-of-Quarter Evaluations, Stanford University School of Medicine, Fall 2013

Student performance

Student performance on the 2013 QM final exam showed no significant changes compared with student performance for the prior two years. The mean score was slightly higher in 2013 compared with 2012 (mean difference = 0.65) and slightly lower in 2013 compared with 2011 (mean difference = −0.47), but these differences were not statistically significant based on post hoc pairwise comparisons.

All students in the 2013 cohort completed the nine online quizzes. The average score on the quizzes was 80%.

Next Steps

This initial comprehensive evaluation suggests that our blended QM curriculum is successful and that its features could serve as a model for future blended courses. In shifting from a traditional format to a blended format, achieving time neutrality is important to ensure that students have the time to dedicate to the online component. Students have offered suggestions for improving efficiency of the QM online component, such as simplifying the quiz wording and providing all of the lecture notes or PowerPoint slides in one downloadable PDF file. We are considering those suggestions, and we are continuing to build additional modules for students who are seeking more advanced learning.

One limitation of the blended QM format is that some students reported preferring in-class lectures, so this learning modality is not ideal for their preferred learning style. Another limitation is generalizability; this study was conducted at one medical school.

The positive impact of the curricular elements we studied will inform the continued development of our QM curriculum. In 2014, the second year of the blended curriculum, student ratings for both the value of the videos and the value of the quizzes were consistent with ratings from the 2013 pilot year. We will continue to examine which aspects need revision moving forward. Future research is planned to (1) compare performance on Step 1 of the United States Medical Licensing Examination for students who completed the traditional QM course (2011, 2012) with the performance of students who completed the 2013 blended course, and (2) evaluate satisfaction and performance of the 2013 and control cohorts as well as cohorts with/without previous QM experience.

Acknowledgments: The authors wish to express their gratitude to the following individuals for their contributions to the development of the quantitative medicine blended curriculum: Michael McAuliffe and Joe Benfield. The authors also gratefully acknowledge the support of a vice provost for online learning grant and the Stanford University Information Resources and Technology team.

References

1. McLaughlin JE, Roth MT, Glatt DM, et al. The flipped classroom: A course redesign to foster learning and engagement in a health professions school. Acad Med. 2014;89:236243.
2. Vermetten YJ, Vermunt JD, Lodewijks HG. Powerful learning environments? How university students differ in their response to instructional measures. Learn Instr. 2002;12:263284.
3. Singh H, Reed C; Centra Software. A white paper: Achieving success with blended learning. 2001. http://www.leerbeleving.nl/wbts/wbt2014/blend-ce.pdf. Accessed December 3, 2015.
4. U.S. Department of Education. Evidence-based practices in online learning: A meta-analysis and review of online learning studies. Revised September 2010. http://www2.ed.gov/rschstat/eval/tech/evidence-based-practices/finalreport.pdf. Accessed November 24, 2015.
5. Felder RM, Spurlin J. Applications, reliability, and validity of the index of learning styles. Int J Eng Educ. 2005;21:103112.
Copyright © 2016 by the Association of American Medical Colleges