After adjusting for differences among modules, post-test scores were higher for the question format (mean ± standard error, 78.9% ± 1.0%) than for the standard format (76.2% ± 1.0%, p = .006; see Figure 3). This difference persisted after adjusting for study group, gender, postgraduate year, and clinic site (p = .005). Adjustment for additional covariates including ethnic group, comfort using the Internet, experiences with WBL, or perceptions of technical problems yielded similar results. Adjustment for the number of self-assessment questions answered revealed a trend (p = .066) suggesting that those who had answered more questions scored higher. After adjusting for postgraduate year and clinic site, women had higher scores (81.0% ± 1.7%) than men (76.7% ± 1.5%, p = .02) regardless of format, although without multivariate adjustment this difference did not reach statistical significance.
Format preference influenced test scores, with those preferring the question format scoring higher (79.7% ± 1.1%) than those preferring standard (69.5% ± 2.3%, p < .001). Preference also interacted with format (p = .031) with the lowest scores occurring among those using the standard format who reported preference for that format.
Scores on the delayed test were not significantly different between the question (70.3% ± 1.6%) and standard (69.9% ± 1.5%) formats both before (p = .771) and after (p = .873) multivariate adjustment, including time from post-test to delayed test. Once again, residents preferring the format with questions performed better than those who preferred the standard format (p < .001).
Self-reported time to complete the modules was greater for the question format (60.4 ± 3.6 minutes) than for the standard format (44.3 ± 3.3, p < .001). Despite the greater time required by the question format, 63 of 76 residents (82.9%, 72.5–90.6%) felt that the question format was more effective and 55 of 77 (71.4%, 60.0–81.2%) reported that it was more efficient.
Fifty-seven of 75 residents (76.0%, 64.8–85.1%) reported returning to use the Web-based modules after completing the module, with 25 (33.3%, 22.9–45.2%) returning more than three times. Forty-four residents (58.7%, 46.7–70.0%) used hyperlinks to access full-text journal articles.
Twenty-nine of 74 residents (39.2%, 28.0–51.2%) experienced significant technical problems at the beginning of the course, and 19 of 73 (26.0%, 16.5–37.6%) felt these were still significant at the end of the course. Eleven residents (14.9%, 7.7–25.0%) reported difficulty with passwords.
In a crossover trial comparing WBL modules with self-assessment questions and feedback to modules without questions, we found that internal medicine residents had higher test scores when using the format with questions. Furthermore, residents strongly preferred the question format and felt it facilitated more effective and efficient learning even though it required more time to complete. These effects remained stable after adjusting for learner characteristics including gender, ethnic group, postgraduate year, prior experience with WBL, and learning styles.
While these data support the theory that instructional methods promoting learner interaction are more effective than less active methods, they contradict prior studies of CAL in medical education. One study found no difference between two case-based CAL formats with varying levels of interactivity, but the variation between formats was poorly defined.30 Another study found that basic science medical students using a CAL format with case-based questions and answers had lower scores than did those using a less interactive CAL format.32 The study’s authors attributed this unexpected finding to the outcome measure, which tested recall rather than application of knowledge, and lack of familiarity of the learners with case-based learning. A follow-up report studying the same CAL formats and the same group of learners in a clinical setting (neuroradiology) found no difference between formats,31 but in this case potential differences between study groups might have been diluted by learning from other sources during the eight-week lapse between pre-test and post-test. The present study avoids the limitations of this prior research, and corroborates a study of college students33 that demonstrated that self-assessment questions with feedback significantly improved post-test scores, and a study of medical students18 who preferred case-based modules over less interactive formats. Future research could confirm our findings and investigate alternate means of engaging learners in Web-based environments.27
These results have relevance beyond WBL, as they support the use of active learning methods in general. Although the literature abounds with courses and curricula using “active learning methods,” few studies have rigorously compared active methods with alternative instructional techniques. For example, many comparative studies in medical education are limited by the use of multifactorial educational interventions42–45 that confound the attribution of effect or lack thereof to any specific method or process.46,47 Other studies make comparisons with no-intervention controls,42–45,48,49 thus failing to inform selection of effective methods from multiple available options. We suggest that instead of studying whether medical students and physicians can learn using a designated method or combination of methods, we should study how best to facilitate learning. By carefully controlling for confounding variables, the present study has demonstrated that variations in instructional method—namely, methods that actively engage the learner—can positively influence learning.
Although residents felt that the format with questions required more time, they preferred this format and felt that it was more efficient. We ascribe this to their perception that this format was more effective. Additionally, learners who preferred the active learning format had higher test scores than did those preferring the standard format. While unmeasured factors such as motivation may be playing a role here, these findings should reassure educators concerned about employing instructional methods that demand more from the learners.
It is important to consider the clinical (educational) significance of these findings. Although the treatment effect of test scores is modest, the true difference is likely attenuated by other sources of variance among the study groups.46 It is also possible that additional learning gains were realized yet unmeasured by our assessment. The observed effect compares favorably with findings of other education studies using active-intervention comparison groups, where differences are typically nonsignificant14–16,50–53 or small.6 Since learners preferred the more effective instructional method, the concordance among outcomes suggests that self-assessment questions with feedback do have a clinically significant benefit on learning.
With the exception of lower test scores among residents with verbal style, learning styles did not affect test scores or format preference. There were no aptitude-treatment interactions54 between learning style and format. This lack of effect is not wholly unexpected given the central importance of instructional method in facilitating learning, and supports the argument that use of effective instructional methods should be ensured before considering the influence of learning styles.36,55 Future studies might investigate theory- and evidence-based predictions36 regarding the adaptation of sound instructional methods to individual learning styles. While the association between visual-verbal styles and test scores is interesting, we caution that this finding should be considered preliminary: it was not predicted by theory, prior research using this style dimension has yielded inconsistent results,36 and a recent study showed poor test-retest reliability for ILS visual-verbal scores.56
The significant difference between formats was no longer present when residents were tested after a delay of several months. Although we are disappointed that the effect did not persist, knowledge retention is challenging in all educational settings, including CAL and WBL,14,57,58 and should be a focus of further research.
This report occupies a sparsely populated niche in WBL literature. The vast majority of publications to date are descriptive, akin to the clinical case report. While these demonstrate the feasibility of an intervention, they do little to inform practice. Almost all evaluative studies are limited either by the use of no-intervention controls or by comparison across different media (e.g., comparison of WBL to lecture or textbook). Inasmuch as authors have consistently denounced media-comparative research for at least 20 years,25–27,60,61 it is time for research in WBL to move forward through a line of research that produces generalizable knowledge and builds upon the past. This study provides a model of such research. Specific directions for research suggested by this study include comparisons of alternate instructional methods to engage learners, theory-based investigations of cognitive and learning styles, and the role of learner motivation in Web-based learning. We further suggest that comparing carefully selected variations in instructional method will provide more meaningful and generalizable results in medical education studies, regardless of the medium, than will other widely prevalent study designs.46 Such studies will answer the recent plea for rigorous controlled trials in medical education.62
In conclusion, we found that self-assessment questions and feedback enhanced learning for internal medicine residents using a Web-based course in ambulatory medicine. We suggest that these findings hold implications for “traditional” educational activities as well as WBL—namely, that teachers must incorporate methods to actively engage learners in the learning process. As educators struggle to assist learners in the face of a rapidly growing body of information and decreasing time in which to learn, it will be increasingly important to identify effective educational practices. Future research, both in WBL and other teaching modalities, should focus on further defining the effectiveness of selected instructional methods in specific learning contexts.
1 Bowen JL, Irby DM. Assessing quality and costs of education in the ambulatory setting: a review of the literature. Acad Med. 2002;77:621–80.
4 MacKenzie JD, Greenes RA. The World Wide Web: redefining medical education. JAMA. 1997;278:1785–86.
5 Zucker S, White JA, Fabri PJ, Khonsari LS. Instructional intranets in graduate medical education. Acad Med. 1998;73:1072–75.
6 Grundman J, Wigton R, Nickol D. A controlled trial of an interactive, Web-based virtual reality program for teaching physical diagnosis skills to medical students. Acad Med. 2000;75(10 suppl):S47–S49.
7 Kumta SM, Tsang PL, Hung LK, Cheng JCY. Fostering critical thinking skills through a Web-based tutorial programme for final year medical students: a randomized, controlled study. J Educ Multimedia Hypermedia. 2003;12:267–73.
8 Leong SL, Baldwin CD, Adelman AM. Integrating Web-based computer cases into a required clerkship: development and evaluation. Acad Med. 2003;78:295–301.
9 Balcezak TJ, Lynch P, Jackson S, Richter J, Jaffe CC, Cadman EC. A web-based risk management and medical-legal curriculum for graduate medical education. J Biocommun. 1998;25(4):2–5.
10 Curran VR, Hoekman T, Gulliver W, Landells I, Hatcher L. Web-based continuing medical education. (II): Evaluation study of computer-mediated continuing medical education. J Contin Educ Health Prof. 2000;20:106–19.
11 Lipman AJ, Sade RM, Glotzbach AL, Lancaster CJ, Marshall MF. The incremental value of Internet-based instruction as an adjunct to classroom instruction: a prospective randomized study. Acad Med. 2001;76:1060–64.
12 Gerbert B, Bronstone A, Maurer T, Berger T, McPhee S, Caspers N. The effectiveness of an Internet-based tutorial in improving primary care physicians’ skin cancer triage skills. J Cancer Educ. 2002;17:7–11.
13 Harris JM, Jr., Kutob RM, Surprenant ZJ, Maiuro RD, Delate TA. Can Internet-based education improve physician confidence in dealing with domestic violence? Fam Med. 2002;34:287–92.
14 Bell DS, Fonarow GC, Hays RD, Mangione CM. Self-study from web-based and printed guideline materials: a randomized, controlled trial among resident physicians. Ann Intern Med. 2000;132:938–46.
15 Spickard A, III, Alrajeh N, Cordray D, Gigante J. Learning about screening using an online or live lecture: does it matter? J Gen Intern Med. 2002;17:540–45.
16 Cook DA, Dupras DM, Thompson WG, Pankratz VS. Web-based learning in resident continuity clinics: a randomized, controlled trial. Acad Med. 2005;80:90–97.
17 Kronz JD, Silberman MA, Allsbrook WC, Epstein JI. A web-based tutorial improves practicing pathologists’ Gleason grading of images of prostate carcinoma specimens obtained by needle biopsy: validation of a new medical education paradigm. Cancer. 2000;89:1818–23.
18 Swagerty Jr. D, Studenski S, Laird R, Rigler S. A case-oriented web-based curriculum in geriatrics for third-year medical students. J Am Geriatr Soc. 2000;48:1507–12.
19 Harris JM, Salasche SJ, Harris RB. Can Internet-based continuing medical education improve physicians’ skin cancer knowledge and skills? J Gen Intern Med. 2001;16:50–56.
20 Cook DA, Dupras DM. Teaching on the Web: Automated online instruction and assessment of residents in an acute care clinic. Med Teach. 2004;26:599–603.
21 Sisson SD, Hughes MT, Levine D, Brancati FL. Effect of an Internet-based curriculum on postgraduate education. A multicenter intervention. J Gen Intern Med. 2004;19:505–9.
22 Merrill MD. First principles of instruction. Educ Tech Res Dev. 2002;50(3):43–59.
23 Alur P, Fatima K, Joseph R. Medical teaching websites: do they reflect the learning paradigm? Med Teach. 2002;24:422–24.
24 Cook DA, Dupras DM. A practical guide to developing effective Web-based learning. J Gen Intern Med. 2004;19:698–707.
25 Clark R. Confounding in educational computing research. J Educ Comput Res. 1985;1:28–42.
26 Keane D, Norman G, Vickers J. The inadequacy of recent research on computer-assisted instruction. Acad Med. 1991;66:444–48.
27 Cook DA. The research we still are not doing: an agenda for the study of computer-based learning. Acad Med. 2005;80:541–48.
28 Brown G, Manogue M. AMEE Medical Education Guide No. 22: Refreshing lecturing: a guide for lecturers. Med Teach. 2001;23:231–44.
29 Kennedy GE. Promoting cognition in multimedia interactivity research. J Interact Learn Res. 2004;15:43–61.
30 Maleck M, Fischer MR, Kammer B, et al. Do computers teach better? A media comparison study for case-based teaching in radiology. Radiographics. 2001;21:1025–32.
31 Hudson JN. Computer-aided learning in the real world of medical education: does the quality of interaction with the computer affect student learning? Med Educ. 2004;38:887–95.
32 Devitt P, Palmer E. Computer-aided learning: an overvalued educational resource? Med Educ. 1999;33:136–39.
33 Gao T, Lehman JD. The effects of different levels of interaction on the achievement and motivational perceptions of college students in a Web-based learning environment. Journal of Interactive Learning Research. 2003;14:367–86.
34 Dillon A, Gabbard RB. Hypermedia as an educational technology: a review of the quantitative research literature on learner comprehension, control, and style. Rev Educ Res. 1998;68:322–49.
35 Chen SY, Paul RJ. Individual differences in web-based instruction: an overview [editorial]. Br J Educ Technol. 2003;34:385–92.
36 Cook DA. Learning and cognitive styles in Web-based learning: theory, evidence, and application. Acad Med. 2005;80:266–78.
37 Lieberman G, Abramson R, Volkan K, McArdle PJ. Tutor versus computer: a prospective comparison of interactive tutorial and computer-assisted instruction in radiology education. Acad Radiol. 2002;9:40–49.
38 Case SM, Swanson DB. Constructing Written Test Questions for the Basic and Clinical Sciences, 3rd ed. Philadelphia: National Board of Medical Examiners, 2001.
40 Cook DA, Smith AJ. Assessment of convergent and divergent validity of scores for Kolb’s Learning Style Inventory, Felder’s Index of Learning Styles, and Riding’s Cognitive Styles Analysis using the Multitrait Multimethod Matrix. Educ Méd. 2004;7(3):191.
41 Curry L. Individual differences in cognitive style, learning style, and instructional preference in medical education. In: Norman G, Van der Vleuten C, Newble D (eds). International Handbook of Research in Medical Education. Dordrecht: Kluwer Academic Publishers, 2002:263–76.
42 Fender GR, Prentice A, Gorst T, et al. Randomised controlled trial of educational package on management of menorrhagia in primary care: the Anglia menorrhagia education study. BMJ. 1999;318:1246–50.
43 Cornuz J, Humair JP, Seematter L, et al. Efficacy of resident training in smoking cessation: a randomized, controlled trial of a program based on application of behavioral theory and practice with standardized patients. Ann Intern Med. 2002;136:429–37.
44 Brown R, Bratton SL, Cabana MD, Kaciroti N, Clark NM. Physician asthma education program improves outcomes for children of low-income families. Chest. 2004;126:369–74.
45 Holmboe ES, Hawkins RE, Huot SJ. Effects of training in direct observation of medical residents’ clinical competence: a randomized trial. Ann Intern Med. 2004;140:874–81.
46 Norman G. RCT = results confounded and trivial: the perils of grand educational experiments. Med Educ. 2003;37:582–84.
47 Beckman TJ, Cook DA. Educational epidemiology. JAMA. 2004;292:2969.
48 Roy MJ, Herbers JE, Seidman A, Kroenke K. Improving patient satisfaction with the transfer of care: a randomized controlled trial. J Gen Intern Med. 2003;18:364–69.
49 Sloan DA, Plymale MA, Donnelly MB, Schwartz RW, Edwards MJ, Bland KI. Enhancing the clinical skills of surgical residents through structured cancer education. Ann Surg. 2004;239:561–66.
50 Graham HJ, Seabrook MA, Woodfield SJ. Structured packs for independent learning: a comparison of learning outcome and acceptability with conventional teaching. Med Educ. 1999;33:579–84.
51 Steele DJ, Medder JD, Turner P. A comparison of learning outcomes and attitudes in student- versus faculty-led problem-based learning: an experimental study. Med Educ. 2000;34:23–29.
52 Haidet P, Morgan RO, O’Malley K, et al. A controlled trial of active versus passive learning strategies in a large group setting. Adv Health Sci Educ. 2004;9:15–27.
53 Ochsendorf FR, Boehncke WH, Boer A, Kaufmann R. Prospective randomised comparison of traditional, personal bedside and problem-oriented practical dermatology courses. Med Educ. 2004;38:652–58.
54 Jonassen DH, Grabowski BL. Handbook of Individual Differences, Learning, and Instruction. Hillsdale, NJ: Lawrence Erlbaum Associates, 1993.
55 Merrill MD. Instructional strategies and learning styles: which takes precedence? In: Reiser R, Dempsey JV (eds). Trends and Issues in Instructional Design and Technology. Upper Saddle River, NJ: Merrill/Prentice Hall, 2002.
56 Cook DA. Reliability and validity of scores from the Index of Learning Styles. Acad Med. 2005;80(10 suppl):S97–S101.
57 Lynch TG, Steele DJ, Johnson Palensky JE, Lacy NL, Duffy SW. Learning preferences, computer attitudes, and test performance with computer-aided instruction. Am J Surg. 2001;181:368–71.
58 Naidr JP, Adla T, Janda A, Feberova J, Kasal P, Hladikova M. Long-term retention of knowledge after a distance course in medical informatics at Charles University Prague. Teach Learn Med. 2004;16:255–59.
59 Prystowsky JB, Bordage G. An outcomes research perspective on medical education: the predominance of trainee assessment and satisfaction. Med Educ. 2001;35:331–36.
60 Friedman C. The research we should be doing. Acad Med. 1994;69:455–57.
61 Adler MD, Johnson KB. Quantifying the literature of computer-aided instruction in medical education. Acad Med. 2000;75:1025–28.
62 Carney PA, Nierenberg DW, Pipas CF, Brooks WB, Stukel TA, Keller AM. Educational epidemiology: applying population-based design and analytic approaches to study medical education. JAMA. 2004;292:1044–50.
63 Felder RM, Silverman LK. Learning and teaching styles in engineering education. Eng Educ. 1988;78:674–81.
*Since some of these learning style dimensions may be unfamiliar, a brief review is in order. Active learners prefer practical application or exercise of information they have received. Reflective learners, on the other hand, internalize information—observing before passing judgment, examining from different perspectives, and looking for meaning as they create new knowledge. Sensing learners prefer what is real (facts, data, and experimentation), while intuitive learners look for patterns and meaning (principles and theories). Visual learners prefer the written or spoken word, while verbal learners learn best from pictures, demonstrations, and displays. Sequential learners follow a linear process of logical steps when they learn, while global learners seem to make large leaps, occasionally struggling until suddenly they “get it.” For a more detailed discussion see Felder and Silverman’s original description,63 and also Cook’s review36 and accompanying references. Cited Here...