There are other differences in CBL and WBL, such as the computer operating system (e.g., Windows, Macintosh, UNIX) and the programming/markup language (e.g., HTML, XML, Perl, etc.). While these differences are certainly important and merit investigation, from a teacher–student perspective the meaningful differences are subsumed by the categories above. For example, in comparing one Web programming language to another the technical differences (which can be significant) are only relevant to the learner insofar as they have bearing on the configuration, instructional methods, or presentation of the course. Such technical considerations are beyond the scope of this essay.
Research should be conducted primarily within, rather than across, levels of instructional design. Doing otherwise leaves the reader unable to distinguish whether it was the configuration, the instructional method, or the presentation that had the effect. For example, a comparison of two instructional methods (e.g., case-based versus non-case-based questions) would likely yield meaningful and generalizable results. However, a comparison across levels (e.g., a discussion board using case-based learning versus Web-page-based interactive models) would be limited by confounding (Was it the discussion board, or the case-based questions, that made a difference?). As with comparisons of media, it will be difficult to control comparisons within the configuration level. For example, when comparing a discussion board to a Web-page-based tutorial, it may prove challenging to account for all the differences in instructional methods and presentation, as illustrated by a recent study.11 Although configuration-comparative research may yield important information, perhaps the greatest utility in identifying configuration as a distinct level will be to avoid the confounded comparison described above.
There may be exceptions to the rule against complex comparisons. As in clinical research, where an effectiveness study demonstrates the clinical utility of an intervention shown to be efficacious in an efficacy study,23 it may at times be appropriate to investigate multifactorial interventions. However, such studies will have limited generalizability unless they recognize and carefully address the challenges noted above.
Several additional research themes warrant rigorous study under the CBL-CBL comparative paradigm. Within the context of instructional design, adaptation to individual learners, just-in-time learning, and simulation present singular challenges and opportunities. Research questions for each of these themes could draw upon virtually any combination of configuration, instructional method, or presentation, and would lend themselves to comparative studies using the framework discussed above. Comparative studies could also investigate integration of CBL within and between institutions. Regardless of the intervention, outcomes in CBL research should be carefully considered. These themes are discussed below and in Table 2.
Adaptation to differences in individual learners has been proposed as a way to improve WBL,24–27 and many of the arguments apply to CBL in general. In face-to-face teaching, effective teachers adapt to accommodate the various needs of individual learners. In contrast, traditional computer-based instruction presents the same material to every learner regardless of individual learning needs. By imitating the effective human teacher, computer systems that adapt to individual differences could enhance learning.28 In fact, given the diversity of the potential audience and the fact that the learner in most Web-based settings works alone, adaptation may be imperative to realize the full promise of WBL.29 In considering adaptation to individual differences, the aptitude–treatment interaction30 is critical (see Figure 2).
CBL must be integrated with other systems and curricular components. Since CBL is not an end in itself, how and when to use this tool are as important as the optimization of the tool itself. Friedman noted, “The thinking about how to integrate computer technology into medical school instruction is less mature than the thinking about how to design computer-based instruction itself,” and he proposed “a line of research that would explicitly compare different modes of integration.”1 More recently, integration has been identified as an area meriting continued research in both curriculum development and individual learning settings.5 As multi-institution initiatives40,41 continue to develop, interinstitutional issues will also need to be addressed.
Across the spectrum of CBL research, selection of outcomes is a critical issue. The predominant outcomes in current use—satisfaction, self-efficacy, and knowledge/performance—are only surrogates for the outcomes of real interest: physician performance and patient outcomes.42–44 Furthermore, the benefits of a given instructional design may differ for different learning outcomes.24 Additional outcomes to consider, some of them unique to CBL, are presented in Table 2.
In proposing this, I acknowledge the potential contribution of qualitative research. In contrast to most of the descriptive reports prevalent in the literature, studies employing rigorous qualitative methods can shed light on the complex pedagogical, technical, and organizational aspects of CBL and uncover truths applicable to other settings.45–48 Such research49–51 complements the comparative research paradigm presented in this article.
CBL is not a panacea. Aspirin does not cure all ailments, and CBL does not cure all educational problems. It will not work equally well in all settings, and with current technology it is likely suboptimal in many contexts. Rather, CBL is a powerful tool, to be used with wisdom and judgment to enhance the learning process. Instead of deciding to use CBL and then working to fit it into the curriculum, instructional objectives should be defined first, and CBL used only when it appears to be the most effective means of achieving them. Research should focus on when to use CBL, and how to use it most effectively once the decision has been made.
The interpretation of most existing research in CBL is limited by lack of an adequate control group. But even well-controlled media-comparative research will always be difficult to generalize because observed effects cannot confidently be ascribed to any one variable. In contrast, CBL-CBL comparisons of instructional design, including configuration, instructional method, and presentation, are more likely to yield meaningful results. Studies employing systematic variations within each of these levels will advance the science of CBL. Within the CBL-CBL framework special attention should be paid to factors such as adaptation to individual differences, just-in-time learning, simulation, and integration within and between institutions, while assessing meaningful outcomes. Such investigations will help to realize and refine the role of computers in medical education.
The author thanks D. M. Dupras and T. J. Beckman for their critical review of the manuscript.
1 Friedman C. The research we should be doing. Acad Med. 1994;69:455–7.
2 Clark R. Confounding in educational computing research. J Educ Comput Res. 1985;1:28–42.
3 Clark R. Dangers in the evaluation of instructional media. Acad Med. 1992;67:819–20.
4 Keane D, Norman G, Vickers J. The inadequacy of recent research on computer-assisted instruction. Acad Med. 1991;66:444–8.
5 Adler MD, Johnson KB. Quantifying the literature of computer-aided instruction in medical education. Acad Med. 2000;75:1025–8.
6 Chumley-Jones HS, Dobbie A, Alford CL. Web-based learning: sound educational method or hype? A review of the evaluation literature. Acad Med. 2002;77(10 suppl):S86–93.
7 Santayana G. In: Hirsch ED Jr, Kett JF, Trefil J (eds). The New Dictionary of Cultural Literacy. 3rd ed. Boston: Houghton Mifflin Company, 2002. Available online at 〈http://www.bartleby.com/59
〉. Accessed 14 April 2005.
8 Norman G. RCT = results confounded and trivial: the perils of grand educational experiments. Med Educ. 2003;37:582–4.
9 Cook DA, Dupras DM, Thompson WG, Pankratz VS. Web-based learning in resident continuity clinics: a randomized, controlled trial. Acad Med. 2005;80:90–7.
10 Chueh H, Barnett GO. “Just-in-time” clinical information. Acad Med. 1997;72:512–7.
11 Brunetaud JM, Leroy N, Pelayo S, et al. Comparative assessment of two interfaces for delivering a multimedia medical course in the French-speaking Virtual Medical University (UMVF). Stud Health Technol Informat. 2003;95:738–43.
12 Gagne RM, Briggs LJ, Wager WW. Principles of Instructional Design. 4th ed. Belmont, CA: Wadsworth/Thompson Learning, 1992.
13 Merrill MD. First principles of instruction. Educ Technol Res Dev. 2002;50(3):43–59.
14 Spiro RJ, Coulson RJ, Feltovich PJ, Anderson DK. Cognitive Flexibility Theory: Advanced Knowledge Acquisition in Ill-structured Domains. Center for the Study of Reading Technical Report. Champaign, IL: University of Illinois at Urbana-Champaign, 1988.
15 Bearman M, Cesnik B, Liddell M. Random comparison of “virtual patient” models in the context of teaching clinical communication skills. Med Educ. 2001;35:824–32.
16 Yoder ME. Preferred learning style and educational technology: linear vs interactive video. Nurs Health Care. 1994;15:128–32.
17 Ford N, Chen SY. Matching/mismatching revisited: an empirical study of learning and teaching styles. Br J Educ Technol. 2001;32:5–22.
18 Hsu TE, Frederick FJ, Chung ML. Effects of learner cognitive styles and metacognitive tools on information acquisition paths and learning in hyperspace environments. Paper presented at the National Convention of the Association for Educational Communications and Technology, Nashville, TN, February 16–20, 1994.
19 Garg AX, Norman GR, Eva KW, Spero L, Sharan S. Is there any real virtue of virtual reality? The minor role of multiple orientations in learning anatomy from computers. Acad Med. 2002;77(10 suppl):S97–9.
20 DiBartola LM, Miller MK, Turley CL. Do learning style and learning environment affect learning outcome? J Allied Health. 2001;30:112–5.
21 Spickard A, Smithers J, Cordray D, Gigante J, Wofford JL. A randomised trial of an online lecture with and without audio. Med Educ. 2004;38:787–90.
22 Triantafillou E, Pomportsis A, Demetriadis S, Georgiadou E. The value of adaptivity based on cognitive style: an empirical study. Br J Educ Technol. 2004;35:95–106.
23 Hulley S, Cummings S, Browner W, Grady D, Hearst N, Newman T. Designing Clinical Research: An Epidemiologic Approach. 2nd ed. Philadelphia: Lippincott Williams & Wilkins, 2001.
24 Dillon A, Gabbard RB. Hypermedia as an educational technology: a review of the quantitative research literature on learner comprehension, control, and style. Rev Educ Res. 1998;68:322–49.
25 Chen C, Czerwinski M, Macredie RD. Individual differences in virtual environments—introduction and overview. J Am Soc Inf Sci. 2000;51:499–507.
26 Merrill MD. Instructional strategies and learning styles: which takes precedence? In: Reiser R, Dempsey JV (eds). Trends and Issues in Instructional Design and Technology. Upper Saddle River, NJ: Merrill/Prentice Hall, 2002.
27 Cook DA. Learning and cognitive styles in Web-based learning: theory, evidence, and application. Acad Med. 2005;80:266–78.
28 Chen SY, Paul RJ. Editorial: Individual differences in Web-based instruction: an overview. Br J Educ Technol. 2003;34:385–92.
29 Brusilovsky P. Adaptive educational systems on the World-Wide-Web: a review of available technologies. Paper presented at the Fourth International Conference in Intelligent Tutoring Systems, San Antonio, TX, August 16–19, 1998.
30 Jonassen DH, Grabowski B. Handbook of Individual Differences, Learning, and Instruction. Hillsdale, NJ: Lawrence Erlbaum Assoc, 1993.
31 Wlodkowsi RJ. Strategies to enhance adult motivation to learn. In: Galbraith MW (ed). Adult Learning Methods: A Guide to Effective Instruction. 2nd ed. Malabar, FL: Krieger, 1998:91–111.
32 Shatzer J. Instructional methods. Acad Med. 1998;73(9 suppl):S38–45.
33 Hunt DL, Haynes RB, Hanna SE, Smith K. Effects of computer-based clinical decision support systems on physician performance and patient outcomes: a systematic review. JAMA. 1998;280:1339–46.
34 Bates DW, Kuperman GJ, Wang S, et al. Ten commandments for effective clinical decision support: making the practice of evidence-based medicine a reality. JAMA. 2003;10:523–30.
35 Issenberg SB, McGaghie WC, Hart IR, et al. Simulation technology for health care professional skills training and assessment. JAMA. 1999;282:861–6.
36 Friedman C. Anatomy of the clinical simulation. Acad Med. 1995;70:205–9.
37 Ziv A, Wolpe PR, Small SD, Glick S. Simulation-based medical education: an ethical imperative. Acad Med. 2003;78:783–8.
38 Kneebone R. Simulation in surgical training: educational issues and practical implications. Med Educ. 2003;37:267–77.
39 Norman GR. Simulation—saviour or Satan? Adv Health Sci Educ. 2003;8:1–3.
40 Harden R, Hart I. An international virtual medical school (IVIMEDS): the future of medical education? Med Teach. 2002;24:261–7.
41 Sisson SD, Hughes MT, Levine D, Brancati FL. Effect of an Internet-based curriculum on postgraduate education: a multicenter intervention. J Gen Intern Med. 2004;19:505–9.
42 Whitcomb ME. Research in medical education: what do we know about the link between what doctors are taught and what they do? Acad Med. 2002;77:1067–8.
43 Prystowsky JB, Bordage G. An outcomes research perspective on medical education: the predominance of trainee assessment and satisfaction. Med Educ. 2001;35:331–6.
44 Chen FM, Bauchner H, Burstin H. A call for outcomes research in medical education. Acad Med. 2004;79:955–60.
45 Owston RD. Evaluating Web-based learning environments: strategies and insights. Cyberpsychol & Behav. 2000;3:79–87.
46 Lederman NG. What works: a commentary on the nature of scientific research. Contemp Issues Technol Teach Educ. 2003;3(1):4–10.
47 Bradley P, Postlethwaite K. Simulation in clinical learning. Med Educ. 2003;37:1–5.
48 Savenye WC, Robinson RS. Qualitative research issues and methods: an introduction for educational technologists. In: Jonassen DH (ed). Handbook of Research on Educational Communications and Technology. 2nd ed. Mahwah, NJ: Lawrence Erlbaum, 2004:1045–71.
49 Kneebone R, ApSimon D. Surgical skills training: simulation and multimedia combined. Med Educ. 2001;35:909–15.
50 Steele DJ, Johnson Palensky JE, Lynch TG, Lacy NL, Duffy SW. Learning preferences, computer attitudes, and student evaluation of computerised instruction. Med Educ. 2002;36:225–32.
51 Bearman M. Is virtual the same as real? Medical students’ experiences of a virtual patient. Acad Med. 2003;78:538–45.
52 Brusilovsky P. Adaptive navigation support in educational hypermedia: the role of student knowledge level and the case for meta-adaptation. Br J Educ Technol. 2003;34:487–97.
53 Specht M, Kobsa A. Interaction of domain expertise and interface design in adaptive educational hypermedia. Paper presented at the Workshop on Adaptive Systems and User Modeling on the World Wide Web, Eighth International World Wide Web Conference, Toronto, Canada, May, 1999.
54 Weibelzahl S, Weber G. Adapting to prior knowledge of learners. Paper presented at the Second international conference on Adaptive Hypermedia and Adaptive Web Based Systems, Malaga, Spain, 2002.
55 Abouserie R, Moss D. Cognitive style, gender, attitude toward computer-assisted learning and academic achievement. Educ Stud. 1992;18:151–60.
56 Lynch TG, Steele DJ, Johnson Palensky JE, Lacy NL, Duffy SW. Learning preferences, computer attitudes, and test performance with computer-aided instruction. Am J Surg. 2001;181:368–71.
57 Lieberman G, Abramson R, Volkan K, McArdle PJ. Tutor versus computer: a prospective comparison of interactive tutorial and computer-assisted instruction in radiology education. Acad Radiol. 2002;9:40–9.
58 Billings DM, Cobb KL. Effects of learning style preferences, attitude and GPA on learner achievement using computer assisted interactive videodisc instruction. J Comput Based Instruct. 1992;19:12–6.
59 Ford N, Chen SY. Individual differences, hypermedia navigation, and learning: an empirical study. J Educ Multimedia Hypermedia. 2000;9:281–311.
60 Eklund J, Sinclair K. An empirical appraisal of the effectiveness of adaptive interfaces for instructional systems. Educ Technol Soc. 2000;3(4):165–77.
61 Riding R, Cheema I. Cognitive styles: an overview and integration. Educ Psychol. 1991;11(3/4):193–215.
62 Kolb D. Experiential Learning: Experience as the Source of Learning and Development. Englewood Cliffs, NJ: Prentice-Hall, 1984.
63 Leung GM, Johnston JM, Tin KY, et al. Randomised controlled trial of clinical decision support tools to improve learning of evidence based medicine in medical students. BMJ. 2003;327:1090.
64 Tsai TL, Fridsma DB, Gatti G. Computer decision support as a source of interpretation error: the case of electrocardiograms. J Am Med Inform Assoc. 2003;10:478–83.
65 Bordage G. Elaborated knowledge: a key to successful diagnostic thinking. Acad Med. 1994;69:883–5.
66 Maran NJ, Glavin RJ. Low- to high-fidelity simulation: a continuum of medical education? Med Educ. 2003;37:22–8.
67 Schuwirth LWT, van der Vleuten CPM. The use of clinical simulations in assessment. Med Educ. 2003;37:65–71.
68 Koschmann T. Medical education and computer literacy: learning about, through, and with computers. Acad Med. 1995;70:818–21.
69 Cartwright CA, Korsen N, Urbach LE. Teaching the teachers: helping faculty in a family practice residency improve their informatics skills. Acad Med. 2002;77:385–91.
70 Davis MH, Harden RM. E is for everything—e-learning? Med Teach. 2001;23:441–4.
71 Nowacek G, Friedman C. Issues and challenges in the design of curriculum information systems. Acad Med. 1995;70:1096–100.
72 Candler CS, Andrews MD. Avoiding the great train wreck: standardizing the architecture for online curricula. Acad Med. 1999;74:1091–5.
73 Kaplan B, Brennan PF, Dowling AF, Friedman CP, Peel V. Toward an informatics research agenda: key people and organizational issues. J Am Med Inform Assoc. 2001;8:235–41.
74 Berner ES, McGowan JJ, Hardin JM, Spooner SA, Raszka WV, Berkow RL. A model for assessing information retrieval and application skills of medical students. Acad Med. 2002;77:547–51.
75 Ramnarayan P, Kapoor RR, Coren M, et al. Measuring the impact of diagnostic decision support on the quality of clinical decision making: development of a reliable and valid composite score. J Am Med Inform Assoc. 2003;10:563–72.
76 Downs S, Marasigan F, Abraham V, Wildemuth B, Friedman C. Scoring performance on computer-based patient simulations: Beyond value of information. Paper presented at the Proceedings of the Annual Symposium of the American Medical Informatics Association, Washington, DC, 1999.
77 Bransford J, Brown A, Cocking R. How People Learn: Brain, Mind, Experience, and School. Washington, DC: National Academy Press, 2000.
78 Jonassen DH, Wang SR. Acquiring structural knowledge from semantically structured hypertext. J Comput Based Instruct. 1993;20(1):1–8.
79 Jonassen DH, Reeves TC. Learning with technology: using computers as cognitive tools. In: Jonassen DH (ed). Handbook of Research for Educational Communications and Technology. New York: Simon and Schuster Macmillan, 1996:693–719.
80 Kamin C, O'Sullivan P, Deterding R, Younger M. A comparison of critical thinking in groups of third-year medical students in text, video, and virtual PBL case modalities. Acad Med. 2003;78:204–11.
*Confounding is present when multiple factors simultaneously influence the dependent variable, resulting in outcomes that can be interpreted in more than one way. As applied to media-comparative research, Clark stated, “Studies are often vulnerable to rival hypotheses that learning gains resulted from different instructional methods, content, or from student enthusiasm for a novel medium, not from the computer per se.”2 Cited Here...