Fall, Leslie H. MD; Berman, Norman B. MD; Smith, Sherilyn MD; White, Christopher B. MD; Woodhead, Jerold C. MD; Olson, Ardis L. MD
Changes in medical practice over the past ten years have also changed the way medicine must be taught. Medical educators face many challenges because of the shift of clinical care from the inpatient to the ambulatory setting.1 In addition, demands on medical school faculty for increased productivity and greater reliance on community-based preceptors to provide outpatient clinical learning opportunities pose formidable challenges as medical schools strive to fulfill the requirement of the Liaison Committee on Medical Education (LCME) that all medical students receive an equivalent educational experience and achieve the same core competencies.1–3 Concurrently, the Association of American Medical Colleges (AAMC) has placed a renewed emphasis on improving medical students’ foundational knowledge and skills in history-taking, physical examination, clinical reasoning, lifelong learning, and use of information technology.4 To respond to these competing challenges, the AAMC has called for the development of effective and innovative educational technology applications to enhance and standardize medical student learning in the clinical years, particularly at multiple and distant training sites.1,5
Realistic, computer-based patient simulations have the potential to address many of these issues. Case simulations can relieve some of the teaching burden of busy faculty while ensuring that all students have a consistent exposure to important inpatient and outpatient clinical problems. They can also deliver standardized curricular material, model clinical reasoning and best practice, and provide instant access to global information resources.6,7 Cases with features that allow for questioning and feedback encourage student self-assessment and provide a safe environment for students to practice without risk.8,9 Computer simulations can also incorporate images, video, and sound, making the simulations more realistic than can be done with traditional textbook learning. Finally, if computer case simulations are Internet based, they can be updated easily and made available at any training site. Preliminary studies have demonstrated that this method of learning is well accepted by students, and is an efficient and effective way of learning that can be integrated well into the clerkship.6,8
Even though the potential of computer-assisted instruction (CAI) cases is great, broad use of CAI for medical student education has been slow to occur. To our knowledge, only three case-based Internet CAI programs have been developed to comprehensively teach a medical school curriculum.6,10,11 Many of the currently developed CAI programs are limited by (1) content designed to teach a specific topic that does not have a clear role in the medical student curriculum; (2) development at a single institution, which can limit the program’s applicability at other institutions; (3) limited participation by intended course directors and students during development; and (4) extended case length.6,7,11–13
Despite these challenges, the AAMC has encouraged medical schools to make better use of educational technology applications to facilitate medical student learning in the clinical years.5 In response to these challenges, we describe below the development of the Computer-assisted Learning in Pediatrics Project (CLIPP), an Internet-based CAI program for teaching the core third-year pediatrics clerkship curriculum that was established in 2000. Comprehensively based on the Council on Medical Student Education in Pediatrics (COMSEP) General Pediatrics Clerkship Curriculum,14 CLIPP demonstrates that a CAI program of this scope is feasible and will be broadly used, despite significant development costs.
In 1993, COMSEP, the U.S. and Canadian organization of undergraduate pediatrics educators, developed and implemented the national pediatrics clerkship curriculum. The core curriculum outlines the appropriate learning objectives and competencies for a general pediatrics clerkship and currently serves as the foundation for more than 90% of pediatrics clerkships in the United States.14 The curriculum development process succeeded because of broad support from pediatrics educators and leaders for both the creation of the curricular content and for the vital process of its adoption and implementation. Throughout development of the curriculum, the need for effective CAI methodologies to teach the curriculum became clear.14
CLIPP, with the support of Health Resources and Services Administration (HRSA) funding, was developed over three years (2000–2003) to respond to this need and to build upon the success and momentum of the COMSEP curriculum project. The CLIPP Project Advisory Group, composed of the project directors (LF, NB, AO) and COMSEP leaders, agreed that the goals of the project were twofold: to develop quality computer-assisted instruction material and to ensure its broad use. To accomplish these goals, the advisory group recommended (1) a series of uniform and interactive CAI case simulations that would comprehensively teach the COMSEP curricular objectives; (2) a multi-institutional team approach to case development, using pediatrics clerkship directors as authors, and pairing them with fellow clerkship directors experienced in CAI and in curriculum development; and (3) extensive case evaluation using blinded peer review and student pilot-testing at three COMSEP schools. This model of combining CAI and content expertise to develop computer-based teaching cases has since been effectively used and described in a similar project.15
Project Objectives and Development
Comprehensive coverage of the COMSEP curriculum
A content development team thoroughly reviewed the curriculum and created detailed outlines for a total of 31 CAI case simulations (see Table 1). The team paid particular attention to ensuring that (1) the series of cases contained all of the core components of the COMSEP curriculum; (2) the content of the cases was complementary and not redundant; (3) the proposed case scenarios were realistic; (4) the case author was given flexibility for creativity in writing each case; and (5) the generalist focus of the curriculum was emphasized in all cases. The project leaders tracked the inclusion of each of the 350 COMSEP curricular objectives into the case outlines. Upon completion of case authoring, 93% of the COMSEP curricular objectives were included in at least one case and 57% were included in more than one case.
Blinded peer review of each case by a pediatrics content expert and a COMSEP educator provided content evaluation, both for accuracy and for consistency with the COMSEP curriculum. In addition, medical students provided feedback regarding the suitability of the case content for the third-year student level (described below). Peer review and case-specific student feedback (see Tables 2 and 3) suggest that the cases were well written and directed appropriately at the third-year medical student level.
A uniform approach to CAI pedagogy
A CAI development team created the uniform case template to guarantee that sound education principles would be used in each case (see List 1). To ensure that students could complete the full series of 31 cases during a typical clerkship, the team established a desired case length of 30 minutes. (After case authoring and pilot-testing experience, this limit was extended to 50 minutes.) The project strictly limited the use of digital video to retain the ability to access the cases using a modem and standard telephone line. Two early cases underwent qualitative usability testing by medical students at one school (Dartmouth) to provide feedback and allow modification of the proposed case template and instructional strategies. Each case underwent blinded peer review by a COMSEP educator to evaluate its educational effectiveness. Students at the three pilot schools also provided similar feedback (described below).
Both COMSEP educator peer review and case-specific student feedback (see Tables 2 and 3) indicated that the case teaching template was well developed. Qualitative comments from students included positive comments about the multimedia (from 28% of the students), interactive learning (26%), and realistic scenarios (12%). Many students specifically described the realistic impact of unexpectedly “discovering” an abdominal mass in the case of an otherwise healthy nine-month-old infant (see Table 1, Case 2). Negative qualitative comments included case length too long (from 26% of the students) and no easy method for reviewing the case content (6%). Thirty-five percent of students offered negative feedback about specific aspects of the case interactivity, such as excessive use of hyperlinks (not enough “bang for the click”) and wanting less interactivity in the history portion of the case, an observation supported by peer review ratings (see Table 2). Student and peer reviewer feedback was used (and continue to be used) to further refine the case pedagogy and interactivity.
Multi-institutional development by pediatrics educators
The project leaders chose a proprietary software package for case authoring, CASUS, to avoid the significant time and expense of software development. This easy-to-use software met many of the project’s needs. It offered (1) a straightforward, linear case structure; (2) a variety of interactive question-and-answer formats; (3) the ability for a student to save work-in-progress; (4) an “expert” feature allowing “mentoring” of the student throughout the case; and (5) a unique diagnostic reasoning tool that allows students to create a differential diagnosis from the key findings within the case.
Project leaders recruited individual case authors through the COMSEP listserv and trained them in a four-hour workshop at the annual COMSEP meeting. Where possible, authors created cases in their area of expertise or subspecialty training. Members of the content and CAI development teams served as mentors to each case author. Following the workshop, the case authors created and shared case drafts with the development team mentors via e-mail. Later, authors made final case revisions, based on peer review and student feedback. When their cases were completed, authors received a small honorarium, and their department chairs received letters detailing the scope of the project and the authors’ contributions to a peer-reviewed, multi-institution project.
Using focused telephone interviews, the project leaders conducted an assessment of the case development process with a cross-section of development team members and case authors. Factors that motivated members to participate included contribution to an innovative project that provided a clear benefit to their students and to COMSEP. The project also provided the opportunity to learn new skills in CAI case development.
Primary challenges described by case authors included finding time to write the first case draft and determining how best to use the many teaching options afforded by the software. Average time to complete a case was approximately 60 hours. Authors found the work involved in case writing to be consistent with authoring a review article or book chapter. In a minority of cases (seven), authors were unable to complete their role in CLIPP. Failure stemmed mainly from the time commitment required for project participation. New case authors (two), project directors (two), or fourth-year medical students working with the project directors (three) ultimately authored these cases.
Factors that facilitated the authoring process included strong leadership and support from the project development teams, clear authoring instructions, ease of authoring in a word-processing format, and support for participation from individual department chairs and home institutions.
Ensuring that the cases are valued by students
Three COMSEP schools integrated and pilot-tested the CLIPP cases by requiring students to independently complete each case prior to the conclusion of the clerkship. Students at these schools provided feedback at two levels: (1) case-specific feedback regarding individual case content and pedagogy, using online questionnaires at the conclusion of each CLIPP case (discussed in previous sections, above); and (2) case-series feedback regarding student satisfaction with the overall series of cases and their experience with this comprehensive CAI approach, using a questionnaire at the conclusion of the clerkship. Additionally, project leaders conducted qualitative feedback sessions with medical students at one pilot school (Dartmouth) to further explore the overall effectiveness of the CLIPP pedagogical approach, case integration with other educational experiences throughout the clerkship, and barriers to access.
A total of 211 students at the three pilot institutions provided case-series feedback at the conclusion of the clerkship, providing information about their experience using the CLIPP cases during the pediatrics clerkship (see Table 3). Students estimated that they spent an average of 50 minutes to complete each case. Most students accessed the cases from home (53% of students), hospital (21%), preceptor’s office (19%), and the library (7%). Using a five-point Likert-type scale in which 1 = strongly disagree and 5 = strongly agree, students agreed that the cases complemented the clerkship (mean rating of 4.36) and provided teaching about conditions they did not encounter clinically (4.09). Most students worked on the cases alone (94% of students). During qualitative feedback sessions at the end of the clerkship, students expressed the desire to have CLIPP tied more directly into their learning throughout the clerkship, including lectures and bedside teaching sessions. Students described frustration with having few faculty, preceptors, or residents who were knowledgeable about the cases. Factors that limited student access to the cases were slow Internet connections (35% of students), Internet or server access problems (18%), insufficient access to a computer (8%), too little time to work on the cases (6%), and software not functioning correctly (5%). Compared to the case-specific value students placed on the time they spent on individual cases (mean rating of 4.22), students gave lower rating to the value of their time spent on the overall series of cases (3.80). Despite these ratings, students agreed that they would enjoy having similar learning programs available in other clinical clerkships (4.08).
Dissemination of Cases
The CLIPP cases were made available for widespread use by U.S. and Canadian medical schools in March 2003. Project leaders created the CLIPP Web site 〈www.clippcases.org〉 to provide a portal to the CLIPP cases, user support information, and tracking of case utilization. At two annual COMSEP meetings, project team members conducted workshops aimed at familiarizing clerkship directors with the CLIPP cases and with effective methods for integrating the cases into the clerkship. Feedback from clerkship directors at these workshops helped to further refine the CLIPP Web site and led to the development of additional student and teacher resources based on the CLIPP cases. Following formal dissemination of the CLIPP cases, project leaders tracked the use of the cases at individual schools and made this data available to respective clerkship directors on the CLIPP Web site.
Since dissemination to COMSEP in 2003, CLIPP has become accepted and widely used by COMSEP schools and medical students. As of June 2005, more than 50 schools have begun using CLIPP, with more than 8,000 students completing over 98,000 case sessions, at an average of 2,000 case sessions completed per week. Each case has been completed by more than 3,000 students. Case-specific feedback at these 50 institutions from 30,243 questionnaires has been consistent with student pilot-testing results (see Table 3). Many conversations with COMSEP clerkship directors confirm that students’ experiences with CLIPP at these schools are consistent with our pilot experience and that, in general, students value and enjoy the cases. In addition to favorable student and peer review ratings, clerkship directors noted that they (and many of their associate deans) value the cases for their ability to meet LCME accreditation standards for providing documented clinical experiences based upon accepted learning objectives, as well as consistent clinical education across training sites and times of year.
Given the scope of this project, the cost of development was substantial (see Table 4). The three-year (2000–2003) cost of the project, including direct and in-kind contributions, was $558,234. (Data for in-kind cost estimates were obtained from telephone interviews with development team members and case authors, described above.) The estimated overall development cost per case (direct and in kind) was $18,000. Each case required 310 total development hours. Given the use of the cases to date, the development cost is $6 per completed case session. Currently the cost of CLIPP development is approximately $70 per student user, roughly equaling the purchase price of a student pediatrics textbook. At the current level of usage, the cost per student will drop to $38 in one additional year and $26 in two years.
The project is currently supported by HRSA grant funding, and the cases are provided free of charge to COMSEP and other educational institutions. The cost to maintain the project beyond current grant funding is estimated to be $120,000 annually. At the current level of use, the estimated cost per school is $2,500 annually, or approximately $25 per student.
Our project demonstrates that the development and broad use of CAI in clinical education is feasible when the program is designed to meet a fundamental need and when the value to both educators and students is clearly evident in the final product. Our work supports previous experience that initial development of a CAI program is both time-consuming and costly,6,9,13 though both of these areas of initial investment reduce significantly with use over time. Even without further grant support, we feel the cost to maintain the cases is reasonable. We attribute the success of this project to our adherence to the project’s four initial objectives and to a firm belief from the beginning that simply creating the program and making it available would not result in its use. While we cannot definitively state which of our four project objectives were most important to our success, the combination has resulted in broad utilization of a CAI program in medical education to an extent not previously described.
Our project supports the premise that the value of CAI to both clerkship directors and students lies in a balance between the quality of the content and the pedagogy. The delivery of comprehensive content with a uniform approach improves both the efficiency of learning and of teaching a large amount of new information during a short clerkship. Students also value learning methods that are well designed, interactive, realistic, and engaging. By dividing the process of case development into two independent and complementary development teams, we responded to previously described challenges9,11,13 and assured that both “what to teach” and “how to teach” received individual and appropriate attention. This successful approach has recently been described in a similar project15 and was further validated by our case authors, who found that despite the innovative nature of creating CAI, the challenge is not only in using the software, but also in deciding how to teach.
Unlike similar CAI projects,6,11 we encountered little difficulty recruiting and retaining case authors. Involvement of the COMSEP membership in the oversight of the project and case development was a key factor in the success of the project. Approximately 40% of COMSEP members participated in the development of CLIPP, either as members of project teams, as case authors, or as peer reviewers. We benefited greatly from the strong COMSEP culture of support and collaboration on projects such as the development of the pediatrics core curriculum. We were fortunate to begin with a well-designed and broadly accepted curriculum, and we were able to build on the momentum created during its development. A project advisory group that included key COMSEP leaders ensured that the project’s scope and priorities remained consistent with those of the COMSEP membership, and that the educational results of the project justified the resources it required. A clear vision of how the final product would benefit the organization and also enhance medical student education in pediatrics motivated members to participate, and case authoring became a successful grassroots effort. One author stated it well: “If I don’t help create it, then it’s not mine.” The fact that pediatrics department chairs showed support for and recognition of author efforts at their home institutions was another important factor in successful case completion. At least five authors have since been promoted, citing their cases and participation in CLIPP as evidence of academic scholarship. We believe that future CAI development in other disciplines will similarly benefit by modeling our use of a multi-institutional approach utilizing a preexisting organization of educators, peer review, and extensive pilot-testing.
Identifying an important need and a clear use for the program from the beginning was vital to ensuring that the cases would have a “home” within the broader clerkship curriculum and structure, and to answering many of the challenges encountered by previous projects.6,10,12 The development of the cases by COMSEP leaders and clerkship directors themselves guaranteed that the case content and instructional strategies were appropriate and flexible enough to meet the teaching needs of individual clerkships. Limiting case length, using limited video, and ensuring that students could save work in progress all facilitated the utility of the cases in busy office or inpatient settings. Case authoring by the membership also “primed the pump” for use. Clerkship directors understood the benefits and limitations of the cases and were excited to use the product they had worked so hard to create. An iterative qualitative and quantitative evaluation process at each stage of development provided vital information that allowed us to further adapt the program to student and clerkship director needs, to identify important barriers to its use, and to generate many new ideas for adding value to the cases. Simply stated by one project leader: “If you [just] build it, they won’t come.”
Although individual CLIPP cases were well received by students, the full series of 31 cases was in general less highly rated. Balancing our objectives of covering the curriculum comprehensively and creating a series of cases that feasibly could be completed during the clinical clerkship was difficult and highlights the need to limit CAI case length during the clinical years. In addition, the need to ensure sufficient access to computers with appropriate Internet connections at all clerkship training sites remains a challenge for some community sites. Students in our study accessed just over half of the cases from home, making computer access at off-site housing an important consideration. While CAI is intended to provide effective self-directed learning, students expressed an unmet need for better integration of the cases into the traditional clinical curriculum. Consistent with similar CAI projects,10,12,13 further research into effective integration strategies in both the inpatient and outpatient settings has become a clear priority for us.
Finally, is CLIPP more “effective” than other traditional teaching modalities? The answers depend on what teaching methods CLIPP is compared to, and how learning effectiveness is measured. CLIPP is specifically designed to teach the learning objectives of the COMSEP curriculum, and as such may not improve performance on exams such as the National Board of Medical Examiners subject examination or the United States Medical Licensing Examination Step 2, which are not limited to this same set of learning objectives. In addition to knowledge acquisition, the teaching approach utilized in CLIPP focuses heavily on clinical reasoning skills, which are inherently more difficult to measure. However, with the widespread use of CLIPP comes the potential to answer vitally important questions about the effectiveness of CAI that cannot be studied with smaller projects. Answering these questions will be critical to ultimately determining if the benefits of CLIPP outweigh the significant time and effort required to develop this or similar projects.
The project is now in the process of developing an editorial board, derived from the membership of COMSEP, that will oversee and update the cases as needed. With additional grant support, further resources are being developed to assist instructors in integrating CLIPP cases into the traditional clerkship model (e.g., lectures, bedside teaching rounds) and to promote formal faculty development about CLIPP. New cases based on more complex topics such as cultural competence, chronic illness, and genetics are also in development. Last, a working group of six COMSEP medical schools is collaborating to define successful CAI integration strategies in various clerkship settings, and to study the educational contexts in which CAI is most effective for educators and for students.
The authors are indebted to the outstanding work of the CLIPP Project Manager, Ms. Carol Edwards, and to the dedicated leadership and membership of COMSEP, particularly David Levine, MD, without whom this project would not have been possible. The authors also thank Patricia Carney, PhD, Greg Ogrinc, MD, and Carole Stashwick, MD, for their editorial support and expertise. This manuscript is dedicated to the memory of Drs. Richard Sarkin and Steven Miller, COMSEP leaders and dear friends whose vision and work were integral to the success of CLIPP. They are sorely missed.
This work was supported by two US Public Health Service Predoctoral Training in Pediatrics Grants (5D16HP00059-03 and 8D56HP00059-04).
1 Hunt CE, Kallenber GA, Whitcomb ME. Medical students’ education in the ambulatory care setting: Background paper 1 of the Medical Schools Objectives Project. Acad Med. 1999;74:290–96.
2 Standards for accreditation of medical education programs leading to the M.D. degree. Functions and standards of a medical school. II. Educational programs for the M.D. degree 〈http://www.lcme.org/standard.htm#latestadditions
〉. Accessed 13 June 2005. Liaison Committee on Medical Education, Washington, DC. Updated 8 June 2004.
3 Carney PA, Eliassen S, Pipas CF, Genereaux SH, Nierenberg DW. Ambulatory care education: how do academic medical centers, affiliated residency teaching sites and community-based practices compare? Acad Med. 2004;79:69–77.
4 Medical School Objectives Writing Group. Learning objectives for medical student education—guidelines for medical schools: report I of the Medical School Objectives Project. Acad Med. 1999;74:13–18.
5 Moberg TF, Whitcomb ME. Educational technology to facilitate medical students’ learning: background paper 2 of the Medical Schools Objectives Project. Acad Med. 1999;74:1146–50.
6 Leong SL, Baldwin CD, Adelman AM. Integrating web-based computer cases into a required clerkship: development and evaluation. Acad Med. 2003;78:295–301.
7 MacKenzie JD, Greenes RA. The World-Wide Web: redefining medical education. JAMA. 1997;278:1785–86.
8 Gordon JA, Oriol NE, Cooper, JB. Bringing good teaching cases “to life”: a simulator-based medical education service. Acad Med. 2004;79:23–27.
9 Adams A. Pedagogical underpinnings of computer-based learning. J Adv Nurs. 2004;46:5–12.
10 Cooksey A, Kohlmeier M, Plaisted C, Adams K, Zeisel SH. Getting nutrition education into medical schools: a computer-based approach. Am J Clin Nutr. 2000;72 (suppl):S868–878.
11 Reid JR, Goske MJ, Hewson MG, Obuchowski N. Creating an international comprehensive web-based curriculum in pediatric radiology. Am J Radiol. 2004;182:797–801.
12 Greenhalgh T. Computer-assisted learning in undergraduate medical education. BMJ. 2001;322:40–4.
13 Hamilton NM, Furnace J, Duguid KP, Helms PJ, Simpson JG. Development and integration of CAL: a case study in medicine. Med Educ. 1999;33:298–305.
14 Olson A, Woodhead J, Berkow R, Kaufmann NM, Marshall SG. A national general pediatrics clerkship curriculum: the process of development and implementation. Pediatrics. 2000;106(1 Pt 2):216–22.
15 Laidlaw JM, Harden RM, Robertson LJ, Hesketh EA. The design of distance-learning programmes and the role of content experts in their production. Med Teach. 2003;25:182–87.