A large body of literature discusses the techniques of critical appraisal and evidence-based medicine (EBM). EBM is defined as the conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual patients. Considerable resources have been applied to teaching EBM principles and practice but there is a paucity of evidence that such programs increase knowledge, change clinical behavior, or improve patient care. EBM has not been universally accepted as a novel or broadly applicable method. A review of problems associated with using EBM pointed out the constraints of applying EBM to the care of individual patients. In the literature that espouses universal practice of EBM, most clinical decisions are not evidentially supported by the results of randomized controlled trials, despite some advocates' contradictory interpretations.1 Ultimately, however, many of the goals of teaching EBM seem fundamentally important to practicing physicians. EBM can improve patient care decisions and potentially enhance patient outcomes, as well as maximize the effectiveness of care. It may be that the best method for transferring the knowledge of and ability to practice EBM in residency programs, and eventually, in clinical practice after residency, has not yet been developed. One credible goal of residency education should be to endow our residents with a practical working knowledge of EBM, so we can graduate competent, confident life-long learners who provide first-class patient care according to the best evidence available.
Many published articles outline proposals and curricula in EBM and critical appraisal at all stages of medical education; most of these articles have little or no experimental evidence of the effectiveness of teaching EBM. Several articles present the design and implementation of innovative EBM programs for students,2,3,4 residents,5 and practicing physicians,6 but many of these studies were methodologically weak. Most lacked control groups and reported only increased scores on indirect measure, such as multiple-choice tests.
These studies lacked convincing controlled-trial experimental validation to support the teaching of EBM. A published systematic review of the literature supporting EBM interventions in medical education found that no study had affirmed the effectiveness of any specific technique of teaching these skills to medical students or residents.7 The review contained only one study that had a control group and showed significant improvement on a free-text response test and inquiries into the perceived clinical use of EBM. Another article demonstrated that no study had measurements based on actual patient interactions.8
Since 1998, a number of studies have investigated the effectiveness of EBM education. A study that used responses to the Association of American Medical Colleges' Graduation Questionnaire and focused surveys demonstrated a difference in confidence between graduating classes that received EBM training and those that did not. This study did not test knowledge or application of EBM principles, and the assessments of control and experimental groups were separated by a significant time lag.9 A study in Scotland that tested the effectiveness of a workshop program in EBM in a before-and-after design had no control group.10 A significant improvement in EBM knowledge assessed by performance on a multicomponent test was evident in a group of gastroenterology fellows who attended a seminar on EBM but, once again, this study did not have a control group.11 Another study demonstrated first-year medical students' ability to incorporate EBM skills when they read medical literature on a clinical problem and to discern methodologic flaws in published clinical trials.12 We could not find any study that had a control group of residents and that measured the change in their knowledge or clinical application of EBM.
Our study investigated the effectiveness of an easily integrated curriculum in increasing residents' knowledge and application of the principles of EBM during outpatient continuity care. To our knowledge, ours is the only educational, interventional study that had a control group of residents during objective testing of EBM knowledge and that analyzed application of EBM principles during patient care.
Setting and Participants
We conducted our study during the 1999–2000 academic year. Participants were aware of the recording of resident–preceptor interactions and were informed that they were participating in an experiment to improve precepting. Because of the necessity of masking the purpose of our recording (to avoid contamination) and the fact that this was an educational intervention, only verbal consent was deemed necessary to proceed with the study. The intervention was viewed as an educational event by the two institutional review boards involved in the approval process. Participation in our study was voluntary, and participants received no rewards for their participation. Our experimental group of 18 residents were in Merle West Medical Center (MWMC)/ Cascades East Family Practice Residency, a university-administered, community-based residency program in Oregon designed to train residents to practice in rural settings. MWMC is a 146-bed hospital in a community of approximately 40,000 inhabitants. The control group of 30 residents was in a residency program at Oregon Health and Science University Family Practice Residency, an urban, university-based site where care is delivered both at a hospital clinic and at two suburban satellite locations.
Needs Assessment and Curriculum Design
Prior to introducing the EBM curriculum, we used the following three techniques to investigate the perceived needs of our residency program regarding specific teaching of EBM skills:
1. Review of literature: One of us (RR) searched the medical and educational literature (Medline 1966–2001, Embase, and ERIC) using the terms “education,” “EBM,” “evidence-based medicine,” and “critical appraisal.” RR searched the bibliographies of the retrieved articles for any other pertinent literature that was overlooked by the electronic search.
2. Group meetings: At our regularly scheduled executive committee meetings at the residency program (attended by the residency director, chief residents, and representatives from administration and clinic staff), there was broad consensus for introducing a specific course to address perceived deficiencies in EBM skills.
3. Survey: We designed a questionnaire and circulated it to all residents and faculty members at Merle West Medical Center/ Cascades East FPR. Most respondents (25 questionnaires distributed with a 100% response rate: 18 residents, seven faculty) felt they possessed at least a rudimentary knowledge of EBM, with a modal score on a Likert scale of 3/7 (1 = strongly disagree to 7 = strongly agree, responding to the statement: “I know how to apply the principles of evidence-based medicine when I review articles in the medical literature”). However, all except one respondent felt that there would be great value in introducing an EBM course into the residency curriculum (modal score 6/7 responding to the statement: “I think that a course specifically designed to help me use evidence-based medicine would be valuable in residency”).
Based on this information, we introduced a series of ten workshops accompanied by pre- and post-intervention measurements of knowledge and the application of EBM during continuity clinic patient care. The workshops were interactive, one-to-two hour sessions with a brief lecture (30–40 minutes) on an EBM topic/principle followed by practical application of the material in an interactive session using articles from the medical literature. We encouraged participants' active involvement. List 1 summarizes the workshop topics.
We then formulated the following research hypothesis: “The introduction of an evidence-based medical curriculum into the residency program will result in increased knowledge of EBM and the application of more critically appraised interventions during the care of patients seen at the family practice center during resident continuity clinics.” We tested our hypothesis by tape-recording and analyzing preceptor–faculty/resident interactions during these clinics before and after the workshops. The institutional review boards of Oregon Health and Science University and Merle West Medical Center approved our study.
Because of the inherent difficulty conducting randomized controlled trials in the educational environment (especially among a small number of residents such as at our location), we decided to use a control group of family practice (FP) residents in the same department, but that was located at the urban site 300 miles distant. The same techniques of test taking and recording were performed with the control group, on the same time line as the experimental group. The control site, like the experimental site, is a fully accredited FP residency, but it does not share any significant residency education programs with the experimental site, has no specific course in EBM, and represented an adequate and relatively accessible control group. We did not formally test the similarities of the control or experimental groups because we were attempting to replicate a “real-life” situation faced by most residents. The residents we receive through the residency match will usually remain in our programs for the duration of their residencies. Because of the control group design and the historic difficulty in obtaining questionnaire responses from residents, we did not formally pursue background information about the residents. We felt that the groups were as closely matched as was practicable. Both groups had the same institutional access to computers, electronic references, and library resources. The majority of residents in both programs were graduates of U.S. medical schools, were about equally divided between sexes, and had had variable exposures to the principles and practice of EBM prior to graduation from medical school. The residents also had very similar scores on the pre-EBM workshop multiple-choice exam, indicating similar depths of EBM knowledge prior to participating in our study.
We tape-recorded resident–preceptor interactions prior to introducing an EBM curriculum. Both preceptors and residents were unaware of the intended use of the recordings and were later informed that the recordings would be used to improve teaching during continuity clinic. We tabulated interactions that contained key phrases or words that we assumed were surrogates for EBM care. We agreed on the terms after one of us (RR) discussed with EBM/research colleagues and authorities at McMaster University in Ontario, Canada, about an acceptable method for choosing key phrases or words for inclusion. Our consultants suggested that a method of including all widely used terms* was a valid method of initiating the study and analyzing the recordings. Before use, we and other Oregon Health and Science University (OHSU) faculty members reviewed the terms for any questionable inclusions or exclusions.
We then introduced the curriculum of the ten one-hour EBM workshops at the experimental site. The workshops took place once a week for ten weeks. We gave the attendees pocket-sized reference cards outlining the EBM techniques discussed at each session. The workshops covered the major subjects that a practicing family physician will be likely to encounter, with an emphasis on the analysis of review articles and articles on therapy (see List 1). One of us (RR) designed the curriculum in house and crafted it to represent the learning needs of family practice residents.
Six months after the participants completed the workshops, we undertook an identical recording process. During the study, we did not disclose the description of the study or its intent to any participants. Comparable procedures were performed with the control group of residents in the OHSU residency program in Portland: the original recording/observation at that site and the follow-up six months later. The control group received no special instruction in EBM. Multiple recording sessions occurred at both sites during continuity clinics.
We supplemented both groups' continuity clinic recordings with a 50-item pre- and post-workshop open-book multiple-choice examination based on the workshop's content. The examination, designed by one of us (RR), was pre-tested for content and face validity by administering it to faculty at both Cascades East Family Practice Residency in Klamath Falls, Oregon, and family practice faculty at the State University of New York, Albany. The multiple-choice examination was reviewed by peers at the study sites (Cascades East and OHSU) and by outside reviewers at the University of Southern California, Los Angeles, and the SUNY family practice residency to ensure the validity of content and question design. We chose an open-book format because our study was designed to closely mirror use of EBM in daily practice. We felt that most practitioners of EBM rely on assistance from written or electronic resources when they analyze articles and reference material.
Our analysis of pre-intervention multiple-choice test results using t-tests to compare means showed no significant difference, with a mean test score for control residents of 56% and an experimental group mean of 53% (p > .22 NS). Post-intervention test scores for the 17 experimental group residents who were present for both the pre- and post-tests (one resident at the experimental site was absent for the post-test) were significantly improved, with a mean of 72% (p < .001). There was no significant change in the multiple-choice test scores of the control group (mean 62% p > .05 NS). During recorded resident–preceptor interactions, there was a dramatic increase in the use of terms that indicated awareness and/or use of EBM principles. Prior to our data analysis, we eliminated terms used by residents who did not participate in both the pre- and post-workshop recording sessions.
The results of our analysis of the tape-recorded interactions are summarized in Table 1 as the numbers of EBM terms recorded divided into three groups: residents, preceptors, and then a combined total of residents and preceptors. The table also includes the amount of time recorded for analysis at each site. For example, prior to the workshops at the experimental site, a total of 1,165 minutes were recorded, and following the workshop a, a total of 735 minutes were recorded. Both residents and preceptors in the experimental group demonstrated remarkable increases in the use of EBM terminology, with residents using four terms before and 36 terms after, and preceptors using six before and 31 after the workshop series. The preceptors' increased use of EBM terminology may reflect the fact that the EBM workshop series was open to them as well to the residents. By contrast, there was a decrease in the use of EBM terminology at the control site. We analyzed the results for statistical significance using Pearson's chi-square for all but the preceptor terms, for which we used Fischer's exact test. As can be seen from Table 1 and our analysis, the changes were dramatic and significant in the resident analysis alone, as well as for the total number of interactions. Although numerically impressive, the results for terms used by preceptors alone were not statistically significant when analyzed with the Fischer test.
Our study demonstrated the effectiveness of a workshop series in improving family practice residents' knowledge of EBM. Perhaps more importantly, the workshops enhanced use of EBM terminology and increased discussion of EBM principles during patient care. Our study makes a significant contribution to our knowledge of effective EBM education because we used a control group of residents during objective testing of EBM knowledge, as well as during the analysis of application of EBM principles during patient care. There was a substantial increase in the knowledge base of residents in the experimental group, as well as a marked increase in the use of EBM terms during interactions with patients, both by residents and by preceptors.
Limitations of our study would include the fact that the experimental and control groups were not randomized samples, and that the preceptors and residents recorded in the initial and follow-up sessions were not identical before manipulation of the data. Although we attempted to use similar faculty and residents during the recorded continuity clinic sessions at the control and experimental sites before and six months after the intervention, this proved to be impossible. Because of the nature of family practice residency programs, many residents were on “away” rotations or vacation during the recording sessions. Despite this problem, the same ten preceptors were involved in both of the recording sessions at the experimental site, and 12 of the 14 residents were at both recording sessions (three first-year residents, four second-year, and five third-year). At the control site, 16 preceptors participated, only one of whom did not precept at some time during both recording sessions. Thirty residents were recorded at the control site, 23 of whom were in both the initial and follow-up recordings (ten first-year residents, three second-year, and ten third-year). Only eight terms during all of the recording sessions at both sites originated from the residents who did not participate in both pre- and post-continuity clinic recording sessions. We removed these data prior to reporting or analysis. In addition, our study had relatively small numbers of participants in both groups of residents (17 in the experimental group, 30 in the control group). We addressed these limitations by analyzing the recordings that included only those residents who were recorded in both the pre- and post-workshop sessions, eliminating the possibility that residents who used EBM more regularly would bias the results.
Randomized controlled trials of educational interventions are very difficult to execute. There is usually no method of preventing cross-contamination of groups, even in large populations. It is difficult to assign residents to different sites after random allocation, and random allocation is not always an effective means of controlling contamination. Because of our residents' transition at the end of the academic year, we were unable to obtain as much recording time at the control group site in the six-month follow-up as we did prior to the workshops. However, there was a considerable decrease in recording time after the intervention at the experimental site, yet there was a large increase in the use of EBM terminology among these participants. In addition, the recording times at the control site were longer than at the experimental site, so any potential bias would tend to support our study's conclusion.
We could not demonstrate that the use of EBM terms by residents and faculty in continuity clinics translated into changes in patient care or improved health outcomes. In listening to the tapes, it became clear to us that essentially all of the actions discussed in precepting sessions would result in significant exploration and/or application of the EBM principle during that or subsequent patient visits or interventions. Thus, we believe that our data better reflect the actual implementation of EBM during patient care better than any previous published data. Further investigation into the actual patient outcomes of EBM education and intervention is warranted. To achieve clinical outcome measurement, a very complex study would have to be undertaken. Given the importance of using effective interventions during medical practice, such a study could provide valuable insights into the most effective ways of transferring EBM theory into practice.
The intervention in our study was effective in a small family practice residency program, and the techniques may not be transferable to other programs or different training venues. However, our residency program is not unique among community-based FP residencies, nor is it unlike many primary care programs located in the community hospital setting. Certainly the workshops and our study of improving EBM skills are worth replicating, introducing, and testing in other settings.
Introducing an EBM workshop series in a family practice residency effectively increased the residents' use of EBM terminology and at least conversational application of EBM principles during continuity clinics. There was also a significant improvements of test scores on an objective measurement instrument in the experimental group of residents. Our study adds credibility to the addition of an EBM-specific curriculum to residency programs by providing evidence of improvements of both residents' objective knowledge and its application in the clinical setting. Further studies are warranted to demonstrate whether educational interventions will alter directly measured patient outcomes as well.
1. Feinstein AR, Horwitz RI. Problems in the “evidence” of “evidence-based medicine.” Am J Med. 1997;103:529–35.
2. Fagan MG, Griffith RA. An evidence-based physical diagnosis curriculum for third-year internal medicine clerks. Acad Med. 2000;75:528–9.
3. Ellis P, Green M, Kernan W. An evidence-based medical curriculum for medical students. Acad Med. 2000;75:528.
4. Matson CC, Morrison RD, Ullian JA. A medical school-managed care partnership to teach evidence-based medicine. Acad Med. 2000;75:526–7.
5. Rucker L, Morrison E. The “EBM Rx.” Acad Med. 2000;75:527–8.
6. Dunn K, Wallace EZ, Leipzig RM. A dissemination model for teaching evidence-based medicine. Acad Med. 2000;75:525–6.
7. Norman GR, Shannon SI. Effectiveness of instruction in critical appraisal (evidence-based medicine) skills: a critical appraisal. Can Med Assoc J. 1998;158:177–81.
8. Green ML, Ellis PJ. Impact of an evidence-based medicine curriculum based on adult learning theory. J Gen Intern Med. 1997;12:742–50.
9. Wadland WC, Barry HC, Farquhar HC, et al. Training medical students in evidence-based medicine: a community campus approach. Fam Med. 1999;10:703–8.
10. Ibbotson T, Grimshaw J, Grant A. Evaluation of a programme of workshops promoting the teaching of critical appraisal skills. Med Educ. 1998;32:486–91.
11. Schoenfield P, Cruess D, Peterson W. Effect of an evidence-based medicine seminar on participants' interpretations of clinical trials. Acad Med. 2000;75:1212–4.
12. Neville AJ, Reiter HI, Eva KW, Norman GR. Critical appraisal turkey shoot. Linking critical appraisal to clinical decision making. Acad Med. 2000;75(suppl):S87–S89.
* Key words or phrases: literature, evidence, articles or article, searches, studies (relating to journals or articles), Medline, Grateful Med, critical appraisal, guidelines, references, peer review. Epidemiological terms: specificity, sensitivity, likelihood ratio, odds ratio, reliability, validity, analysis, efficacy, effectiveness, probabilities, utility, controlled, blinded, cohort, retrospective, sample, inclusion criteria, meta-analysis, trial, relative risk, absolute risk, number needed to treat, gold standard, positive and negative predictive values, case–control, pre– and post-test probability, outcome measures, confounding variable. References to literature: names of specific journals such as JAMA, New England Journal of Medicine, or specific reviews such as POEM's, ACP Journal Club, Evidence-Based Medicine. Cited Here...