Skip Navigation LinksHome > April 2003 - Volume 78 - Issue 4 > Introducing an Evidence‐based Medicine Curriculum into a Fam...
Academic Medicine:
Research Reports

Introducing an Evidence‐based Medicine Curriculum into a Family Practice Residency—Is It Effective?

Ross, Robert MD, MScEd; Verdieck, Alex MD

Free Access
Article Outline
Collapse Box

Author Information

Dr. Ross is associate professor of Family Medicine at Oregon Health and Science University and Cascades East Family Practice Residency, in Klamath Falls, Oregon. At the time of this study, Dr. Verdieck was a chief resident at Oregon Health and Science University in Portland.

Correspondence and requests for reprints should be addressed to Dr. Ross, Cascades East FPR/Oregon Health Sciences University, 2801 Daggett, Klamath Falls, OR 97601; telephone: (541) 885-4612; fax: (541) 885-0328; e-mail: 〈robr@cdsnet.net〉.

Collapse Box

Abstract

Purpose: To investigate whether teaching an evidence-based medicine (EBM) curriculum increased the knowledge and use of EBM principles in residents' continuity clinics.

Method: In 1999, the authors performed a needs assessment with residents and faculty of Cascades East Family Practice Residency in Oregon and constructed a ten-session EBM workshop series that was introduced into the curriculum in 2000. Resident–preceptor interactions during outpatient continuity clinic were tape-recorded prior to and six months following introduction of the curriculum. A 50-item, multiple-choice examination was administered before and after the workshop series. Residents at another FP residency at the same university served as a control group. The same assessments were applied to the experimental and control groups. The tape recordings were analyzed for interactions that contained key EBM phrases or words.

Results: Pre-intervention multiple-choice test results were similar (control mean 56%, experimental 53%, p > .22 NS). Post-intervention test scores for the experimental group were significantly improved (mean 72%, p < .001). There was no significant improvement in test results among members of the control group (p > .05 NS). In the recorded resident–preceptor interactions, a marked increase in the use of EBM terms indicated awareness and/or use of EBM in the experimental group. In 1,165 minutes recorded prior to the workshops, EBM terms were used in a total of ten events. In 735 minutes recorded after the workshops, EBM terms were recorded in 67 events. A reduced number of EBM terms were recorded in the control group.

Conclusion: Administering a structured EBM curriculum increased residents' knowledge and use of EBM constructs during patient care.

A large body of literature discusses the techniques of critical appraisal and evidence-based medicine (EBM). EBM is defined as the conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual patients. Considerable resources have been applied to teaching EBM principles and practice but there is a paucity of evidence that such programs increase knowledge, change clinical behavior, or improve patient care. EBM has not been universally accepted as a novel or broadly applicable method. A review of problems associated with using EBM pointed out the constraints of applying EBM to the care of individual patients. In the literature that espouses universal practice of EBM, most clinical decisions are not evidentially supported by the results of randomized controlled trials, despite some advocates' contradictory interpretations.1 Ultimately, however, many of the goals of teaching EBM seem fundamentally important to practicing physicians. EBM can improve patient care decisions and potentially enhance patient outcomes, as well as maximize the effectiveness of care. It may be that the best method for transferring the knowledge of and ability to practice EBM in residency programs, and eventually, in clinical practice after residency, has not yet been developed. One credible goal of residency education should be to endow our residents with a practical working knowledge of EBM, so we can graduate competent, confident life-long learners who provide first-class patient care according to the best evidence available.

Many published articles outline proposals and curricula in EBM and critical appraisal at all stages of medical education; most of these articles have little or no experimental evidence of the effectiveness of teaching EBM. Several articles present the design and implementation of innovative EBM programs for students,2,3,4 residents,5 and practicing physicians,6 but many of these studies were methodologically weak. Most lacked control groups and reported only increased scores on indirect measure, such as multiple-choice tests.

These studies lacked convincing controlled-trial experimental validation to support the teaching of EBM. A published systematic review of the literature supporting EBM interventions in medical education found that no study had affirmed the effectiveness of any specific technique of teaching these skills to medical students or residents.7 The review contained only one study that had a control group and showed significant improvement on a free-text response test and inquiries into the perceived clinical use of EBM. Another article demonstrated that no study had measurements based on actual patient interactions.8

Since 1998, a number of studies have investigated the effectiveness of EBM education. A study that used responses to the Association of American Medical Colleges' Graduation Questionnaire and focused surveys demonstrated a difference in confidence between graduating classes that received EBM training and those that did not. This study did not test knowledge or application of EBM principles, and the assessments of control and experimental groups were separated by a significant time lag.9 A study in Scotland that tested the effectiveness of a workshop program in EBM in a before-and-after design had no control group.10 A significant improvement in EBM knowledge assessed by performance on a multicomponent test was evident in a group of gastroenterology fellows who attended a seminar on EBM but, once again, this study did not have a control group.11 Another study demonstrated first-year medical students' ability to incorporate EBM skills when they read medical literature on a clinical problem and to discern methodologic flaws in published clinical trials.12 We could not find any study that had a control group of residents and that measured the change in their knowledge or clinical application of EBM.

Our study investigated the effectiveness of an easily integrated curriculum in increasing residents' knowledge and application of the principles of EBM during outpatient continuity care. To our knowledge, ours is the only educational, interventional study that had a control group of residents during objective testing of EBM knowledge and that analyzed application of EBM principles during patient care.

Back to Top | Article Outline

METHOD

Setting and Participants

We conducted our study during the 1999–2000 academic year. Participants were aware of the recording of resident–preceptor interactions and were informed that they were participating in an experiment to improve precepting. Because of the necessity of masking the purpose of our recording (to avoid contamination) and the fact that this was an educational intervention, only verbal consent was deemed necessary to proceed with the study. The intervention was viewed as an educational event by the two institutional review boards involved in the approval process. Participation in our study was voluntary, and participants received no rewards for their participation. Our experimental group of 18 residents were in Merle West Medical Center (MWMC)/ Cascades East Family Practice Residency, a university-administered, community-based residency program in Oregon designed to train residents to practice in rural settings. MWMC is a 146-bed hospital in a community of approximately 40,000 inhabitants. The control group of 30 residents was in a residency program at Oregon Health and Science University Family Practice Residency, an urban, university-based site where care is delivered both at a hospital clinic and at two suburban satellite locations.

Back to Top | Article Outline
Needs Assessment and Curriculum Design

Prior to introducing the EBM curriculum, we used the following three techniques to investigate the perceived needs of our residency program regarding specific teaching of EBM skills:

1. Review of literature: One of us (RR) searched the medical and educational literature (Medline 1966–2001, Embase, and ERIC) using the terms “education,” “EBM,” “evidence-based medicine,” and “critical appraisal.” RR searched the bibliographies of the retrieved articles for any other pertinent literature that was overlooked by the electronic search.

2. Group meetings: At our regularly scheduled executive committee meetings at the residency program (attended by the residency director, chief residents, and representatives from administration and clinic staff), there was broad consensus for introducing a specific course to address perceived deficiencies in EBM skills.

3. Survey: We designed a questionnaire and circulated it to all residents and faculty members at Merle West Medical Center/ Cascades East FPR. Most respondents (25 questionnaires distributed with a 100% response rate: 18 residents, seven faculty) felt they possessed at least a rudimentary knowledge of EBM, with a modal score on a Likert scale of 3/7 (1 = strongly disagree to 7 = strongly agree, responding to the statement: “I know how to apply the principles of evidence-based medicine when I review articles in the medical literature”). However, all except one respondent felt that there would be great value in introducing an EBM course into the residency curriculum (modal score 6/7 responding to the statement: “I think that a course specifically designed to help me use evidence-based medicine would be valuable in residency”).

Based on this information, we introduced a series of ten workshops accompanied by pre- and post-intervention measurements of knowledge and the application of EBM during continuity clinic patient care. The workshops were interactive, one-to-two hour sessions with a brief lecture (30–40 minutes) on an EBM topic/principle followed by practical application of the material in an interactive session using articles from the medical literature. We encouraged participants' active involvement. List 1 summarizes the workshop topics.

List 1
List 1
Image Tools

We then formulated the following research hypothesis: “The introduction of an evidence-based medical curriculum into the residency program will result in increased knowledge of EBM and the application of more critically appraised interventions during the care of patients seen at the family practice center during resident continuity clinics.” We tested our hypothesis by tape-recording and analyzing preceptor–faculty/resident interactions during these clinics before and after the workshops. The institutional review boards of Oregon Health and Science University and Merle West Medical Center approved our study.

Back to Top | Article Outline
Study Design

Because of the inherent difficulty conducting randomized controlled trials in the educational environment (especially among a small number of residents such as at our location), we decided to use a control group of family practice (FP) residents in the same department, but that was located at the urban site 300 miles distant. The same techniques of test taking and recording were performed with the control group, on the same time line as the experimental group. The control site, like the experimental site, is a fully accredited FP residency, but it does not share any significant residency education programs with the experimental site, has no specific course in EBM, and represented an adequate and relatively accessible control group. We did not formally test the similarities of the control or experimental groups because we were attempting to replicate a “real-life” situation faced by most residents. The residents we receive through the residency match will usually remain in our programs for the duration of their residencies. Because of the control group design and the historic difficulty in obtaining questionnaire responses from residents, we did not formally pursue background information about the residents. We felt that the groups were as closely matched as was practicable. Both groups had the same institutional access to computers, electronic references, and library resources. The majority of residents in both programs were graduates of U.S. medical schools, were about equally divided between sexes, and had had variable exposures to the principles and practice of EBM prior to graduation from medical school. The residents also had very similar scores on the pre-EBM workshop multiple-choice exam, indicating similar depths of EBM knowledge prior to participating in our study.

We tape-recorded resident–preceptor interactions prior to introducing an EBM curriculum. Both preceptors and residents were unaware of the intended use of the recordings and were later informed that the recordings would be used to improve teaching during continuity clinic. We tabulated interactions that contained key phrases or words that we assumed were surrogates for EBM care. We agreed on the terms after one of us (RR) discussed with EBM/research colleagues and authorities at McMaster University in Ontario, Canada, about an acceptable method for choosing key phrases or words for inclusion. Our consultants suggested that a method of including all widely used terms* was a valid method of initiating the study and analyzing the recordings. Before use, we and other Oregon Health and Science University (OHSU) faculty members reviewed the terms for any questionable inclusions or exclusions.

We then introduced the curriculum of the ten one-hour EBM workshops at the experimental site. The workshops took place once a week for ten weeks. We gave the attendees pocket-sized reference cards outlining the EBM techniques discussed at each session. The workshops covered the major subjects that a practicing family physician will be likely to encounter, with an emphasis on the analysis of review articles and articles on therapy (see List 1). One of us (RR) designed the curriculum in house and crafted it to represent the learning needs of family practice residents.

Six months after the participants completed the workshops, we undertook an identical recording process. During the study, we did not disclose the description of the study or its intent to any participants. Comparable procedures were performed with the control group of residents in the OHSU residency program in Portland: the original recording/observation at that site and the follow-up six months later. The control group received no special instruction in EBM. Multiple recording sessions occurred at both sites during continuity clinics.

We supplemented both groups' continuity clinic recordings with a 50-item pre- and post-workshop open-book multiple-choice examination based on the workshop's content. The examination, designed by one of us (RR), was pre-tested for content and face validity by administering it to faculty at both Cascades East Family Practice Residency in Klamath Falls, Oregon, and family practice faculty at the State University of New York, Albany. The multiple-choice examination was reviewed by peers at the study sites (Cascades East and OHSU) and by outside reviewers at the University of Southern California, Los Angeles, and the SUNY family practice residency to ensure the validity of content and question design. We chose an open-book format because our study was designed to closely mirror use of EBM in daily practice. We felt that most practitioners of EBM rely on assistance from written or electronic resources when they analyze articles and reference material.

Back to Top | Article Outline

RESULTS

Our analysis of pre-intervention multiple-choice test results using t-tests to compare means showed no significant difference, with a mean test score for control residents of 56% and an experimental group mean of 53% (p > .22 NS). Post-intervention test scores for the 17 experimental group residents who were present for both the pre- and post-tests (one resident at the experimental site was absent for the post-test) were significantly improved, with a mean of 72% (p < .001). There was no significant change in the multiple-choice test scores of the control group (mean 62% p > .05 NS). During recorded resident–preceptor interactions, there was a dramatic increase in the use of terms that indicated awareness and/or use of EBM principles. Prior to our data analysis, we eliminated terms used by residents who did not participate in both the pre- and post-workshop recording sessions.

The results of our analysis of the tape-recorded interactions are summarized in Table 1 as the numbers of EBM terms recorded divided into three groups: residents, preceptors, and then a combined total of residents and preceptors. The table also includes the amount of time recorded for analysis at each site. For example, prior to the workshops at the experimental site, a total of 1,165 minutes were recorded, and following the workshop a, a total of 735 minutes were recorded. Both residents and preceptors in the experimental group demonstrated remarkable increases in the use of EBM terminology, with residents using four terms before and 36 terms after, and preceptors using six before and 31 after the workshop series. The preceptors' increased use of EBM terminology may reflect the fact that the EBM workshop series was open to them as well to the residents. By contrast, there was a decrease in the use of EBM terminology at the control site. We analyzed the results for statistical significance using Pearson's chi-square for all but the preceptor terms, for which we used Fischer's exact test. As can be seen from Table 1 and our analysis, the changes were dramatic and significant in the resident analysis alone, as well as for the total number of interactions. Although numerically impressive, the results for terms used by preceptors alone were not statistically significant when analyzed with the Fischer test.

Table 1
Table 1
Image Tools
Back to Top | Article Outline

DISCUSSION

Our study demonstrated the effectiveness of a workshop series in improving family practice residents' knowledge of EBM. Perhaps more importantly, the workshops enhanced use of EBM terminology and increased discussion of EBM principles during patient care. Our study makes a significant contribution to our knowledge of effective EBM education because we used a control group of residents during objective testing of EBM knowledge, as well as during the analysis of application of EBM principles during patient care. There was a substantial increase in the knowledge base of residents in the experimental group, as well as a marked increase in the use of EBM terms during interactions with patients, both by residents and by preceptors.

Limitations of our study would include the fact that the experimental and control groups were not randomized samples, and that the preceptors and residents recorded in the initial and follow-up sessions were not identical before manipulation of the data. Although we attempted to use similar faculty and residents during the recorded continuity clinic sessions at the control and experimental sites before and six months after the intervention, this proved to be impossible. Because of the nature of family practice residency programs, many residents were on “away” rotations or vacation during the recording sessions. Despite this problem, the same ten preceptors were involved in both of the recording sessions at the experimental site, and 12 of the 14 residents were at both recording sessions (three first-year residents, four second-year, and five third-year). At the control site, 16 preceptors participated, only one of whom did not precept at some time during both recording sessions. Thirty residents were recorded at the control site, 23 of whom were in both the initial and follow-up recordings (ten first-year residents, three second-year, and ten third-year). Only eight terms during all of the recording sessions at both sites originated from the residents who did not participate in both pre- and post-continuity clinic recording sessions. We removed these data prior to reporting or analysis. In addition, our study had relatively small numbers of participants in both groups of residents (17 in the experimental group, 30 in the control group). We addressed these limitations by analyzing the recordings that included only those residents who were recorded in both the pre- and post-workshop sessions, eliminating the possibility that residents who used EBM more regularly would bias the results.

Randomized controlled trials of educational interventions are very difficult to execute. There is usually no method of preventing cross-contamination of groups, even in large populations. It is difficult to assign residents to different sites after random allocation, and random allocation is not always an effective means of controlling contamination. Because of our residents' transition at the end of the academic year, we were unable to obtain as much recording time at the control group site in the six-month follow-up as we did prior to the workshops. However, there was a considerable decrease in recording time after the intervention at the experimental site, yet there was a large increase in the use of EBM terminology among these participants. In addition, the recording times at the control site were longer than at the experimental site, so any potential bias would tend to support our study's conclusion.

We could not demonstrate that the use of EBM terms by residents and faculty in continuity clinics translated into changes in patient care or improved health outcomes. In listening to the tapes, it became clear to us that essentially all of the actions discussed in precepting sessions would result in significant exploration and/or application of the EBM principle during that or subsequent patient visits or interventions. Thus, we believe that our data better reflect the actual implementation of EBM during patient care better than any previous published data. Further investigation into the actual patient outcomes of EBM education and intervention is warranted. To achieve clinical outcome measurement, a very complex study would have to be undertaken. Given the importance of using effective interventions during medical practice, such a study could provide valuable insights into the most effective ways of transferring EBM theory into practice.

The intervention in our study was effective in a small family practice residency program, and the techniques may not be transferable to other programs or different training venues. However, our residency program is not unique among community-based FP residencies, nor is it unlike many primary care programs located in the community hospital setting. Certainly the workshops and our study of improving EBM skills are worth replicating, introducing, and testing in other settings.

Introducing an EBM workshop series in a family practice residency effectively increased the residents' use of EBM terminology and at least conversational application of EBM principles during continuity clinics. There was also a significant improvements of test scores on an objective measurement instrument in the experimental group of residents. Our study adds credibility to the addition of an EBM-specific curriculum to residency programs by providing evidence of improvements of both residents' objective knowledge and its application in the clinical setting. Further studies are warranted to demonstrate whether educational interventions will alter directly measured patient outcomes as well.

Back to Top | Article Outline

REFERENCES

1. Feinstein AR, Horwitz RI. Problems in the “evidence” of “evidence-based medicine.” Am J Med. 1997;103:529–35.

2. Fagan MG, Griffith RA. An evidence-based physical diagnosis curriculum for third-year internal medicine clerks. Acad Med. 2000;75:528–9.

3. Ellis P, Green M, Kernan W. An evidence-based medical curriculum for medical students. Acad Med. 2000;75:528.

4. Matson CC, Morrison RD, Ullian JA. A medical school-managed care partnership to teach evidence-based medicine. Acad Med. 2000;75:526–7.

5. Rucker L, Morrison E. The “EBM Rx.” Acad Med. 2000;75:527–8.

6. Dunn K, Wallace EZ, Leipzig RM. A dissemination model for teaching evidence-based medicine. Acad Med. 2000;75:525–6.

7. Norman GR, Shannon SI. Effectiveness of instruction in critical appraisal (evidence-based medicine) skills: a critical appraisal. Can Med Assoc J. 1998;158:177–81.

8. Green ML, Ellis PJ. Impact of an evidence-based medicine curriculum based on adult learning theory. J Gen Intern Med. 1997;12:742–50.

9. Wadland WC, Barry HC, Farquhar HC, et al. Training medical students in evidence-based medicine: a community campus approach. Fam Med. 1999;10:703–8.

10. Ibbotson T, Grimshaw J, Grant A. Evaluation of a programme of workshops promoting the teaching of critical appraisal skills. Med Educ. 1998;32:486–91.

11. Schoenfield P, Cruess D, Peterson W. Effect of an evidence-based medicine seminar on participants' interpretations of clinical trials. Acad Med. 2000;75:1212–4.

12. Neville AJ, Reiter HI, Eva KW, Norman GR. Critical appraisal turkey shoot. Linking critical appraisal to clinical decision making. Acad Med. 2000;75(suppl):S87–S89.

* Key words or phrases: literature, evidence, articles or article, searches, studies (relating to journals or articles), Medline, Grateful Med, critical appraisal, guidelines, references, peer review. Epidemiological terms: specificity, sensitivity, likelihood ratio, odds ratio, reliability, validity, analysis, efficacy, effectiveness, probabilities, utility, controlled, blinded, cohort, retrospective, sample, inclusion criteria, meta-analysis, trial, relative risk, absolute risk, number needed to treat, gold standard, positive and negative predictive values, case–control, pre– and post-test probability, outcome measures, confounding variable. References to literature: names of specific journals such as JAMA, New England Journal of Medicine, or specific reviews such as POEM's, ACP Journal Club, Evidence-Based Medicine. Cited Here...

Cited By:

This article has been cited 16 time(s).

Journal of Manipulative and Physiological Therapeutics
Development and Psychometric Evaluation of An Evidence-Based Practice Questionnaire for A Chiropractic Curriculum
Leo, MC; Peterson, D; Haas, M; LeFebvre, R; Bhalerao, S
Journal of Manipulative and Physiological Therapeutics, 35(9): 692-700.
10.1016/j.jmpt.2012.10.011
CrossRef
Journal of General Internal Medicine
Impact of an Evidence-Based Medicine Curriculum on Resident Use of Electronic Resources: A Randomized Controlled Study
Kim, S; Willett, LR; Murphy, DJ; O'Rourke, K; Sharma, R; Shea, JA
Journal of General Internal Medicine, 23(): 1804-1808.
10.1007/s11606-008-0766-y
CrossRef
British Medical Journal
What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review
Coomarasamy, A; Khan, KS
British Medical Journal, 329(): 1017-1019.

American Journal of Managed Care
Teaching evidence-based medicine in a managed care setting: From didactic exercise to pharmacopolicy development tool
Kahan, NR; Fogelman, Y; Waitman, DA; Kahan, E; Bar-Yochai, A; Meidan, A; Kitai, E
American Journal of Managed Care, 11(9): 570-572.

Journal of Evaluation in Clinical Practice
Knowledge and attitudes of trainee physicians regarding evidence-based medicine: a questionnaire survey in Tehran, Iran
Ahmadi-Abhari, S; Soltani, A; Hosseinpanah, F
Journal of Evaluation in Clinical Practice, 14(5): 775-779.
10.1111/j.1365-2753.2008.01073.x
CrossRef
Journal of General Internal Medicine
Beyond journal clubs - Moving toward an integrated evidence-based medicine curriculum
Hatala, R; Keitz, SA; Wilson, MC; Guyatt, G
Journal of General Internal Medicine, 21(5): 538-541.
10.1111/1525-1497.2006.00445.x
CrossRef
Journal of General Internal Medicine
Reforming internal medicine residency training - A report from the Society of General Internal Medicine's task force for residency reform
Holmboe, ES; Bowen, JL; Green, M; Gregg, J; DiFrancesco, L; Reynolds, E; Alguire, P; Battinelli, D; Lucey, C; Duffy, D
Journal of General Internal Medicine, 20(): 1165-1172.
10.1111/j.1525-1497.2005.0249.x
CrossRef
British Medical Journal
Commentary: The fool wonders, the wise (women) ask ... about tropical diseases in their practice
Green, ML
British Medical Journal, 329(): 1023.
10.1136/bmj.38257.549653.55
CrossRef
British Medical Journal
Evaluating the teaching of evidence based medicine: conceptual framework
Straus, SE; Green, ML; Bell, DS; Badgett, R; Davis, D; Gerrity, M; Ortiz, E; Shaneyfelt, TM; Whelan, C; Mangrulkar, R
British Medical Journal, 329(): 1029-1032.

Ambulatory Pediatrics
Evidence-based medicine in pediatric residency programs: Where are we now?
Kersten, HB; Randis, TM; Giardino, AP
Ambulatory Pediatrics, 5(5): 302-305.

Jama-Journal of the American Medical Association
Instruments for evaluating education in evidence-based practice - A systematic review
Shaneyfelt, T; Baum, KD; Bell, D; Feldstein, D; Houston, TK; Kaatz, S; Whelan, C; Green, M
Jama-Journal of the American Medical Association, 296(9): 1116-1127.

Bmc Health Services Research
Evidence based practice in postgraduate healthcare education: A systematic review
Flores-Mateo, G; Argimon, JM
Bmc Health Services Research, 7(): -.
ARTN 119
CrossRef
Health Information and Libraries Journal
Evaluating information skills training in health libraries: a systematic review
Brettle, A
Health Information and Libraries Journal, 24(): 18-37.
10.1111/j.1471-1842.2007.00740.x
CrossRef
Family Medicine
A training intervention to improve information management in primary care
Schifferdecker, KE; Reed, VA; Homa, K
Family Medicine, 40(6): 423-432.

Academic Medicine
Teaching Evidence-Based Medicine on a Busy Hospitalist Service: Residents Rate a Pilot Curriculum
Nicholson, LJ; Shieh, LY
Academic Medicine, 80(6): 607-609.

PDF (45)
Academic Medicine
Developing an Integrated Evidence-Based Medicine Curriculum for Family Medicine Residency at the University of Alberta
Allan, GM; Korownyk, C; Tan, A; Hindle, H; Kung, L; Manca, D
Academic Medicine, 83(6): 581-587.
10.1097/ACM.0b013e3181723a5c
PDF (103) | CrossRef
Back to Top | Article Outline

© 2003 Association of American Medical Colleges

Login

Article Tools

Images

Share