Skip Navigation LinksHome > December 2000 - Volume 75 - Issue 12 > What Evidence Supports Teaching Evidence‐based Medicine?
Academic Medicine:
Institutional Issues: Commentaries

What Evidence Supports Teaching Evidence‐based Medicine?

Dobbie, Alison E. MB, ChB; Schneider, F. David MD, MSPH; Anderson, Anthony D. MD; Littlefield, John PhD

Free Access
Article Outline
Collapse Box

Author Information

Dr. Dobbie is director of medical student programs, Department of Family Practice, University of Texas Health Science Center at San Antonio (UTHSC San Antonio); Dr. Schneider is residency director, Department of Family and Community Medicine, UTHSC San Antonio; Dr. Anderson is director of research, Family Practice Residency of the Brazos Valley, College Station, Texas; and Dr. Littlefield is evaluation specialist, Educational Resources and Development, UTHSC San Antonio.

Evidence-based teaching and learning are “hot topics” in medical education. Teaching critical thinking and appraisal skills to learners should give them a current knowledge base, a constantly questioning attitude, and the tools for lifelong learning. However, what is the evidence that teaching evidence-based medicine (EBM) actually changes learners' behaviors and that such changes eventually translate into better patient care and outcomes?

Ironically, few published studies have evaluated the teaching of evidence-based medicine, and two recent critical reviews of EBM curricula offer disappointing conclusions as to the effectiveness of such programs. Norman and Shannon1 found evidence that teaching critical appraisal skills can increase students', but not residents', knowledge of epidemiology. Green2 reviewed published reports on 18 teaching programs and concluded that most of the studies had poor teaching methods and inadequate evaluation methods. He concluded that, “in those studies that were methodologically rigorous, the curricula's effectiveness in improving knowledge and skills was modest.” There are thus few good tools for measuring shortterm outcomes of evidence-based teaching and learning (for example, how well learners acquire the basic knowledge and skills of EBM), and fewer yet to measure whether learners' behaviors change or are maintained over time. It is even more difficult to determine whether teaching EBM techniques benefits patients in terms of reduced morbidity and mortality. In fact, some authors have postulated that using EBM can adversely affect patient care, by devaluing the “non-evidentiary aspects of medical practice,” such as clinical judgment and expert opinion.3

We conducted a small-group discussion session entitled “How Can We Best Evaluate The Teaching of Evidence-based Medicine?” at the 1999 annual meeting of the Association of American Medical Colleges. Approximately 40 MD and non-MD faculty at all levels of seniority participated, representing many different specialties and medical schools across the country. Most were currently involved in the development and administration of EBM teaching programs for students and residents at their home institutions. The discussion centered on the following four questions: (1) What is the evidence that teaching evidence-based learning techniques changes learners' behaviors? (2) How can we collaborate to produce this evidence? (3) What new tools and strategies can we invent as a group? (4) Is there any valid and reliable way to measure the effect of EBM teaching and learning on patient outcomes? This paper presents a summary of the group's discussion, with suggestions for future work needed to evaluate the outcomes of EBM teaching programs.

Back to Top | Article Outline

WHAT IS THE EVIDENCE THAT TEACHING EVIDENCE-BASED LEARNING TECHNIQUES CHANGES A LEARNER'S BEHAVIOR?

There is little evidence that EBM teaching programs change learners' behaviors. As we mentioned above, in 1998, Norman and Shannon evaluated four programs for students and three residency interventions that passed their inclusion criteria. The student teaching interventions, which ranged in length from three to 16 hours, showed an average 17% improvement in students' knowledge of critical appraisal. However, none of the residency teaching interventions produced a clinically significant change in knowledge. The authors concluded that, “Although the goal of EBM (and by extension, of teaching critical appraisal skills) is ultimately to improve patient care decisions by providing practicing physicians with tools to keep up to date with current literature, there is as yet no evidence that the gains in knowledge demonstrated in undergraduate critical appraisal courses can be sustained into residency and practice and eventually translated into improved patient outcomes.”1

None of the participants knew of any additional studies showing that the teaching of EBM techniques changes learners' behaviors in either the short or the long term, leads to changes in practice behaviors, or improves patient outcomes. Several participants commented that to change behavior, the teaching of EBM should be brought “out of the journal club and into the clinic.” Some novel ways suggested to evaluate learner behavior included tracking how often learners cite evidence from the literature in the medical record (through chart review), during rounds (as recorded by the attending), and during patient encounters (using direct or video surveillance or simulated-patient raters.)

Back to Top | Article Outline

HOW CAN WE COLLABORATE TO PRODUCE THIS EVIDENCE?

There was general agreement that we need multi-program collaboration over an extended time period in order to obtain significant data to evaluate evidence-based teaching interventions. This could be accomplished using a multi-centered database of learners, who could be tracked after graduation and periodically evaluated for practice style and the use of EBM techniques through surveys, chart review, board scores, patient-satisfaction ratings, and clinical outcomes. Participants generally agreed that we would need some standardization of educational interventions and evaluation tools among collaborating institutions. The difficulty in finding control groups as the teaching of EBM techniques becomes more and more widespread in the medical community might lead to biased results.

The practice of EBM by community physicians may erode significantly without continuing formal reinforcement, and this might be addressed by the formation of evidence-based continuing medical education.

Back to Top | Article Outline

WHAT TOOLS AND STRATEGIES CAN WE INVENT TO MEASURE THE OUTCOMES OF EBM TEACHING PROGRAMS?

Participants suggested using the National Board of Medical Examiners' questions on EBM strategies and usage or OSCE-type stations to evaluate learner's abilities to retrieve information from Medline databases. Computers could be used to log and track learners' specific search strategies and to follow changes in search-behavior patterns.

Population-based evaluation methods such as HEDIS (Health Plan Employer Data and Information Set) reporting will become useful tools for measuring patient outcomes. As electronic medical records become more widely used, it may become the norm for physician groups to audit most aspects of their practice, as it is, for example, for physicians in the United Kingdom.

Back to Top | Article Outline

IS THERE ANY VALID AND RELIABLE WAY TO MEASURE THE EFFECT OF EBM TEACHING AND LEARNING ON PATIENT OUTCOMES?

The consensus of the group was, “Not yet.” Until there are national mechanisms to measure population outcomes and group-physician behavior patterns, there is no way to accomplish this. We can monitor the use of, and patient outcomes from, the application of clinical guidelines in small population settings (for example, one hospital or HMO), but guidelines are not always evidence-based, and, if they are, front-line practitioners, may well not know the evidence behind them. To address this issue, medical schools will have to produce “medical scholars” in addition to “competent practitioners.” In fact, given the rapidly changing knowledge base in medicine, becoming an “information acquisition master” is now not only desirable, but mandatory.

Back to Top | Article Outline

WHAT IF THE EMPEROR HAS NO CLOTHES?

Are we accomplishing anything in our efforts to teach evidence-based medicine techniques and philosophy? There is currently no good evidence that EBM teaching programs produce sustained changes in learners' practice behaviors or improvements in patient treatments and outcomes. We must continue to move forward with the teaching of EBM, but belief in the cause is not enough. Fanaticism has been defined as “redoubling one's efforts while simultaneously losing sight of one's goals,” and we must not fall into this trap. The medical education community must collaborate to formally measure and evaluate the impacts of current EBM teaching on student, resident, and practitioner behaviors, and ultimately on patient outcomes. We must teach EBM principles as seriously and systematically and evaluate EBM programs as rigorously as we do any other educational intervention.

If we cannot prove that teaching EBM changes medical practice and patient outcomes for the better, should we continue our crusade? It is difficult and inappropriate to continue teaching EBM without some evidence that our interventions are beneficial to learners and patients. After rigorous evaluation of the outcomes of EBM teaching programs, we may discover that the emperior truly has no clothes. If so, we must face facts and be prepared to abandon the teaching of EBM. We must not wait for a child in the crowd to point out our foolishness.

Back to Top | Article Outline

REFERENCES

1. Norman GR, Shannon SI. Effectiveness of instruction in critical appraisal (evidence-based medicine) skills: a critical appraisal. Can Med Assoc J. 1998;158(2):177–81.

2. Green ML. Grad medical education training in clinical epidemiology, critical appraisal, and evidence-based medicine: a critical review of curricula. Acad Med. 1999;74:686–94.

3. Tonelli MR. The philosophical limits of EBM. Acad Med. 1998;73:1234–40.

© 2000 Association of American Medical Colleges

Login

Article Tools

Share