The term evidence-based medicine describes a clinical learning strategy that was developed by faculty at McMaster Medical School in the 1980s. Evidence-based medicine, disseminated to the medical community in the early 1990s, applies methods derived from clinical epidemiology to clinical decision making.1 Over the last decade, other medical professions adopted an evidence-based approach, and in 2000 the American Physical Therapy Association (APTA) identified evidence-based practice as one of five key areas for achieving the vision of autonomous practice by 2020.2
The recent emphasis on evidence-based practice challenges physical therapy educators to learn the principles of evidence-based medicine, to translate these principles into a framework for physical therapy practice, and to infuse evidence-based practice into course content and curricula. In order to understand the implications of evidence-based practice for physical therapy education, it is important to consider factors driving the evidence-based practice movement and differences between evidence-based and traditional clinical practice.
What Factors Are Driving the Evidence-Based Practice Movement?
The promise of evidence-based practice rests in its ability to provide a sound method for making clinical decisions to reduce idiosyncratic variations in clinical practice and bridge the gap between knowledge and practice.3,4 The founders of evidence-based medicine sought to ground medical decision making in rigorous scientific evidence to invalidate previously accepted but ineffective diagnostic tests and treatments and replace them with new, more powerful, accurate, efficacious, and safer ones.5 Thus, evidence-based practice offers an objective method for making clinical decisions and, because research evidence is embedded in the clinical decision making process, evidence-based practice promotes the translation of research into practice. The evidence-based practice movement was fueled by the dissemination potential of the Internet, which currently provides the medical community with powerful resources for searching, interpreting, organizing, and prioritizing research information, along with support for using research information in clinical practice.
How Does Evidence-Based Practice Differ From Traditional Physical Therapy Practice?
Faculty, students, and clinicians may not fully understand how evidence-based practice differs from the long-standing tradition of using research evidence in clinical practice. The quote below from a physical therapy student demonstrates how misconceptions can blur distinctions between evidence-based practice and the traditional use of research in clinical practice.
It seems the most common misconception among my peers is that evidence-based practice simply involves reading and applying the research. I learned that EBP requires critical appraisal of literature and quantifying the clinical relevance of research findings.6
In contrast to traditional physical therapy practice, evidence-based practice uses a well-defined framework and systematic approach for making clinical decisions based on research evidence.
This paper outlines competencies, describes the evidence-based practice process, and identifies program and learner conditions that support teaching evidence-based practice. Strategies for teaching critical competencies and promoting program and learner conditions are also identified. The concepts and teaching strategies presented in this paper are a compilation of recommendations extracted from the literature and the following sources: author participation in the International Conference of Evidence-Based Health Care Teachers and Developers (Sicily, Italy, September 2003) and a working group meeting (Ulm, Germany, October 2004), experience organizing and conducting the Faculty Summer Institute: Teaching Evidence-Based Practice in Rehabilitation Professional Curricula and interaction with the nearly 300 faculty who attended the Institute from 2000-2003, and working with faculty and students at Sargent College of Health and Rehabilitation Sciences. Quotations used throughout the paper were selected from a journal kept by a physical therapy student, Frederick Weiss, PT, DPT, during an independent study of teaching and learning evidence-based practice.6
An evidence-based approach to clinical practice requires competency in the following domains: asking clinical questions, conducting literature searches, critically appraising literature, applying evidence, and evaluating the process.7Table 1 presents a summary of skills in each evidence-based practice competency domain.
Domain I: Asking a Focused Clinical Question
The evidence-based practice process begins with a patient problem that cannot be adequately answered with existing knowledge. The student or clinician recognizes a knowledge gap and begins to define the learning issue. Before formulating a focused clinical question, the learning issue is identified as related to diagnosis, prognosis, therapy (intervention), or harm.
A well-formulated clinical question consists of four components, identified by the acronym PICO: Patient, Intervention, Comparison, and Outcome.7 A well-constructed clinical question establishes a link between research evidence and clinical decisions about patient care because the PICO components (intervention, comparison groups, and outcomes) are also examined in randomized controlled trials.
Teaching strategies. The first step in formulating a clinical question is to distinguish between background questions that focus on learning issues related to disease or pathology and evidence-based practice clinical questions that are answered with research evidence.7 Students can begin to identify learning issues during their clinical courses and, in the context of patient cases, develop evidence-based clinical questions, using the PICO format. As students progress through the curriculum and develop more sophisticated evidence-based practice skills, they can explore answers to clinical questions in greater detail. Faculty can help students develop evidence-based practice behaviors by requiring students to keep a notebook of learning issues and clinical questions that arise in the academic setting and during clinical education.
Domain II: Conducting Literature Searches
Due to the volume of available information, it is a challenge for busy clinicians to identify valid and relevant research evidence. Evidence-based practice uses a hierarchical framework, based on validity issues, to grade research evidence. This framework is available online through the Centre for Evidence-Based Medicine Web site.8 According to the evidence hierarchy, meta-analyses and systematic reviews, which summarize results of randomized controlled trials meeting specific validity criteria, provide stronger evidence than a single randomized controlled trial.7 An efficient literature search begins with the highest level of evidence and, therefore, literature searches should begin by looking for meta-analyses and systematic reviews. If no appropriate meta-analyses or systematic reviews on the subject are available, the search should then focus on randomized controlled trials. Finally, if there are no randomized controlled trials, the search should continue to look for well-designed cohort studies.
Search engines have greatly facilitated the literature search process and, in fact, the structure and design of several popular literature search engines incorporate an evidence-based practice framework. PubMed9 recently added an evidence-based medicine search filter termed “clinical queries.” This feature, which is found on a button located on the left side of the search Web page, allows the user to conduct a search based on the category of evidence (diagnosis, prognosis, or therapy) and to request results that are “focused” (specific) or “expanded” (sensitive). SUMsearch,10 a “meta” search engine developed at the University of Texas Health Center at San Antonio, facilitates literature searches by organizing results from multiple search engines based on the strength of evidence.
The evidence-based practice movement fostered development of “predigested” evidence summaries. The premier example of this effort is the Cochrane Collaboration,11 the largest worldwide organization dedicated to combining similar randomized trials to produce a more statistically sound result. The latest edition of the Cochrane Library contains 3,670 systematic reviews.11 The fact that reviews for the top 10 causes of disability in developed and developing nations are included in the Cochrane Library demonstrates the relevance of this resource to physical therapy practice.12 Abstracts of the Cochrane reviews are available electronically in a searchable database at the Cochrane Library Web site. Additional electronic databases with evidence summaries that are useful to physical therapists include the following: Hooked on Evidence,13 PEDro,14 OTSeeker,15 and DARE.16
Teaching strategies. Students can begin to develop competency in conducting effective and efficient literature searches by summarizing the specific features, advantages, and limitations of different literature search engines (eg, MEDLINE, PubMed, CINAHL, SUMsearch) and evidence databases (eg, Cochrane Library, PEDro, DARE). To become more familiar with search engines and databases, students can go on a literature scavenger hunt and return to compare and critique results and resources used. Throughout the curriculum, students can complete searches, document the tools used, and share search strategies. As students become adept in using the available resources, they can outline their intended strategy before beginning their search and critique their plan upon completion.
Domain III: Critically Appraising Literature
Critical appraisal of literature requires two steps: determining if the study is valid, and if validity is established, calculating the importance of the research findings.
Teaching strategies. To develop critical appraisal competency, students can begin by engaging in the following activities: critiquing the validity of journal articles, discussing the logic behind the evidence hierarchy and rating research articles using this framework, and reading several articles to compare and contrast validity issues. When assessing the importance of the research, students can apply clinical epidemiology knowledge to interpret study results and progress to assignments that require using data from the article to calculate the measures of importance, such as positive predictive value (PPV) for studies of diagnostic tests and numbers needed to treat (NNT) for evidence related to interventions.
The Canadian Medical Association Journal recently published a series of articles aimed at helping clinicians understand and apply the principles of evidence-based medicine.17-19 Articles are available online, at no cost, at the Canadian Medical Association Web site.20 This series is a unique and valuable resource for educators because it offers an online teacher's version of each article and provides access to interactive teaching exercises and PowerPoint slides.17 At the time this manuscript was submitted, published articles addressed relative risk reduction, absolute risk reduction, numbers needed to treat,18 and measures of precision.19 Future articles will address understanding estimates of interrater reliability and the impact of heterogeneity on systematic reviews.17
Assignments that require students to complete critically appraised topics (CATs) can help students develop this competency. David Sackett7 recommends a format for developing CATs that provides an excellent structure for appraisal of validity and importance. Critical appraisal worksheets for diagnosis, harm, prognosis, therapy, and systematic reviews are available at the Centre for Evidence-Based Medicine Web site.21 Worksheets provide a list of questions to establish validity and formulas for calculating measures of importance. Faculty can customize these worksheets so that students can use a common CAT template across all courses. Ideally, completed CATs can be stored, updated regularly, and shared with faculty, students, and clinical educators.
To develop critical appraisal competency, students can review and critique completed CATs and progress to developing CATs as a group activity. At the highest level, students can independently develop a CAT by synthesizing research evidence with conflicting findings.
Domain IV: Applying Evidence
Applying research evidence to individual patients is one of the most challenging aspects of evidence-based practice. While specific rules and processes guide the preceding evidence-based practice steps, applying research evidence to individual patients requires sound clinical judgment. Research evidence may not be applicable to an individual patient if age, disease characteristics, or comorbidities differ in important ways from subjects in the study sample. Patient preferences also affect decisions about how to apply research evidence.
The increase in consumerism and emphasis on patient-centered care reinforces the need to consider patient preferences when deciding on treatment choices. Patients and their families have access to a wealth of information through the Internet, the federal government, and national disability organizations. To help patients make informed choices, clinicians must be able to effectively communicate with patients by clearly summarizing research evidence, outlining treatment options, and discussing estimates of potential benefits and risks.
In addition to learning how to use research evidence to make decisions about individual patients, students can use research evidence to make service delivery decisions. For example, research evidence can help establish performance indicators, such as length of stay or expected outcomes for different patient groups.
Teaching strategies. Competency in applying evidence can be developed through written assignments and practical examinations that require students to summarize the research evidence, discuss the evidence with patients and, when appropriate, modify the treatment approach to match the patient's expectations, interests, and issues.22 Additional assignments can require students to use research evidence to make service delivery decisions.
Domain V: Evaluating the Process
The final step in the evidence-based practice process involves evaluation of the process with the goal of continuous improvement. Critical self-reflection is important to the learning process.23 The competencies presented in Table 1 provide a simple means for students to assess their evidence-based practice competencies and identify areas to target for improvement.
Teaching strategies. Assignments and clinical education experiences can include a self-assessment component to develop competency in evaluating the evidence-based practice process. Expanding this competency to include metacognition, as described in the next section, enhances the evaluation process.
Metacognition: The Sixth Step in Evidence-Based Practice
Critical reflection on the evidence-based practice process and development of an improvement plan expands the evaluation of the evidence-based practice process.24 This sixth step involves metacognition, which is often referred to as “thinking about thinking” and involves processes used to oversee cognition and determine if a goal has been met.25,26 Metacognition consists of three knowledge components,26 which are listed below with examples of activities to facilitate each (italicized).
- Person: Knowledge about how human beings learn and process information and one's own learning processes. Students keep a journal and reflect on their progress learning evidence-based practice.
- Task: Learning about processing demands of the task. Students outline evidence-based practice processes and steps.
- Strategy: Processes that one uses to control cognitive activities, check the outcome, and determine if a goal has been met. Students assess progress and develop improvement plans.
Embedding processes that facilitate metacognition in course content and curricula may promote continued professional development and life-long learning.
The Evidence-Based Practice Process
While there are specific evidence-based practice competencies, evidence-based practice is a process that begins and ends with a patient problem. Figure 1 presents a schematic diagram of the evidence-based practice process. Whenever possible, evidence-based practice teaching and student evaluation should emphasize the entire process rather than discrete competencies.
Specific evidence-based practice competencies, such as searching the literature27 and critical appraisal skills28-30 are often assessed. However, evidence-based practice requires the integration of all five domains and, therefore, performance-based examinations (PBEs) that assess the ability to use knowledge and apply skills are preferable.31-34 The Berlin34 and Fresno31 tests are valid PBEs used in medical education to assess evidence-based medicine knowledge and skill. The Berlin Test was designed to evaluate postgraduate evidence-based medicine courses and consists of questions based on clinical scenarios and published research studies. The Berlin Test measures basic knowledge and the following skills: relating a clinical problem to a clinical question, choosing the best research design to answer the question, and using quantitative information from the article to answer questions. The Fresno Test requires that students review patient scenarios suggesting clinical uncertainty and respond to short-answer questions. Students develop a clinical question, identify the most appropriate research design for answering the question, describe use of electronic resources, identify issues for determining the relevance and validity of a research article, and discuss the magnitude and importance of research findings. The Fresno Test details and scoring rubric are available at the British Medical Journal Web site.35
PBE of evidence-based content and communication skills can also occur in a practical examination setting32,33 or by using patient simulations.33 Patient simulations can begin by asking students to respond to a patient's request for evidence to support a clinical decision.32 Students are then required to complete the following steps: develop a clinical question and state how it relates to the patient case; conduct a literature search and document search terms, databases, and search engines utilized; list three persuasive articles; and use details from the resources and justify their answer to the question. Students communicate the information to the patient and are evaluated on the following points: avoids medical jargon, organizes explanation, allows time for questions, and demonstrates concern for the patient's interests.
Unfortunately, the existing evidence-based practice PBEs are based on medical examples that are not applicable to rehabilitation. We need valid outcome measures that incorporate patient examples relevant for physical therapy to assess the efficacy of evidence-based practice teaching methods.
The next sections focus on program structure and learner attitudes that are necessary conditions for teaching evidence-based practice. These conditions help develop evidence-based practice competencies, allow students to internalize the evidence-based practice process, and promote evidence-based practice behaviors.
The optimal program structure for teaching evidence-based practice includes the following components: active student involvement, integrated teaching of all evidence-based practice components, and coordination with clinical courses.36
Programs with strong active learning components are effective in changing behaviors and promoting life-long learning.37 Opportunities to learn evidence-based practice in collaborative activities empower students to be independent problem solvers and encourage teamwork. Students learn the value of pooling resources and the individual's perspective is expanded by input from others. When students work in small groups on evidence-based practice assignments, they learn how to work as a team and model behaviors they will need to become evidence-based practitioners.
Integrating evidence-based practice with clinical courses allows students to learn these skills as they are used in practice-within the context of a patient problem. Unfortunately, evidence-based practice is often taught as a separate topic or in stand-alone courses that are poorly integrated with clinical training. As a result, students lack opportunities to integrate evidence-based practice knowledge with clinical decision making and patient communication skills. The clinical relevancy and usefulness of evidence-based practice as a component of clinical decision making is reinforced when evidence-based practice is emphasized throughout the curriculum, integrated with patient cases,36 and taught in conjunction with courses that emphasize essential patient care skills.38 Constructing evidence-based practice learning experiences that are clinically relevant helps achieve the ultimate objectiv--to view literature searching, critical appraisal, and use of evidence as a fundamental clinical skill.36
Ideally, students would continue to develop evidence-based practice competencies and learn strategies for implementing an evidence-based approach during their clinical education experiences. Unfortunately, evidence-based practice knowledge and skills gained in the academic setting are often not adequately reinforced in during clinical education. A recent systematic review of the evidence for teaching evidence-based medicine highlights the importance of integrating teaching with clinical practice.39 The results should be considered with the following caveat: The systematic review involved studies of postgraduate medical education, and the results, while intriguing, may not directly apply to physical therapy professional education. The review included 23 studies of evidence-based medicine courses that were standalone (N=18) and integrated with clinical activities (N=5). While outcomes from the stand-alone and integrated courses demonstrate improved knowledge, outcomes for the integrated courses also showed improvement in skills, attitudes, and behavior. The authors suggest that integrated teaching is more effective because the learner relates the evidence-based medicine process to real patient problems and learns to identify and resolve practice barriers. The efficacy of evidence-based practice teaching strategies is an issue that needs further study, and, if integrated teaching is more effective for improving evidence-based practice knowledge, skill, attitudes, and behavior, faculty should redouble efforts to ensure that evidence-based practice is a strong component of clinical education.
Evidence-based practice provides an ideal vehicle for collaboration between academic and clinical faculty because it links research and clinical practice. Working together, academic and clinical faculty can examine strategies for improving evidence-based practice in clinical education. Table 2 presents examples of collaborative activities that may be undertaken by academic and clinical faculty to advance evidence-based practice in clinical education.
A recent survey of physical therapists indicated a positive attitude toward evidence-based practice but difficulty implementing an evidence-based approach due to barriers which included a lack of time and the need for further development of knowledge and skills.40 Academic and clinical faculty can work together to identify barriers and facilitators to evidence-based practice and develop and test implementation strategies. Table 3 presents a summary of evidence-based practice implementation strategies developed by academic and clinical faculty.
Learner attitudes influence behavior.41 Since the ultimate goal is to promote evidence-based practice behavior, academic programs should address critical learner attitudes.
Tolerance for Uncertainty
As the quotes below suggest, evidence-based practice requires a tolerance for uncertainty.
I expected that evidence for the efficacy of physical therapy practice was well documented. I expected to be able to look up research in physical therapy and apply it. I was disappointed that research to support physical therapy practice was so meager.6Answers are not always black and white. Decisions are made based on “relative” certainty.6
Evidence-based practice exposes students to the reality that absolute certainty in clinical decisionmaking is not attainable. As students evaluate and rate evidence, they learn to evaluate degrees of certainty. In addition to the strength of evidence, students must learn to make clinical decisions on a case-by-case basis after considering patient preferences and mediating factors such as risks, expense, and severity of the patient's condition.42
A study of medical residents' use of evidence in clinical practice revealed two patterns of behavior regarding use of research evidence in practice that were related to their tolerance for uncertainty.41 The group described as “librarians” were not comfortable with uncertainty. When using evidence, they consulted any published resource to quickly resolve problems, often relying on textbooks, guidelines, and review articles. “Librarians” were not comfortable using primary literature and skimmed the methods section before going directly to the conclusions. In contrast, “researchers” were comfortable with uncertainty and used statistical standards to make distinctions between research studies. They did not expect specific answers and approached the literature from a relativist perspective. It is interesting that residents classified as “librarians” identified themselves as evidence-based practitioners, even though they did not use systematic search strategies or critically appraise evidence—the hallmarks of an evidence-based approach.
To encourage evidence-based practice behaviors, faculty can structure learning activities involving uncertainty that require students to make decisions that are not black-and-white. It may be challenging for faculty to acknowledge uncertainty without conveying a sense of futility. However, presenting students with idealized examples of evidence-based practice can lead to unrealistic expectations and frustration when complex problems arise.42
As the following quote demonstrates, evidence-based practice requires students to accept responsibility for clinical decision making and to make a commitment to life-long learning.
Evidence-based practice requires that students accept individual responsibility for using evidence to make clinical decisions. Students must be able to critically think and not just accept information that is given. Evidence-based practice requires a life-long commitment to learning and openness to change. Knowledge is not static.6
Life-long learning is defined as “a concept involving a set of self-initiated activities (behavioral aspect) and information-seeking skills (capabilities) that are activated in individuals with a sustained motivation (predisposition) to learn and the ability to recognize their own learning needs (cognitive aspect).”43 Life-long learning is essential to individual and professional growth and, therefore, it is important to assess this capacity and examine educational strategies to promote its development.
The Jefferson Scale of Physician Lifelong Learning was developed for physicians and is a psychometrically valid instrument consisting of 19 items.43 Analysis of the items revealed the following five factors related to life-long learning: need recognition, research endeavor, self-initiation, technical skills, and motivation. A similar scale, adapted and validated for physical therapists, could examine the effects of curricular approaches, teaching strategies, and learner attitudes on life-long learning.
Evidence-based practice has had a profound impact on physical therapy education. Over the last decade, faculty have translated evidence-based medicine concepts and developed an evidence-based practice framework for physical therapy curricula and course content. We have much work to do to articulate a vision for the evidence-based physical therapy education. This special edition of the Journal of Physical Therapy Education provides an important opportunity to share experiences and resources and identify areas for future work. The ultimate goal of evidence-based practice education is to promote evidence-based practice behaviors in clinical settings.44 It is ironic that evidence-based practice emphasizes research evidence, but there is little evidence that teaching evidence-based practice alters clinician behaviors or improves patient outcomes.45 Examining the effects of curricular designs and teaching approaches on evidence-based practice knowledge, skills, attitudes, and behaviors, and studying the impact of evidence-based practice on patient outcomes are important next steps.
1. Evidence-Based Medicine Group. Evidence-base medicine. JAMA
2. Massey B. Making vision 2020 a reality. Phys Ther.
3. Institute of Medicine, Committee on Health Care in America. Crossing the Quality Chasm: A New Health Care System for the 21st Century.
Washington, DC: National Academy Press; 2001.
4. Chassin MR, Galvin RW, and the National Roundtable on Health Care Quality. The urgent need to improve health care quality. JAMA
5. Sackett DL, Rosenberg WMC, Muir Gray JA, Haynes RB, Richardson WS. Evidence-based medicine: what it is and what it isn't. BMJ
6. Weiss F. Independent Study, Teaching and Learning Evidence-Based Practice [personal journal].
7. Sackett DL, Richardson WS, Rosenberg W, Haynes RB. Evidence-Based Medicine: How to Practice and Teach EBM
. Edinburgh, England: Churchill Livingstone; 2000.
8. Centre for Evidence-Based Medicine. Available at: http://www.cebm.utoronto.ca
. Accessed November 15, 2004.
9. PubMed [database online]. Bethesda Md: National Center for Biotechnology Information (NCBI); 2004.
10. SUMSearch [database online]. San Antonio, Tex: University of Texas Health Center, 2004.
11. The Cochrane Library.
Vol 4: John Wiley & Sons, Inc; 2004.
12. Grimshaw J. So what has the Cochrane Collaboration ever done for us? A report card on the first 10 years. Can Med Assoc J.
13. Hooked on Evidence [database online]. Alexandria, Va: American Physical Therapy Association; 2004.
14. Physiotherapy Evidence Database (PEDro) [database online]. Sydney, Australia; Centre for Evidence-Based Physiotherapy, University of Sydney; 2004. Updated November 19, 2004.
15. Occupational Therapy Systematic Evaluation of Evidence (OTSeeker) [database online]. Brisbane, Queensland, Australia: The University of Queensland; 2004.
16. Database of Abstracts of Reviews and Effects (DARE) [database online]. York, UK: Centre for Reviews and Dissemination, University of York; 2004.
17. Wyer PC, Keitz S, Hatala R, et al. Tips for learning and teaching evidence-based medicine: introduction to the series. Can Med Assoc J.
18. Barratt A, Wyer PC, Hatala R, et al. Tips for learners of evidence-based medicine: 1. relative risk reduction, absolute risk reduction and number needed to treat. Can Med Assoc J.
19. Montori VM, Kleinbart J, Newman TB, et al. Tips for learners of evidence-based medicine: 2. measures of precision (confidence intervals). Can Med Assoc J
. 2004;171(6): 611-615.
20. Can Med Assoc J. Available at: http://www.cmaj.ca
21. Critical Appraisal Worksheets. Centre for Evidence-Based Medicine. Available at: www.cebm.utoronto.ca/teach/materials/caworksheets.htm
. Accessed Novmber 15, 2004.
22. Tickle-Degnen L. Client-centered practice, therapeutic relationship, and use of research evidence. American Occupational Therapy Journal.
23. Schon DA. Educating the Reflective Practitioner: Toward a New Design for Teaching and Learning in the Professions.
San Francisco, Calif: Jossey-Bass; 1990.
24. Porzsolt F, Ohletz A, Thim A, et al. Evidence-based decision making—the 6th step approach. ACP Journal Club.
25. Flavell. Metacognition and cognitive monitoring: A new area of cognitive-developmental inquiry. American Psychologist.
26. Flavell JH. Speculations about the nature and development of metacognition. In: Weinert FE, Kluwe RH, eds. Metacognition, Motivation, and Understanding
. Hillside, NJ: Lawrence Erlbaum Associates; 1987.
27. Burrows SC, Tylman V. Evaluating medical student searches of MEDLINE for evidence-based information: process and application of results. Bulletin of the Medical Library Association.
28. Stern DT, Linzer M, O'Sullivan PS, Weld L. Evaluating medical residents' literature-appraisal skills. Acad Med.
29. Bennett KJ, Sackett DL, Haynes RB, Neufeld VR, Tugwell P, Roberts R. A controlled trial of teaching critical appraisal of the clinical literature to medical students. JAMA
30. Norman GR, Shannon SI. Effectiveness of instruction in critical appraisal (evidence-based medicine) skills: a critical appraisal. Can Med Assoc J.
31. Ramos KD, Schafer S, Tracz SM. Validation of the Fresno test of competence in evidence-based medicine. BMJ
32. Davidson R, Duerson M, Romrell L, Pauly R, Watson RT. Evaluating evidence-based medicine skills during a performance-based examination. Acad Med.
33. Bradley P, Humphris G. Assessing the ability of medical students to apply evidence in practice: the potential of the OSCE. Medical Education.
34. Fritsche L, Greenhalgh T, Falck-Ytter Y, Neumayer H-H, Kunz R. Do short courses in evidence-based medicine improve knowledge and skill? Validation of Berlin questionnaire before and after study of courses in evidence-based medicine. BMJ
. 2002;325(7): 1338-1341.
35. British Medical Journal
Web site. Available at: http://www.bmj.com
36. Ghali WA, Saitz R, Eskew AH, Gupta M, Quan H, Hershman WY. Successful teaching of evidence-based medicine. Medical Education.
37. Candy P. Self-Direction for Life-Long Learning: A Comprehensive Guide to Theory and Practice.
San Francisco, Calif: Jossey-Bass; 1991.
38. Korenstein D, Dunn A, McGinn T. Mixing it up: integrating evidence-based medicine and patient care. Acad Med.
39. Coomarasamy A, Khan KS. What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review. BMJ
40. Jette DU, Bacon K, Batty C, et al. Evidence-based practice: beliefs, attitudes, knowledge, and behaviors of physical therapists. Phys Ther.
41. Timmermans S, Angell A. Evidence-based medicine, clinical uncertainty, and learning to doctor. Journal of Health and Social Behavior.
42. Welch HG, Lurie JD. Teaching evidence-based medicine: caveats and challenges. Acad Med.
43. Hojat M, Nasca TJ, Erdmann JB, Frisby AJ, Veloski JJ, Gonnella JS. An operational measure of physician lifelong learning: its development, components and preliminary psychometric data. Medical Teacher.
44. Hatala R, Guyatt G. Evaluating the teaching of evidence-based medicine. JAMA
. 2002; 288(9):1110-1112.
45. Dobbie AE, Schneider FD, Anderson AD, Littlefield J. What evidence supports teaching evidence-based medicine? Acad Med.
46. Modified from a presentation at the International Conference of Evidence-Based Health Care Teachers and Developers (Sicily, Italy, September 2003).
47. Compilation of suggestions by clinical educators at the Boston University Faculty Summer Institutes 2001-2003.