McCord, Gary MA; Smucker, William D. MD; Selius, Brian A. DO; Hannan, Scott MD; Davidson, Elliot MD; Schrop, Susan Labuda MS; Rao, Vinod; Albrecht, Paula
Most physicians believe that practicing evidence-based medicine (EBM) improves patient care,1 but physicians do not consistently practice EBM.2,3 For example, when questions are pursued, informal and formal consultations, personal collections, books, and pocket references are preferred more often than evidence-based sources.4–7 The most important obstacle to evidence-based practice seems to be the nature of physicians' job responsibilities, principally, time constraints.1–4 During a typical office day, physicians are required to manage clinical decisions, provide for the needs of patients, provide medication information, track and manage patient data, and calculate clinical indices.7,8
Ramos et al5 investigated whether residents differ from faculty in applying evidence-based principles to clinical decisions and found that the two groups were very similar. Rarely were evidence-based sources used to answer clinical questions.
With the above issues in mind, we designed and carried out the present study to (1) determine the types of information resources that EBM-trained, family medicine residents use to answer clinical questions at the point of care, (2) assess whether these sources are evidence-based, (3) determine the resources available or required by the participating residencies, (4) examine residents' self-reports of the best resources available to practice family medicine, and (5) provide suggestions on how to approach information management and evidence-based practice more effectively within residencies.
We conducted this study at five hospital-based family medicine residency programs affiliated with the Northeastern Ohio Universities College of Medicine in Rootstown, Ohio. The office of research of the department of family medicine designed and supervised the study. The office consists of the department chair, six faculty physician research directors from the residency sites, a community-based physician, a research coordinator/statistician, and an administrator.
Study participants were all 25 third-year residents practicing family medicine at the five participating residency sites. The study took place in July 2005 after graduating residents had departed. In their second year of training, all of the participating third-year residents had completed a required EBM curriculum during which EBM principles were informally taught and discussed during the delivery of patient care, seminars, journal clubs, self-study projects, and research or scholarly projects. Toward the end of the residents' second year, an EBM workshop had been conducted during which residents received lectures about keeping up with the literature, pharmaceutical detailing, and limitations of EBM. During the workshop, residents also evaluated a clinical question that they researched in advance, and they participated in small-group discussions to evaluate the best evidence-based answers available to that question. After the workshop, residents completed a questionnaire to evaluate the curriculum.
In our study, a family physician researcher trained two medical students to directly observe and record residents' information-retrieval activities, using a checklist designed by the office of research. Questions and issues that arose during the training of these students were resolved to ensure standardized procedures for recording residents' behavior during the study. Residents received an orientation to the study and were instructed to practice in their normal manner during the observation period.
Students directly observed each participating resident for two half-days. Direct observation started as soon as residents arrived at the family medicine center and ended when residents completed their half-day sessions. Data were collected before and after patient visits and during the delivery of patient care inside and outside of the exam room. If a resident determined that the half-day session observed was not typical (e.g., an abnormally low number of patients), data from the observed session were not counted and attempts were made to reschedule another half day of observation.
Students recorded every instance of residents' information retrieval related to a clinical question and documented the type of source (book, journal, etc.), the retrieval location (inside exam room, outside exam room with the patient present, and outside exam room with the patient not present), the actual name of the information source (e.g., ePocrates), the estimated amount of time consulting the source (less than one minute, one to two minutes, two to five minutes, or more than five minutes), the date of observation, the time the office session started and ended, the number of patients (both observed and unobserved by the student) inside the exam room, and the name of the family medicine center.
Patients were asked for their verbal consent to allow the student to observe the encounter and were reassured that no information about them was being recorded. For patients who declined, residents self-reported any information retrieval to the student after the visit was complete. If there were questions about what source the resident was consulting or whether the event was patient related, the student asked for clarification. If questions arose during a physician–patient encounter, the student asked for clarification after the encounter was complete to avoid interfering with the physician–patient interaction.
When the direct observation phase of the project was complete at each center, an end-of-study questionnaire was distributed to the 37 full-time faculty and the 25 third-year residents. They were asked to (1) list up to three of the best electronic resources available for practicing family physicians, (2) list their electronic resource subscriptions, (3) indicate how often they use personal digital assistant (PDA) resources for patient care, (4) list the advantages and disadvantages of using a PDA in practice, (5) list the reasons why they use a PDA to practice medicine, and (6) state whether they had ever prevented medical errors using a PDA (if they had, they were asked to provide examples). The institutional review board at the Northeastern Ohio Universities College of Medicine reviewed and approved the study protocol.
All of the 25 third-year residents eligible for this study had participated in the EBM curriculum as second-year residents and had evaluated the curriculum after the EBM workshop. Table 1 presents their ratings of the questions used in the evaluation. Residents felt they had a better than basic understanding of EBM principles, rated the amount of curricular time as about right, agreed that EBM is discussed in relation to patient care at their center, rated EBM as valuable to clinical care, and were able to find evidence-based answers to clinical questions more than 50% of the time. Table 2 presents residents' self-reports of the sources they used to answer their clinical questions. The Internet, clinical practice guidelines, and review articles were the most frequently used sources.
All of the 25 third-year residents agreed to be directly observed. Three residents reported that they had an abnormal day, all because they had seen fewer patients than normal. The final sample consisted of 23 residents who were observed for two normal half days and two residents who were observed for one normal half day. Attempts to reschedule the two residents for another half day were not successful. Post-study questionnaires were returned from 76% (28/37) of faculty and 52% (13/25) of residents.
The average time spent in the office for a half day of patient care was approximately three and a half hours (209.1 minutes, SD = 29.9, range =135–280), and residents saw a total of 328 patients during the study period. The average number of patients per half day was 6.8 (SD = 1.6, range 4–11), of which 6.1 were observed in the exam room (SD = 1.8, range 3–11), and 0.7 were not observed (SD = 1.0, range = 0 to 4). There were 532 instances of clinical information retrieval during the 48 observed half days, for an average of 11.1 instances per half day, or 1.6 per patient.
Table 3 shows the sources that the residents used to obtain answers for the total of 532 clinical inquiries they made during the study period. As the table indicates, the greatest number of answers were obtained from attending physicians, with PDA programs and medical textbooks providing most of the remaining answers. These three sources accounted for 87% of residents' information retrieval.
Eighty-seven percent (461/532) of clinical inquiries were made during office visit time but outside of direct physician–patient contact. Thirteen percent (68/532) were made with patients inside the exam room, and fewer than 1% (3/532) with patients outside of the exam room. Forty-eight percent of the clinical inquiries (256/532) took less than one minute, 24% (127/532) took one to two minutes, 17% (89/532) took two to five minutes, and 11% took more than five minutes.
From the post-study questionnaire, Table 4 presents faculty members' and residents' opinions about the best electronic information sources available for family physicians. The online version of UpToDate was rated as the best source available. Of the sources listed in Table 4, ePocrates, Griffith's 5-Minute Clinical Consult, and UpToDate were the only sources actually used during the direct observation portion of the study.
Eighty-two percent of the responding faculty (23/28) and 92% (12/13) of the responding residents reported that they use PDA resources for patient care an average of once every day. Seventy-one percent (20/28) of the faculty and 54% (7/13) of the residents reported having discovered or prevented medical errors by using a PDA program. Every medical error example provided by respondents was related to drug dosages and/or interactions.
From the open-ended questions asking respondents to list the advantages and disadvantages of PDA use when practicing medicine, 71 advantages were listed by 41 respondents (mean = 1.7 per physician). Sixty-nine percent (49/71) involved use factors: time (17 responses), portability (11 responses), convenience (eight responses), accessibility (seven responses), and one response each for ease of use, comforting patients, effectiveness, ability to customize for practice, personal use, and calculating indices. Thirty-one percent (22/71) of the advantages involved information factors: point-of-care information (six responses), current information (four responses), medication information/cost (four responses), quantity of data (four responses), frequent updates (three responses), and quality of data (one responses).
Thirty-seven disadvantages were listed by the 41 respondents (mean = 0.9 per physician). Eighty-one percent (30/37) involved use factors: damage or malfunctions (five responses), small screen (four responses), dependence on the PDA (three responses), cost (three responses), battery life (two responses), slow response (two responses), losing the PDA (two responses), and one response each for “It's in the way,” memory capacity, no Internet access, PDA becomes out of date quickly, impersonal to patients, noninteractive programs, and scrolling to read. Nineteen percent (7/37) of the disadvantages involved information factors: “You stop learning” (three responses), “You cannot evaluate the information” (three responses), and “The information is not available” (one response).
From the closed-ended question regarding the reasons why respondents use a PDA while practicing medicine, 85% (35/41) responded ease of use, 81% (33/41) time factors, 78% (32/41) accessibility, 63% (26/41) quality of data, 56% current data (23/41), 49% (20/41) quantity of data, and 2% (1/41) responded that “Patients like it.”
None of the family medicine centers require the use of any specific information source. Residents had subscriptions to six different PDA programs: ePocrates (11 subscriptions), Griffith's 5-Minute Clinical Consult (three subscriptions), Medcalc (two subscriptions), PediSuite (one subscription), 5-Minute Pediatric Consult (one subscription) and Shots 2004 (one subscription). Faculty had subscriptions to ePocrates (17 subscriptions), Griffith's 5-Minute Clinical Consult (four subscriptions), and 5-Minute Pediatric Consult (three subscriptions). For faculty, there were 10 additional subscriptions owned, all of which were listed once.
The direct observation method we used suggests that our results describe what actually happens when EBM-trained residents answer clinical questions at the point of care. At the end of their second year, residents self-reported that Internet sources, clinical practice guidelines, and review articles were the three most frequent sources used to answer clinical questions, but the same residents were observed in our study to use attending physicians, PDA programs (mostly ePocrates), and textbooks as 87% of the sources they consulted.
Despite being trained to use traditional EBM methods, our residents most often used authoritative sources to answer their clinical questions, principally, direct physician-to-physician consultation. These authoritative sources may or may not be evidence-based depending on the EBM skills of the consulting physicians as well as the accuracy and reliability of the electronic programs and textbooks used. It is important to determine whether this type of information retrieval is useful, accurate, and prevents medical errors or whether it is just convenient. When designing the study, we chose not to collect the clinical questions investigated, so we are unable to correlate residents' use of EBM or non-EBM resources with the types of questions asked. We were concerned that residents would change their normal practice patterns if they realized that we would be able to generate a “report card” of their performance. The purpose of our study was to determine the nature and (self-reported) quality of information sources residents used to answer clinical questions, not to evaluate individual residents' performance.
Rather than practicing EBM, residents operated more as information managers within the constraints of time limitations and job responsibilities. These findings are consistent with the premise that teaching EBM skills only and expecting physicians to manage information effectively will not work because the EBM skills that physicians are taught are not relevant to daily practice.9 Slawson and Shaughnessy9 recently stated that there is a “need to teach the applied science of information management along with, or perhaps even instead of, teaching the basic science of EBM” (emphasis is ours).
An important aspect of information management is the ability to deal effectively with the vast volume of medical information that is increasing too rapidly for physicians to keep pace.10,11 Improved information technologies, such as electronic health (EHR) records and PDAs, allow more questions to be answered than is possible using traditional sources, and such technologies are potential solutions to addressing the ever increasing amount of medical data.10 Given the frequency of PDA use in the present study, it is vital that the information included in electronic programs for both PDA and EHR systems is carefully monitored, up to date, accurate, and reliable. More accurate and timely information may lead to a decrease in medical errors,11–13 which cost the United States approximately $38 billon dollars a year.12 The Agency for Health Care Research and Quality (AHRQ) reports that most medical errors are system related and that the key to reducing errors is to improve systems of delivering care, but patients attribute medical errors to physician failure, not to problematic medical systems.12 Improved systems of delivering evidence-based care would benefit patients and physicians alike.
EHR and PDAs present significant implications for system improvements by the immediate accessibility of data at the point of care.12,14,15 In our study residents used PDAs frequently, especially for medication information. Prior studies have shown that the most important barrier in practicing EBM is time constraints,1–5,16 and we found that time, ease of use, and accessibility (not data quality or quantity) were the most important reasons residents use PDAs. The process of leaving a patient in the exam room, going to a computer, searching for and reviewing information, and then going back to the patient is one process that does not work. The residents whom we studied seem to use PDAs as a time-effective form of decision support and as a repository of what they perceive as relevant, reliable medical information. However, in a practical sense, our findings reinforce the idea that residents are using the resources that are available to them. Residents self-reported UpToDate as the best available resource, but it was used for only 5 of the 532 instances of information retrieval. ePocrates and Griffith's 5-Minute Clinical Consult constituted more than 90% of the electronic sources used, probably because residents owned subscriptions to those resources. As the use of electronic health records becomes more pervasive, the PDA may become less important in obtaining information at the point of care, especially if health care software on EHR systems proves to be more reliable and comprehensive than those on the PDA. Regardless of the type of electronic system used, careful monitoring, revising, and updating of the system is vital if health care is to improve.
The results of our study suggest to us the following recommendations for medical system improvements to enhance patient health care:
▪ Provide residents with sources that are current, easy to use, and evidence based; that are quickly accessible at the point of care; and that adequately answer most of the common clinical questions encountered in practice, especially for medications.
▪ Provide educational support for physicians on the accuracy and reliability of individual databases so they can make wise decisions about the information sources they decide to use.
▪ Medical database manufacturers for both EHR and PDA systems should produce unbiased, reliable, up-to-date, and easy-to-use programs at a reasonable cost.
Our study has certain limitations. First, we cannot determine whether the Hawthorne effect occurred—that is, whether residents diverged from their normal practice patterns as a result of the study. To minimize this effect, residents were asked to practice in their normal manner. Furthermore, all of the residents were accustomed to working with students on a regular basis during clinical care outside of this study. The second limitation is that the mere presence of faculty physicians may allow residents to use them as a primary source of information rather than search for evidence-based answers, and it is unknown how residents would practice without preceptors present, because physicians in training have to be supervised. Third, even though we had 100% resident participation during direct observation, we observed only 25 residents, and only for a short time. It is possible that our residency programs are in some ways qualitatively different than other programs and that our results may not generalize to all residency programs, especially the results of the PDA survey, which was completed by only 52% of the residents. Fourth, our results may not be relevant to residency programs using EHR. Electronic access to patient health records is currently unavailable at our practice sites. Finally, the results of this study are limited by the fact that we studied residents' searches for answers to questions that needed to be answered at the point of care rather than their searches for answers to a wider array of questions.
Future studies may focus on comparing the information and retrieval strategies of residents, faculty, and private practice physicians. The needs of each group of physicians may differ because of varying levels of clinical experience, job structure, and the availability of resources to answer clinical questions.17 Another factor that is unknown is whether therapeutic and diagnostic decisions occurring at the point of care conform to evidence-based practice or guideline-based practice.
To sum up: the results of our study support the AHRQ's call for medical system improvements at the point of care. To improve patient outcomes, it may be necessary to teach physicians better information-management skills in addition to teaching EBM skills.
The authors wish to thank the Ohio Academy of Family Physicians Foundation, the American Academy of Family Physicians Foundation, and the Northeastern Ohio Universities College of Medicine for sponsoring the medical students who collected data for this study.
1 McColl A, Smith H, White P, Field J. General practitioners' perceptions of the route to evidence based medicine: a questionnaire survey. BMJ. 1998;316:361–365.
2 Ely JW, Osheroff JA, Ebell MH, et al. Analysis of questions asked by family doctors regarding patient care. BMJ. 1999;319:358–361.
3 Covell DG, Uman GC, Manning PR. Information needs in office practice: are they being met? Ann Intern Med. 1985;103:596–599.
4 Mangrulkar RS. Targeting and structuring information resource use: a path toward informed clinical decisions. J Contin Educ Health Prof. 2004;24:S13–S21.
5 Ramos K, Linscheid R, Schafer S. Real-time information-seeking behavior of residency physicians. Fam Med. 2003;35:257–260.
6 Bryant SL. The information needs and information seeking behaviour of family doctors. Health Info Libr J. 2004;21:84–93.
7 Haug JD. Physicians' preferences for information sources: a meta-analytic study. Bull Med Libr Assoc. 1997;85:223–232.
8 Torre DM, Wright SM. Clinical and educational uses of handheld computers. South Med J. 2003;96:947–948.
9 Slawson DC, Shaughnessy AF. Teaching evidence-based medicine: should we be teaching information management instead? Acad Med. 2005;80:685–689.
10 Pluye P, Grad RM. How information retrieval technology may impact on physician practice: an organizational case study in family medicine. J Eval Clin Pract. 2004;10:413–430.
11 Maviglia SM, Strasberg HR, Bates DW, Kuperman GJ. KnowledgeLink update: just-in-time context-sensitive information retrieval. AMIA Annu Symp Proc. 2003:902.
13 Arnold JL, Levine BN, Mammatha R, et al. Information sharing in out-of-hospital disaster response: the future role of information technology. Prehospital Disaster Med. 2004;19:201–207.
14 Johnston JM, Leung GM, Tin KYK, Ho LM, Lam W, Fielding R. Evaluation of a handheld clinical decision support tool for evidence-based learning and practice in medical undergraduates. Med Educ. 2004;38:628–637.
15 Joy S, Benrubi G. Personal digital assistant use in Florida obstetrics and gynecology residency programs. South Med J. 2004;97:430–433.
16 D'Alessandro DM, Kreiter CD, Peterson MW, Kingsley P, Johnson-West J. An analysis of patient care questions asked by pediatricians at an academic medical center. Ambul Pediatr. 2004;4:18–23.
17 McLeod TG, Ebbert JO, Lymp JF. Survey assessment of personal digital assistant use among trainees and attending physicians. J Am Med Inform Assoc. 2003;10:605–607.