Secondary Logo

Journal Logo

Research Report

Why Do Residents Fail to Answer Their Clinical Questions? A Qualitative Study of Barriers to Practicing Evidence-Based Medicine

Green, Michael L. MD, MSc; Ruff, Tanya R. MD

Author Information
  • Free

Abstract

Evidence-based practice has emerged as a national priority in efforts to improve health care quality.1 Physicians are encouraged to identify, appraise, and apply the best evidence in their decision making for individual patients. However, this ideal remains far from realization. Physicians leave the majority of their clinical questions unanswered,2,3 witness their medical knowledge deteriorate after their training,4 and demonstrate wide practice variations for clinical maneuvers with established efficacy. Similarly, residents pursue only 28% of their clinical questions, often consulting non-evidence-based information resources.5

Practicing physicians in several studies6–12 identified lack of time as the primary barrier to answering their clinical questions. In addition, they reported difficulty in phrasing clinical questions; not knowing when to stop searching; and a lack of awareness, access, and skills in searching medical information resources.

Less is known, however, about the barriers to evidence-based medicine (EBM) faced by physicians-in-training. In a prospective study of information needs, residents commonly identified lack of time and forgetting as reasons for failing to pursue their clinical questions.5 The degree of time pressure, in an analysis of structured interviews, determined medical residents’ decision to either seek quick information from a consultant or to directly examine “the evidence.”13 Finally, in a focus-group study, surgery residents also identified lack of time among several “challenges to EBM practice.” However, unlike medical residents, they also faced more fundamental obstacles, including confusion over the definition of EBM, lack of mentorship, and widespread resistance to EBM.14

To facilitate residents’ evidence-based practice, we sought to learn more about the barriers they encounter in attempting to answer their clinical questions. We chose a qualitative inquiry, as we wished to openly explore this experiential phenomenon and avoid the presuppositions inherent in a quantitative survey. Suspecting that residents might share many frustrations, we believed the interchange of a focus-group would generate insights that might not be captured by individual interviews.

Method

Participants and setting

In 2003, we studied categorical residents in the Yale primary care internal medicine program. They rotate through a university-based tertiary care medical center and two urban community hospitals. They follow continuity patients weekly in one of two community hospital based clinics. In addition, residents have a yearly three-month ambulatory block, which includes placement in community primary care practices and non-internal medicine specialty experiences.

We selected a convenience sample of residents arbitrarily assigned to three consecutive ambulatory block rotations. No resident refused to participate. Thirty-four residents, representing 54% of those in the categorical program, participated in the three focus groups (of 12, 11, and 11 residents, respectively) described below. The demographic characteristics of the participating residents (mean age 20 years, 54% women, 30% interns) were representative of the residents in the program as a whole.

The university's human investigation committee granted this project an exemption from formal review.

Outcome measurement

We convened three 90-minute focus groups over a four-month period. A professional facilitator led the first two focus groups. One or two of the investigators remained in the room to clarify points of confusion, as the facilitator did not have a medical background. One of us (MG) conducted the final focus group because of scheduling conflicts with the facilitator. The investigators and facilitator met immediately after each session to “debrief” about question effectiveness, group dynamics, and time schedule. The focus-group discussions were recorded and transcribed, eliminating any mention of the participants’ names.

The focus-group questions related to barriers to answering clinical questions and solutions to overcoming them. We developed the focus-group script to progress from introduction to transition to key questions (see Table 1). 15 The introduction question was designed to stimulate thinking about the concept of clinical questions. The transition question prompted the residents to recall and reflect upon recent attempts to respond to clinical questions. The key questions directly solicited the responses of interest. Finally, when responses to the broad open-ended key questions were exhausted, the facilitator asked probing follow-up questions. We revised the focus-group script based on two pilot “mock” focus groups with general medicine faculty and a critical review by a group process expert. In addition, after the first resident focus group, we deleted one follow-up question, which had consistently elicited tangential responses.

Table 1
Table 1:
Questions to Participants from the Focus Group Discussion Guide, Yale Primary Care Internal Medicine Program, 2003

Analysis

We conducted a thematic analysis of the focus-group transcripts, using common coding techniques and the constant comparison method of qualitative analysis.16 In open coding, each of us independently read each transcript line by line and assigned codes (representative words or phrases) to key themes. In our analysis of later transcripts, we applied codes from earlier transcripts, assigned additional codes to new themes, and revised some existing codes. We then grouped related codes together in axial coding to develop a hierarchical classification scheme. In an iterative process, we met after independently analyzing each of the three transcripts to compare coding structures, review theme exemplars, and reach consensus for coding differences. Throughout the analytic process, we remained open to alternative explanations and considered comments that appeared to contradict our emerging explanation of EBM barriers (deviant cases).

We kept an audit trail to document coding sessions, consensus meetings, and evolving coding structures. The “final” classification scheme was presented to a post hoc focus group, comprised of representatives from the three study groups and some naïve residents. We asked them if our interpretation faithfully and exhaustively represented their experience (member checking).

We selected several illustrative quotations to include in our results. For ease of reading, we corrected grammatical inconsistencies and inserted obviously deleted words but did not make any substantive changes.

Results

Our analysis revealed several barriers to answering clinical questions, which we collapsed into four technical or pragmatic themes and four emotional or cultural themes (see List 1). In the following paragraphs, we explicate these themes and include illustrative quotations from the focus-group transcripts.

List 1 Eight Themes That Characterize Barriers Residents Found to Answering Clinical Questions, Yale Primary Care Internal Medicine Program, 2003

Access to electronic information resources.

Residents described several barriers to accessing electronic medical information, which varied widely between hospitals and, to a lesser degree, between locations within a hospital.

They consistently encountered more obstacles at the two community hospitals. First, computer terminals were often not located at the point of care. They often found themselves wasting precious time in search of computers at some distance from their patients.

It's often very useful to have one key clinical question and to answer it right on the spot because you need to act fairly soon and you have like three other patients to see and you may not get back to that clinical question in a timely enough manner if you don't have the resources right there to be able to pursue that question. And, for example, one of the barriers here is that there's just nothing in the Emergency Room to look at. And so your immediate clinical questions…can't get answered on the spot when you have the questions on your mind.

Residents also lamented inferior technology, including outdated hardware, slow Internet connections, firewall restrictions, and inability to make printouts. The repeated frustrations with computer access at the community hospitals, for some residents, created an overall sense of nihilism.

The thing is, you get trapped into things that make you so frustrated that you're loathe to try any new settings…. And I think that happens to us over and over again. We encounter different problems, each one unpredictable, and that ends up causing inconvenience not only for us but for other people and so we sort of threw up our hands and it's just too hard.

Residents contrasted their frustration at the community hospitals with the point-of-care availability, speed, exclusive access, and thoughtful design of electronic information resources at the academic medical center.

[The academic medical center] is right at the point of contact. You have a computer—the same computer that you're checking your labs on, it's literally two clicks to getting a general clinical answer done right there…. You have somebody on a medication. You're not familiar with it. Don't necessarily know what the specific side effects are and you're not going to get all of the answers out of your palm pilot. You can make up your mind very quickly about what medications you're going to start this patient on or not.

Finally, while some residents enjoyed the convenience of personal handheld devices, most found their cost prohibitive and the program stipend inadequate.

Skills in searching electronic information resources.

Residents lamented their lack of familiarity, or even awareness, of the wide range of electronic information resources currently available. They seemed to have a sense that searching for original articles on Medline may not be the most efficient way to answer many of their questions. However, they did not know which resource might be most appropriate for a particular type of question.

I wasn't aware of half the Websites that you showed us…. So I think part of my problem is at least knowing appropriately where to go and not just going to Ovid or PubMed, but even having access to The Cochrane Library or different other Websites that are available to us that we just don't know about.

To overcome this barrier, one resident suggested a centralized Web site that would direct a searcher to appropriate information sites based on the nature of the clinical question.

In addition, residents professed difficulty articulating “answerable questions” and translating them into effective search terms and strategies.

And so, a lot of times, if I'm getting frustrated, it's usually that I didn't formulate my clinical question very well. So, I'm getting way too many resources, or not enough resources from my search strategies.

Finally, residents had difficulty knowing when to stop searching, because they remained uncertain of the validity, timeliness, or exhaustiveness of the information they retrieved.

It was regarding an algorithm for brain mets’ of unknown primary. Maybe because I don't have the skill, I didn't know when enough was enough. OK, I had four articles…and I found myself saying, “I need more. I need the latest data.” If the article was from ’96 I thought it was not up to date enough….I think we need to know when we have a clinical question, when do we have an answer to it? And that also will limit our time because we would know this is what we need…. But I think, many times, when clinical questions are assigned to someone, we find ourselves being trapped in this world of research.

Clinical question tracking.

If unable to respond to a question as a clinical scenario unfolded, residents often deferred the question to a later time. However, they rarely pursued these postponed questions.

And I feel like, if I leave clinic without answering the question, that the likelihood of it happening at home is pretty low, for various reasons.

Residents forgot these questions, despite their good intentions, and lamented the lack of an adequate system to track them.

I've had about five different systems in the last three years and there are remnants of all five around the house…. If they were all in one place, it's easier. So I had a little notebook I was going to use but, over time, your shoulders get so sore. You start dropping things out of your white coat and you can't stand to have something else weighing you down. So then it becomes scraps of paper.

Time.

In all three focus groups, residents immediately identified insufficient time in response to the key question about barriers. And throughout the discussion, time loomed as a pervasive barrier. With steady patient throughput in the hospital, short visit times in continuity clinic, and multiple competing responsibilities, they simply did not have time to seek answers to many of their clinical questions. The recent implementation of work hours restrictions exacerbated this time pressure.

And it's so limited by time, particularly in our clinic site where we have 20 minutes to see a patient. Ten minutes is spent waiting for someone to get the patient in the room, five to ten minutes seeing the patient, and ten minutes waiting to talk to your [preceptor] to check out. Then you're left with the 30 seconds that you have to answer, basically, all the questions the patient has.

To help overcome the time and skill barriers, residents encouraged the program to organize educational venues with protected time, specific curricula, and mentoring to pursue their clinical questions. These might take the form of “clinical question conferences” or modifications of existing teaching sessions such as resident report. However, a few residents objected to the paternalistic aspect of this approach, preferring to take personal responsibility only for their own learning.

I'm actually against that. Having to come up with a clinical question and then review it with mentors. I think it's for our betterment. If you have a legitimate clinical question and you're truly interested, I think you'll follow-up on it. But I don't think you should be forced to come up with questions for each patient you see.

Clinical question priority.

Residents selected which questions to pursue based on clear priorities.

Literally, there's a million things you could look up on patients. You have to stratify things. The ones that make it come up to the top of the bubble.

They were less likely to pursue questions that they perceived as nonurgent:

If I needed to know a question that night because I'm admitting the patient, there's no way that you're not going to look up that question and answer that question that night. But a lot of times, questions come up that are of interest but they don't affect the care that night and then, by the next day, the consultant has seen the patient and weighed in on what they think and you kind of go along with that without having reviewed the literature yourself.

They were also less likely to pursue questions that they felt were less relevant to patient care or unfeasible because of practical barriers, such as medication costs or poor patient adherence.

And I can suggest putting my patient on the ACE inhibitor. But if they come back to me and say, “Doctor, you know, my lights ain't on [i.e., I cannot even afford to pay my electricity bill], I'm going to take this fluid pill [instead] because it costs a couple of dollars,” you know, there's a difference. So we can cite the clinical question, but if the therapy that we're looking at cannot fit into that patient's lifestyle, not only do we give a useless treatment plan, we lose that patient because they say, “You don't relate to me. You don't understand me.”

Personal initiative.

While residents were quick to identify external barriers, they also recognized their own responsibility in educating themselves. In fact, offering them unlimited access to state-of-the-art technology would not increase their pursuit rate in the absence of their own personal investment in the process. And the level of commitment waned with residents’ with increasing fatigue or burnout.

Part of this training [is that] we have to be invested. So even if you gave us a laptop or PDA and a little visor with a little screen, we still have to have the discipline to access it and to go look it up…. So, what's most important is that we still have to take the initiative. Because you can have all the resources in the world, but at 2:00 in the morning, we choose the extra five minutes of sleep, and we don't look up the question.

Team dynamics.

The learning climate established by the teaching attending greatly influenced the residents’ motivation to pursue their clinical questions. Some attendings cultivated an atmosphere of collaborative learning and academic inquiry.

Sometimes when there's a really good team rapport, your whole motivation and your standards go up because everybody's really doing their work. This happened to me last month and I looked up more stuff last month than probably in the last I don't know how many months because the whole team dynamics were so good. And we were just as busy.

In contrast, attendings who assumed an authoritative style suppressed residents’ inclination to seek answers to their questions.

And the other side of that, too, I've had attendings who wanted to tell me so much stuff that I just wasn't motivated to go look up anything. They'll talk and—literally this happened like my first month—I had an attending who was a specialist and he talked like 20 minutes on one thing. And I just didn't have anything left.

The degree of decision-making autonomy also influenced residents’ information-seeking behavior. They were less likely to pursue clinical questions if the attending of record did not grant them the authority to act on the answers.

If you don't have control over that patient, you can come up with a really great answer, but if they're not used to using that drug and they're not comfortable with it, it's not going to get used…. I had a patient with hepato-renal syndrome and…we were getting ready to read his eulogy. But there were some things I thought we could have done. But our consults felt…the answer was, “Well, we have never used it. Well, maybe we could get it here, but we're not too comfortable.” Well, you know, the man's slippin’ now. And those things happen because we don't have complete control.

Many levels of training were represented on the “team,” including students, interns, residents, and attendings. In addition, the team often brought together individuals with different learning styles. This heterogeneity created barriers through confusion of roles or conflicting learning preferences as the team tried to collectively meet its information needs.

Something that just came to mind is who's responsible for the question?… As an intern I looked up a lot on my own. But was it the resident's responsibility as a teacher to do all the clinical question finding and try to bring that to rounds the next day? Or does it make more sense for the intern to try to look those things up because they're taking ownership in the patient by doing that? And how to learn the best? Because, as a resident, I struggle with this issue. When I bring everything to the table and say, instead of just gushing information, “Do you learn that way? Or do you learn better by looking it up yourself?” And different interns have responded differently.

Institutional culture.

Residents perceived different institutional cultures relating to information services between the community hospitals and the university-based medical center. This clearly affected their access to medical information and their inclination to seek it.

At the two community hospitals, residents found the culture more inhospitable to meeting their information needs. They encountered a prevailing perception that computers in clinical areas are intended for managing patient data and not for looking up medical information.

Yeah, but even those, you get yelled at for using them. You go to use them, and they say, “These are not for using…They're for looking up data—patient data.”

Residents also felt under suspicion for inappropriate use of computers, such as personal correspondence or ”surfing“ the Internet.

They see these flashy pictures and the scrolling on the screen and they really think we're sort of wasting time.

And they perceived pressure to “relinquish their seats” to hospital staff, who needed the computer terminals for higher priority purposes.

The residents hypothesized that the contrast in institutional culture between the community hospitals and the academic medical center originated in differing institutional priorities regarding medical education.

Yes, but it all goes down to what's the top priority? Is your top priority healing people and getting them out? Or is your top priority training people to take care of people? It just matters what your thrust is. Certainly, I've always gotten the feeling that my job here [community hospital] is to work first. And then, when I have my time…I'll do some kind of clinical questions.

In contrast, they perceived no essential differences in the front-line hospital staffs, whose behavior merely reflected the institutional priorities.

I don't think the staff's any different, though…. I think that your staff here would be just as interested in hearing about clinical questions…. I think the difference is that you need an institutional investment in the process. I mean, we just spent all this time talking about all the hardware that you need to make this happen in an efficient way and so the institution, as a whole, needs to say, “This is a worthwhile investment for education. This is one of our main missions and so we're going to pour this money into this buying of hardware and make it available to use.” And I think once you have that, nursing and other ancillary staff would be happy to be a part of it. But we just haven't seen it yet.

Conceptual model of themes within the sequence of EBM steps.

Finally, we considered the eight barrier themes within the sequence of steps involved in answering clinical questions. We found that residents may encounter specific barriers in every step, as illustrated in our conceptual model (see Figure 1). The process begins when a resident encounters a clinical question. It may be immediately apparent to her or she may have an unknown information need revealed to her by an attending physician or another resident. If she chooses to defer the question, she may never pursue it due to difficulties in tracking her questions. She may consider pursuing a question but not commit to it if she assigns it a low priority, faces discouraging team dynamics, or lacks personal initiative. After committing to pursue a question, her quest may be scuttled by limited access to electronic medical information resources. And finally, she may have insufficient skills to effectively search an electronic database for the answer to her question if she makes it to this step. (To complete the evidence-based decision-making process, the resident must then appraise the information and apply it to a particular patient. However, our focus groups did not inquire about barriers a resident might encounter after finding the information to answer her question.) In addition to the barriers associated with particular EBM steps, time and institutional culture, more pervasive barriers, loom over the entire process. It is also noteworthy that attitudinal or cultural barriers may lead a resident to abandon the pursuit of a question before some of the technical barriers would be encountered.

Figure 1
Figure 1:
A conceptual model that locates the eight EBM-barrier themes within the sequence of steps involved in answering clinical questions. A resident may encounter barriers at each step. For example, she may consider pursuing a question but not commit to it if she assigns it a low priority, faces discouraging team dynamics, or lacks personal initiative. The more pervasive barriers of time and institutional culture, in contrast, loom over the entire process. Of note, to complete the evidence-based decision-making process, a clinician must then appraise the information and apply it to a particular patient. However, the focus-group script used in this study (see the text) did not inquire about barriers a resident might face after finding the information to answer her question.

Discussion

We are confident that our barrier themes and conceptual model accurately portray the residents’ experience, since we used several accepted techniques in our data collection and analysis.16–20 We enhanced validity (trustworthiness in qualitative research) through constantly comparing emerging themes, analyzing deviant cases, vetting our findings with a posthoc focus group, and interpreting our results in the context of existing theory. To ensure reliability (exhaustiveness in qualitative research), we professionally facilitated the focus groups, followed a discussion guide, audiotaped and transcribed the discussions, multiply-coded all transcripts, and maintained an audit trail of data collection and analysis.

Our study extends previous work on barriers to the practice of EBM. Several studies of practicing physicians, including surveys,6–8 focus groups,9,10 an analysis of test searchers’ field notes,11 and a multifaceted study of a group practice,12 have explored barriers to practicing EBM. Physicians consistently cited lack of time as the primary barrier to answering their clinical questions. In addition, they identified difficulty phrasing clinical questions, not knowing when to stop searching, and their lack of awareness, access, and skills in searching medical information resources. While some physicians expressed negative perceptions of EBM, most did not identify skepticism of the idea of EBM as a barrier.

Less is known about EBM barriers faced by physicians in training. In our earlier quantitative study of medical information needs, medicine residents identified lack of time and forgetting as reasons for failing to pursue their clinical questions.5 Montori et al.13 conducted individual semistructured interviews to explore medical residents’ perceptions and practice of EBM. The degree of time constraint largely determined their path of pursuit of medical information. Finally, in a focus-group study, surgery residents also identified lack of time among “challenges to EBM practice.” However, they also faced more fundamental obstacles, including confusion over the definition of EBM, lack of mentorship, fear of reprisal from skeptical faculty, and widespread resistance to EBM.14

Like practicing physicians, residents in our study encountered the same technical barriers of insufficient time, limited access to information resources, and poor searching skills. In addition, their inability to track their questions may represent a barrier unique to their status as trainees, who lack a physical home base to organize their hectic lives. Among residents’ skill deficiencies, not knowing when to stop searching presents an interesting challenge. One resident felt “trapped in a world of research”—uncertain whether he had enough information to stop searching and start acting. In his qualitative studies of learning, Slotnick21,22 found that practicing physicians stop searching, often well before exhausting all of the relevant resources, based on several learning heuristics. They move on to implementation if they believe they must act urgently, have a clear answer, have a reasonable plan, or will gain little from additional pursuit.

Additionally, emotional or cultural barriers blunted residents’ inclination to pursue their clinical questions. Residents prioritized their clinical questions based on urgency, clinical relevance, or practicality. Practicing physicians’ similarly employ a winnowing process, which is at first affective and then quite practical.21,22 They acknowledge a question based on a “feeling” of a discrepancy between what is and what should be. Then, before committing to pursuing an answer, they ask themselves four questions: (1) Is this problem for me? (2) Is there a solution? (3) Are resources available to learn the solution? and (4) Is practice change acceptable?

In contrast, residents remain uniquely vulnerable to variable team dynamics and inhospitable institutional cultures. They learn and practice within the team-centered organization of training programs, which are based in and sponsored by hospitals. Team dynamics determine the learning microclimate, which could limit residents’ autonomy, stifle their thirst for learning, or bring together diverse learning styles and learner levels. The hospital, which serves many constituencies in addition its residency programs, sets the learning macroclimate. At the community hospitals, residents in our study encountered a more inhospitable climate. The importance of team dynamics and hospital culture confirms the power of the “hidden curriculum”23 on residents’ learning.

Our findings suggest several interventions to help residents practice EBM, some of which were offered by the residents themselves. Clearly, reliable, rapid, and preferably exclusive access to electronic information resources at the point of care remains essential. While hospitals must understandably attend to Internet security, their “firewalls” must permit the penetration of legitimate medical information. Residents also need more training in articulating clinical questions, selecting and searching evidence-based information resources, and “knowing when to stop.” As suggested by the residents, “clinical question conferences” or inclusion of mentored information searching in existing educational venues may address issues of time and skills. In support of this approach, attendance at a problem-based learning conference was positively correlated with surgery residents’ in-training exam scores,24 while attendance at a traditional noon conference bore no relation to medical residents’ certifying exam scores.25

Residents need better systems to manage their accumulating questions, lest they quickly fade from memory. Two investigators developed electronic clinical question databases, which allow entry of questions, sources searched, and summaries of answers.26,27 As these databases mature, they can themselves serve as information resources, provided that someone oversees the quality and decay of the entries. We can envision a similar, but more interactive, Internet-based, practice-based learning “log,” which would track residents’ questions, remind them of outstanding ones, and generate self-directed learning reports.

While informatics training and electronic infrastructure are necessary, our results suggest that they are not sufficient. “Don't drop 100 laptops in our laps and then think we're going to change overnight,” one resident prophetically confessed. We must help residents overcome attitudinal and cultural obstacles as well.

We should not challenge residents’ question “priority system,” since it reflects urgency, relevance, and practicality—all motivators of “need to know” in adult learners. However, we might help them to reach lower down on their list of questions. Questions with less urgency or relevance for the patient before them may, nonetheless, retain future educational value. And, considering opportunity costs, an EBM approach to lifelong learning will likely bear more fruit than will traditional lecture-based continuing medical education approaches.28

Program faculty can influence team dynamics. Based on our findings, a favorable EBM microclimate is one that fosters academic inquiry and shared learning. Faculty must also preserve residents’ autonomy while ensuring adequate supervision. Finally, the hospital institutional culture may represent the most formidable barrier, since it is larger than the residency program itself. Program directors should advocate prominent representation of educational programs in the hospital's mission. In their negotiations, they can articulate the powerful influence of institutional culture on learning behaviors, the benefits of facilitating evidence-based practice, and the institutional requirement of the Accreditation Council for Graduate Medical Education that “a sponsoring institution must be organized for the conduct of GME in a scholarly environment and must be committed to excellence in both medical education and patient care.”29

A few limitations should temper the interpretation of our results. Since we studied residents at a single program and three hospitals, some of the barriers and solutions may not be generalizable to other settings. In particular, technical issues, such as access to information, likely vary from hospital to hospital. Second, we relied on convenience (rather than theoretical) sampling. Nonetheless, we believe the participants represent a broad range of experiences and perspectives. Finally, sacrificing breadth for depth, we limited our inquiry to two of the four steps of the evidence-based decision making process. The focus-group questions related to barriers encountered in asking clinical questions and searching for medical information. Thus, we could not detect barriers residents may encounter as they proceed to appraising information and applying it to their patients.

In conclusion, residents encounter several barriers to answering their clinical questions, some of which are unique to their status as trainees. While access to electronic information resources and informatics training remain necessary, they are not sufficient to help residents practice EBM. Educators must also attend to their attitudes toward self-directed learning and larger programmatic and institutional cultures. Future research should examine the effectiveness of strategies to overcome these obstacles.

ACKNOWLEDGMENTS

The authors wish to thank Ms. Marjie Lipsitz-Shapiro for facilitating the focus groups. Drs. Julie Rosenbaum and Eric Holmboe and Ms. Lynnea Ladouceur also deserve appreciation for their guidance in analyzing qualitative data.

References

1 Institute of Medicine. Crossing the Quality Chasm: a New Health System for the 21st Century. Washington, DC: National Academy Press, 2001.
2 Covell DG, Uman GC, Manning PR. Information needs in office practice: are they being met? Ann Intern Med. 1985;103:596–9.
3 Gorman PN, Helfand M. Information seeking in primary care: how physicians choose which clinical questions to pursue and which to leave unanswered. Med Decis Making. 1995;15:113–9.
4 Ramsey PG, Carline JD, Inui TS, et al. Changes over time in the knowledge base of practicing internists. JAMA. 1991;266:1103–7.
5 Green ML, Ciampi MA, Ellis PJ. Residents’ medical information needs in clinic: are they being met? Am J Med. 2000;109:218–23.
6 McColl A, Smith H, White P, Field J General practitioner's perceptions of the route to evidence based medicine: a questionnaire survey. BMJ. 1998;316:361–5.
7 McAlister FA, Graham I, Karr GW, Laupacis A. Evidence-based medicine and the practicing clinician. J Gen Intern Med. 1999;14:236–42.
8 Young JM, Ward JE. Evidence-based medicine in general practice: beliefs and barriers among Australian GPs. J Eval Clin Pract. 2001;7:201–10.
9 Putnam W, Twohig PL, Burge FI, Jackson LA, Cox JL. A qualitative study of evidence in primary care: what the practitioners are saying. CMAJ. 2002;166:1525–30.
10 Freeman AC, Sweeney K. Why general practitioners do not implement evidence: qualitative study. BMJ. 2001;323:1100–2.
11 Ely JW, Osheroff JA, Ebell MH, et al. Obstacles to answering doctors’ questions about patient care with evidence: qualitative study. BMJ. 2002;324:710.
12 Oswald N, Bateman H. Treating individuals according to evidence: why do primary care practitioners do what they do? J Eval Clin Pract. 2000;6:139–48.
13 Montori VM, Tabini CC, Ebbert JO. A qualitative assessment of 1st-year internal medicine residents’ perceptions of evidence-based clinical decision-making. Teach Learn Med. 2001;14:114–8.
14 Bhandari M, Montori V, Devereaux PJ, Dosanjh S, Sprague S, Guyatt GH. Challenges to the practice of evidence-based medicine during residents’ surgical training: a qualitative study using grounded theory. Acad Med. 2003;78:1183–90.
15 Krueger RA. Developing questions for focus groups. In: Morgan DL, Krueger RA, eds. The Focus Group Kit. Vol 3. Thousand Oaks, CA: Sage Publications, 1998.
16 Strauss A, Corbin J. Basics of Qualitative Research. 2nd ed. Thousand Oaks, CA: Sage Publications, 1998.
17 Mays N, Pope C. Qualitative research: rigour and qualitative research. BMJ. 1995;311:109–12.
18 Mays N, Pope C. Qualitative research in health care. Assessing quality in qualitative research. BMJ. 2000;320:50–2.
19 Pope C, Ziebland S, Mays N. Qualitative research in health care. Analysing qualitative data. BMJ. 2000;320:114–6.
20 Krueger RA. Analyzing and reporting focus group results. In: Morgan DL, Krueger RA (eds). The Focus Group Kit. Vol 6. Thousand Oaks, CA: Sage Publications, 1998:139
21 Slotnick HB. How doctors know when to stop learning. Med Teach. 2000;22:189–96.
22 Slotnick HB. How doctors learn: physicians’ self-directed learning episodes. Acad Med. 1999;74:1106–17.
23 Hafferty FW. Beyond curriculum reform: confronting medicine's hidden curriculum. Acad Med. 1998;73:403–7.
24 Itani KM, Miller CC, Church HM, McCollum CH. Impact of a problem-based learning conference on surgery residents’ in training exam (ABSITE) scores. American Board of Surgery in Training Exam. J Surg Res. 1997;70:66–8.
25 FitzGerald JD, Wenger NS. Didactic teaching conferences for IM residents: who attends, and is attendance related to medical certifying examination scores? Acad Med. 2003;78:84–9.
26 Ely JW, Osheroff JA, Ferguson KJ, Chambliss ML, Vinson DC, Moore JL. Lifelong self-directed learning using a computer database of clinical questions. J Fam Pract. 1997;45:382–8.
27 Crowley SD, Owens TA, Schardt CM, et al. A Web-based compendium of clinical questions and medical evidence to educate internal medicine residents. Acad Med. 2003;78:270–4.
28 Davis DA, Thomson MA, Oxman AD, Haynes RB. Changing physician performance. A systematic review of the effect of continuing medical education strategies. JAMA. 1995;274:700–5.
29 Accreditation Council for Graduate Medical Education. Institutional Requirements 〈http://www.acgme.org/acWebsite/irc/irc_IRCpr703.asp〉. Accessed November 2004.
© 2005 Association of American Medical Colleges