Literature Search Strategies
Reproducibility of Literature Search Reporting in Medical Education Reviews
Maggio, Lauren A. MS(LIS), MA; Tannery, Nancy H. MLS; Kanter, Steven L. MD
Ms. Maggio is medical education librarian, Lane Medical Library, Stanford University School of Medicine, Stanford, California.
Ms. Tannery is associate director, User Services, Health Sciences Library System, University of Pittsburgh, Pittsburgh, Pennsylvania.
Dr. Kanter is vice dean, University of Pittsburgh School of Medicine, Pittsburgh, Pennsylvania.
Correspondence should be addressed to Ms. Maggio, Stanford University, Lane Medical Library, 300 Pasteur Drive, Room L-109, Stanford, CA 94035; telephone: (650) 725-5493; e-mail: email@example.com.
First published online June 20, 2011
Purpose: Medical education literature has been found to lack key components of scientific reporting, including adequate descriptions of literature searches, thus preventing medical educators from replicating and building on previous scholarship. The purpose of this study was to examine the reproducibility of search strategies as reported in medical education literature reviews.
Method: The authors searched for and identified literature reviews published in 2009 in Academic Medicine, Teaching and Learning in Medicine, and Medical Education. They searched for citations whose titles included the words “meta-analysis,” “systematic literature review,” “systematic review,” or “literature review,” or whose publication type MEDLINE listed as “meta-analysis” or “review.” The authors created a checklist to identify key characteristics of literature searches and of literature search reporting within the full text of the reviews. The authors deemed searches reproducible only if the review reported both a search date and Boolean operators.
Results: Of the 34 reviews meeting the inclusion criteria, 19 (56%) explicitly described a literature search and mentioned MEDLINE; however, only 14 (41%) also mentioned searches of nonmedical databases. Eighteen reviews (53%) listed search terms, but only 6 (18%) listed Medical Subject Headings, and only 2 (6%) mentioned Boolean operators. Fifteen (44%) noted the use of limits. None of the reviews included reproducible searches.
Conclusions: According to this analysis, literature search strategies in medical education reviews are highly variable and generally not reproducible. The authors provide recommendations to facilitate future high-quality, transparent, and reproducible literature searches.
In the last decade, medical education has been increasingly recognized as a discipline of scientific inquiry that has the potential to affect the outcomes of learners and ultimately their patients.1–3 With this recognition has come a call for the publication of rigorous, scientifically sound, outcomes-based inquiry that is equivalent in quality to clinical research.4–7 However, some recent researchers have concluded that the medical education literature lacks the “essential elements of scientific reporting,”8 thus inhibiting the ability of medical educators to replicate and build on previous scholarship.9–11
Because, as Cook and colleagues9 write, scholarly innovations do “not appear from thin air; they must build on prior work,” this analysis focuses on the reporting of search strategies in medical education reviews, which, as defined by PubMed, include articles that examine published materials on a subject. Prior research in the clinical realm has examined the reporting of search strategies in clinical reviews,12 and guidelines for the scientific reporting of search strategies for clinical studies are also available.10 Our examination of the medical education literature via a literature search reveals that no similar guidelines for literature search reporting are available for medical education reviews. In this analysis we seek both to characterize the reporting of literature searches in recent medical education reviews and to make recommendations for conducting and reporting high-quality, transparent, and reproducible searches.
In this study we examined literature reviews from three key academic medical education journals: Academic Medicine, Teaching and Learning in Medicine, and Medical Education. We selected these journals because they focus on medical education research, are indexed in the MEDLINE database, and are included in Journal Citation Reports. To locate medical education reviews in these three publications, we searched MEDLINE via PubMed on May 21, 2010, using the following search strategy:
(meta-analysis[pt] OR review[pt] OR meta-analysis[ti] OR systematic literature review[ti] OR systematic review[ti] OR literature review[ti]) AND (“Acad Med”[Journal] OR “Med Educ”[Journal] OR “Teach Learn Med”[Journal])
This search strategy indicates that we queried the MEDLINE database to retrieve citations indexed by the National Library of Medicine (NLM) that met the following criteria:
* their publication type (pt) was “meta-analysis” or “review,” or
* their citations contained any of the word groups “meta-analysis,” “systematic literature review,” “systematic review,” or “literature review” within the citation title (ti), and
* they were found in the journals Academic Medicine, Medical Education, or Teaching and Learning in Medicine.
The Boolean operator “OR” expanded our search while the operator “AND” limited it.
We further limited our search to reviews published in 2009, the most recent year for which all reviews would have been indexed. We managed and shared citations using RefWorks (Bethesda, Md). Two of us (L.A.M. and N.H.T.) retrieved and examined the full text of all the items cited. We created a checklist (List 1) to identify the presence or absence of key characteristics of literature searches and of reports of literature searches. Two of us (L.A.M. and N.H.T.) independently completed the checklist for all reviews and then compared results. We agreed unanimously on the presence or absence of all characteristics. We (L.A.M., N.H.T., S.L.K.) considered the literature searches to be reproducible only if Boolean operators and a search completion date (day/month/year) were present in the review.
Results and Discussion
Thirty-six reviews met the search criteria. We retrieved 27 (75%) of them from Academic Medicine,13–39 1 (3%) from Teaching and Learning in Medicine,40 and 8 (22%) from Medical Education.41–48 Two of the articles were commentaries (both from Academic Medicine), and thus we excluded them from this study.38,39 Of the remaining 34, over half (19 [56%]) explicitly described a literature search (Table 1).17,19,24,26,28–31,35,36,40–45,47,48 Notably, the other 15 (44%), although designated as reviews by NLM criteria, did not explicitly describe a literature search.
All 19 reviews reporting literature searches explicitly mentioned the use of a database (MEDLINE in all cases). Three listed only one database17,30,35 (MEDLINE), even though research shows that no single database provides comprehensive coverage of all medical education literature.8,49,50 Although it indexes approximately 5,000 journals, MEDLINE primarily indexes biomedical journals and therefore may not include all journals that publish articles on medical education research, such as the Journal of Graduate Medical Education or the Journal of the International Association of Medical Science Educators. In the future, MEDLINE may include these journals, but at this time searching other databases is necessary to retrieve citations to articles from these publications. Additionally, MEDLINE does not index alternate publication types such as books, dissertations, and reports; for example, reports published by the Association of American Medical Colleges are indexed in the Education Resources Information Center (ERIC) database, but not MEDLINE. Therefore, medical education researchers should search multiple databases to ensure comprehensive retrieval.51
The most common databases named in the reviews that reported searches of multiple databases were ERIC and CINAHL (both mentioned by eight reviews26,28,29,31,36,40–42). Less than half (14 [41%]) also reported searches of nonmedical databases such as ERIC or Web of Science (List 2).19,24,26,28,29,36,40–45,47,48 A minority of reviews named resources such as PsycINFO (which covers psychological literature),24,41,42 ISI Web of Knowledge,19 Google,43 or Scopus40—all of which provide broad coverage of areas such as the social sciences and engineering. Additionally, one review19 listed ABI Inform, a business literature database. Using a variety of databases, including those not focused exclusively on medicine, provides medical education researchers a multidisciplinary lens to view and build on educational concepts in other arenas, such as business or law. Therefore, we recommend strongly that medical education researchers search a variety of relevant databases to ensure comprehensive searches.
Reporting search terms is essential for search reproducibility and judging search completeness.10 This study revealed that even though all 19 reviews describing a literature search strategy did include search terms, the reporting of these search terms was highly variable. Six (18%) of the reviews19,28–30,36,48 reporting search terms stated that the authors used words from the Medical Subject Headings (MeSH), a controlled vocabulary that has been hailed as a powerful tool for generating unambiguous searches of the biomedical literature.52 MeSH terms are specialized descriptors that describe MEDLINE-indexed articles. To enable comprehensive searches in MEDLINE, MeSH has a hierarchical structure that is applied in varying ways, depending on the platform used. For example, in MEDLINE searches via PubMed, all MeSH terms are automatically “exploded.” This means that all related, more specific terms are also integrated into the search. For example, a PubMed search using the broad MeSH terms “education, medical” also retrieves citations related to more specific MeSH terms such as “education, medical, graduate.” PubMed's default to explode MeSH facilitates comprehensive searching; however, PubMed's default can be deactivated, which would significantly alter the search. Furthermore, other MEDLINE platforms (e.g., Ovid, EBSCO) control the explode function in their own ways. Thus, for reproducibility, authors must note explicitly whether or not the MeSH terms in their search were exploded.
Searches using MeSH terms alone “often fail to capture the subject requirements of medical education”50; therefore, including key words and their synonyms in a literature search is necessary to ensure comprehensiveness. The failure of searches to uncover important literature relevant to a study may be partially attributed to researchers seeking information on concepts, such as education or new trends, outside the traditional scope of biomedicine. For example, the term “faculty development” is not a MeSH term. Additionally, searching only MeSH terms can fail to retrieve citations to recently published articles. As newly published articles are entered into PubMed, they are initially considered “in process”; that is, the most recent items are being indexed with MeSH and migrated to MEDLINE, which is a subset of PubMed. Therefore, searches using only MeSH terms will not retrieve citations (no matter how relevant) that are not yet indexed. Furthermore, indexing cycles for journals vary. Personal correspondence with the NLM (John Thomas, March 29, 2010) revealed, for example, that the quarterly journal, Teaching and Learning in Medicine, is not indexed in the month that it is published but, rather, several months after publication. Thus, the time delay of applying MeSH terms to this journal's citations means that a PubMed search, using only MeSH terms, would fail to retrieve articles in this journal's most recent issues.
This time gap highlights the necessity for authors to include the date of their search. We found that only one review (3%) reported the date on which the investigators executed their literature search.48 The reporting of a search date enables accurate replication of the search to ensure identical retrieval. Because citations are added daily to databases, a search date clearly demarcates citations that were available at the time of the search. Additionally, the reporting of a search date enables readers to determine the currency of the review10 and helps to clarify inclusion criteria.
As mentioned, MeSH terms are designed specifically for MEDLINE. Although researchers can certainly use MeSH terms to search other databases, these terms may not be the most appropriate in those resources. ERIC, for example, uses its own controlled vocabulary. None of the reviews we examined in this study mention controlled vocabularies other than MeSH. As in searches of MEDLINE, using a combination of key words and each database's own controlled vocabulary is important for ensuring comprehensiveness.51
In addition to carefully selecting search terms, researchers must know how to logically combine search terms, because different combinations can retrieve different sets of articles. Only two (6%) of the reviews we examined for this study listed the Boolean operators that the investigators used in their search strategy.24,28 Boolean operators combine two or more terms using “AND,” “OR,” and/or “NOT.” Analysis of the search strategies used for Cochrane Collaboration reviews showed that 90% of the search strategies studied had at least one error.53 Without clear reporting of the Boolean logic used in a literature search, identifying potential errors (such as those discovered in the analysis of Cochrane reviews53), ensuring a study's validity, and assessing the appropriateness of a project's methods are all impossible.
Fifteen (44%) of the reviews we examined report the use of limits.17,19,24,26,28–32,35,36,40,41,45,47 The most common limits reported were restricting results either to a certain date range or to English-language articles. The use of limits is an important component of honing searches and defining inclusion criteria. Researchers using limits in PubMed should note that, as with the delayed application of MeSH terms, limits are not instantaneously specified for all PubMed citations; rather, they are delayed until the articles are indexed for MEDLINE. For example, when searching MEDLINE via PubMed and applying a limit, such as “review,” a searcher may not retrieve a citation for a review that is present in the database but not yet tagged “review article.” Therefore, when applying limits, ensuring that recent relevant citations are not excluded is important; and, again, explicitly reporting limits used and the date of the search is vital.
Hand searching, which was specifically mentioned in 15 (44%) of the reviews we examined,17,19,26,28–32,35,36,43–45,47,48 can mitigate the inconsistencies of database searching and enable the inclusion of recent articles or journals that are not indexed in the databases searched. Research has shown that, in the clinical realm, hand searching leads to relevant citations that would not otherwise have been found.54 Thus, hand searches are essential for creating a comprehensive search in biomedical and medical education literature.
Previous researchers have recommended clearly identifying the roles of everyone involved in a publication, including any individuals—such as librarians—who design and/or conduct literature searches.10 Five (15%) of the reviews we examined report some degree of involvement of an information professional or librarian,17,19,29,32,35 but only two (6%) clearly identify the specific role of the librarian.29,35 Librarians have specialized knowledge of database structure and information retrieval, and they are well qualified to undertake several tasks, including not only designing and running comprehensive literature searches but also retrieving and managing citations, and writing manuscripts.55 In fact, research has demonstrated that because experienced librarians have superior recall compared with subject experts who are novice searchers, collaborating with them may add value to MEDLINE searches.56 Golder and colleagues57 note that reviews in which librarians participated were more likely to feature complex search strategies and to be well described.
The International Committee of Medical Journal Editors (ICJME)'s Uniform Requirements for Manuscripts Submitted to Biomedical Journals instruct authors to “[i]dentify the methods, apparatus (give the manufacturer's name and address in parentheses), and procedures in sufficient detail to allow others to reproduce the results.”58 Although all three of the journals included in our study recommend that submitting authors follow ICJME guidelines, none of the reviews we examined provided descriptions of their search strategies in enough detail as to be—based on our criteria—reproducible. Most often, the absence of the Boolean operators used to combine search terms and the absence of a specific search date in reporting precluded the ability to reproduce a search. These absences are not unique to medical education: A study of randomly selected meta-analyses in clinical medicine reported that only 6.7% reported a search strategy in enough detail to be reproduced.53
Our study has some limitations. We examined the characteristics of literature search reporting in medical education research reviews; however, we included only three journals focused on medical education research and chose not to include journals such as the Journal of Graduate Medical Education or the Journal of the International Association of Medical Science Educators because they are not indexed in MEDLINE or included in Journal Citation Reports. Clinical literature also includes medical education reviews, but we did not explore articles in these publications. Additionally, this report provides a snapshot of 2009 and may not be representative of earlier (or later) years of medical education research.
This examination of medical education reviews revealed that literature search strategies in three medical-education-focused journals are of variable quality. Additionally, when reviews report a search strategy, several key elements of the search strategy may be absent, which precludes reproducibility. Comprehensive search strategies and thorough descriptions of those strategies are key to high-quality, transparent, reproducible medical education research.
One of the authors (S.L.K.) serves as editor-in-chief of one of the journals included in the analysis (Academic Medicine).
3 Morrison JM, Sullivan F, Murray E, Jolly B. Evidence-based education: Development of an instrument to critically appraise reports of educational interventions. Med Educ. 1999;33:890–893.
5 Patrick LJ, Munro S. The literature review: Demystifying the literature search. Diabetes Educ. 2004;30:30–34, 36–38.
6 Guidelines for evaluating papers on educational interventions. BMJ. 1999;318:1265–1267.
7 Cook DA, Bordage G, Schmidt HG. Description, justification and clarification: A framework for classifying the purposes of research in medical education. Med Educ. 2008;42:128–133.
8 Beckman TJ, Cook DA. Developing scholarly projects in education: A primer for medical teachers. Med Teach. 2007;29:210–218.
9 Cook DA, Bowen JL, Gerrity MS, et al. Proposed standards for medical education submissions to the Journal of General Internal Medicine. J Gen Intern Med. 2008;23:908–913.
10 Liberati A, Altman DG, Tetzlaff J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: Explanation and elaboration. PLoS Med. 2009;6:e1000100.
12 Sampson M, McGowan J, Cogo E, Grimshaw J, Moher D, Lefebvre C. An evidence-based practice guideline for the peer review of electronic search strategies. J Clin Epidemiol. 2009;62:944–952.
40 Tempelhof MW. Personal digital assistants: A review of current and potential utilization among medical residents. Teach Learn Med. 2009;21:100–104.
41 Bokken L, Linssen T, Scherpbier A, van der Vleuten C, Rethans JJ. Feedback by simulated patients in undergraduate medical education: A systematic review of the literature. Med Educ. 2009;43:202–210.
42 Cook DA, Triola MM. Virtual patients: A critical literature review and proposed next steps. Med Educ. 2009;43:303–311.
43 Dowell J, Merrylees N. Electives: Isn't it time for a change? Med Educ. 2009;43:121–126.
44 Hill AG, Yu TC, Barrow M, Hattie J. A systematic review of resident-as-teacher programmes. Med Educ. 2009;43:1129–1140.
45 Jha V, Quinton ND, Bekker HL, Roberts TE. Strategies and interventions for the involvement of real patients in medical education: A systematic review. Med Educ. 2009;43:10–20.
46 Martimianakis MA, Maniate JM, Hodges BD. Sociological interpretations of professionalism. Med Educ. 2009;43:829–837.
47 Ponnamperuma GG, Karunathilake IM, McAleer S, Davis MH. The long case and its modifications: A literature review. Med Educ. 2009;43:936–941.
48 Ruiz JG, Cook DA, Levinson AJ. Computer animations in medical education: A critical literature review. Med Educ. 2009;43:838–846.
49 Haig A, Dozier M. BEME guide no. 3: Systematic searching for evidence in medical education—Part 2: Constructing searches. Med Teach. 2003;25:463–484.
50 Haig A, Dozier M. BEME guide no 3: Systematic searching for evidence in medical education—Part 1: Sources of information. Med Teach. 2003;25:352–363.
51 Reed D, Price EG, Windish DM, et al. Challenges in systematic reviews of educational intervention studies. Ann Intern Med. 2005;142:1080–1089.
52 Lowe HJ, Barnett GO. Understanding and using the medical subject headings (MeSH) vocabulary to perform literature searches. JAMA. 1994;271:1103–1108.
53 Sampson M, McGowan J. Errors in search strategies were identified by type and frequency. J Clin Epidemiol. 2006;59:1057–1063.
54 Hopewell S, Clarke M, Lusher A, Lefebvre C, Westby M. A comparison of handsearching versus MEDLINE searching to identify reports of randomized controlled trials. Stat Med. 2002;21:1625–1634.
55 Klem M, Saghafi E, Abromitis R, Stover A, Dew MA, Pilkonis P. Building PROMIS item banks: Librarians as co-investigators. Qual Life Res. 2009;18:881–888.
56 McGowan J, Sampson M. Systematic reviews need systematic searchers. J Med Libr Assoc. 2005;93:74–80.
57 Golder S, Loke Y, McIntosh HM. Poor reporting and inadequate searches were apparent in systematic reviews of adverse effects. J Clin Epidemiol. 2008;61:440–448.
58 Uniform requirements for manuscripts submitted to biomedical journals. International Committee of Medical Journal Editors. JAMA. 1997;277:927–934.
This article has been cited 2 time(s).
Journal of Clinical EpidemiologySome improvements are apparent in identifying adverse effects in systematic reviews from 1994 to 2011Journal of Clinical Epidemiology
Plos Computational BiologyTen Simple Rules for Writing a Literature ReviewPlos Computational Biology
© 2011 Association of American Medical Colleges
What does "Remember me" mean?
By checking this box, you'll stay logged in until you logout. You'll get easier access to your articles, collections,
media, and all your other content, even if you close your browser or shut down your
To protect your most sensitive data and activities (like changing your password),
we'll ask you to re-enter your password when you access these services.
What if I'm on a computer that I share with others?
If you're using a public computer or you share this computer with others, we recommend
that you uncheck the "Remember me" box.
Data is temporarily unavailable. Please try again soon.