Secondary Logo

Journal Logo


Federated searches: why a one-stop shop approach to literature searching falls short for evidence synthesis

Solomons, Terena1,2; Hinton, Elizabeth3,4,5,6

Author Information
doi: 10.11124/JBIES-21-00177
  • Free

Developing robust search strategies to search the literature and identify studies is a crucial step in evidence synthesis methodology. One of the key differences in undertaking a literature review as compared to a systematic review or other type of evidence synthesis is that the search strategies for the latter need to be transparent, reproducible and, most importantly, comprehensive to capture all relevant studies.

JBI recommends a three-step process for developing search strategies for systematic and scoping reviews. The first step is identifying keywords from initial exploratory searches of relevant databases, usually MEDLINE and CINAHL, followed by an analysis of the text words in the title and abstract and of index terms (subject headings) used to describe “seed references” from which search strategies are developed. The second step is developing a search strategy using these identified keywords and indexed terms to search all databases and gray literature sources. Finally, the third step is to check the reference lists for any additional studies.1

Subject headings (also referred to as indexed terms, controlled vocabulary, or thesauri) are an important way for reviewers to find studies of interest when they have not searched for the same vocabulary that the author of a published paper has used. Baumann explains how the National Library of Medicine employs indexers to read articles and assign Medical Subject Headings (MeSH), which are “official words or phrases selected to represent particular biomedical concepts.”2(p.171) A modern-day analogy is that MeSH are social media hashtags, as explained in the What is MeSH YouTube video.3

JBI, Cochrane, and Campbell Collaboration recommend searching both keywords and subject headings in several databases.1,4,5 The English language has many variations and nuances in meaning. From our experience in searching for studies, subject headings help to improve the precision of the search and cater for differences in language. For example, suppose that we were conducting a search for studies on the effectiveness of educational strategies in improving parental/caregiver management of fever in their child. To us, a “caregiver” is a person who cares for children, whereas the term “caretaker” is a person who looks after properties. However, a descriptive, comparative study by Kelly et al. used the term “caretaker” in the title, abstract, and throughout the full text of the paper.6 As shown in Table 1, the subject heading “caregiver” was assigned to the article by indexers for the MEDLINE, Embase, and CINAHL databases. Had we not used the “Map Term to Subject Heading” function in these databases, we would have not discovered this study. Most health databases have subject headings: MEDLINE uses MeSH, Embase uses Emtree, and CINAHL has major and minor subject headings. Other databases for disciplines, such as law and social care, do not have subject headings, so a comprehensive list of keywords needs to be searched.7

Table 1:
Abstract excerpt from “Improving caretakers’ knowledge of fever management in preschool children: is it possible?” with corresponding database index terms

So where does federated searching fit in the context of JBI reviews? Is there a one-stop shop for finding studies for evidence synthesis?

Federated searching, also known as meta-searching or cross-database searching, is the practice of using one interface to search multiple sources in one fell swoop.8 Federated searching was developed in the late 1990s and was heralded as the next best thing in searching. Not only could it be used to search multiple websites, it could also be integrated into a library's online catalog to allow for easier user access to subscription materials and database content. Libraries are still using these discovery tools today. They are often helpful for locating much of a library's subscription content; however, depending on database licensing agreements, not all databases’ subscriptions are searchable.

De Groote and Appelt9 compared the use of WebFeat, a federated search engine, with searching MEDLINE, Embase, CINAHL, Web of Science, and Biological Abstracts databases individually. They concluded that “while WebFeat might be convenient for searching multiple databases simultaneously, it does not offer the advanced features including limits and mapping to subject headings offered by directly searching health sciences databases.”9(p.39) Federated search engines such as this are useful for busy clinicians needing information quickly; they are not recommended for evidence synthesis.

The Preferred Reporting Items for Systematic Reviews and Meta-Analysis for Searching (PRISMA-S) outlines the importance of transparent reporting of literature searching and provides a 16-point checklist for reporting items in searching for studies.10 Transparency of the search algorithms behind federated search engines is often unknown, making the reporting of a reproducible search nearly impossible. Rethlefsen, lead author of the PRISMA-S guidelines, cautioned that there are “no one-stop solutions for quality research,” and that “Federated search doesn’t necessarily make anything easy. In an ideal world, it would, but no matter the implementation or what technology may come, federated search will always be an overly simplistic solution to an extremely complex problem.”11(p.12)

The first two items on the PRISMA-S checklist relate to the databases searched for evidence synthesis. Item #1 requires the name of each database searched and the database's corresponding platform (eg, MEDLINE [Ovid], CINAHL [EBSCO]). Each platform or search interface provides different functionality and syntax for searching the databases. If a database uses subject headings (on the ProQuest and EBSCO platforms these are referred to as a thesaurus), it is advisable to use these tools in combination with keywords from the title and abstract. Different libraries subscribe to different numbers of databases within a platform, so from a transparency and reproducibility point of view, it's important to list the names of all databases searched, not just that ProQuest was searched, for example.

Item #2 on the PRISMA-S checklist refers to multi-database searching where several databases are searched on one platform, such as MEDLINE and Embase being searching simultaneously on Ovid. Including the full search strategy with which keywords and subject headings are used “helps readers immediately understand how the search was constructed and executed. This helps readers determine how effective the search strategy will be for each database.”10(p.5) Importantly, PRISMA-S Item #8 requires that the full search strategies for all databases and information sources be documented exactly as run to demonstrate transparency and so that the search can be replicated

Federated searching of library discovery tools does have a place in evidence synthesis at the beginning of a search. They are useful for finding “seed references” from which keywords and subject headings are obtained for developing the full search strategy. Federated searching is also useful for finding gray literature in resources like MedNar and TRIP.

To maintain the rigor, reproducibility, and unbiased nature of a systematic review, it is important to continue searching each database individually and using an appropriate combination of subject headings and keywords to maximize the functionality of individual databases. Gussenbaur and Haddaway12 recently conducted a systematic evaluation of 28 academic search engines and databases using 27 criteria, including search reproducibility. This study found Google Scholar to be inappropriate as a principal source for searching in a systematic review due to its lack of algorithm transparency and lack of functionality, although it may be suitable for secondary searching and gray literature.12

The full-service federated searching environment for evidence synthesis is still a utopia. JBI, Cochrane, and Campbell Collaboration all advise that information specialists and librarians be involved in review teams or at least consulted to explain the functionality and syntax of databases, and use of subject headings and keywords for finding studies.


1. Aromataris E, Munn Z. JBI Manual for Evidence Synthesis. Adelaide: JBI; 2020 [cited 2021 Apr 30]. Available from:
2. Baumann N. How to use the medical subject headings (MeSH). Int J Clin Pract 2016;70 (2):171–174.
3. Medical College of Wisconsin Libraries. 2018 [cited 2021 May 3]. Available from:
4. Lefebvre C, Glanville J, Briscoe S, Littlewood A, Marshall C, Metzendorf M-I, et al. Chapter 4: Searching for and selecting studiesHiggins JPT, Thomas J, Chandler J, Cumpston M, Li T, Page MJ, et al. Cochrane Handbook for Systematic Reviews of Interventions. Cochrane; 2021 [cited 2021 May 3]. Available from:
5. Kugley S, Wade A, Thomas J, Mahood Q, Jørgensen AMK, Hammerstrøm K, et al. Searching for studies: a guide to information retrieval for Campbell systematic reviews. Campbell Syst Rev 2017;13 (1):1–73.
6. Kelly L, Morin K, Young D. Improving caretakers’ knowledge of fever management in preschool children: is it possible? J Pediatr Health Care 1996;10 (4):167–173.
7. Forster B, Catterall J, Clough A. A collaboration between information specialists and a public health researcher to investigate search strategies in systematic reviews in interdisciplinary topics: a progress report. J Health Inf Libr Australas 2021;21 (1):19–23.
8. Surratt B. Federated Search Engines, 2001–2003. Chicago: American Library Association; 2007 [cited 2021 Apr 19]. Available from:
9. De Groote SL, Appelt K. The accuracy and thoroughness of a federated search engine in the health sciences. Int Ref Service Quart 2007;12 (1/2):27–47.
10. Rethlefsen ML, Kirtley S, Waffenschmidt S, Ayala AP, Moher D, Page MJ, et al. PRISMA-S: an extension to the PRISMA Statement for Reporting Literature Searches in Systematic Reviews. Syst Rev 2021;10 (1):39.
11. Rethlefsen ML. Easy ≠ Right. Libr J 2008;133:12–14.
12. Gusenbauer M, Haddaway NR. Which academic search systems are suitable for systematic reviews or meta-analyses? Evaluating retrieval qualities of Google Scholar, PubMed, and 26 other resources. Res Syn Meth 2020;11 (2):181–217.
© 2021 JBI