Secondary Logo

Journal Logo


Rapid reviews and the methodological rigor of evidence synthesis: a JBI position statement

Tricco, Andrea C.1,2,3; Khalil, Hanan4; Holly, Cheryl5; Feyissa, Garumma6; Godfrey, Christina3; Evans, Catrin7; Sawchuck, Diane8; Sudhakar, Morankar9; Asahngwa, Constantine10; Stannard, Daphne11; Abdulahi, Misra12; Bonnano, Laura13; Aromataris, Edoardo14; McInerney, Patricia15; Wilson, Rosemary3; Pang, Dong16; Wang, Zhiwen15; Cardoso, Ana Filipa17; Peters, Micah D.J.18,19,20; Marnie, Casey18; Barker, Timothy14; Pollock, Danielle14; McArthur, Alexa14; Munn, Zachary14

Author Information
doi: 10.11124/JBIES-21-00371
  • Free
  • Watch Video



Rapid reviews aim to provide more timely information for decision-makers (such as health care planners, providers, policymakers, patients, and others), often at a reduced cost. Rapid reviews generally follow similar steps to systematic reviews; however, because the objective is to expedite the review process, standard workflows and processes involved in a systematic review1,2 may be omitted, modified, or simplified.

Various definitions of rapid reviews exist,1,2 with the most recent proposed by the Cochrane Rapid Review Methodology Group. If an organization produces rapid reviews only for decision-making, then this definition can be used: “A rapid review is a form of knowledge synthesis that accelerates the process of conducting a traditional systematic review through streamlining or omitting a variety of methods to produce evidence for stakeholders in a resource-efficient manner."3(p.81) This definition highlights the importance and intention of rapid reviews to specifically meet the needs of stakeholders/decision-makers.

One of the key barriers to the use of research evidence in decision-making is the lack of timely and/or relevant research. Policymakers and health care planners often need to make difficult decisions in short timeframes, and when research is available, rapid reviews provide a practical, feasible, and efficient way to summarize this evidence.4 In some settings, due to constraints in funding, resources, or methodological expertise, rapid reviews may be all that is possible. It is likely that, in the majority of these cases, some evidence is better than no evidence. It should be acknowledged that rapid reviews do present a risk where evidence may have been missed or inadequately appraised or synthesized, and these potential limitations need to be sufficiently reported.5

Demand for rapid reviews

The demand for rapid reviews has exploded in recent years. This has been especially apparent during the COVID-19 pandemic, where more than 3000 rapid reviews were conducted to inform dynamic decision-making.6 This demonstrates the utility of rapid reviews as a type of evidence synthesis to inform urgent, rapid health system responses. There is also an increased public awareness of urgent issues impacting the health system and the need for rapid dissemination of public health measures that in turn drives the need for rapid responses.7

Increasingly, health care decision-makers seek quality evidence in a short timeframe to support urgent and emergent decisions related to procurement, clinical practice, and policy. Rapid response reports are ideally tailored to the contextual needs of health care decision-makers, representing a range of options about depth, breadth, and time-to-service delivery. A rapid review is an emerging approach that allows evidence to be brought to the forefront of health care decision-making in a timely, relevant way; however, it may require methodological trade-offs when compared with systematic reviews.8

The JBI approach to evidence synthesis

JBI is an international collaboration that is world-renowned for its evidence synthesis and implementation methodologies.9-12 JBI has developed and published guidance on 11 different types of evidence synthesis, as outlined in Table 1.13-23

Table 1 - Review types for which published JBI guidance is available
1. Systematic reviews of qualitative evidence 13
2. Systematic reviews of effectiveness 14
3. Systematic reviews of text and opinion 15
4. Systematic reviews of prevalence and incidence 16
5. Systematic reviews of economic evidence 17
6. Systematic reviews of etiology and risk 18
7. Mixed methods systematic reviews 19
8. Diagnostic test accuracy systematic reviews 20
9. Umbrella reviews 21
10. Scoping reviews 22
11. Systematic reviews of measurement properties 23

In this paper, we present the JBI position statement for rapid reviews.

Rapid reviews versus other types of evidence synthesis

It has been said that rapid reviews are not a type of evidence synthesis24 unto themselves and that many different types of evidence syntheses can be com-pleted rapidly.25,26 In light of this, all the review types listed in Table 1 could be undertaken using a “rapid approach.” Rapid reviews can therefore be considered similar to living reviews, which are not necessarily a review type, but rather an approach or mindset when conducting any type of review. Living reviews refer to systematic reviews that are continu-ously updated as new studies emerge to ensure their relevance.27 Like rapid reviews, many types of evi-dence syntheses can be “living.” Rapid reviews and living reviews should be considered approaches to evidence synthesis, rather than novel evidence synthesis types.

Most of the methodological inquiry and guidance on rapid reviews has focused on reviews of interventions, although investigations into other review types are emerging.28,29 There is an increasing amount of methodological research evaluating deviations or omissions from the traditional systematic review process, and whether these omissions present genuine threats to the validity of the results in systematic reviews. For JBI's evidence synthesis toolkit, investigation on the impact of omitting or abbreviating review processes for other review types could be an important program of future methodological research. Additionally, automation, machine learning, artificial intelligence, and the digitization of evidence are further areas of work that offer opportunities to streamline review processes.

Different types of rapid evidence products

There are four major rapid evidence products that differ from one another in their purpose, methodological rigor, comprehensiveness, and the time taken for their production.25 These include: i) inventories—a list of available evidence sources that lack appraisal, synthesis, and recommendations, and can be completed within a few hours to a few days30,31; ii) rapid response briefs—a summary of already existing synthesized evidence (systematic reviews or guidelines) without formal analysis, which can be completed within days to weeks30,31; iii) rapid reviews—appraised and synthesized knowledge products that can be completed within weeks to months; and iv) automated products— rapid evidence products produced by computer-based analysis from databases of extracted studies to address queries defined by the user, which can be completed within days to weeks.32,33 The product that is the closest to a systematic review is the rapid review; a rapid response brief is closest to an over-view of reviews if only systematic reviews are included. Inventories and automated products are further from the systematic review process and are not the focus of this paper.

The SelecTing Approaches for Rapid Reviews (STARR) is a decision tool that offers researchers guidance on planning a rapid review.34 The tool includes 20 items that cover interactions with the decision-maker(s) who commissioned the rapid review; scoping the literature; selecting streamlined approaches to literature searches, methods for data abstraction, and synthesis; as well as reporting the methods used in a rapid review. STARR is useful as a starting point to select broad approaches that may be considered for a rapid review.

There are several ways that rapid reviews can be expedited through the efficient utilization of team resources. Process maps or work flowcharts outline specific activities, activity dependencies, timelines, and allocated accountabilities to ensure the entire team understands all aspects of the project, along with each individual role and associated responsibil-ity. Concept mapping utilizes diagrams to demon-strate complex relationships between constructs, and is recommended for visualizing the interpretation of, and relationship between, studies included in the evidence synthesis. These are tools that can be used to expedite the evidence synthesis process and pro-duce a rapid review. Other emerging approaches that combine human and machine learning are also being discussed in the literature. This approach combines methods such as crowdsourcing and automation tools for various steps of the review, which facilitate the production of reviews in a shorter amount of time than systematic reviews.35

Assessing the quality of rapid reviews

There is currently no specific tool to critically appraise the methodological quality of a rapid review. Rather, tools to assess the risk of bias or quality of systematic reviews are used, such as ROBIS,36 the JBI critical appraisal tools,21 or the assessment of multiple sys-tematic reviews (AMSTAR). The AMSTAR tool is likely more commonly used than the others, based on the number of citations for each in Google Scholar (searched on January 17, 2022). The most up-to-date version of this tool37 covers 16 items related to the conduct of a systematic review, including the protocol, literature search strategy, study selection, data extraction, risk of bias/appraisal, meta-analysis, and conflict of interest. Many of these items are relevant to rapid reviews, and so the tool can easily be tailored for their appraisal (ie, exclusion of items related to meta-analysis if this was not conducted in the rapid review). Quality assessment can also provide important infor-mation to decision-makers regarding how trustwor-thy the rapid review results are for decision-making.

JBI and rapid reviews

To date, apart from its well-established, well-documented, and systematic approach to the development of evidence summaries,38 JBI has not endorsed a specific approach to the modification of its existing systematic review guidance to accommodate a rapid review. Indeed, the hallmark of JBI reviews are their comprehensiveness, rigor, transparency, and focus on applicability to clinical practice. Given this, it is unclear at the present time how JBI would endorse and publish abridged reviews for rapid decision-making purposes.

However, there are some circumstances in which JBI does conduct rapid reviews for policymakers and other commissioning agencies due to time or funding constraints. In these cases, a standard methodology for the rapid review is not followed; rather, the approach is tailored to the timeframes, the resources available, and the needs of the funders. This aligns with JBI's pragmatic ethos that is applied to all programs and products.39,40 When conducting these reviews, the urgent need to deliver timely evidence to decision-makers is acknowledged, although conflicts exist with JBI's mission to produce high-quality, trustworthy evidence. The increasing demand for a more rapid delivery of the available evidence and the need to maintain methodological rigor in the conduct of evidence synthesis do not need to be completely irreconcilable. It is the responsibility of the review team to conduct their rapid review with as much rigor and transparency as is expected of JBI reviews. Details regarding the decisions to be made when tailoring rapid review methods are beyond the scope of this article; however, we direct readers to resources provided on the methodological conduct of rapid reviews.25,30 Most importantly, the dimension of transparency is essential, and the rapid review should clearly report where any abbreviations in the methodological process have been taken.

Rapid reviews, despite their limitations, represent a substantial effort by the review team within very short timelines, and where relevant, may be of use to audiences other than the funding body. One way to ensure broader dissemination beyond the purpose of providing evidence for the decision-maker is for authors to return to their rapid review after meeting the needs of the funders and complete the additional steps to meet publication and methodological standards. These rapid reviews can then evolve into full evidence syntheses and could be converted into “full” systematic reviews for publication in journals that do not accept rapid reviews. However, this often takes substantial resources (funding, time) that many review teams may not have, for example, in low-resource settings. We encourage authors wishing to publish in JBI Evidence Synthesis to refer to the journal guidelines.


Rapid reviews have a valuable and necessary place in the evidence synthesis toolbox, particularly where they are needed to provide rapid-turnaround evidence for decision-makers who need guidance in fast-paced contexts. Although there are several definitions of rapid reviews, at JBI we believe the simplest definition is that they are reviews characterized by the omission, abbreviation, or simplification of the traditional steps in a systematic review. Due to the urgent need for trustworthy evidence, reviewers may consider measures to streamline their approaches and conduct reviews more efficiently, without compromising on quality, and it is hoped that automation methods will provide solutions to this challenge. As methodological guidance and guidelines for various forms of evidence synthesis evolve over time, a more robust and coherent methodology for rapid reviews may also take shape.


ACT is funded by a Tier 2 Canada Research Chair in Knowledge Synthesis. ZM is supported by an NHMRC Investigator Grant APP1195676. The other authors have no funding to declare.


Shazia Siddiqui and Navjot Mann for assisting with obtaining co-author feedback and merging comments, inserting references, and ensuring the document conformed to the journal requirements.


1. Haby MM, Chapman E, Clark R, Barreto J, Reveiz L, Lavis JN. What are the best methodologies for rapid reviews of the research evidence for evidence-informed decision making in health policy and practice: a rapid review. Health Res Policy Syst 2016;14 (1):83.
2. Tricco AC, Antony J, Zarin W, Strifler L, Ghassemi M, Ivory J, et al. A scoping review of rapid review methods. BMC Med 2015;13:224.
3. Hamel C, Michaud A, Thuku M, Skidmore B, Stevens A, Nussbaumer-Streit B, et al. Defining rapid reviews: a systematic scoping review and thematic analysis ofdefinitions and defining characteristics of rapid reviews. J Clin Epidemiol 2021;129:74–85.
4. Saul JE, Willis CD, Bitz J, Best A. A time-responsive tool for informing policy making: rapid realist review. Implement Sci 2013;8 (1):103.
5. Plüddemann A, Aronson JK, Onakpoya I, Heneghan C, Mahtani KR. Redefining rapid reviews: a flexible framework for restricted systematic reviews. BMJ Evid Based Med 2018;23 (6):201.
6. COVID-END: COVID-19 Evidence Network to Support Decision-making. COVID-END [internet]. McMaster University. n.d. [cited 2021 Jan 10]. Available from:
7. Fretheim A, Brurberg KG, Forland F. Rapid reviews for rapid decision-making during the coronavirus disease (COVID-19) pandemic, Norway. Euro Surveill 2020;25 (19): 2000687.
8. Khangura S, Polisena J, Clifford TJ, Farrah K, Kamel C. Rapid review: an emerging approach to evidence synthesis in health technology assessment. Int J Technol Assess Health Care 2014;30 (1):20–27.
9. JBI. JBI [internet]. Adelaide: JBI; n.d. [cited 2021 Jan 10]. Available from:
10. Jordan Z, Lockwood C, Munn Z, Aromataris E. The updated Joanna Briggs Institute Model of Evidence-Based Health-care. Int J Evid Based Healthc 2019;17 (1):58–71.
11. Aromataris E, Munn Z. JBI Manual for Evidence Synthesis [internet]. Adelaide: JBI; 2020 [cited 2021 Jan 10]. Available from
12. Porritt K, McArthur A, Lockwood C, Munn Z. JBI Handbook for Evidence Implementation [internet]. Adelaide: JBI, 2020. [cited 2021 Jan 10]. Available from:
13. Lockwood C, Munn Z, Porritt K. Qualitative research synthesis: methodological guidance for systematic reviewers utilizing meta-aggregation. Int J Evid Based Med 2015;13 (3):179–187.
14. Tufanaru C, Munn Z, Stephenson M, Aromataris E. Fixed or random effects meta-analysis? Common methodological issues in systematic reviews of effectiveness. Int J Evid Based Healthc 2015;13 (3):196–207.
15. McArthur A, Klugárová J, Yan H, Florescu S. Innovations in the systematic review of text and opinion. Int J Evid Based Healthc 2015;13 (3):188–195.
16. Munn Z, Moola S, Lisy K, Riitano D, Tufanaru C. Methodologi-cal guidance for systematic reviews of observational epide-miological studies reporting prevalence and cumulative incidence data. Int J Evid Based Healthc 2015;13 (3):147–153.
17. Gomersall JS, Jadotte YT, Xue Y, Lockwood S, Riddle D, Preda A. Conducting systematic reviews of economic evaluations. Int J Evid Based Healthc 2015;13 (3):170–178.
18. Moola S, Munn Z, Sears K, Sfetcu R, Currie M, Lisy K, et al. Conducting systematic reviews of association (etiology): The Joanna Briggs Institute's approach. Int J Evid Based Healthc 2015;13 (3):163–169.
19. Pearson A, White H, Bath-Hextall F, Salmond S, Apostolo J, Kirkpatrick P. A mixed-methods approach to systematic reviews. Int J Evid Based Healthc 2015;13 (3):121–131.
20. Campbell JM, Klugar M, Ding S, Carmody DP, Hakonsen SJ, Jadotte YT, et al. Diagnostic test accuracy: methods for systematic review and meta-analysis. Int J Evid Based Healthc 2015;13 (3):154–162.
21. Aromataris E, Fernandez R, Godfrey CM, Holly C, Khalil H, Tungpunkom P. Summarizing systematic reviews: method-ological development, conduct and reporting of an umbrella review approach. Int J Evid Based Med 2015;13 (3):132–140.
22. Peters MDJ, Marnie C, Tricco AC, Pollock D, Munn Z, Alex-ander L, et al. Updated methodological guidance for the conduct of scoping reviews. JBI Evid Synth 2020;18 (10):2119–2126.
23. Stephenson M, Riitano D, Wilson S, Leonardi-Bee J, Mabire C, Cooper K. Aromataris E, Munn Z, et al. Chapter 12: Systematic reviews of measure-ment properties. JBI, JBI Manual for Evidence Synthesis [internet]. Adelaide:2020.
24. Munn Z, Stern C, Aromataris E, Lockwood C, Jordan Z. What kind of systematic review should I conduct? A proposed typology and guidance for systematic reviewers in the medical and health sciences. BMC Med Res Methodol 2018;18 (1):5.
25. Tricco AC, Langlois EV, S Straus SE, editors. Rapid reviews to strengthen health policy and systems: a practical guide [internet]. Geneva: World Health Organization; 2017 [cited 2021 Jan 10]. Available from:
26. Peters MDJ, Marnie C. I want to write a literature review, where do I start? [internet]. ANMJ 2020.
27. Akl EA, Haddaway NR, Rada G, Lotfi T. Future of evidence ecosystem series: evidence synthesis 2.0: when systematic, scoping, rapid, living, and overviews of reviews come together. J Clin Epidemiol 2020;123:162–165.
28. Biesty L, Meskell P, Glenton C, Delaney H, Smalle M, Booth A, et al. A QuESt for speed: rapid qualitative evidence synthe-ses as a response to the COVID-19 pandemic. Syst Rev 2020;9 (1):256.
29. Arevalo-Rodriguez I, Tricco AC, Steingart KR, Nussbaumer-Streit B, Kaunelis D, Alonso-Coello P, et al. Challenges of rapid reviews for diagnostic test accuracy questions: a protocol for an international survey and expert consulta-tion. Diagn Progn Res 2019;3 (1):7.
30. Tricco AC, Garritty CM, Boulos L, Lockwood C, Wilson M, McGowan J, et al. Rapid review methods more challenging during COVID-19: commentary with a focus on 8 knowledge synthesis steps. J Clin Epidemiol 2020;126:177–183.
31. Haby MM, Chapman E, Clark R, Barreto J, Reveiz L, Lavis JN. Designing a rapid response program to support evidence-informed decision-making in the Americas region: using the best available evidence and case studies. Implement Sci 2016;11 (1):117.
32. Hartling L, Guise J-M, Kato E, Anderson J, Belinson S, Berliner E, et al. A taxonomy of rapid reviews links report types and methods to specific decision-making contexts. J Clin Epi-demiol 2015;68 (12):1451–1462.
33. Hartling L, Guise JM, Kato E, Anderson J, Aronson N, Belinson S, et al. AHRQ comparative effectiveness reviews. EPC methods: an exploration of methods and context for the production of rapid reviews. Rockville (MD): Agency for Healthcare Research and Quality (US); 2015.
34. Pandor A, Kaltenthaler E, Martyn-St James M, Wong R, Cooper K, Dimairo M, et al. Delphi consensus reached to produce a decision tool for SelecTing Approaches for Rapid Reviews (STARR). J Clin Epidemiol 2019;114:22–29.
35. Thomas J, Noel-Storr A, Marshall I, Wallace B, McDonald S, Mavergames C, et al. Living systematic reviews: 2. Combin-ing human and machine effort. J Clin Epidemiol 2017;91:31–37.
36. Whiting P, Savović J, Higgins JPT, Caldwell DM, Reeves BC, Shea B, et al. ROBIS: a new tool to assess risk of bias in systematic reviews was developed. J Clin Epidemiol 2016;69:225–234.
37. Shea BJ, Reeves BC, Wells G, Thuku M, Hamel C, Moran J, et al. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised stud-ies of healthcare interventions, or both. BMJ 2017;358: j4008-j.
38. Munn Z, Lockwood C, Moola S. The development and use of evidence summaries for point of care information systems: a streamlined rapid review approach. Worldview Evid Based Nurs 2015;12 (3):131–138.
39. Hannes K, Lockwood C. Pragmatism as the philosophical foundation for the Joanna Briggs meta-aggregative approach to qualitative evidence synthesis. J Adv Nurs 2011;67 (7):1632–1642.
40. Munn Z. Implications for practice: should recommendations be recommended in systematic reviews? JBI Database System Rev Implement Rep 2015;13 (7):1–3.

evidence synthesis; knowledge synthesis; rapid reviews; research methodology

© 2022 JBI