Skip to main content

Paper 2: Performing rapid reviews



Health policy-makers must often make decisions in compressed time frames and with limited resources. Hence, rapid reviews have become a pragmatic alternative to comprehensive systematic reviews. However, it is important that rapid review methods remain rigorous to support good policy development and decisions. There is currently little evidence about which streamlined steps in a rapid review are less likely to introduce unacceptable levels of uncertainty while still producing a product that remains useful to policy-makers.


This paper summarizes current research describing commonly used methods and practices that are used to conduct rapid reviews and presents key considerations and options to guide methodological choices for a rapid review.


The most important step for a rapid review is for an experienced research team to have early and ongoing engagement with the people who have requested the review. A clear research protocol, derived from a needs assessment conducted with the requester, serves to focus the review, defines the scope of the rapid review, and guides all subsequent steps. Common recommendations for rapid review methods include tailoring the literature search in terms of databases, dates, and languages. Researchers can consider using a staged search to locate high-quality systematic reviews and then subsequently published primary studies. The approaches used for study screening and selection, data extraction, and risk-of-bias assessment should be tailored to the topic, researcher experience, and available resources. Many rapid reviews use a single reviewer for study selection, risk-of-bias assessment, or data abstraction, sometimes with partial or full verification by a second reviewer. Rapid reviews usually use a descriptive synthesis method rather than quantitative meta-analysis. Use of brief report templates and standardized production methods helps to speed final report publication.


Researchers conducting rapid reviews need to make transparent methodological choices, informed by stakeholder input, to ensure that rapid reviews meet their intended purpose. Transparency is critical because it is unclear how or how much streamlined methods can bias the conclusions of reviews. There are not yet internationally accepted standards for conducting or reporting rapid reviews. Thus, this article proposes interim guidance for researchers who are increasingly employing these methods.

Peer Review reports



Health policy-makers and other stakeholders need evidence to inform their decisions. However, their decisions must often be made in short time frames, and they may have other resource constraints, such as the available budget or personnel [1,2,3,4,5,6]. Rapid reviews are increasingly being used and are increasingly influential in the health policy and system arena [3, 7,8,9,10]. One needs assessment [11] showed that policy-makers want evidence reviews to answer the right question, be completed in days to weeks, rather than months or years, be accurate and reproducible, and be affordable.

As much as policy-makers may desire faster and more efficient evidence syntheses, it is not yet clear whether rapid reviews are sufficiently rigorous and valid, compared to systematic reviews which are considered the “gold standard” evidence synthesis, to inform policy [12]. Only a few empirical studies have compared the findings of rapid reviews and systematic reviews on the same topic, and their results are conflicting and inconclusive, leaving questions about the level of bias that may be introduced because of rapid review methods [7, 13,14,15,16,17,18,19].

A standardized or commonly agreed-upon set of methods for conducting rapid reviews had not existed until recently, [1, 9, 14, 20,21,22,23] and while there is little empiric evidence on some of the standard elements of systematic reviews, [24] those standards are well articulated [25, 26]. A minimum interim set of standards has was developed by the Cochrane Rapid Reviews Methods Group [1, 2] to help guide rapid review production during the SARS-CoV-19 pandemic, and other researchers have proposed methods and approaches to guide rapid reviews [5, 21, 22, 27,28,29,30,31,32,33,34,35,36].

This article gives an overview of potential ways to produce a rapid review while maintaining a synthesis process that is sufficiently rigorous, yet tailored as needed, to support health policy-making. We present options for common methods choices, summarized from descriptions and evaluations of rapid review products and programs in Table 1, along with key considerations for each methodological step.

Table 1 Common methods, approaches, and key considerations for the steps in a rapid review


The World Health Organization (WHO) published Rapid reviews to strengthen health policy and systems: a practical guide [5] in 2017. The initial work for this article was completed as a chapter for that publication and included multiple literature searches and layers of peer review to identify important studies and concepts. We conducted new searches using Ovid MEDLINE, the Cochrane Library’s methodology collection, and the bibliography of studies maintained by the Cochrane Rapid Reviews Methods Group, to identify articles, including both examples of rapid reviews and those on rapid review methodology, published after the publication of the WHO guide. We have not attempted to perform a comprehensive identification or catalog of all potential articles on rapid reviews or examples of reviews conducted with these methods. As this work was not a systematic review of rapid review methods, we do not include a flow of articles from search to inclusion and have not undertaken any formal critical appraisal of the articles we did include.


Needs assessment, topic selection, and topic refinement

Rapid reviews are typically conducted at the request of a particular decision-maker, who has a key role in posing the question, setting the parameters of the review, and defining the timeline [40,41,42]. The most common strategy for completing a rapid review within a limited time frame is to narrow its scope. This can be accomplished by limiting the number of questions, interventions, and outcomes considered in the review [13, 15]. Early and continuing engagement of the requester and any other relevant stakeholders is critical to understand their needs, the intended use of the review, and the expected timeline and deliverables [15, 28, 29, 40,41,42]. Policy-makers and other requesters may have vaguely defined questions or unrealistic expectations about what any type of review can accomplish [41, 42]. A probing conversation or formal needs assessment is the critical first step in any knowledge synthesis approach to determine the scope of the request, the intended purpose for the completed review, and to obtain a commitment for collaboration over the duration of the project [28, 30, 41]. Once the request and its context are understood, researchers should fully develop the question(s), including any needed refinement with the requester or other stakeholders, before starting the project [5]. This process can be iterative and may require multiple contacts between the reviewers and the requester to ensure that the final rapid review is fit for its intended purpose [41, 42]. In situations where a definitive systematic review might be needed, it may be useful to discuss with the requester the possibility of conducting a full systematic review, either in parallel or serially with the rapid review [43].

Protocol development

A research protocol clearly lays out the scope of the review, including the research questions and the approaches that will be used to conduct the review [44]. We suggest using the Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols (PRISMA-P) 2015 statement for guidance [37]. Most reviewers use the PICO format (population, intervention, comparator, outcome), with some adding elements for time frame, setting, and study design. The PICO elements help to define the research questions, and the initial development of questions can point to needed changes in the PICO elements. For some types of research questions or data, other framework variations such as SPICE (setting, perspective, intervention, comparison, evaluation) may be used, although the PICO framework can generally be adapted [45]. Health services and policy research questions may call for more complex frameworks [5]. This initial approach assists both researchers and knowledge users to know what is planned and enables documentation of any protocol deviations; however, the customized and iterative nature of rapid reviews means that some flexibility may be required. Some rapid review producers include the concept of methods adjustment in the protocol itself [46, 47]. However, changes made beyond the protocol stage and the rationale for making them must be transparent and documented in the final report.

The international prospective register of systematic reviews (PROSPERO) [44] ( accepts registration of protocols that include at least one clinically or patient-relevant outcome. The Open Science Framework (OSF) [48] platform ( also accepts protocol registrations for rapid reviews. We advise protocol submitters to include the term “rapid review” or another similar term in the registered title, as this will assist tracking the use, validity, and value of rapid reviews [1]. Protocol registration helps to decrease research waste and allows both requesters and review authors to avoid duplication. Currently, most rapid review producers report using a protocol, but few register their protocols [13, 17].

Literature search

Multiple authors have conducted inventories of the characteristics of and methods used for rapid reviews, including the broad categories of literature search, study selection, data extraction, and synthesis steps [13, 15, 17, 20, 24, 49]. PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) standards call for documentation of the full search strategy for all electronic databases used [38]. Most published rapid reviews search two or more databases, with PubMed, Embase, and the Cochrane Library mentioned frequently [13, 17, 20, 49]. Rapid reviews often streamline systematic review methods by limiting the number of databases searched and the search itself by date, language, geographical area, or study design, and some rapid reviews search only for existing systematic reviews [13, 15, 17, 20, 49, 50]. Other rapid reviews use a layered searching approach, identifying existing systematic reviews and then updating them with a summary of more recent eligible primary studies [13, 15, 18, 20, 36]. Studies of simplified search strategies have generally demonstrated acceptable retrieval characteristics for most types of rapid review reports [51, 52]. Searching the reference lists of eligible studies (sometimes known as the “snowballing” technique) and searching the gray literature (i.e., reports that are difficult to locate or unpublished) are done in about half of published rapid reviews and may be essential for certain topics [13, 15, 20, 49]. However, rapid reviews seldom report contact with authors and other experts to identify additional unpublished studies [13, 15, 20, 49]. One study found that peer review of the search strategy, using a tool such as the PRESS (peer review of electronic search strategies) checklist, [39] was reported in 38% of rapid reviews, but that it was usually performed internally rather than by external information specialist reviewers [13]. Peer review of search strategies has been reported to increase retrieval of relevant records, particularly for nonrandomized studies [53].

Screening and study selection

Methodological standards for systematic reviews generally require independent screening of citations and abstracts by at least two researchers to arrive at a set of potentially eligible references, which are in turn subjected to dual review in full-text format to arrive at a final inclusion set. Rapid reviews often streamline this process, with up to 40% using a single researcher at each stage [13, 15, 17, 18, 20, 49]. Some rapid reviews report verification of a sample of the articles by a second researcher or, occasionally, use of full dual screening by two independent researchers [13, 17, 20, 49]. One methodological study reported that single screener selection missed an average of 5% of eligible studies, ranging from 3% for experienced reviewers and 6% for those with less experience [54]. If time and resources allow, we recommend that dual screening of all excluded studies, at both the title and full-text stages, be used to minimize the risk of selection bias through the inappropriate exclusion of relevant studies. However, there is some evidence that the use of a single experienced reviewer for particular topics may be sufficient [18, 46, 54].

Data extraction

As with citation screening and study selection, the number of independent reviewers who extract study data for a rapid review can vary. One study found that the most common approach is single-reviewer extraction (41%), although another 25% report verification of a sample by a second reviewer and nearly as many used dual extraction [13]. A more recent study reported that only about 10% of rapid reviews examined reported dual data extraction, although nearly twice as many simply did not report this feature [17]. Data abstraction generally includes PICO elements, although data abstraction was often limited by the scope of the review, and authors were contacted for missing data very infrequently [13].

Risk-of-bias assessment

Risk-of-bias assessment, sometimes called critical appraisal or methodological quality appraisal, examines the quality of the methods employed for each included study and is a standard element of systematic reviews [25]. The vast majority of rapid review producers perform some type of critical appraisal [17, 20]. Some rapid reviews report the use of a single assessor with verification of a sample of study assessments by another assessor [17, 49]. There is no consensus as to which risk-of-bias assessment tools should be used, although most reviews use study design-specific instruments (e.g., an instrument designed for randomized controlled trials (RCTs) if assessing RCTs) intended for assessing internal validity [13, 20].

Knowledge synthesis

Nearly all rapid review producers conduct a descriptive synthesis (also often called a narrative synthesis) of results, but a few perform additional meta-analyses or economic analyses [13, 17, 20]. The synthesis that is conducted is often limited to a basic descriptive summary of studies and their results, rather than the full synthesis that is recommended for systematic reviews [26]. Most rapid reviews present conclusions, recommendations, or implications for policy or clinical practice as another component of the synthesis. Multiple experts also recommend that rapid reviews clearly describe and discuss the potential limitations arising from methodological choices [5, 9, 13, 15, 23].

Many systematic review producers use the Grading of Recommendations Assessment, Development and Evaluation (GRADE) system [55] ( to rate the certainty of the evidence about health outcomes. Guideline developers and others who make recommendations or policy decisions use GRADE to rate the strength of recommendations based on that evidence. The GRADE evidence to decisions (EtD) framework has also been used to help decision-makers developing health system and public health [56] and coverage [57] policies. Rapid review authors can also employ GRADE to rate the certainty of synthesized evidence and develop policy implications for decision-makers if time and resources permit. However, the GRADE system works best for interventions that have been subject to RCTs and where there is at least one meta-analysis to provide a single estimate of effect.

Report production and dissemination

Standard templates for each stage of the review, from protocol development to report production, can assist the review team in performing each step efficiently. Use of a report template, with minimum methodological standards, reporting requirements, and standard report sections, can assist the producer in streamlining production of the report and can also enhance transparency [15, 20, 28, 40]. An extension of the PRISMA statement for rapid reviews is under development and has been registered with the EQUATOR Network [58]. Until it is available, the PRISMA checklist for systematic reviews can serve as a reporting template to increase the transparency of rapid reviews [8, 40, 59].

Research about review formatting and presentation of rapid review is being conducted, but it is likely that the forms employed and tested will need to be adapted to the individual requester and stakeholder audiences [47]. Khangura and colleagues [28] have presented a figure showing formatted sections of a sample report, and many other rapid review producers have examples of reports online that can serve as formatting examples. In addition, findings from evidence summary presentation research for decision-makers in low- and middle-income countries can be translated into other settings [60, 61].

Most rapid review producers conduct some form of peer review for the resulting reports, but such review is often internal and may include feedback from the requester [13]. Most producers disseminate their reports beyond the requester, but dissemination varies by the sensitivity or proprietary nature of the product [13, 20]. When reports are disseminated, it is common for them to be posted online, for example, at an organizational website [13, 20].

Operational considerations

Evaluations and descriptions of research programs that produce rapid reviews typically include some helpful pragmatic and operational considerations for undertaking a rapid review or developing a rapid review program [5, 15, 18, 27,28,29, 31, 36, 40, 62, 63]. Highly experienced, permanent staff with the right skill mix, including systematic reviewers, information specialists, methodologists, and content experts [15, 18, 30, 40, 49], are essential. It is time-consuming to assemble staff on a per-project basis, so the presence of an existing team (which may only do rapid reviews or may also do systematic reviews or other research) with review infrastructure already in place allows projects to get off to a quick start. The existence of a dedicated team also creates the potential to build relationships with requesters and to cultivate mutual trust. Staff with experience conducting systematic reviews will be familiar with standard methods and may be alert to any needed protocol changes as the review proceeds [49]. The rapid review team must understand the methodological implications of decisions taken and must convey these implications to the requesters, to allow them to understand the caveats and potential limitations. Continuing relationships and longer-term contracting with requesters, to allow for a quick start and “good faith” initiation of work before a contract is in place, can speed the early development stages [31, 40]. It is important for rapid review producers to confirm that the choices they make to streamline the review are acceptable to the requester [41]. Whether it is a decision to limit the scope to a single intervention or outcome, restrict the literature search to existing systematic reviews, or forgo a meta-analysis, the knowledge user must be aware of the implications of streamlining decisions [15, 27, 31, 41]. Some programs also emphasize the need for follow-up with review requesters to develop the relationship and continuously improve knowledge products [28, 63]. Although it is beyond the scope of this article, we note that both systematic and rapid review producers are currently using various automated technologies to speed review production. There are examples of tools to help search for references, screen citations, abstract data, organize reviews, and enhance collaboration, but few evaluations of their validity and value in report production [64, 65]. The Systematic Review Toolbox [66] ( is an online searchable database of tools that can help perform tasks in the evidence synthesis process.

Table 1 summarizes the commonly described approaches and key considerations for the major steps in a rapid review that are discussed in detail in the preceding sections.

Suggested approaches to rapid reviews

The previous sections have summarized the numerous approaches to conducting rapid reviews. Abrami and colleagues [27] summarized several methods of conducting rapid reviews and developed a brief review checklist of considerations and recommendations, which may serve as a useful parallel to Table 2. A “one-size-fits-all” approach may not be suitable to cover the variety of topics and requester needs put forward. Watt and colleagues [9] observed over a decade ago, “It may not be possible to validate methodological strategies for conducting rapid reviews and apply them to every subject. Rather, each topic must be evaluated by thorough scoping, and appropriate methodology defined.” Plüddemann and colleagues [23] advocated for a flexible framework for what they term “restricted reviews,” with a set of minimum requirements and additional steps to reduce the risk of bias when time and resources allow. Thomas, Newman, and Oliver [29] noted that it might be more difficult to apply rapid approaches to questions of social policy than to technology assessment, in part because of the complexity of the topics, underlying studies, and uses of these reviews. The application of mixed methods, such as key informant interviews, stakeholder surveys, primary data, and policy analysis, may be required for questions with a paucity of published literature and those involving complex subjects [29]. However, rapid review producers should remain aware that streamlined methods may not be appropriate for all questions, settings, or stakeholder needs, and they should be honest with requesters about what can and cannot be accomplished within the timelines and resources available [31]. For example, a rapid review would likely be inappropriate as the foundation for a national guideline on cancer treatment due to be launched 5 years in the future. A decision tool, STARR (SelecTing Approaches for Rapid Reviews) has been published by Pandor and colleagues [67] to help guide decisions about interacting with report requesters, making informed choices regarding to the evidence base, methods for data extraction and synthesis, and reporting on the approaches used for the report.

Table 2 Interim guidance for rapid reviews

Tricco and colleagues [21] conducted an international survey of rapid review producers, using a modified Delphi ranking to solicit opinions about the feasibility, timeliness, comprehensiveness, and risk of bias of six different rapid review approaches. Ranked best in terms of both risk of bias and feasibility was “approach 1,” which included published literature only, based on a search of one or more electronic databases, limited in terms of both date and language. With this approach, a single reviewer conducts study screening, and both data extraction and risk-of-bias assessment are done by a single reviewer, with verification by a second researcher. Other approaches were ranked best in terms of timeliness and comprehensiveness, [21] representing trade-offs that review producers and knowledge users may want to consider. Because the survey report was based on expert opinion, it did not provide empirical evidence about the implications of each streamlined approach [21]. However, in the absence of empirical evidence, it may serve as a resource for rapid review producers looking to optimize one of these review characteristics. Given that evidence regarding the implications of methodological decisions for rapid reviews is limited, we have developed interim guidance for those conducting rapid reviews (Table 2).


Rapid reviews are being used with increasing frequency to support clinical and policy decisions [6, 22, 34]. While policymakers are generally willing to trade some certainty for speed and efficiency, they do expect rapid reviews to come close to the validity of systematic reviews [51]. There is no universally accepted definition of a rapid review [2]. This lack of consensus is, in part, related to the grouping of products with different purposes, audiences, timelines, and resources. Although we have attempted to summarize the major choices available to reviewers and requesters of information, there are few empiric data to guide these choices. We may have missed examples of rapid reviews and methodological research that could add to the conclusions of this paper. However, our approach to this work has been pragmatic, much like a rapid review itself, and is based on our international experience as researchers involved in the Cochrane Rapid Reviews Methods Group, as well as authors who participated in the writing and dissemination of Rapid reviews to strengthen health policy and systems: a practical guide [5]. This paper has, in addition, been informed by our research about rapid reviews and our collective work across several groups that conduct rapid reviews [1, 68]. The Cochrane Rapid Review Methods Group also conducted a methods opinion survey in 2019 and released interim recommendations to guide Cochrane rapid reviews during the SARS-CoV-2 pandemic [2]. These recommendations are specific to the needs of Cochrane reviews and offer more detailed guidance for rapid review producers than those presented in this paper. We encourage readers to sign up for the Cochrane Rapid Reviews Methods Group newsletter on the website ( and to check the list of methodological publications which is updated regularly to continue to learn about research pertinent to rapid reviews [68].


We have summarized the rapid review methods that can be used to balance timeliness and resource constraints with a rigorous knowledge synthesis process to inform health policy-making. Interim guidance suggestions for the conduct of rapid reviews are outlined in Table 2. The most fundamental key to success is early and continuing engagement with the research requester to focus the rapid review and ensure that it is appropriate to the needs of stakeholders. Although the protocol serves as the starting point for the review, methodological decisions are often iterative, involving the requester. Any changes to the protocol should be reflected in the final report. Methods can be streamlined at all stages of the review process, from search to synthesis, by limiting the search in terms of dates and language; limiting the number of electronic databases searched; using one reviewer to perform study selection, risk-of-bias assessment, and data abstraction (often with verification by another reviewer); and using a descriptive synthesis rather than a quantitative summary. Researchers need to make transparent methodological choices, informed by stakeholder input, to ensure that the evidence review is fit for its intended purpose. Given that it is not clear how these choices can bias a review, transparency is essential. We are aware that an increasing number of journals publish rapid reviews and related evidence synthesis products, which we hope will further increase the availability, transparency, and empiric research base for progress on rapid review methodologies.



Enhancing the QUAlity and Transparency Of health Research


Grading of Recommendations Assessment, Development and Evaluation


Population, intervention, comparator, outcomes


Preferred Reporting Items for Systematic Reviews and Meta-Analyses


Preferred Reporting Items for Systematic Reviews and Meta-Analyses-Protocols


Randomized controlled trial


Setting, Perspective, Intervention, Comparator, Evaluation


SelecTing Approaches for Rapid Reviews


Peer review of electronic search strategies


World Health Organization


  1. Garritty C, Stevens A, Gartlehner G, King V, Kamel C. Cochrane Rapid Reviews Methods Group to play a leading role in guiding the production of informed high-quality, timely research evidence syntheses. Syst Rev. 2016;5(1):184.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Garritty C, Gartlehner G, Nussbaumer-Streit B, King VJ, Hamel C, Kamel C, et al. Cochrane Rapid Reviews Methods Group offers evidence-informed guidance to conduct rapid reviews. J Clin Epidemiol. 2021;130:13–21.

    Article  PubMed  Google Scholar 

  3. Peterson K, Floyd N, Ferguson L, Christensen V, Helfand M. User survey finds rapid evidence reviews increased uptake of evidence by Veterans Health Administration leadership to inform fast-paced health-system decision-making. Syst Rev. 2016;5(1):132.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Thomas J, Newman M, Oliver S. Rapid evidence assessments of research to inform social policy: taking stock and moving forward. Evid Policy. 2013;9(1):5–27.

    Article  Google Scholar 

  5. Tricco AC, Langlois EV, Straus SE, editors. Rapid reviews to strengthen health policy and systems: a practical guide. Geneva: World Health Organization; 2017.

    Google Scholar 

  6. Langlois EV, Straus SE, Antony J, King VJ, Tricco AC. Using rapid reviews to strengthen health policy and systems and progress towards universal health coverage. BMJ Glob Health. 2019;4(1): e001178.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Hite J, Gluck ME. Rapid evidence reviews for health policy and practice. 2016; Accessed 20 June 2021.

  8. Moore GM, Redman S, Turner T, Haines M. Rapid reviews in health policy: a study of intended use in the New South Wales’ Evidence Check programme. Evidence Policy. 2016;12(4):505–19.

    Article  Google Scholar 

  9. Watt A, Cameron A, Sturm L, et al. Rapid reviews versus full systematic reviews: an inventory of current methods and practice in health technology assessment. Int J Technol Assess Health Care. 2008;24(2):133–9.

    Article  PubMed  Google Scholar 

  10. Moore G, Redman S, Rudge S, Haynes A. Do policy-makers find commissioned rapid reviews useful? Health Res Policy Syst. 2018;16(1):17.

    Article  PubMed  PubMed Central  Google Scholar 

  11. Gluck M. Can evidence reviews be made more responsive to policymakers? Paper presented at: Fourth Global Symposium on health systems research: resiliant and responsive health systems for a changing world. 2016; Vancouver.

  12. Wagner G, Nussbaumer-Streit B, Greimel J, Ciapponi A, Gartlehner G. Trading certainty for speed - how much uncertainty are decisionmakers and guideline developers willing to accept when using rapid reviews: an international survey. BMC Med Res Methodol. 2017;17(1):121.

    Article  PubMed  PubMed Central  Google Scholar 

  13. Abou-Setta AM, Jeyaraman M, Attia A, et al. Methods for developing evidence reviews in short periods of time: a scoping review. PLoS ONE. 2016;11(12): e0165903.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  14. Haby MM, Chapman E, Clark R, Barreto J, Reveiz L, Lavis JN. What are the best methodologies for rapid reviews of the research evidence for evidence-informed decision making in health policy and practice: a rapid review. Health Res Policy Syst. 2016;14(1):83.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Hartling L, Guise JM, Kato E, et al. A taxonomy of rapid reviews links report types and methods to specific decision-making contexts. J Clin Epidemiol. 2015;68(12):1451-1462.e1453.

    Article  PubMed  Google Scholar 

  16. Reynen E, Robson R, Ivory J, et al. A retrospective comparison of systematic reviews with same-topic rapid reviews. J Clin Epidemiol. 2018;96:23–34.

    Article  PubMed  Google Scholar 

  17. Tricco AC, Zarin W, Ghassemi M, et al. Same family, different species: methodological conduct and quality varies according to purpose for five types of knowledge synthesis. J Clin Epidemiol. 2018;96:133–42.

    Article  PubMed  Google Scholar 

  18. Eiring O, Brurberg KG, Nytroen K, Nylenna M. Rapid methods including network meta-analysis to produce evidence in clinical decision support: a decision analysis. Syst Rev. 2018;7(1):168.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Taylor-Phillips S, Geppert J, Stinton C, et al. Comparison of a full systematic review versus rapid review approaches to assess a newborn screening test for tyrosinemia type 1. Res Synthesis Methods. 2017;8(4):475–84.

    Article  Google Scholar 

  20. Polisena J, Garritty C, Kamel C, Stevens A, Abou-Setta AM. Rapid review programs to support health care and policy decision making: a descriptive analysis of processes and methods. Syst Rev. 2015;4:26.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Tricco AC, Zarin W, Antony J, et al. An international survey and modified Delphi approach revealed numerous rapid review methods. J Clin Epidemiol. 2016;70:61–7.

    Article  PubMed  Google Scholar 

  22. Aronson JK, Heneghan C, Mahtani KR, Pluddemann A. A word about evidence: ‘rapid reviews’ or ‘restricted reviews’? BMJ Evid-Based Med. 2018;23(6):204–5.

    Article  PubMed  Google Scholar 

  23. Pluddemann A, Aronson JK, Onakpoya I, Heneghan C, Mahtani KR. Redefining rapid reviews: a flexible framework for restricted systematic reviews. BMJ Evid-Based Med. 2018;23(6):201–3.

    Article  PubMed  Google Scholar 

  24. Robson RC, Pham B, Hwee J, et al. Few studies exist examining methods for selecting studies, abstracting data, and appraising quality in a systematic review. J Clin Epidemiol. 2019;106:121–35.

    Article  PubMed  Google Scholar 

  25. Higgins JPT, Lasserson T, Chandler J, Tovey D, Churchill R. Methodological Expectations of Cochrane Intervention Reviews (MECIR). 2016; https:// Accessed June 20, 2021.

  26. Higgins JPT, Green S. Cochrane Handbook for Systematic Reviews of Interventions Version 5.1.0 [updated March 2011]. The Cochrane Collaboration; 2011.

  27. Abrami PC, Borokhovski E, Bernard RM, et al. Issues in conducting and disseminating brief reviews of evidence. Evid Policy. 2010;6(3):371–89.

    Article  Google Scholar 

  28. Khangura S, Konnyu K, Cushman R, Grimshaw J, Moher D. Evidence summaries: the evolution of a rapid review approach. Syst Rev. 2012;1:10.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Thomas J, Newman M, Oliver S. Rapid evidence assessments of research to inform social policy: taking stock and moving forward. Evid Policy. 2013;9:5–27.

    Article  Google Scholar 

  30. Varker T, Forbes D, Dell L, et al. Rapid evidence assessment: increasing the transparency of an emerging methodology. J Eval Clin Pract. 2015;21(6):1199–204.

    Article  PubMed  Google Scholar 

  31. Wilson MG, Lavis JN, Gauvin FP. Developing a rapid-response program for health system decision-makers in Canada: findings from an issue brief and stakeholder dialogue. System Rev. 2015;4:25.

    Article  Google Scholar 

  32. Featherstone RM, Dryden DM, Foisy M, et al. Advancing knowledge of rapid reviews: an analysis of results, conclusions and recommendations from published review articles examining rapid reviews. Syst Rev. 2015;4:50.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Silva MT, Silva END, Barreto JOM. Rapid response in health technology assessment: a Delphi study for a Brazilian guideline. BMC Med Res Methodol. 2018;18(1):51.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Patnode CD, Eder ML, Walsh ES, Viswanathan M, Lin JS. The use of rapid review methods for the U.S. Preventive Services Task Force. Am J Prevent Med. 2018;54(1S1):S19-S25.

  35. Strudwick K, McPhee M, Bell A, Martin-Khan M, Russell T. Review article: methodology for the ‘rapid review’ series on musculoskeletal injuries in the emergency department. Emerg Med Australas. 2018;30(1):13–7.

    Article  PubMed  Google Scholar 

  36. Dobbins M. Rapid review guidebook: steps for conducting a rapid review. McMaster University;2017.

  37. Moher D, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. System Rev. 2015;4(1):1–9.

  38. Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et. al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372:n71.Moher D, Liberati A, Tetzlaff J, Altman DG, The PG. Preferred Reporting Items for Systematic Reviews and Meta-Analyses: the PRISMA statement. PLOS Med. 2009;6(7):e1000097.

  39. McGowan J, Sampson M, Salzwedel DM, Cogo E, Foerster V, Lefebvre C. PRESS peer review of electronic search strategies: 2015 guideline statement. J Clin Epidemiol. 2016;75:40–6.

    Article  PubMed  Google Scholar 

  40. Haby MM, Chapman E, Clark R, Barreto J, Reveiz L, Lavis JN. Designing a rapid response program to support evidence-informed decision-making in the Americas region: using the best available evidence and case studies. Implement Sci. 2016;11(1):117.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Moore G, Redman S, Butow P, Haynes A. Deconstructing knowledge brokering for commissioned rapid reviews: an observational study. Health Res Policy Syst. 2018;16(1):120.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  42. Tricco AC, Zarin W, Rios P, et al. Engaging policy-makers, health system managers, and policy analysts in the knowledge synthesis process: a scoping review. Implementation science : IS. 2018;13(1):31.

    Article  PubMed Central  Google Scholar 

  43. Murphy A, Redmond S. To HTA or not to HTA: identifying the factors influencing the rapid review outcome in Ireland. Value Health. 2019;22(4):385–90.

    Article  PubMed  Google Scholar 

  44. PROSPERO-International prospective register of systematic reviews. Accessed 20 June 2021.

  45. Booth A. Clear and present questions: formulating questions for evidence based practice. In: Library hi tech. Vol 24.2006:355–368.

  46. Garritty C, Stevens A. Putting evidence into practice (PEP) workshop – rapid review course. 2015, 2015; University of Alberta, Edmonton, Alberta

  47. Garritty C, Stevens A, Gartlehner G, Nussbaumer-Streit B, King V. Rapid review workshop: timely evidence synthesis for decision makers. Paper presented at: Cochrane Colloquium; 2016, 2016; Seoul, South Korea.

  48. Open Science Foundation. Accessed 20 June 2021.

  49. Tricco AC, Antony J, Zarin W, et al. A scoping review of rapid review methods. BMC Med. 2015;13:224.

    Article  PubMed  PubMed Central  CAS  Google Scholar 

  50. Nussbaumer-Streit B, Klerings I, Dobrescu AI, Persad E, Stevens A, Garritty C, et al. Excluding non-English publications from evidence-syntheses did not change conclusions: a meta-epidemiological study. J Clin Epidemiol. 2020;118:42–54.

    Article  CAS  PubMed  Google Scholar 

  51. Nussbaumer-Streit B, Klerings I, Wagner G, et al. Abbreviated literature searches were viable alternatives to comprehensive searches: a meta-epidemiological study. J Clin Epidemiol. 2018;102:1–11.

    Article  PubMed  Google Scholar 

  52. Rice M, Ali MU, Fitzpatrick-Lewis D, Kenny M, Raina P, Sherifali D. Testing the effectiveness of simplified search strategies for updating systematic reviews. J Clin Epidemiol. 2017;88:148–53.

    Article  PubMed  Google Scholar 

  53. Spry C, Mierzwinski-Urban M. The impact of the peer review of literature search strategies in support of rapid review reports. Research synthesis methods. 2018;9(4):521–6.

    Article  PubMed  Google Scholar 

  54. Waffenschmidt S, Knelangen M, Sieben W, Buhn S, Pieper D. Single screening versus conventional double screening for study selection in systematic reviews: a methodological systematic review. BMC Med Res Methodol. 2019;19(1):132.

    Article  PubMed  PubMed Central  Google Scholar 

  55. The Grade Working Group. GRADE. Accessed 20 June 2021.

  56. Moberg J, Oxman AD, Rosenbaum S, Schunemann HJ, Guyatt G, Florttorp S, et al. The GRADE evidence to decision (EfD) framework for health system and public health decisions. Health Res Policy Syst. 2018;16:45.

    Article  PubMed  PubMed Central  Google Scholar 

  57. Parmelli E, Amato L, Oxman AD, Alonso-Coello P, Brunetti M, Moberg J, et al. GRADE evidence to decision (EtD) framework for coverage decisions. Int J Technol Assess Health Care. 2017;33(2):176–82.

    Article  PubMed  Google Scholar 

  58. Stevens A, Garritty C, Hersi M, Moher D. Developing PRISMA-RR, a reporting guideline for rapid reviews of primary studies (protocol). 2018. Accessed 20 June 2021.

  59. Kelly SE, Moher D, Clifford TJ. Quality of conduct and reporting in rapid reviews: an exploration of compliance with PRISMA and AMSTAR guidelines. Syst Rev. 2016;5:79.

    Article  PubMed  PubMed Central  Google Scholar 

  60. Mijumbi-Deve R, Rosenbaum SE, Oxman AD, Lavis JN, Sewankambo NK. Policymaker experiences with rapid response briefs to address health-system and technology questions in Uganda. Health Res Policy Syst. 2017;15(1):37.

    Article  PubMed  PubMed Central  Google Scholar 

  61. Rosenbaum SE, Glenton C, Wiysonge CS, et al. Evidence summaries tailored to health policy-makers in low- and middle-income countries. Bull World Health Organ. 2011;89(1):54–61.

    Article  PubMed  Google Scholar 

  62. McIntosh HM, Calvert J, Macpherson KJ, Thompson L. The healthcare improvement Scotland evidence note rapid review process: providing timely, reliable evidence to inform imperative decisions on healthcare. Int J Evid Based Healthc. 2016;14(2):95–101.

    Article  PubMed  Google Scholar 

  63. Gibson M, Fox DM, King V, Zerzan J, Garrett JE, King N. Methods and processes to select and prioritize research topics and report design in a public health insurance programme (Medicaid) in the USA. Cochrane Methods. 2015;1(Suppl 1):33–35.

  64. Department for Environment, Food and Rural Affairs. Emerging tools and techniques to deliver timely and cost effective evidence reviews. In. London: Department for Environment, Food and Rural Affairs; 2015.

  65. Marshall CG, J. Software tools to support systematic reviews. Cochrane Methods. 2016;10(Suppl. 1):34–35.

  66. The Systematic Review Toolbox. Accessed 20 June 2021.

  67. Pandor A, Kaltenthaler E, Martyn-St James M, et al. Delphi consensus reached to produce a decision tool for SelecTing Approaches for Rapid Reviews (STARR). J Clin Epidemiol. 2019;114:22–9.

    Article  PubMed  Google Scholar 

  68. Cochrane Rapd Reviews Methods Group. Accessed 20 June 2021.

Download references


Time to produce this manuscript was donated in kind by the authors’ respective organizations, but no other specific funding was received alliance for health policy and systems research,norwegian government agency for development cooperation, swedish international development cooperation agency, department for international development, uk government

Author information

Authors and Affiliations



The first author drafted the manuscript and was responsible for incorporating all other authors’ comments into the final version of the manuscript. The authors read and approved the final manuscript.

Corresponding author

Correspondence to Valerie J. King.

Ethics declarations

Competing interests

All authors are leaders or members of the Cochrane Rapid Reviews Methods Group, and all are producers of rapid reviews for their respective organizations.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit The Creative Commons Public Domain Dedication waiver ( applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

King, V.J., Stevens, A., Nussbaumer-Streit, B. et al. Paper 2: Performing rapid reviews. Syst Rev 11, 151 (2022).

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: