• Search Menu
  • Advance Articles
  • Editor's Choice
  • Supplements
  • Patient Perspectives
  • Methods Corner
  • Science for Patients
  • Invited Commentaries
  • ESC Content Collections
  • Author Guidelines
  • Instructions for reviewers
  • Submission Site
  • Why publish with EJCN?
  • Open Access Options
  • Self-Archiving Policy
  • Read & Publish
  • About European Journal of Cardiovascular Nursing
  • About ACNAP
  • About European Society of Cardiology
  • ESC Publications
  • Editorial Board
  • Advertising & Corporate Services
  • War in Ukraine
  • Journals on Oxford Academic
  • Books on Oxford Academic

Issue Cover

Article Contents

The problem: what if you need a synthesis of the evidence now, a solution: rapid reviews, examples of rapid reviews, limitations and pitfalls of rapid reviews, data availability.

  • < Previous

Rapid reviews: the pros and cons of an accelerated review process

ORCID logo

  • Article contents
  • Figures & tables
  • Supplementary Data

Philip Moons, Eva Goossens, David R. Thompson, Rapid reviews: the pros and cons of an accelerated review process, European Journal of Cardiovascular Nursing , Volume 20, Issue 5, June 2021, Pages 515–519, https://doi.org/10.1093/eurjcn/zvab041

  • Permissions Icon Permissions

Although systematic reviews are the method of choice to synthesize scientific evidence, they can take years to complete and publish. Clinicians, managers, and policy-makers often need input from scientific evidence in a more timely and resource-efficient manner. For this purpose, rapid reviews are conducted. Rapid reviews are performed using an accelerated process. However, they should not be less systematic than standard systematic reviews, and the introduction of bias must be avoided. In this article, we describe what rapid reviews are, present their characteristics, give some examples, highlight potential pitfalls, and draw attention to the importance of evidence summaries in order to facilitate adoption in clinical decision-making.

Knowing what rapid reviews are.

Understanding the features and benefits of rapid reviews.

Recognizing the limitations of rapid reviews and knowing when they are not the preferred choice.

Researchers, clinicians, managers, and policy-makers are typical consumers of empirical work published in the scientific literature. For researchers, reviewing the literature is part of the empirical cycle, in order to generate new research questions and to discuss their own study findings. When the available evidence has to be searched for, collated, critiqued, and summarized, systematic reviews are the gold standard. 1 Systematic reviews are rigorous in approach and transparent about how studies were searched, selected, and assessed. Doing so, they limit bias and random error, and hence, they yield the most valid and trustworthy evidence. Systematic reviews can be complemented by meta-analyses to compute an overall mean effect, proportion, or relationship. 2 Systematic reviews and meta-analyses are seen as the pillars of evidence-based healthcare. The rigour in the methodology of a systematic review, however, also means that it often takes between 6 months and 2 years to undertake. 3

Clinicians, managers, and policy-makers also use the literature for their decision-making. They often cannot afford to wait for 2 years to get the answer to their questions by means of a systematic review. The evidence must be synthesized without undue delays. 4 Furthermore, the synthesis and reporting of systematic reviews often fail to address the needs of the users at the point of care 5 and are considered to be too large and too complex. 3 To facilitate the uptake of research findings in clinical practice, other types of reviews with a shorter lead time are needed, and alternative evidence summaries have to be developed. 5

Rapid reviews have been proposed as a method to provide summaries of the literature in a timely and resource-efficient manner by using methods to accelerate or streamline traditional systematic review processes. 5 , 6 It is argued that rapid reviews should be conducted in less than 8 weeks. 4 The purpose of rapid reviews is to respond to urgent situations or political pressures, often in a rapidly changing field. The typical target audiences for rapid reviews are policy-makers, healthcare institutions, managers, professionals, and patient associations. 6 The first rapid reviews were published in the 1960s and proliferated in the mid-2010s. Not surprisingly, the number of rapid reviews have boomed in 2020, in response to the global SARS-CoV-2/COVID-19 pandemic (see Figure 1 ). Indeed, this pandemic has had a huge impact on healthcare delivery, 7–9 and triggered unprecedented clinical questions that needed a prompt answer. 10 Healthcare research, also, has had to adapt swiftly to the drastically changed situation. 11

Number of publications in the Pubmed database (1960–2020) referring to ‘rapid review’ (search performed 16 March 2021).

Number of publications in the Pubmed database (1960–2020) referring to ‘rapid review’ (search performed 16 March 2021).

A rapid review is a ‘a form of knowledge synthesis that accelerates the process of conducting a traditional systematic review through streamlining or omitting various methods to produce evidence for stakeholders in a resource-efficient manner’. 12 There is not a single-validated methodology in conducting rapid reviews. 13 Therefore, variation in methodological quality of rapid reviews can be observed. 14 When adopting the ‘Search, AppraisaL, Synthesis and Analysis (SALSA) framework’ to rapid reviews, it is stipulated that the completeness of the search is determined by time constraints; the quality appraisal is time-limited, if performed at all; the synthesis is narrative and tabular; and the analysis pertains to the overall quality/direction of effect of literature. 15 In Table 1 , we describe the SALSA characteristics of rapid reviews and systematic reviews. Rapid reviews should not be less systematic, and they must adhere to the core principles of systematic reviews to avoid bias in the inclusion, assessment, and synthesis of studies. 4 The typical characteristic of a rapid review is that it provides less in-depth information and detail in its recommendations. 6 It is essential, however, that deviations from traditional systematic review methods are described well in the methods section. This can, for instance, be done by explicating where the PRISMA criteria were omitted or adapted. 4 The speed with which a rapid review is conducted largely depends on the availability of human and financial resources. 4 There is also often a close interaction between the commissioners and the reviewers because the review purports to guide decision-making.

Distinction between rapid and systematic reviews

Based on Grant and Booth. 15

Although rapid reviews do not meet the gold standard of systematic reviews, and therefore do have their limitations (see below), they frequently provide adequate advice on which to base clinical and policy decisions. 13 A direct comparison of the findings from rapid and full systematic reviews showed that the essential conclusions did not differ extensively. 13 Given the importance of rapid reviews, the Cochrane collaboration has established the Cochrane Rapid Reviews Methods Group, which recently developed actionable recommendations and minimum standards for rapid reviews ( Table 2 ). 16

Cochrane rapid review methods recommendations

Reproduced from Garritty et al . 16 published under the CC BY-NC-ND license.

To date, three rapid reviews have been published in the European Journal of Cardiovascular Nursing . 17–19 The first, published in 2017, assessed the efficacy of non-pharmacological interventions on psychological distress in patients undergoing cardiac catheterization. 17 A second rapid review, published in 2020 amidst the first wave of the SARS-CoV-2/COVID-19 pandemic in Asia, Europe, and North America, looked at the evidence for remote healthcare during quarantine situations to support people living with cardiovascular diseases. 18 Given the unprecedented global situation and the sense of urgency, this was a pre-eminent example for which a rapid review was appropriate. A third rapid review, published in 2021, investigated if participation in a support-based intervention exclusively for caregivers of people living with heart failure change their psychological and emotional wellbeing. 19 The authors explicitly chose the streamlined method of a rapid review to inform the methodological approach of a future caregiver-based intervention. 19

Although rapid and systematic reviews have shown to yield similar conclusions, 13 , 20 there are definitely some limitations or pitfalls to bear in mind. For instance, rapidity may lead to brevity. 4 In such cases, the search may be restricted to one database; limited inclusion criteria by date or language; having one person screen and another verify studies; not conducting quality appraisal; or presenting results only as a narrative summary. 14 If only one database is used, it is recommended to search Pubmed, because rapid reviews that did not use Pubmed as a database are more likely to obtain results that differ from systematic reviews. 21 It is also recommended that a quality appraisal of the included studies is not skipped. For this purpose, appraisal tools that account for different methodologies are very suitable, such as the Mixed Methods Appraisal Tool (MMAT). 22 It has also been observed that rapid reviews are often not explicitly defining the methodology that had been used. 4 , 13 Consequently, the search cannot always be replicated and the reasons for the differences between the findings are difficult to comprehend. Further, it is not clear if the review was performed in a systematic fashion, which is also mandatory for rapid reviews. Otherwise, they may bear the risks of any other narrative review or poorly conducted systematic review. 4 Rapid reviews should not be seen as a quick alternative to a full systematic review, 13 and authors must avoid making shortcuts that could lead to bias. 6 Therefore, a thorough evaluation of the appropriateness of a rapid review methodology, being the need for a summary of the evidence without delay, is imperative. If there is no urgent need to obtain the evidence for clinical practice or policy-making, a full systematic review would be more suitable. Furthermore, when there is a high need for accuracy, for instance for clinical guidelines or regulatory affairs, a systematic review is still the best option. 21

Transparency in the description of the methods used is of critical importance to appraise the quality of the rapid review. 4 A scoping review of rapid reviews found that the quality of reporting is generally poor. 14 This may lead to the interpretation that rapid reviews are inherently inferior to full systematic reviews, whereas this is not the case if properly conducted and reported. It is also vital to acknowledge the potential limitations of rapidity.

Since the typical reports of systematic reviews are often too long and too complex for clinicians and decision-makers, 3 new formats of evidence summaries have been developed. 5 Evidence summaries are synopses that summarize existing international evidence on healthcare interventions or activities’. 5 For rapid reviews, reporting the evidence in tabular format is indispensable to be used at the point of care. Such evidence summaries can be even integrated in electronic patient records, to provide recommendations for the care for that patient, based on their specific characteristics. 5 An extensive database with evidence summaries has been developed by the Joanna Briggs Institute ( https://www.wolterskluwer.com/en/know/jbi-resources/jbi-ebp-database, last accessed 27 March 2021 ).

Rapid reviews are meant to inform specific clinical or policy decisions in a timely and resource-efficient fashion. They are conducted within a timeframe of some weeks. The rapidity refers to the accelerated process but should not come at the cost of losing any of the important information that could be expected from full systematic reviews, and the introduction of biases that may jeopardize the validity of the conclusions must be avoided. The quality of rapid reviews is as important as for traditional systematic reviews. Rapid reviews need to be explicit in the methodology that has been used and clearly state how the review differs from a full systematic review. Sufficient attention ought to be given to the evidence summaries because the format of these summaries will largely determine the adoption in clinical care or decision-making.

The article is based on a review of the literature. No specific data sources have been used.

Conflict of interest : none declared.

Munn Z , Stern C , Aromataris E , Lockwood C , Jordan Z. What kind of systematic review should I conduct? A proposed typology and guidance for systematic reviewers in the medical and health sciences . BMC Med Res Methodol 2018 ; 18 : 5 .

Google Scholar

Ruppar T. Meta-analysis: how to quantify and explain heterogeneity? Eur J Cardiovasc Nurs 2020 ; 19 : 646 – 652 .

Khangura S , Konnyu K , Cushman R , Grimshaw J , Moher D. Evidence summaries: the evolution of a rapid review approach . Syst Rev 2012 ; 1 : 10 .

Schünemann HJ , Moja L. Reviews: Rapid! Rapid! Rapid! …and systematic . Syst Rev 2015 ; 4 : 4 .

Munn Z , Lockwood C , Moola S. The development and use of evidence summaries for point of care information systems: a streamlined rapid review approach . Worldviews Evid Based Nurs 2015 ; 12 : 131 – 138 .

Ganann R , Ciliska D , Thomas H. Expediting systematic reviews: methods and implications of rapid reviews . Implement Sci 2010 ; 5 : 56 .

Klompstra L , Jaarsma T. Delivering healthcare at distance to cardiac patients during the COVID-19 pandemic: experiences from clinical practice . Eur J Cardiovasc Nurs 2020 ; 19 : 551 – 552 .

Lauck S , Forman J , Borregaard B , Sathananthan J , Achtem L , McCalmont G , Muir D , Hawkey MC , Smith A , Højberg Kirk B , Wood DA , Webb JG. Facilitating transcatheter aortic valve implantation in the era of COVID-19: recommendations for programmes . Eur J Cardiovasc Nurs 2020 ; 19 : 537 – 544 .

Hill L , Beattie JM , Geller TP , Baruah R , Boyne J , Stolfo GD , Jaarsma T. Palliative care: essential support for patients with heart failure in the COVID-19 pandemic . Eur J Cardiovasc Nurs 2020 ; 19 : 469 – 472 .

Tricco AC , Garritty CM , Boulos L , Lockwood C , Wilson M , McGowan J , McCaul M , Hutton B , Clement F , Mittmann N , Devane D , Langlois EV , Abou-Setta AM , Houghton C , Glenton C , Kelly SE , Welch VA , LeBlanc A , Wells GA , Pham B , Lewin S , Straus SE. Rapid review methods more challenging during COVID-19: commentary with a focus on 8 knowledge synthesis steps . J Clin Epidemiol 2020 ; 126 : 177 – 183 .

Van Bulck L , Kovacs AH , Goossens E , Luyckx K , Jaarsma T , Stromberg A , Moons P. Impact of the COVID-19 pandemic on ongoing cardiovascular research projects: considerations and adaptations . Eur J Cardiovasc Nurs 2020 ; 19 : 465 – 468 .

Hamel C , Michaud A , Thuku M , Skidmore B , Stevens A , Nussbaumer-Streit B , Garritty C. Defining rapid reviews: a systematic scoping review and thematic analysis of definitions and defining characteristics of rapid reviews . J Clin Epidemiol 2021 ; 129 : 74 – 85 .

Watt A , Cameron A , Sturm L , Lathlean T , Babidge W , Blamey S , Facey K , Hailey D , Norderhaug I , Maddern G. Rapid versus full systematic reviews: validity in clinical practice? ANZ J Surg 2008 ; 78 : 1037 – 1040 .

Tricco AC , Antony J , Zarin W , Strifler L , Ghassemi M , Ivory J , Perrier L , Hutton B , Moher D , Straus SE. A scoping review of rapid review methods . BMC Med 2015 ; 13 : 224 .

Grant MJ , Booth A. A typology of reviews: an analysis of 14 review types and associated methodologies . Health Info Libr J 2009 ; 26 : 91 – 108 .

Garritty C , Gartlehner G , Nussbaumer-Streit B , King VJ , Hamel C , Kamel C , Affengruber L , Stevens A. Cochrane Rapid Reviews Methods Group offers evidence-informed guidance to conduct rapid reviews . J Clin Epidemiol 2021 ; 130 : 13 – 22 .

Carroll DL , Malecki-Ketchell A , Astin F. Non-pharmacological interventions to reduce psychological distress in patients undergoing diagnostic cardiac catheterization: a rapid review . Eur J Cardiovasc Nurs 2017 ; 16 : 92 – 103 .

Neubeck L , Hansen T , Jaarsma T , Klompstra L , Gallagher R. Delivering healthcare remotely to cardiovascular patients during COVID-19: a rapid review of the evidence . Eur J Cardiovasc Nurs 2020 ; 19 : 486 – 494 .

Carleton-Eagleton K , Walker I , Freene N , Gibson D , Semple S. Meeting support needs for informal caregivers of people with heart failure: a rapid review . Eur J Cardiovasc Nurs 2021 .

Best L , Stevens A , Colin‐Jones D. Rapid and responsive health technology assessment: the development and evaluation process in the South and West region of England . J Clin Eff 1997 ; 2 : 51 – 56 .

Marshall IJ , Marshall R , Wallace BC , Brassey J , Thomas J. Rapid reviews may produce different results to systematic reviews: a meta-epidemiological study . J Clin Epidemiol 2019 ; 109 : 30 – 41 .

Hong QN , Gonzalez-Reyes A , Pluye P. Improving the usefulness of a tool for appraising the quality of qualitative, quantitative and mixed methods studies, the Mixed Methods Appraisal Tool (MMAT) . J Eval Clin Pract 2018 ; 24 : 459 – 467 .

Email alerts

Companion articles.

  • Meeting support needs for informal caregivers of people with heart failure: a rapid review
  • Comment on rapid reviews
  • Response to letter from Dr Riegel and Mr James

Citing articles via

  • Recommend to Your Librarian
  • Advertising and Corporate Services
  • Journals Career Network

Affiliations

  • Online ISSN 1873-1953
  • Print ISSN 1474-5151
  • Copyright © 2024 European Society of Cardiology
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

This website may not work correctly because your browser is out of date. Please update your browser .

Rapid evidence assessment

Rapid Evidence Assessment is a process that uses a combination of key informant interviews and targeted literature searches to produce a report in a few days or a few weeks.

This process is faster and less rigorous than a full systematic review but more rigorous than ad hoc searching.

The latest REA publications from the UK Government.

This toolkit was designed to help Government Social Researchers to carry out or commission REAs. It contained detailed guidance on choosing the right methods for each stage of an REA  and offers a range of templates and sources to support the successful completion of an REA.

This page is a Stub (a minimal version of a page). You can help expand it. Contact Us  to recommend resources or volunteer to expand the description.

'Rapid evidence assessment' is referenced in:

  • 52 weeks of BetterEvaluation: Week 29: Weighing the data for an overall evaluative judgement
  • 52 weeks of BetterEvaluation: Week 40: How to find evidence and use it well

Framework/Guide

  • Communication for Development (C4D) :  C4D: Realistic
  • Communication for Development (C4D) :  C4D: Synthesise data across studies (research, monitoring data, evaluations)
  • Rainbow Framework :  Synthesise data across evaluations

Back to top

© 2022 BetterEvaluation. All right reserved.

Simon Fraser University

  • Library Catalogue

Systematic, scoping, and rapid reviews: An overview

What is evidence synthesis.

Evidence synthesis is "the contextualization and integration of research findings of individual research studies within the larger body of knowledge on the topic. A synthesis must be reproducible and transparent in its methods, using quantitative and/or qualitative methods" ( CIHR ). Systematic reviews, scoping reviews, and rapid reviews are all forms of evidence synthesis.

What review is right for you?

The Right Review tool might help guide your choice of an evidence synthesis method.

You can get a sense of the wide array of review types on our  Literature Reviews for Graduate Students  guide.

Below, you'll find a brief comparison of three common types of evidence synthesis: systematic reviews, scoping reviews, and rapid reviews. 

Systematic review

"A systematic review attempts to identify, appraise and synthesize all the empirical evidence that meets pre-specified eligibility criteria to answer a given research question. Researchers conducting systematic reviews use explicit methods aimed at minimizing bias in order to produce more reliable findings that can be used to inform decision making." ( Cochrane )

Scoping review

"A scoping review or scoping study is a form of knowledge synthesis that addresses an exploratory research question aimed at mapping key concepts, types of evidence, and gaps in research related to a defined area or field by systematically searching, selecting, and synthesizing existing knowledge" ( Colquhoun et al. )

Rapid review

"Rapid reviews are a form of knowledge synthesis in which components of the systematic review process are simplified or omitted to produce information in a timely manner" ( Tricco et al. )

 When to use a systematic review

The aim is to address a focused research question with narrow parameters.

 When to use a scoping review

The aim is to summarize "a range of evidence in order to convey the breadth and depth of a field." ( Levac, Colquhoun, & O'Brien ) 

 When to use a rapid review

The aim is to conduct a rigorous review with limited time and/or resources.

 Time needed for systematic reviews 

9 to 18 months

  Time: scoping reviews

  time: rapid reviews.

1 to 6 months

 Reporting guidelines for systematic reviews

  reporting guidelines: scoping reviews,   reporting guidelines: rapid reviews.

PRISMA-RR ( under development since 2018); one possibility is to adapt  PRISMA  guidelines to the constraints of your project

  Frameworks/ guidance for systematic reviews

  • Cochrane Handbook for Systematic Reviews of Interventions (2022)
  • Joanna Briggs Institute (2020)
  • Campbell Collaboration

 Frameworks/ guidance: scoping reviews

  • Cochrane Training: Scoping reviews video series (2017)
  • Scoping studies: Advancing the methodology ( Levac , Colquhoun , & O'Brien, 2010)

  Frameworks/ guidance: rapid reviews

  • Rapid Review Guidebook ( NCCMT , 2017)
  • Rapid reviews to strengthen health policy and systems: A practical guide (WHO, 2017)

 Search strategy for systematic reviews

Comprehensive searches across a range of resources with explicit strategies; typically includes grey literature

  Search strategy: scoping reviews 

  search strategy: rapid reviews.

As comprehensive as time and/or resource constraints permit  

 Considerations for systematic reviews

Requires at least three team members, including expertise in the research area, systematic review methods, statistical analysis, and information retrieval; requires a focused question; involves critical appraisal.

  Considerations: scoping reviews 

Requires at least three team members, including expertise in the research area, scoping review methods, and information retrieval; requires an exploratory question; involves no critical appraisal.

  Considerations: rapid reviews

Can be done by an individual researcher or a research team; due to time and/or resource constraints, rapid reviews are less comprehensive and more prone to bias than systematic and scoping reviews; should provide explanations for shortcuts and subsequent limitations.

This page was adapted from the What's in a Name? comparison chart created by Library Services, Unity Health Toronto. Creative Commons BY-NC-SA 4.0 .

  • En español – ExME
  • Em português – EME

What are ‘rapid reviews’ and why do we need them?

Posted on 20th July 2022 by Zain Douba

""

Rapid reviews are a form of knowledge synthesis that follow the systematic review process, but components of the process are simplified or omitted to produce information in a timely manner (Khangura, 2012) .

Palmatier et al. describe review papers as:

Critical evaluations of material that has already been published regardless of the type of study design.

Progressively, reviews scan the literature to answer a research question briefly.

Types of review articles

There are more than 14 types of ‘r eview articles’; here are some of the main types:

  • Critical review
  • Literature (Narrative) review
  • Mapping review
  • Meta-analysis
  • Mixed studies review
  • Rapid review
  • Scoping review
  • Systematic review
  • Systematized review
  • Umbrella review

So, what is a rapid review?

A rapid review is a form of knowledge synthesis that accelerates the process of conducting a traditional systematic review through streamlining or omitting specific methods to produce evidence for stakeholders in a resource-efficient manner.

The timeframe of the review depends on resource availability, the quantity and quality of the literature, and the expertise or experience of reviewers.

As a guide, the stages and timeframe of the rapid review are:

  • Timeframe: ≤ 5 weeks
  • Question: Narrow question (may use the PICO framework – YouTube video )
  • Searches: Sources are limited due to time constraints of searching. Must still be transparent and reproducible
  • Selection: Based on inclusion/exclusion criteria
  • Appraisal: Critical and rigorous but time-limited
  • Synthesis: Descriptive summary or categorization of data, may still be quantitative

The main role of the rapid review

Policy-makers require valid and reliable evidence to support time-sensitive decisions, and will need to assess the quality and efficiency of that evidence.

Systematic reviews and other types of evidence syntheses are increasingly being used to inform, and lead, health policy decision-making. However, the time and cost to produce a systematic review are often barriers to its use in decision-making.

Rapid reviews are a timely and affordable approach that can provide actionable and relevant evidence to strengthen health policy and systems.

What are the key advantages of rapid reviews?

The rapid review can benefit the scientific community in many ways:

  • Provide an incorporated, synthesized overview of the currently available evidence
  • Evaluate existing methodological approaches and unique insights
  • Describe research understandings, existing gaps, and future research directions

In other words, the methodology used in a rapid review aims to limit some secondary steps compared to the systematic review, in order to produce focused research. This includes carefully focusing on the question, using broader or less sophisticated search strategies, conducting a review of reviews, restricting the amount of grey literature, extracting only key variables, and performing only ‘simple’ quality appraisal. Thus, not every review paper can offer all of these benefits, but this list represents their key contributions.

Conclusions

Clinicians, stakeholders, consumers, and policy-makers usually tend to digest and produce health-related decisions in a timely and resource-efficient manner. Concurrently, researchers tend to conduct ‘summarized evidence’ to respond to the need for the most recent and valid evidence.

Many readers tend to access the most summarized articles that come under the categories of improving patient care, health systems, decision-making, and international policies.

And now, can you tell me why and how you are planning to conduct a rapid review?

References and resources

A typology of reviews: an analysis of 14 review types and associated methodologies

Evidence summaries: the evolution of a rapid review approach

Review articles: purpose, process, and structure

Systematic Reviews & Other Review Types

Rapid Review Protocol

Cochrane Rapid Reviews: Interim Guidance from the Cochrane Rapid Reviews Methods Group

' src=

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Subscribe to our newsletter

You will receive our monthly newsletter and free access to Trip Premium.

Related Articles

""

Systematic reviews vs meta-analysis: what’s the difference?

What are the differences between a systematic review and a meta-analysis? Here are some tips to help you understand these two different yet related types of study.

""

Natural killer cells in glioblastoma therapy

As seen in a previous blog from Davide, modern neuroscience often interfaces with other medical specialities. In this blog, he provides a summary of new evidence about the potential of a therapeutic strategy born at the crossroad between neurology, immunology and oncology.

rapid evidence assessment vs literature review

What are evidence gap maps and why are they important?

Evidence gap maps (EGMs) are graphic representations of the available systematic reviews and ongoing research on relevant topics. Learn more, and test your understanding, in this blog for beginners to the topic.

Home

  • Evaluations

Structured Literature Reviews

Chapter 1 | Structured Literature Reviews: Background and Definitions

Structured literature reviews of the Independent Evaluation Group (IEG) aim to synthesize existing research on a given topic using systematic and transparent procedures. The concept and principles of a systematic review inspire and guide the approach. A systematic review aims to identify, appraise, and synthesize all relevant research that meets explicit prespecified eligibility criteria (Higgins et al. 2019). 1 Because systematic reviews are intended to be exhaustive, proper implementation can require a considerable amount of time, expertise, and resources. Snilstveit et al. (2017) report an average production time of 12–24 months (depending on the scope and resources available) for a systematic review involving a multi-person expert research team. 2

Comprehensively identifying all relevant studies in large volumes of literature can be a particularly challenging and time-intensive task (Thomas, Newman, and Oliver 2013). A broad and fully systematic review often requires searching a variety of literature sources, screening many thousands of studies to identify those relevant to the review. For example, searchable databases do not always use a common set of terms or keywords to index literature. Even when a standardized nomenclature for describing an issue or topic is available, searches on websites such as Google Scholar or the World Bank eLibrary can still yield many irrelevant results. Moreover, such terms likely do not capture the full breadth or nuance of a concept perfectly (Cantrell, Booth, and Chambers, forthcoming). These factors can make it challenging to ensure that a search is comprehensive, and review teams may have to search very large volumes of literature to reach saturation of coverage for the phenomenon of interest. Many systematic reviews even fall short of the mark of conducting truly exhaustive searches (as also highlighted by Evans and Popova 2016).

IEG’s structured literature review approach falls into a subset of more rapid evidence reviews inspired by the concept of a systematic review. Other common terms used for these types of literature reviews include rapid reviews , rapid evidence reviews , and rapid evidence assessments (Littell 2018; Tricco et al. 2015). For the sake of consistency, the type of literature review evaluated in this paper will be referred to as a rapid evidence review for the remainder of the paper. The Agency for Healthcare Research and Quality categorizes such reviews according to the extent of synthesis applied to the material covered in review (AHRQ 2015). In its categorization, inventories provide a list of available evidence, along with other contextual information needed to help inform decisions related to the state of research on a given subject. However, inventories do not synthesize evidence or present conclusions related to the state of the literature. Rapid responses present the user with an answer based on the best available evidence but do not attempt to formally synthesize evidence into conclusions. Rapid reviews perform a synthesis (qualitative, quantitative, or both) to provide an answer about the direction of evidence and possibly its strength.

Though ostensibly less nuanced than a full systematic review, these approaches nonetheless abide by the same basic principles as a systematic review, such as adhering to prespecified criteria for including studies and transparently reporting on the analysis of all relevant studies identified. However, the methods used may streamline the general approach and procedures used. Many rapid evidence reviews aim to deliver results within six months or less (Ganann, Ciliska, and Thomas 2010; Snilstveit et al. 2017; Varker et al. 2015). They may also use a narrower range of search techniques or sources of literature (Haby et al. 2016; Harker and Kleijnen 2012). Such restrictions ensure that the review can be delivered within shorter timeframes and resource constraints or meet deadlines required to feed into policy and decision-making processes (Varker et al. 2015; Watt et al. 2008).

Given that structured literature reviews may be limited in their coverage of the literature and depth of analysis, it is important to understand the caveats associated with applying this approach. Narrow searches of wide-ranging topics and omitting methods of critical appraisal and synthesis may limit what can reliably be said about the state of the literature and research on a particular subject. However, the processes used to conduct structured literature reviews also vary greatly (Ganann, Ciliska, and Thomas 2010; Haby et al. 2016; Hunter et al. 2020; Varker et al. 2015). Each review inevitably establishes its own set of shortcuts and heuristics, delineating the review according to a unique set of project-specific objectives.

To reflect the variety of adjustments a rapid evidence review may adopt, it is important to treat each one as unique. Omitting certain sources of literature, search methods, and analytical approaches will affect different reviews unequally: some shortcuts might be more or less important, representing different levels of risk of bias in different reviews. For example, some sources of literature may be thematically more important regarding some research topics (as would be the case for research on health-related topics, for which searching Medline and PubMed would be intuitively more important). Alternatively, some studies may find citation and reference tracking (a method of searching discussed in “Literature Search and Analysis” in chapter   2) more important for identifying all relevant literature in some contexts (see Cooper et al. 2018; Linder et al. 2015; Papaioannou et al. 2010; and Wright, Golder, and Rodriguez-López 2014). The appropriate choice between the two can depend on the efficacy of search strategies using key terms to identify relevant literature on websites and in databases.

Furthermore, some rapid evidence reviews may also have very good coverage and depth: evidence indicates that a more thorough systematic review does not necessarily always yield different conclusions than a more abbreviated review of the same topic (for example, AHRQ 2015; Haby et al. 2016). Hence, broad-brush statements about the rigor and limitations of rapid evidence reviews do not necessarily reflect authors’ individual work or specific applications of the approach.

This chapter discussed some of the underlying concepts and terminology; the next chapter presents an example of a structured literature review based on a case study assessing the effects of the World Bank’s Doing Business report.

  • Oya, Schaefer, and Skalidou (2018; agricultural certification), Snilstveit et al. (2015; education), Vaessen et al. (2014; microcredit), and Waddington et al. (2014; farmer field schools) provide some examples of systematic reviews.
  • New technologies are decreasing the time required to complete systematic reviews. However, a systematic review still requires some scoping to adequately delineate what should (and should not) be included in its coverage before production can begin.

The University of Melbourne

Which review is that? A guide to review types.

  • Which review is that?
  • Review Comparison Chart
  • Decision Tool
  • Critical Review
  • Integrative Review
  • Narrative Review
  • State of the Art Review
  • Narrative Summary
  • Systematic Review
  • Meta-analysis
  • Comparative Effectiveness Review
  • Diagnostic Systematic Review
  • Network Meta-analysis
  • Prognostic Review
  • Psychometric Review
  • Review of Economic Evaluations
  • Systematic Review of Epidemiology Studies
  • Living Systematic Reviews
  • Umbrella Review
  • Review of Reviews

Rapid Review

  • Rapid Evidence Assessment
  • Rapid Realist Review
  • Qualitative Evidence Synthesis
  • Qualitative Interpretive Meta-synthesis
  • Qualitative Meta-synthesis
  • Qualitative Research Synthesis
  • Framework Synthesis - Best-fit Framework Synthesis
  • Meta-aggregation
  • Meta-ethnography
  • Meta-interpretation
  • Meta-narrative Review
  • Meta-summary
  • Thematic Synthesis
  • Mixed Methods Synthesis
  • Narrative Synthesis
  • Bayesian Meta-analysis
  • EPPI-Centre Review
  • Critical Interpretive Synthesis
  • Realist Synthesis - Realist Review
  • Scoping Review
  • Mapping Review
  • Systematised Review
  • Concept Synthesis
  • Expert Opinion - Policy Review
  • Technology Assessment Review
  • Methodological Review
  • Systematic Search and Review

“A rapid review is a form of knowledge synthesis that accelerates the process of conducting a traditional systematic review through streamlining or omitting specific methods to produce evidence for stakeholders in a resource-efficient manner.” (Garritty et. al, 2020)

Rapid reviews target high quality and authoritative resources for time-critical decision-making or clinically urgent questions. Yet like a systematic review they aim to identify the key concepts, theories and resources in a field, and to survey the major research studies.  Less time may be spent on critical appraisal as systematic reviews, evidence briefs and clinical guidelines are sought in preference to exhaustive coverage of primary studies.

Further Reading/Resources

Garritty, C., Hamel, C., Trivella, M., Gartlehner, G., Nussbaumer-Streit, B., Devane, D., ... & King, V. J. (2024). Updated recommendations for the Cochrane rapid review methods guidance for rapid reviews of effectiveness. bmj , 384 . Full Text

Garritty C, Gartlehner G, Kamel C, King VJ, Nussbaumer-Streit B, Stevens A, Hamel C, Affengruber L. Cochrane Rapid Reviews. Interim Guidance from the Cochrane Rapid Reviews Methods Group. March 2020. Full Text PDF Other

Cochrane Rapid Reviews: Learning Live webinar series Link Grant MJ, Booth A.  A typology of reviews: an analysis of 14 review types and associated methodologies.   Health Information and Libraries Journal.   2009; 26(2):91-108. Full Text Haby, M.M., Chapman, E., Clark, R.  et al.  What are the best methodologies for rapid reviews of the research evidence for evidence-informed decision making in health policy and practice: a rapid review.  Health Res Policy Sys   14,  83 (2016). Full Text Schünemann, H. (Ed.). (n.d.). Advances in rapid reviews . www.biomedcentral.com. Retrieved June 21, 2022, from https://www.biomedcentral.com/collections/arr  Link Dobbins, M. (2017). Rapid review guidebook. Natlonal Collaborating Centre for  Methods and Tools , 13 , 25.  Full Text PDF Pandor A, Kaltenthaler E, Martyn-St James M, Wong R, Cooper K, Dimairo M, O’Cathain A, Campbell F, Booth A. (2019), Delphi consensus reached to produce a decision tool for SelecTing Approaches for Rapid Reviews (STARR), Journal of Clinical Epidemiology  Link Ganann R, Ciliska D and Helen T.  Expediting systematic reviews: methods and implications of rapid reviews.  Implementation Science.  2010; 5:56. Full Text Tricco AC, Antony J, Zarin W, et al. A scoping review of rapid review methods.   BMC Medicine . 2015;13:224. Full Text

Khangura, S., Konnyu, K., Cushman, R., Grimshaw, J., & Moher, D. (2012). Evidence summaries: the evolution of a rapid review approach. Systematic reviews, 1(1), 1. Full Text

Abboah-Offei, M., Salifu, Y., Adewale, B., Bayuo, J., Ofosu-Poku, R., & Opare-Lokko, E. (2021). A rapid review of the use of face mask in preventing the spread of COVID-19.  International journal of nursing studies advances ,  3 , 100013. Full Text Nussbaumer-Streit, B., Mayr, V., Dobrescu, A. I., Chapman, A., Persad, E., Klerings, I., ... & Gartlehner, G. (2020). Quarantine alone or in combination with other public health measures to control COVID‐19: a rapid review. Cochrane Database of Systematic Reviews, (9). Full Text

References Garritty C, Gartlehner G, Kamel C, King VJ, Nussbaumer-Streit B, Stevens A, Hamel C, Affengruber L. Cochrane Rapid Reviews. Interim Guidance from the Cochrane Rapid Reviews Methods Group. March 2020. Full Text PDF  

  • << Previous: Rapid Review Family
  • Next: Rapid Evidence Assessment >>
  • Last Updated: Apr 30, 2024 9:02 AM
  • URL: https://unimelb.libguides.com/whichreview
  • Open access
  • Published: 30 July 2022

Paper 2: Performing rapid reviews

  • Valerie J. King 1 ,
  • Adrienne Stevens 2 ,
  • Barbara Nussbaumer-Streit 3 ,
  • Chris Kamel 4 &
  • Chantelle Garritty 5  

Systematic Reviews volume  11 , Article number:  151 ( 2022 ) Cite this article

10k Accesses

18 Citations

1 Altmetric

Metrics details

Health policy-makers must often make decisions in compressed time frames and with limited resources. Hence, rapid reviews have become a pragmatic alternative to comprehensive systematic reviews. However, it is important that rapid review methods remain rigorous to support good policy development and decisions. There is currently little evidence about which streamlined steps in a rapid review are less likely to introduce unacceptable levels of uncertainty while still producing a product that remains useful to policy-makers.

This paper summarizes current research describing commonly used methods and practices that are used to conduct rapid reviews and presents key considerations and options to guide methodological choices for a rapid review.

The most important step for a rapid review is for an experienced research team to have early and ongoing engagement with the people who have requested the review. A clear research protocol, derived from a needs assessment conducted with the requester, serves to focus the review, defines the scope of the rapid review, and guides all subsequent steps. Common recommendations for rapid review methods include tailoring the literature search in terms of databases, dates, and languages. Researchers can consider using a staged search to locate high-quality systematic reviews and then subsequently published primary studies. The approaches used for study screening and selection, data extraction, and risk-of-bias assessment should be tailored to the topic, researcher experience, and available resources. Many rapid reviews use a single reviewer for study selection, risk-of-bias assessment, or data abstraction, sometimes with partial or full verification by a second reviewer. Rapid reviews usually use a descriptive synthesis method rather than quantitative meta-analysis. Use of brief report templates and standardized production methods helps to speed final report publication.

Conclusions

Researchers conducting rapid reviews need to make transparent methodological choices, informed by stakeholder input, to ensure that rapid reviews meet their intended purpose. Transparency is critical because it is unclear how or how much streamlined methods can bias the conclusions of reviews. There are not yet internationally accepted standards for conducting or reporting rapid reviews. Thus, this article proposes interim guidance for researchers who are increasingly employing these methods.

Peer Review reports

Introduction

Health policy-makers and other stakeholders need evidence to inform their decisions. However, their decisions must often be made in short time frames, and they may have other resource constraints, such as the available budget or personnel [ 1 , 2 , 3 , 4 , 5 , 6 ]. Rapid reviews are increasingly being used and are increasingly influential in the health policy and system arena [ 3 , 7 , 8 , 9 , 10 ]. One needs assessment [ 11 ] showed that policy-makers want evidence reviews to answer the right question, be completed in days to weeks, rather than months or years, be accurate and reproducible, and be affordable.

As much as policy-makers may desire faster and more efficient evidence syntheses, it is not yet clear whether rapid reviews are sufficiently rigorous and valid, compared to systematic reviews which are considered the “gold standard” evidence synthesis, to inform policy [ 12 ]. Only a few empirical studies have compared the findings of rapid reviews and systematic reviews on the same topic, and their results are conflicting and inconclusive, leaving questions about the level of bias that may be introduced because of rapid review methods [ 7 , 13 , 14 , 15 , 16 , 17 , 18 , 19 ].

A standardized or commonly agreed-upon set of methods for conducting rapid reviews had not existed until recently, [ 1 , 9 , 14 , 20 , 21 , 22 , 23 ] and while there is little empiric evidence on some of the standard elements of systematic reviews, [ 24 ] those standards are well articulated [ 25 , 26 ]. A minimum interim set of standards has was developed by the Cochrane Rapid Reviews Methods Group [ 1 , 2 ] to help guide rapid review production during the SARS-CoV-19 pandemic, and other researchers have proposed methods and approaches to guide rapid reviews [ 5 , 21 , 22 , 27 , 28 , 29 , 30 , 31 , 32 , 33 , 34 , 35 , 36 ].

This article gives an overview of potential ways to produce a rapid review while maintaining a synthesis process that is sufficiently rigorous, yet tailored as needed, to support health policy-making. We present options for common methods choices, summarized from descriptions and evaluations of rapid review products and programs in Table 1 , along with key considerations for each methodological step.

The World Health Organization (WHO) published Rapid reviews to strengthen health policy and systems: a practical guide [ 5 ] in 2017. The initial work for this article was completed as a chapter for that publication and included multiple literature searches and layers of peer review to identify important studies and concepts. We conducted new searches using Ovid MEDLINE, the Cochrane Library’s methodology collection, and the bibliography of studies maintained by the Cochrane Rapid Reviews Methods Group, to identify articles, including both examples of rapid reviews and those on rapid review methodology, published after the publication of the WHO guide. We have not attempted to perform a comprehensive identification or catalog of all potential articles on rapid reviews or examples of reviews conducted with these methods. As this work was not a systematic review of rapid review methods, we do not include a flow of articles from search to inclusion and have not undertaken any formal critical appraisal of the articles we did include.

Needs assessment, topic selection, and topic refinement

Rapid reviews are typically conducted at the request of a particular decision-maker, who has a key role in posing the question, setting the parameters of the review, and defining the timeline [ 40 , 41 , 42 ]. The most common strategy for completing a rapid review within a limited time frame is to narrow its scope. This can be accomplished by limiting the number of questions, interventions, and outcomes considered in the review [ 13 , 15 ]. Early and continuing engagement of the requester and any other relevant stakeholders is critical to understand their needs, the intended use of the review, and the expected timeline and deliverables [ 15 , 28 , 29 , 40 , 41 , 42 ]. Policy-makers and other requesters may have vaguely defined questions or unrealistic expectations about what any type of review can accomplish [ 41 , 42 ]. A probing conversation or formal needs assessment is the critical first step in any knowledge synthesis approach to determine the scope of the request, the intended purpose for the completed review, and to obtain a commitment for collaboration over the duration of the project [ 28 , 30 , 41 ]. Once the request and its context are understood, researchers should fully develop the question(s), including any needed refinement with the requester or other stakeholders, before starting the project [ 5 ]. This process can be iterative and may require multiple contacts between the reviewers and the requester to ensure that the final rapid review is fit for its intended purpose [ 41 , 42 ]. In situations where a definitive systematic review might be needed, it may be useful to discuss with the requester the possibility of conducting a full systematic review, either in parallel or serially with the rapid review [ 43 ].

Protocol development

A research protocol clearly lays out the scope of the review, including the research questions and the approaches that will be used to conduct the review [ 44 ]. We suggest using the Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols (PRISMA-P) 2015 statement for guidance [ 37 ]. Most reviewers use the PICO format (population, intervention, comparator, outcome), with some adding elements for time frame, setting, and study design. The PICO elements help to define the research questions, and the initial development of questions can point to needed changes in the PICO elements. For some types of research questions or data, other framework variations such as SPICE (setting, perspective, intervention, comparison, evaluation) may be used, although the PICO framework can generally be adapted [ 45 ]. Health services and policy research questions may call for more complex frameworks [ 5 ]. This initial approach assists both researchers and knowledge users to know what is planned and enables documentation of any protocol deviations; however, the customized and iterative nature of rapid reviews means that some flexibility may be required. Some rapid review producers include the concept of methods adjustment in the protocol itself [ 46 , 47 ]. However, changes made beyond the protocol stage and the rationale for making them must be transparent and documented in the final report.

The international prospective register of systematic reviews (PROSPERO) [ 44 ] ( https://www.crd.york.ac.uk/PROSPERO/ ) accepts registration of protocols that include at least one clinically or patient-relevant outcome. The Open Science Framework (OSF) [ 48 ] platform ( https://osf.io/ ) also accepts protocol registrations for rapid reviews. We advise protocol submitters to include the term “rapid review” or another similar term in the registered title, as this will assist tracking the use, validity, and value of rapid reviews [ 1 ]. Protocol registration helps to decrease research waste and allows both requesters and review authors to avoid duplication. Currently, most rapid review producers report using a protocol, but few register their protocols [ 13 , 17 ].

Literature search

Multiple authors have conducted inventories of the characteristics of and methods used for rapid reviews, including the broad categories of literature search, study selection, data extraction, and synthesis steps [ 13 , 15 , 17 , 20 , 24 , 49 ]. PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) standards call for documentation of the full search strategy for all electronic databases used [ 38 ]. Most published rapid reviews search two or more databases, with PubMed, Embase, and the Cochrane Library mentioned frequently [ 13 , 17 , 20 , 49 ]. Rapid reviews often streamline systematic review methods by limiting the number of databases searched and the search itself by date, language, geographical area, or study design, and some rapid reviews search only for existing systematic reviews [ 13 , 15 , 17 , 20 , 49 , 50 ]. Other rapid reviews use a layered searching approach, identifying existing systematic reviews and then updating them with a summary of more recent eligible primary studies [ 13 , 15 , 18 , 20 , 36 ]. Studies of simplified search strategies have generally demonstrated acceptable retrieval characteristics for most types of rapid review reports [ 51 , 52 ]. Searching the reference lists of eligible studies (sometimes known as the “snowballing” technique) and searching the gray literature (i.e., reports that are difficult to locate or unpublished) are done in about half of published rapid reviews and may be essential for certain topics [ 13 , 15 , 20 , 49 ]. However, rapid reviews seldom report contact with authors and other experts to identify additional unpublished studies [ 13 , 15 , 20 , 49 ]. One study found that peer review of the search strategy, using a tool such as the PRESS (peer review of electronic search strategies) checklist, [ 39 ] was reported in 38% of rapid reviews, but that it was usually performed internally rather than by external information specialist reviewers [ 13 ]. Peer review of search strategies has been reported to increase retrieval of relevant records, particularly for nonrandomized studies [ 53 ].

Screening and study selection

Methodological standards for systematic reviews generally require independent screening of citations and abstracts by at least two researchers to arrive at a set of potentially eligible references, which are in turn subjected to dual review in full-text format to arrive at a final inclusion set. Rapid reviews often streamline this process, with up to 40% using a single researcher at each stage [ 13 , 15 , 17 , 18 , 20 , 49 ]. Some rapid reviews report verification of a sample of the articles by a second researcher or, occasionally, use of full dual screening by two independent researchers [ 13 , 17 , 20 , 49 ]. One methodological study reported that single screener selection missed an average of 5% of eligible studies, ranging from 3% for experienced reviewers and 6% for those with less experience [ 54 ]. If time and resources allow, we recommend that dual screening of all excluded studies, at both the title and full-text stages, be used to minimize the risk of selection bias through the inappropriate exclusion of relevant studies. However, there is some evidence that the use of a single experienced reviewer for particular topics may be sufficient [ 18 , 46 , 54 ].

Data extraction

As with citation screening and study selection, the number of independent reviewers who extract study data for a rapid review can vary. One study found that the most common approach is single-reviewer extraction (41%), although another 25% report verification of a sample by a second reviewer and nearly as many used dual extraction [ 13 ]. A more recent study reported that only about 10% of rapid reviews examined reported dual data extraction, although nearly twice as many simply did not report this feature [ 17 ]. Data abstraction generally includes PICO elements, although data abstraction was often limited by the scope of the review, and authors were contacted for missing data very infrequently [ 13 ].

Risk-of-bias assessment

Risk-of-bias assessment, sometimes called critical appraisal or methodological quality appraisal, examines the quality of the methods employed for each included study and is a standard element of systematic reviews [ 25 ]. The vast majority of rapid review producers perform some type of critical appraisal [ 17 , 20 ]. Some rapid reviews report the use of a single assessor with verification of a sample of study assessments by another assessor [ 17 , 49 ]. There is no consensus as to which risk-of-bias assessment tools should be used, although most reviews use study design-specific instruments (e.g., an instrument designed for randomized controlled trials (RCTs) if assessing RCTs) intended for assessing internal validity [ 13 , 20 ].

Knowledge synthesis

Nearly all rapid review producers conduct a descriptive synthesis (also often called a narrative synthesis) of results, but a few perform additional meta-analyses or economic analyses [ 13 , 17 , 20 ]. The synthesis that is conducted is often limited to a basic descriptive summary of studies and their results, rather than the full synthesis that is recommended for systematic reviews [ 26 ]. Most rapid reviews present conclusions, recommendations, or implications for policy or clinical practice as another component of the synthesis. Multiple experts also recommend that rapid reviews clearly describe and discuss the potential limitations arising from methodological choices [ 5 , 9 , 13 , 15 , 23 ].

Many systematic review producers use the Grading of Recommendations Assessment, Development and Evaluation (GRADE) system [ 55 ] ( http://www.gradeworkinggroup.org/ ) to rate the certainty of the evidence about health outcomes. Guideline developers and others who make recommendations or policy decisions use GRADE to rate the strength of recommendations based on that evidence. The GRADE evidence to decisions (EtD) framework has also been used to help decision-makers developing health system and public health [ 56 ] and coverage [ 57 ] policies. Rapid review authors can also employ GRADE to rate the certainty of synthesized evidence and develop policy implications for decision-makers if time and resources permit. However, the GRADE system works best for interventions that have been subject to RCTs and where there is at least one meta-analysis to provide a single estimate of effect.

Report production and dissemination

Standard templates for each stage of the review, from protocol development to report production, can assist the review team in performing each step efficiently. Use of a report template, with minimum methodological standards, reporting requirements, and standard report sections, can assist the producer in streamlining production of the report and can also enhance transparency [ 15 , 20 , 28 , 40 ]. An extension of the PRISMA statement for rapid reviews is under development and has been registered with the EQUATOR Network [ 58 ]. Until it is available, the PRISMA checklist for systematic reviews can serve as a reporting template to increase the transparency of rapid reviews [ 8 , 40 , 59 ].

Research about review formatting and presentation of rapid review is being conducted, but it is likely that the forms employed and tested will need to be adapted to the individual requester and stakeholder audiences [ 47 ]. Khangura and colleagues [ 28 ] have presented a figure showing formatted sections of a sample report, and many other rapid review producers have examples of reports online that can serve as formatting examples. In addition, findings from evidence summary presentation research for decision-makers in low- and middle-income countries can be translated into other settings [ 60 , 61 ].

Most rapid review producers conduct some form of peer review for the resulting reports, but such review is often internal and may include feedback from the requester [ 13 ]. Most producers disseminate their reports beyond the requester, but dissemination varies by the sensitivity or proprietary nature of the product [ 13 , 20 ]. When reports are disseminated, it is common for them to be posted online, for example, at an organizational website [ 13 , 20 ].

Operational considerations

Evaluations and descriptions of research programs that produce rapid reviews typically include some helpful pragmatic and operational considerations for undertaking a rapid review or developing a rapid review program [ 5 , 15 , 18 , 27 , 28 , 29 , 31 , 36 , 40 , 62 , 63 ]. Highly experienced, permanent staff with the right skill mix, including systematic reviewers, information specialists, methodologists, and content experts [ 15 , 18 , 30 , 40 , 49 ], are essential. It is time-consuming to assemble staff on a per-project basis, so the presence of an existing team (which may only do rapid reviews or may also do systematic reviews or other research) with review infrastructure already in place allows projects to get off to a quick start. The existence of a dedicated team also creates the potential to build relationships with requesters and to cultivate mutual trust. Staff with experience conducting systematic reviews will be familiar with standard methods and may be alert to any needed protocol changes as the review proceeds [ 49 ]. The rapid review team must understand the methodological implications of decisions taken and must convey these implications to the requesters, to allow them to understand the caveats and potential limitations. Continuing relationships and longer-term contracting with requesters, to allow for a quick start and “good faith” initiation of work before a contract is in place, can speed the early development stages [ 31 , 40 ]. It is important for rapid review producers to confirm that the choices they make to streamline the review are acceptable to the requester [ 41 ]. Whether it is a decision to limit the scope to a single intervention or outcome, restrict the literature search to existing systematic reviews, or forgo a meta-analysis, the knowledge user must be aware of the implications of streamlining decisions [ 15 , 27 , 31 , 41 ]. Some programs also emphasize the need for follow-up with review requesters to develop the relationship and continuously improve knowledge products [ 28 , 63 ]. Although it is beyond the scope of this article, we note that both systematic and rapid review producers are currently using various automated technologies to speed review production. There are examples of tools to help search for references, screen citations, abstract data, organize reviews, and enhance collaboration, but few evaluations of their validity and value in report production [ 64 , 65 ]. The Systematic Review Toolbox [ 66 ] ( http://systematicreviewtools.com/ ) is an online searchable database of tools that can help perform tasks in the evidence synthesis process.

Table 1 summarizes the commonly described approaches and key considerations for the major steps in a rapid review that are discussed in detail in the preceding sections.

Suggested approaches to rapid reviews

The previous sections have summarized the numerous approaches to conducting rapid reviews. Abrami and colleagues [ 27 ] summarized several methods of conducting rapid reviews and developed a brief review checklist of considerations and recommendations, which may serve as a useful parallel to Table 2 . A “one-size-fits-all” approach may not be suitable to cover the variety of topics and requester needs put forward. Watt and colleagues [ 9 ] observed over a decade ago, “It may not be possible to validate methodological strategies for conducting rapid reviews and apply them to every subject. Rather, each topic must be evaluated by thorough scoping, and appropriate methodology defined.” Plüddemann and colleagues [ 23 ] advocated for a flexible framework for what they term “restricted reviews,” with a set of minimum requirements and additional steps to reduce the risk of bias when time and resources allow. Thomas, Newman, and Oliver [ 29 ] noted that it might be more difficult to apply rapid approaches to questions of social policy than to technology assessment, in part because of the complexity of the topics, underlying studies, and uses of these reviews. The application of mixed methods, such as key informant interviews, stakeholder surveys, primary data, and policy analysis, may be required for questions with a paucity of published literature and those involving complex subjects [ 29 ]. However, rapid review producers should remain aware that streamlined methods may not be appropriate for all questions, settings, or stakeholder needs, and they should be honest with requesters about what can and cannot be accomplished within the timelines and resources available [ 31 ]. For example, a rapid review would likely be inappropriate as the foundation for a national guideline on cancer treatment due to be launched 5 years in the future. A decision tool, STARR (SelecTing Approaches for Rapid Reviews) has been published by Pandor and colleagues [ 67 ] to help guide decisions about interacting with report requesters, making informed choices regarding to the evidence base, methods for data extraction and synthesis, and reporting on the approaches used for the report.

Tricco and colleagues [ 21 ] conducted an international survey of rapid review producers, using a modified Delphi ranking to solicit opinions about the feasibility, timeliness, comprehensiveness, and risk of bias of six different rapid review approaches. Ranked best in terms of both risk of bias and feasibility was “approach 1,” which included published literature only, based on a search of one or more electronic databases, limited in terms of both date and language. With this approach, a single reviewer conducts study screening, and both data extraction and risk-of-bias assessment are done by a single reviewer, with verification by a second researcher. Other approaches were ranked best in terms of timeliness and comprehensiveness, [ 21 ] representing trade-offs that review producers and knowledge users may want to consider. Because the survey report was based on expert opinion, it did not provide empirical evidence about the implications of each streamlined approach [ 21 ]. However, in the absence of empirical evidence, it may serve as a resource for rapid review producers looking to optimize one of these review characteristics. Given that evidence regarding the implications of methodological decisions for rapid reviews is limited, we have developed interim guidance for those conducting rapid reviews (Table 2 ).

Rapid reviews are being used with increasing frequency to support clinical and policy decisions [ 6 , 22 , 34 ]. While policymakers are generally willing to trade some certainty for speed and efficiency, they do expect rapid reviews to come close to the validity of systematic reviews [ 51 ]. There is no universally accepted definition of a rapid review [ 2 ]. This lack of consensus is, in part, related to the grouping of products with different purposes, audiences, timelines, and resources. Although we have attempted to summarize the major choices available to reviewers and requesters of information, there are few empiric data to guide these choices. We may have missed examples of rapid reviews and methodological research that could add to the conclusions of this paper. However, our approach to this work has been pragmatic, much like a rapid review itself, and is based on our international experience as researchers involved in the Cochrane Rapid Reviews Methods Group, as well as authors who participated in the writing and dissemination of Rapid reviews to strengthen health policy and systems: a practical guide [ 5 ]. This paper has, in addition, been informed by our research about rapid reviews and our collective work across several groups that conduct rapid reviews [ 1 , 68 ]. The Cochrane Rapid Review Methods Group also conducted a methods opinion survey in 2019 and released interim recommendations to guide Cochrane rapid reviews during the SARS-CoV-2 pandemic [ 2 ]. These recommendations are specific to the needs of Cochrane reviews and offer more detailed guidance for rapid review producers than those presented in this paper. We encourage readers to sign up for the Cochrane Rapid Reviews Methods Group newsletter on the website ( https://methods.cochrane.org/rapidreviews/ ) and to check the list of methodological publications which is updated regularly to continue to learn about research pertinent to rapid reviews [ 68 ].

We have summarized the rapid review methods that can be used to balance timeliness and resource constraints with a rigorous knowledge synthesis process to inform health policy-making. Interim guidance suggestions for the conduct of rapid reviews are outlined in Table 2 . The most fundamental key to success is early and continuing engagement with the research requester to focus the rapid review and ensure that it is appropriate to the needs of stakeholders. Although the protocol serves as the starting point for the review, methodological decisions are often iterative, involving the requester. Any changes to the protocol should be reflected in the final report. Methods can be streamlined at all stages of the review process, from search to synthesis, by limiting the search in terms of dates and language; limiting the number of electronic databases searched; using one reviewer to perform study selection, risk-of-bias assessment, and data abstraction (often with verification by another reviewer); and using a descriptive synthesis rather than a quantitative summary. Researchers need to make transparent methodological choices, informed by stakeholder input, to ensure that the evidence review is fit for its intended purpose. Given that it is not clear how these choices can bias a review, transparency is essential. We are aware that an increasing number of journals publish rapid reviews and related evidence synthesis products, which we hope will further increase the availability, transparency, and empiric research base for progress on rapid review methodologies.

Abbreviations

Enhancing the QUAlity and Transparency Of health Research

Grading of Recommendations Assessment, Development and Evaluation

Population, intervention, comparator, outcomes

Preferred Reporting Items for Systematic Reviews and Meta-Analyses

Preferred Reporting Items for Systematic Reviews and Meta-Analyses-Protocols

Randomized controlled trial

Setting, Perspective, Intervention, Comparator, Evaluation

SelecTing Approaches for Rapid Reviews

Peer review of electronic search strategies

World Health Organization

Garritty C, Stevens A, Gartlehner G, King V, Kamel C. Cochrane Rapid Reviews Methods Group to play a leading role in guiding the production of informed high-quality, timely research evidence syntheses. Syst Rev. 2016;5(1):184.

Article   PubMed   PubMed Central   Google Scholar  

Garritty C, Gartlehner G, Nussbaumer-Streit B, King VJ, Hamel C, Kamel C, et al. Cochrane Rapid Reviews Methods Group offers evidence-informed guidance to conduct rapid reviews. J Clin Epidemiol. 2021;130:13–21.

Article   PubMed   Google Scholar  

Peterson K, Floyd N, Ferguson L, Christensen V, Helfand M. User survey finds rapid evidence reviews increased uptake of evidence by Veterans Health Administration leadership to inform fast-paced health-system decision-making. Syst Rev. 2016;5(1):132.

Thomas J, Newman M, Oliver S. Rapid evidence assessments of research to inform social policy: taking stock and moving forward. Evid Policy. 2013;9(1):5–27.

Article   Google Scholar  

Tricco AC, Langlois EV, Straus SE, editors. Rapid reviews to strengthen health policy and systems: a practical guide. Geneva: World Health Organization; 2017.

Google Scholar  

Langlois EV, Straus SE, Antony J, King VJ, Tricco AC. Using rapid reviews to strengthen health policy and systems and progress towards universal health coverage. BMJ Glob Health. 2019;4(1): e001178.

Hite J, Gluck ME. Rapid evidence reviews for health policy and practice. 2016; https://www.academyhealth.org/sites/default/files/rapid_evidence_reviews_brief_january_2016.pdf . Accessed 20 June 2021.

Moore GM, Redman S, Turner T, Haines M. Rapid reviews in health policy: a study of intended use in the New South Wales’ Evidence Check programme. Evidence Policy. 2016;12(4):505–19.

Watt A, Cameron A, Sturm L, et al. Rapid reviews versus full systematic reviews: an inventory of current methods and practice in health technology assessment. Int J Technol Assess Health Care. 2008;24(2):133–9.

Moore G, Redman S, Rudge S, Haynes A. Do policy-makers find commissioned rapid reviews useful? Health Res Policy Syst. 2018;16(1):17.

Gluck M. Can evidence reviews be made more responsive to policymakers? Paper presented at: Fourth Global Symposium on health systems research: resiliant and responsive health systems for a changing world. 2016; Vancouver.

Wagner G, Nussbaumer-Streit B, Greimel J, Ciapponi A, Gartlehner G. Trading certainty for speed - how much uncertainty are decisionmakers and guideline developers willing to accept when using rapid reviews: an international survey. BMC Med Res Methodol. 2017;17(1):121.

Abou-Setta AM, Jeyaraman M, Attia A, et al. Methods for developing evidence reviews in short periods of time: a scoping review. PLoS ONE. 2016;11(12): e0165903.

Article   PubMed   PubMed Central   CAS   Google Scholar  

Haby MM, Chapman E, Clark R, Barreto J, Reveiz L, Lavis JN. What are the best methodologies for rapid reviews of the research evidence for evidence-informed decision making in health policy and practice: a rapid review. Health Res Policy Syst. 2016;14(1):83.

Hartling L, Guise JM, Kato E, et al. A taxonomy of rapid reviews links report types and methods to specific decision-making contexts. J Clin Epidemiol. 2015;68(12):1451-1462.e1453.

Reynen E, Robson R, Ivory J, et al. A retrospective comparison of systematic reviews with same-topic rapid reviews. J Clin Epidemiol. 2018;96:23–34.

Tricco AC, Zarin W, Ghassemi M, et al. Same family, different species: methodological conduct and quality varies according to purpose for five types of knowledge synthesis. J Clin Epidemiol. 2018;96:133–42.

Eiring O, Brurberg KG, Nytroen K, Nylenna M. Rapid methods including network meta-analysis to produce evidence in clinical decision support: a decision analysis. Syst Rev. 2018;7(1):168.

Taylor-Phillips S, Geppert J, Stinton C, et al. Comparison of a full systematic review versus rapid review approaches to assess a newborn screening test for tyrosinemia type 1. Res Synthesis Methods. 2017;8(4):475–84.

Polisena J, Garritty C, Kamel C, Stevens A, Abou-Setta AM. Rapid review programs to support health care and policy decision making: a descriptive analysis of processes and methods. Syst Rev. 2015;4:26.

Tricco AC, Zarin W, Antony J, et al. An international survey and modified Delphi approach revealed numerous rapid review methods. J Clin Epidemiol. 2016;70:61–7.

Aronson JK, Heneghan C, Mahtani KR, Pluddemann A. A word about evidence: ‘rapid reviews’ or ‘restricted reviews’? BMJ Evid-Based Med. 2018;23(6):204–5.

Pluddemann A, Aronson JK, Onakpoya I, Heneghan C, Mahtani KR. Redefining rapid reviews: a flexible framework for restricted systematic reviews. BMJ Evid-Based Med. 2018;23(6):201–3.

Robson RC, Pham B, Hwee J, et al. Few studies exist examining methods for selecting studies, abstracting data, and appraising quality in a systematic review. J Clin Epidemiol. 2019;106:121–35.

Higgins JPT, Lasserson T, Chandler J, Tovey D, Churchill R. Methodological Expectations of Cochrane Intervention Reviews (MECIR). 2016; https:// https://community.cochrane.org/mecir-manual . Accessed June 20, 2021.

Higgins JPT, Green S. Cochrane Handbook for Systematic Reviews of Interventions Version 5.1.0 [updated March 2011]. The Cochrane Collaboration; 2011.

Abrami PC, Borokhovski E, Bernard RM, et al. Issues in conducting and disseminating brief reviews of evidence. Evid Policy. 2010;6(3):371–89.

Khangura S, Konnyu K, Cushman R, Grimshaw J, Moher D. Evidence summaries: the evolution of a rapid review approach. Syst Rev. 2012;1:10.

Thomas J, Newman M, Oliver S. Rapid evidence assessments of research to inform social policy: taking stock and moving forward. Evid Policy. 2013;9:5–27.

Varker T, Forbes D, Dell L, et al. Rapid evidence assessment: increasing the transparency of an emerging methodology. J Eval Clin Pract. 2015;21(6):1199–204.

Wilson MG, Lavis JN, Gauvin FP. Developing a rapid-response program for health system decision-makers in Canada: findings from an issue brief and stakeholder dialogue. System Rev. 2015;4:25.

Featherstone RM, Dryden DM, Foisy M, et al. Advancing knowledge of rapid reviews: an analysis of results, conclusions and recommendations from published review articles examining rapid reviews. Syst Rev. 2015;4:50.

Silva MT, Silva END, Barreto JOM. Rapid response in health technology assessment: a Delphi study for a Brazilian guideline. BMC Med Res Methodol. 2018;18(1):51.

Patnode CD, Eder ML, Walsh ES, Viswanathan M, Lin JS. The use of rapid review methods for the U.S. Preventive Services Task Force. Am J Prevent Med. 2018;54(1S1):S19-S25.

Strudwick K, McPhee M, Bell A, Martin-Khan M, Russell T. Review article: methodology for the ‘rapid review’ series on musculoskeletal injuries in the emergency department. Emerg Med Australas. 2018;30(1):13–7.

Dobbins M. Rapid review guidebook: steps for conducting a rapid review. McMaster University;2017.

Moher D, et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. System Rev. 2015;4(1):1–9.

Page MJ, McKenzie JE, Bossuyt PM, Boutron I, Hoffmann TC, Mulrow CD, et. al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ. 2021;372:n71.Moher D, Liberati A, Tetzlaff J, Altman DG, The PG. Preferred Reporting Items for Systematic Reviews and Meta-Analyses: the PRISMA statement. PLOS Med . 2009;6(7):e1000097.

McGowan J, Sampson M, Salzwedel DM, Cogo E, Foerster V, Lefebvre C. PRESS peer review of electronic search strategies: 2015 guideline statement. J Clin Epidemiol. 2016;75:40–6.

Haby MM, Chapman E, Clark R, Barreto J, Reveiz L, Lavis JN. Designing a rapid response program to support evidence-informed decision-making in the Americas region: using the best available evidence and case studies. Implement Sci. 2016;11(1):117.

Moore G, Redman S, Butow P, Haynes A. Deconstructing knowledge brokering for commissioned rapid reviews: an observational study. Health Res Policy Syst. 2018;16(1):120.

Article   CAS   PubMed   PubMed Central   Google Scholar  

Tricco AC, Zarin W, Rios P, et al. Engaging policy-makers, health system managers, and policy analysts in the knowledge synthesis process: a scoping review. Implementation science : IS. 2018;13(1):31.

Article   PubMed Central   Google Scholar  

Murphy A, Redmond S. To HTA or not to HTA: identifying the factors influencing the rapid review outcome in Ireland. Value Health. 2019;22(4):385–90.

PROSPERO-International prospective register of systematic reviews. https://www.crd.york.ac.uk/prospero/ . Accessed 20 June 2021.

Booth A. Clear and present questions: formulating questions for evidence based practice. In: Library hi tech. Vol 24.2006:355–368.

Garritty C, Stevens A. Putting evidence into practice (PEP) workshop – rapid review course. 2015, 2015; University of Alberta, Edmonton, Alberta

Garritty C, Stevens A, Gartlehner G, Nussbaumer-Streit B, King V. Rapid review workshop: timely evidence synthesis for decision makers. Paper presented at: Cochrane Colloquium; 2016, 2016; Seoul, South Korea.

Open Science Foundation. https://osf.io/ . Accessed 20 June 2021.

Tricco AC, Antony J, Zarin W, et al. A scoping review of rapid review methods. BMC Med. 2015;13:224.

Nussbaumer-Streit B, Klerings I, Dobrescu AI, Persad E, Stevens A, Garritty C, et al. Excluding non-English publications from evidence-syntheses did not change conclusions: a meta-epidemiological study. J Clin Epidemiol. 2020;118:42–54.

Article   CAS   PubMed   Google Scholar  

Nussbaumer-Streit B, Klerings I, Wagner G, et al. Abbreviated literature searches were viable alternatives to comprehensive searches: a meta-epidemiological study. J Clin Epidemiol. 2018;102:1–11.

Rice M, Ali MU, Fitzpatrick-Lewis D, Kenny M, Raina P, Sherifali D. Testing the effectiveness of simplified search strategies for updating systematic reviews. J Clin Epidemiol. 2017;88:148–53.

Spry C, Mierzwinski-Urban M. The impact of the peer review of literature search strategies in support of rapid review reports. Research synthesis methods. 2018;9(4):521–6.

Waffenschmidt S, Knelangen M, Sieben W, Buhn S, Pieper D. Single screening versus conventional double screening for study selection in systematic reviews: a methodological systematic review. BMC Med Res Methodol. 2019;19(1):132.

The Grade Working Group. GRADE. http://www.gradeworkinggroup.org/ . Accessed 20 June 2021.

Moberg J, Oxman AD, Rosenbaum S, Schunemann HJ, Guyatt G, Florttorp S, et al. The GRADE evidence to decision (EfD) framework for health system and public health decisions. Health Res Policy Syst. 2018;16:45.

Parmelli E, Amato L, Oxman AD, Alonso-Coello P, Brunetti M, Moberg J, et al. GRADE evidence to decision (EtD) framework for coverage decisions. Int J Technol Assess Health Care. 2017;33(2):176–82.

Stevens A, Garritty C, Hersi M, Moher D. Developing PRISMA-RR, a reporting guideline for rapid reviews of primary studies (protocol). 2018. http://www.equator-network.org/wp-content/uploads/2018/02/PRISMA-RR-protocol.pdf . Accessed 20 June 2021.

Kelly SE, Moher D, Clifford TJ. Quality of conduct and reporting in rapid reviews: an exploration of compliance with PRISMA and AMSTAR guidelines. Syst Rev. 2016;5:79.

Mijumbi-Deve R, Rosenbaum SE, Oxman AD, Lavis JN, Sewankambo NK. Policymaker experiences with rapid response briefs to address health-system and technology questions in Uganda. Health Res Policy Syst. 2017;15(1):37.

Rosenbaum SE, Glenton C, Wiysonge CS, et al. Evidence summaries tailored to health policy-makers in low- and middle-income countries. Bull World Health Organ. 2011;89(1):54–61.

McIntosh HM, Calvert J, Macpherson KJ, Thompson L. The healthcare improvement Scotland evidence note rapid review process: providing timely, reliable evidence to inform imperative decisions on healthcare. Int J Evid Based Healthc. 2016;14(2):95–101.

Gibson M, Fox DM, King V, Zerzan J, Garrett JE, King N. Methods and processes to select and prioritize research topics and report design in a public health insurance programme (Medicaid) in the USA. Cochrane Methods. 2015;1(Suppl 1):33–35.

Department for Environment, Food and Rural Affairs. Emerging tools and techniques to deliver timely and cost effective evidence reviews. In. London: Department for Environment, Food and Rural Affairs; 2015.

Marshall CG, J. Software tools to support systematic reviews. Cochrane Methods. 2016;10(Suppl. 1):34–35.

The Systematic Review Toolbox. http://systematicreviewtools.com/ . Accessed 20 June 2021.

Pandor A, Kaltenthaler E, Martyn-St James M, et al. Delphi consensus reached to produce a decision tool for SelecTing Approaches for Rapid Reviews (STARR). J Clin Epidemiol. 2019;114:22–9.

Cochrane Rapd Reviews Methods Group. https://methods.cochrane.org/rapidreviews/ . Accessed 20 June 2021.

Download references

Time to produce this manuscript was donated in kind by the authors’ respective organizations, but no other specific funding was received alliance for health policy and systems research,norwegian government agency for development cooperation, swedish international development cooperation agency, department for international development, uk government

Author information

Authors and affiliations.

The Center for Evidence-Based Policy, Oregon Health & Science University, Portland, Oregon, 97201, USA

Valerie J. King

Epidemiology and Biostatistics, Unit Head, Public Health Agency of Canada, Ottawa, Canada

Adrienne Stevens

Cochrane Austria, Danube University Krems, Krems, Austria

Barbara Nussbaumer-Streit

Canadian Agency for Drugs and Technologies in Health, Ottawa, Canada

Chris Kamel

Global Health & Guidelines Division, Public Health Agency of Canada, Ottawa, Canada

Chantelle Garritty

You can also search for this author in PubMed   Google Scholar

Contributions

The first author drafted the manuscript and was responsible for incorporating all other authors’ comments into the final version of the manuscript. The authors read and approved the final manuscript.

Corresponding author

Correspondence to Valerie J. King .

Ethics declarations

Competing interests.

All authors are leaders or members of the Cochrane Rapid Reviews Methods Group, and all are producers of rapid reviews for their respective organizations.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

King, V.J., Stevens, A., Nussbaumer-Streit, B. et al. Paper 2: Performing rapid reviews. Syst Rev 11 , 151 (2022). https://doi.org/10.1186/s13643-022-02011-5

Download citation

Received : 10 November 2021

Accepted : 23 June 2022

Published : 30 July 2022

DOI : https://doi.org/10.1186/s13643-022-02011-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Rapid review
  • Systematic review
  • Technology assessment
  • Evidence-based medicine

Systematic Reviews

ISSN: 2046-4053

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

rapid evidence assessment vs literature review

  • Our services

What is a Rapid Evidence Assessment (REA)?

There are various types of reviews. The most authoritative review, i.e. the review that presents the most valid and reliable scientific evidence, is the systematic review. The aim of a systematic review (SR) is to identify all relevant studies on a specific topic as comprehensively as possible, and to select appropriate studies based on explicit criteria. These studies are then assessed to ascertain their internal validity. A systematic approach is applied to selecting studies: the methodological quality of the studies in question is assessed by several researchers independently of each other on the basis of explicit criteria. A SR is therefore transparent, verifiable and reproducible. Because of this the likelihood of bias is considerably smaller in a SR compared to traditional literature reviews.

A Rapid Evidence Assessments (REAs) is another type of evidence summary that can inform practice. An REA applies the same methodology as a SR and both involve the following steps:

1.    Background

2.    Question

3.    Inclusion Criteria

4.    Search Strategy

5.    Study Selection

6.    Data Extraction

7.    Critical Appraisal

8.    Results

       8.1.  Definitions

       8.2.  Causal Mechanism

       8.3.  Main Findings

       8.4.  Moderators and Mediators

9.    Synthesis

10. Limitations

11. Conclusion

12. Implications for Practice

The main way in which these two types of summaries vary is in relation to the time and resources used to produce them and the scope and depth of the results produced. In order to be ‘rapid’ an REA makes concessions in relation to the breadth, depth and comprehensiveness of the search. Aspects of the search may be limited to produce a quicker result:

· Searching: consulting a limited number of databases, and excluding unpublished research.

· Inclusion: only including specific research designs (e.g. meta-analyses or controlled studies)

· Data Extraction: only extracting a limited amount of key data, such as year, population, sector, study desig, sample size, moderators/mediators, main findings, and effect sizes.

· Critical Appraisal: limiting quality appraisal to methodological appropriateness and quality. 

Due to these limitations, an REA may be more prone to bias than a SR. A SR, however, usually takes a team of academics several months (sometimes even more than a year) to produce – as it aims to identify all published and unpublished relevant studies – whereas an REA might take two skilled persons only several weeks. In general, an organization will not have time or financial means to hire a team of academics to conduct a SR on a managerial topic of interest. As a result, an REA is the most widely used method of reviewing the scientific literature within Evidence-Based Management.

Want to conduct an REA?

  • Navigate To
  • Members area
  • Bargelaan 200
  • 2333 CW Leiden
  • The Netherlands
  • Want to stay up to date?
  • Research article
  • Open access
  • Published: 16 September 2015

A scoping review of rapid review methods

  • Andrea C. Tricco 1 , 2 ,
  • Jesmin Antony 1 ,
  • Wasifa Zarin 1 ,
  • Lisa Strifler 1 , 3 ,
  • Marco Ghassemi 1 ,
  • John Ivory 1 ,
  • Laure Perrier 3 ,
  • Brian Hutton 4 ,
  • David Moher 4 &
  • Sharon E. Straus 1 , 5  

BMC Medicine volume  13 , Article number:  224 ( 2015 ) Cite this article

93k Accesses

575 Citations

50 Altmetric

Metrics details

Rapid reviews are a form of knowledge synthesis in which components of the systematic review process are simplified or omitted to produce information in a timely manner. Although numerous centers are conducting rapid reviews internationally, few studies have examined the methodological characteristics of rapid reviews. We aimed to examine articles, books, and reports that evaluated, compared, used or described rapid reviews or methods through a scoping review.

MEDLINE, EMBASE, the Cochrane Library, internet websites of rapid review producers, and reference lists were searched to identify articles for inclusion. Two reviewers independently screened literature search results and abstracted data from included studies. Descriptive analysis was conducted.

We included 100 articles plus one companion report that were published between 1997 and 2013. The studies were categorized as 84 application papers, seven development papers, six impact papers, and four comparison papers (one was included in two categories). The rapid reviews were conducted between 1 and 12 months, predominantly in Europe (58 %) and North America (20 %). The included studies failed to report 6 % to 73 % of the specific systematic review steps examined. Fifty unique rapid review methods were identified; 16 methods occurred more than once. Streamlined methods that were used in the 82 rapid reviews included limiting the literature search to published literature (24 %) or one database (2 %), limiting inclusion criteria by date (68 %) or language (49 %), having one person screen and another verify or screen excluded studies (6 %), having one person abstract data and another verify (23 %), not conducting risk of bias/quality appraisal (7 %) or having only one reviewer conduct the quality appraisal (7 %), and presenting results as a narrative summary (78 %). Four case studies were identified that compared the results of rapid reviews to systematic reviews. Three studies found that the conclusions between rapid reviews and systematic reviews were congruent.

Conclusions

Numerous rapid review approaches were identified and few were used consistently in the literature. Poor quality of reporting was observed. A prospective study comparing the results from rapid reviews to those obtained through systematic reviews is warranted.

Systematic reviews are a useful tool for decision-makers because they can be used to interpret the results of individual studies within the context of the totality of evidence and provide the evidence-base for knowledge translation products, such as patient decision aids, clinical practice guidelines or policy briefs [ 1 ]. However, due to the high level of methodological rigour, systematic reviews take from 0.5 to 2 years to conduct [ 2 ] and require considerable skill to execute. According to the Cochrane Collaboration, all procedures including screening citations (titles and abstracts), screening full-text articles, data abstraction, and risk of bias appraisal, should be conducted by two individuals, independently [ 3 ]. In addition, technical expertise from librarians, research coordinators, content experts, and statisticians is required.

Health decision-makers (including clinicians, patients, managers, and policy-makers) often need timely access to health information. Although this information can be obtained through a systematic review, these research endeavours require enormous resources to complete and the timeframe required to conduct a systematic review may not suit the needs of some decision-makers. For example, it has been estimated that systematic reviews take, on average, 1,139 hours (range 216–2,518 hours) to complete and usually require a budget of at least $100,000 [ 4 ]. Consequently, decision-makers may be forced to rely on less robust evidence, such as expert opinion or the results of a single small study [ 5 ], leading to suboptimal decision-making.

Rapid reviews are a form of knowledge synthesis in which components of the systematic review process are simplified or omitted to produce information in a timely manner [ 2 ]. Yet rapid reviews might be susceptible to biased results as a consequence of streamlining the systematic review process [ 6 ]. Although numerous rapid review programs exist internationally [ 7 ], few studies have examined their methodology. We aimed to examine rapid review approaches, guidance, impact, and comparisons through a scoping review.

Definition of a rapid review

A formal definition for a rapid review does not exist. As such, we used the following working definition, ‘a rapid review is a type of knowledge synthesis in which components of the systematic review process are simplified or omitted to produce information in a short period of time’ [ 2 ].

A scoping review protocol was compiled using guidance from Arksey and O’Malley [ 8 ], and revised upon feedback received from the Canadian Institutes of Health Research peer review panel. It is available from the corresponding author upon request.

Information sources and literature search

To identify potentially relevant studies for inclusion, the following electronic databases were searched: MEDLINE; EMBASE; and the Cochrane Library. Since two systematic reviews have already been published on rapid reviews [ 6 , 7 ], we limited our search from 2008 until May 2013. An experienced librarian (LP) drafted the literature searches based on the previous reviews, which was refined through team discussion. The MEDLINE search strategy is presented in Additional file 1 : Appendix 1 and the other searches are available from the corresponding author upon request.

Our literature search was supplemented by targeted internet searches for unpublished rapid review reports posted on the websites of producers of rapid reviews. For this search, we took a random 10 % sample of the unpublished rapid reviews available on the producers’ websites. Often only the title was available for the rapid reviews, so, we focused inclusion to the full rapid review, if available. The reference lists of relevant reviews were scanned [ 6 , 7 ], as were the reference lists of all included rapid reviews.

Inclusion criteria

Articles, papers, books, and reports were included if they evaluated, compared, used or described a rapid review according to the authors.

Screening process

The screening criteria were established a priori (as outlined in our protocol) and calibrated amongst the team through a series of pilot tests. After >90 % agreement was observed, pairs of reviewers screened the literature search results independently, and discrepancies were resolved through discussion. All screening was performed using our online tool, synthesi. sr [ 9 ].

Data items and data abstraction process

A data abstraction form was developed a priori and the draft form was calibrated amongst the team using a random sample of ten included studies. After this exercise, the data abstraction form was revised and all included studies were abstracted by two reviewers working independently. Discrepancies were resolved through discussion.

Data items included study characteristics (for example, first author, year of publication), terminology used to describe the rapid review, full citation of previous methods papers that were used to guide the rapid review design, timeframe (in months) for completing the rapid review, and operationalized steps of the rapid review, if reported. The rapid review type was categorized as an application (for example, a rapid review report), development (paper attempts to further refine the rapid review method), impact (examines the impact of rapid reviews) or comparison (compares the results of a rapid review to a systematic review). We abstracted the assessment of the rapid review approach, including accuracy of results, comprehensiveness, potential for risk of bias, timeliness, cost-effectiveness, and feasibility as reported by the publication authors. We also abstracted the skills or knowledge required to conduct the rapid review as reported by the authors.

To synthesize the descriptive results, we conducted qualitative analysis using NVivo 10 [ 10 ]. Content analysis was conducted by one team member (WZ) and verified by another team member (ACT) to synthesize common methodologies used across the included rapid reviews using a framework. The framework was developed by the review team and presented in Additional file 1 : Appendix 2. The framework focused on the following steps for a rapid review: literature search (number of databases and grey literature); inclusion criteria (limited by date, language, and study design); screening (title/abstract and full-text); data abstraction; risk of bias/quality appraisal; and data synthesis. In order to depict the frequency of the terms used to describe the rapid reviews, a word cloud was created using Wordle, which is software that generates ‘word clouds’ from text that the user provides and places more emphasis on words that occur with greater frequency [ 11 ].

Literature search

A total of 3,397 citations and 262 potentially relevant full-text papers were screened. Subsequently, 100 articles [ 2 , 12 – 110 ] plus one companion report [ 111 ] fulfilled the eligibility criteria and were included [ 31 ] (Fig.  1 ). Forty-seven of the included papers were unpublished rapid reviews posted on websites [ 13 , 24 , 29 , 31 – 36 , 39 , 45 , 47 , 50 , 52 – 57 , 62 , 63 , 66 , 68 , 70 , 73 – 75 , 77 , 81 – 83 , 86 – 94 , 99 , 100 , 103 , 104 , 107 , 109 , 112 ].

Study flow diagram

Rapid review characteristics and assessment

The rapid reviews were published between 1997 and 2013, and 58 were conducted in Europe, while 20 were conducted in North America (Table  1 , Additional file 1 : Appendix 3). The type of articles included 84 application papers (two did not report any methods), seven development papers, six impact papers, and four comparison papers; one article [ 20 ] was categorized in two categories. Ten of the rapid reviews were reported in 5 pages or less, suggesting that they were brief reports or research letters. Most of the articles (73 %) did not report the duration of conduct for the rapid review. For the minority that reported this, the duration ranged from less than 1 month to 12 months, and 18 were between 1 and 6 months. For the application articles, 74 % examined interventions, 12 % charted the frequency of literature (for example, regarding outcomes or frameworks), 5 % examined associations between exposure and disease, 5 % assessed diagnosis or screening techniques, and 2 % examined the patient experience or barriers/facilitators.

Sixty-five articles assessed rapid review characteristics (Table  2 ) [ 2 , 12 , 14 – 22 , 24 , 26 – 30 , 32 , 37 – 39 , 41 – 43 , 45 – 49 , 51 – 59 , 61 , 63 , 64 , 66 , 69 , 72 – 76 , 78 – 80 , 84 , 86 , 88 – 94 , 100 , 103 – 105 , 110 ]. Sixty percent of the authors reported that the report was timely, 29 % believed that the method had potential risk of bias, 23 % deemed that the approach was accurate compared to a full systematic review, 8 % believed the approach was comprehensive, 5 % reported that the approach was cost-effective, and 6 % believed it was a feasible approach.

Terminology used to describe the rapid review method

The most frequent term used to describe the rapid review approaches was ‘rapid review’, used in 34 of the included articles (Fig.  2 ). This was followed by ‘rapid evidence assessment’, which was used in 11 papers, ‘rapid systematic review’ in ten papers, and ‘health technology assessment’ or ‘rapid health technology assessment’ in six papers. All of the other terms occurred two times or less.

Word cloud for the frequency of terms

Citation analysis

Twenty-six [ 2 , 12 , 13 , 17 , 20 – 22 , 27 , 28 , 30 , 40 , 42 – 44 , 48 , 49 , 61 , 76 , 78 – 80 , 84 , 88 , 103 , 105 , 110 ] articles provided citations of previous methods papers that were used to guide the rapid review method (Fig.  3 , Additional file 1 : Appendix 4). The citations were Ganann and colleagues [ 6 ] (cited in eight papers), Watt and colleagues [ 7 , 111 ] (cited in seven papers), a Civil Service paper [ 113 ] (cited in four papers), Ehlers and colleagues [ 114 ] (cited in one paper), Armitage and colleagues [ 14 ] (cited in one paper), and Grant and colleagues [ 115 ] (cited in one paper).

Citation analysis. *Twenty-six papers referenced another seminal paper to establish their rapid review framework

Skills and knowledge required to conduct the rapid reviews

Thirteen [ 16 , 32 , 39 , 42 , 46 , 48 , 49 , 52 , 79 , 84 , 88 , 90 , 94 ] of the included papers reported the skills and knowledge required to conduct the rapid reviews (Table  3 ). These were content experts in seven articles [ 16 , 32 , 42 , 48 , 49 , 79 , 90 ], information specialists in five articles [ 39 , 49 , 52 , 84 , 88 ], systematic review methodologists in four papers [ 16 , 42 , 48 , 79 ], staff experienced in conducting reviews in four papers [ 46 , 48 , 49 , 84 ], and knowledge users in three papers [ 32 , 79 , 94 ].

Operationalized steps to conduct the rapid review applications

The 84 rapid review applications were categorized using our framework (Additional file 1 : Appendix 2) and 50 unique methods were observed. Of these, only 16 occurred more than once; three approaches occurred five times [ 21 , 36 , 40 , 44 , 45 , 47 , 53 , 54 , 56 , 57 , 65 , 75 , 83 , 91 , 92 ], another two occurred four times [ 18 , 37 , 39 , 64 , 86 , 93 , 99 , 107 ], three approaches were used three times [ 49 , 51 , 58 , 61 , 62 , 69 , 73 , 76 , 81 ], and eight approaches occurred two times [ 14 , 16 , 20 , 25 , 27 , 30 , 31 , 66 – 68 , 70 , 79 , 82 , 96 , 100 , 104 ]. The characteristics of the rapid review approaches that occurred more than four times were analyzed (Table  4 ). Rapid Approach 1 had the most details reported, with 5/5 papers mentioning that it was accurate and timely (but did not report the amount of time it took to conduct their rapid review), and had limited comprehensiveness.

Many of the steps used in the rapid reviews were not fully reported (Table  5 , Additional file 1 : Appendix 5). For example, 40 % (33/82) did not report whether reference lists were scanned and 67 % (55/82) did not report whether authors were contacted to obtain further material or information.

Streamlined methods that were used in the 82 rapid reviews included limiting the literature search to published literature (24 %) or one database (2 %), limiting inclusion criteria by date (68 %) or language (49 %), having one person screen and another verify or screen excluded studies (6 %), having one person abstract data and another verify (23 %), not conducting risk of bias/quality appraisal (7 %) or having only one reviewer conduct the quality appraisal (7 %), and presenting results as a narrative summary (78 %) (Fig.  4 ).

Streamlined steps used across the rapid reviews (n = 82 studies reporting this information). SR, systematic review

Comparing results from rapid reviews to systematic reviews

Four studies were comparisons, providing details on differences in results between rapid reviews and systematic reviews [ 20 , 31 , 34 , 106 ]. Cameron and colleagues identified rapid reviews from health technology assessment (HTA) organization websites and then conducted a literature search to identify systematic reviews on the same topic [ 31 ]. Eight rapid review products were identified on four different topics. However, the authors did not appraise the methodological quality of the systematic reviews, so it is unclear whether shortcuts were also taken in the included systematic reviews. The authors noted that the conclusions did not differ substantially between the rapid and systematic reviews. Corabian and colleagues compared six rapid review products (called ‘technotes’) with their final peer-reviewed publications [ 34 ]. The authors found that the conclusions differed only in 1/6 cases. Van de Velde and colleagues compared the results from their rapid review to a systematic review that was conducted by another group and published on the same topic [ 106 ]. Despite having literature searches that were conducted for the same dates, conflicting results were observed; the rapid review concluded that potato peel was effective for burns, while the systematic review concluded that potato peel was not effective for treating burns. Finally, Best and colleagues noted that two of the rapid reviews they conducted were in agreement with systematic reviews published at a later point in time on the same topic [ 20 ].

Development papers on rapid reviews

Seven papers proposed methods to refine the rapid review approach [ 2 , 12 , 16 , 20 , 46 , 79 , 80 ]. Best and colleagues (1997) described their experience conducting 63 rapid reviews for decision-making beginning in 1991, through the Development and Evaluation Committee in the UK [ 20 ]. Abrami and colleagues (2010) described ways to produce brief reviews efficiently, and presented a checklist for the conduct and reporting of brief reviews [ 12 ]. Bambra and colleagues (2010) described their experience conducting nine rapid reviews for the Secretary of State for Health [ 16 ]. Jahangirian and colleagues (2011) described their experience conducting five rapid reviews for the Research into Global Healthcare Tools consortium and proposed a framework for the conduct of rapid reviews [ 46 ]. Khangura and colleagues (2012) described their approach to the conduct of 11 rapid reviews through the collaboration between the Ottawa Hospital Research Institute and the Champlain Local Health Integrated Network [ 2 ]. Thigpen and colleagues (2012) described their experience conducting rapid reviews using the 6-step Prevention Synthesis and Translation System process for the Division of Violence Prevention, National Center for Injury Prevention and Control at the Centers for Disease Control and Prevention [ 79 ]. Thomas and colleagues (2013) described their experience of conducting two rapid reviews for the UK Treasury to inform the 2006/07 Comprehensive Spending Review [ 80 ].

Guidance to streamline the rapid review process varied, yet some consistencies were observed (Table  6 ). For example, four papers suggested using integrated knowledge translation, in which researchers work closely with the knowledge users to complete the rapid review [ 2 , 16 , 19 , 79 ]. Four papers suggested the use of a research question with a limited scope [ 12 , 16 , 80 , 110 ]. Seven publications recommended streamlining the literature search [ 2 , 12 , 16 , 46 , 79 , 80 , 110 ] and three suggested restricting the eligibility criteria [ 2 , 12 , 80 ]. Two papers provided suggestions for efficiently appraising risk of bias [ 2 , 80 ] and none suggested conducting a meta-analysis as part of the rapid review.

Articles assessing the impact and use of rapid reviews

Six papers examined the impact of rapid reviews on decision-making [ 41 – 43 , 60 , 85 , 110 ]. Hailey and colleagues (2000) examined the impact of 20 rapid review products [ 43 ] and found that 14 had an influence on policy decision-making, four provided guidance, and two had no perceived impact. McGregor and Brophy (2005) evaluated the success of the conduct of 16 rapid reviews for a hospital rapid review service [ 60 ]. The results of all 16 products were directly implemented in the hospital, saving approximately $3 million per year. Hailey (2006) wrote a paper summarizing the impact of HTA in general, as well as related to rapid HTA. Overall, it was concluded that these reports can influence decision-making. Hailey (2009) conducted a survey of HTA organizations to examine the use of rapid reviews for decision-making [ 42 ]. Fifteen rapid review products were included; all influenced a decision, including using the rapid review for reference material (67 %) and directly using the rapid review’s conclusions for the decision (53 %). Zechmeister (2012) examined the impact of 58 rapid assessments and observed that 56 of these products were directly used for reimbursement decisions and two were used for disinvestment decisions [ 85 ]. Finally, Batten (2012) wrote an editorial discussing how rapid reviews can be used by school nurses [ 110 ].

Our results suggest that the conduct of rapid reviews is recondite across the literature. Through our study, 50 different rapid review approaches were identified and only 16 occurred more than once. Furthermore, many different terms were used to describe a rapid review, making the identification of these types of knowledge synthesis products difficult.

Using a framework of rapid review methods, we observed numerous strategies employed to conduct reviews in a streamlined manner. These included not using a protocol, limiting the literature search, limiting inclusion criteria, only having one person screen the literature search results, not conducting quality appraisal, and not conducting a meta-analysis. In general, combining multiple shortcuts led to a timelier conduct of the review.

Only four of the included studies compared the results of rapid reviews to systematic reviews. Three of these found that the results for both knowledge synthesis products were in agreement. However, the results of these studies should be interpreted with caution because a very small sample of reviews were included (ranging from 1 to 8) and none of these were prospectively conducted. The latter is of particular importance, since it is unclear whether the authors of the full systematic reviews used the rapid review as a starting point to identify articles for inclusion (or vice versa). Interestingly, none of the included studies compared the results across rapid reviews on the same topic. Such a study may provide further clarity into the impact of streamlining different steps on the risk of bias and comprehensiveness of the review.

Seven papers provided recommendations on making rapid reviews more efficient. Consistent guidance included using an integrated knowledge translation approach, limiting the scope of the question and literature search, and not conducting a meta-analysis. Furthermore, six papers examined the impact of rapid reviews on decision-making and all found that they were valuable products. These results suggest that decision-makers are currently using rapid reviews to inform their decision-making processes. Further supporting this observation was the recent Canadian Agency for Drugs and Technologies in Health Rapid review summit [ 116 ], for which a large number of international decision-making organizations were in attendance.

Across the application papers, many of the methods were poorly reported suggesting that improvement in the reporting of rapid reviews is warranted. Thorough reporting of the methods is important because it is difficult to judge the bias of these reports without fully understanding what shortcuts were taken. As well, transparent reporting allows the reproducibility of research. It is important to note that 10 % of the included papers were reported in 5 pages or less, suggesting that perhaps there was insufficient room to report the methods fully.

Prior to establishing a quality of reporting guidelines for rapid reviews, a common terminology and definition is required [ 117 ]. Some of the team members are currently involved with research that is attempting to tackle this issue. At the bare minimum, one of the included papers provided a checklist to examine the reporting of rapid reviews [ 12 ], which can be used by producers of rapid reviews to ensure their reports are reported in a consistent manner.

We have also conducted other research on rapid reviews that builds on this scoping review [ 118 ]. Specifically, we conducted an international survey of 40 rapid review producers who identified several rapid review approaches, such as updating the literature search of previous reviews and limiting the search strategy by date of publication. Most of the rapid review products were conducted within 12 weeks. A modified Delphi approach was used to include input from 113 stakeholders (for example, researchers, policy-makers, industry, journal editors, and healthcare providers) to agree upon an attractive rapid review method that would be used in a future comparative study. The stakeholders ranked the following method as being the most feasible, timely, and having a low perceived risk of bias: literature search limited by date and language; study selection by one reviewer only; and data abstraction and quality appraisal conducted by one reviewer and verified by a second reviewer. We are currently in the process of seeking funding of a comparative study to test the accuracy of this rapid review approach versus the gold standard, systematic review.

A recent project on rapid reviews was commissioned by the Agency for Healthcare Research and Quality in the United States [ 119 , 120 ]. The authors summarized evidence from 12 review articles of rapid reviews [ 120 ], as well as 35 different rapid reviews produced by 20 different organizations [ 119 ]. This information was obtained through literature searches and key informant interviews with 18 individuals who had experience of conducting rapid reviews. The authors are currently conducting interviews with policy-makers to obtain their perceptions on rapid reviews, including their utility and importance.

Our scoping review has some limitations. To make our review more feasible, we were only able to include a random sample of rapid reviews from websites of rapid review producers. Further adding to this issue is that many rapid reviews contain proprietary information and are not publicly available. As such, our results are only likely generalizable to rapid reviews that are publicly available. Furthermore, this scoping review was an enormous undertaking and our results are only up to date as of May 2013. However, we believe that our results provide important information on rapid reviews and ours is the most comprehensive scoping review that we are currently aware of.

In conclusion, numerous rapid review approaches were identified and few were used consistently in the literature. Poor quality of reporting was observed. Further research on rapid reviews is warranted. In particular, the consequences of various methodological shortcuts should be investigated. This could be examined through a prospective study comparing the results of rapid reviews to those obtained through systematic reviews on the same topic. Team members are currently seeking funding to conduct such a study and it is hoped that our results will provide pertinent information on the utility and risk of bias of rapid reviews.

Abbreviations

Health technology assessment

Not reported

Risk of bias

  • Systematic review

Tricco AC, Tetzlaff J, Moher D. The art and science of knowledge synthesis. J Clin Epidemiol. 2011;64:11–20. doi: 10.1016/j.jclinepi.2009.11.007 .

PubMed   Google Scholar  

Khangura S, Konnyu K, Cushman R, Grimshaw J, Moher D. Evidence summaries: the evolution of a rapid review approach. Syst Rev. 2012;1:10. doi: 10.1186/2046-4053-1-10 .

PubMed   PubMed Central   Google Scholar  

Higgins JPT. Green S (Eds). Cochrane handbook for systematic reviews of interventions. Version 5.1.0 (updated March 2011). Oxford: The Cochrane Collaboration; 2011.

Google Scholar  

Petticrew M, Roberts H. Systematic reviews in the social sciences: a practical guide. Malden, MA: Blackwell Publishing Co.; 2006.

Antman EM, Lau J, Kupelnick B, Mosteller F, Chalmers TC. A comparison of results of meta-analyses of randomized control trials and recommendations of clinical experts. Treatments for myocardial infarction. JAMA. 1992;268:240–8.

CAS   PubMed   Google Scholar  

Ganann R, Ciliska D, Thomas H. Expediting systematic reviews: methods and implications of rapid reviews. Implement Sci. 2010;5:56. doi: 10.1186/1748-5908-5-56 .

Watt A, Cameron A, Sturm L, Lathlean T, Babidge W, Blamey S, et al. Rapid reviews versus full systematic reviews: an inventory of current methods and practice in health technology assessment. Int J Technol Assess Health Care. 2008;24:133–9. doi: 10.1017/S0266462308080185 .

Arksey H, O’Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol. 2005;8:19–32. doi: 10.1080/1364557032000119616 .

synthesi.sr. http://knowledgetranslation.ca/sysrev/login.php .

NVivo 10 for Windows. http://www.qsrinternational.com/products_nvivo.aspx .

Feinberg J. Wordle. http://www.wordle.net .

Abrami PC, Borokhovski E, Bernard RM, Wade CA, Tamim R, Persson T, et al. Issues in conducting and disseminating brief reviews of evidence. Evid Policy. 2010;6:371–89. doi: 10.1332/174426410X524866 .

Adi Y, Bayliss S, Taylor R. Systematic review of clinical effectiveness and cost-effectiveness of radiofrequency ablation for the treatment of varicose veins. Birmingham: West Midlands Health Technology Assessment Collaboration; 2004. http://www.birmingham.ac.uk/Documents/college-mds/haps/projects/WMHTAC/REPreports/2004/varicoseveins.pdf .

Armitage A, Keeble-Ramsay D. The rapid structured literature review as a research strategy. US-China Education Review. 2009;6:27–38.

Attree P, French B, Milton B, Povall S, Whitehead M, Popay J. The experience of community engagement for individuals: a rapid review of evidence. Health Soc Care Comm. 2011;19:250–60. doi: 10.1111/j.1365-2524.2010.00976.x .

Bambra C, Joyce KE, Bellis MA, Greatley A, Greengross S, Hughes S, et al. Reducing health inequalities in priority public health conditions: using rapid review to develop proposals for evidence-based policy. J Public Health (Oxf). 2010;32:496–505. doi: 10.1093/pubmed/fdq028 .

Barnighausen T, Tanser F, Dabis F, Newell ML. Interventions to improve the performance of HIV health systems for treatment-as-prevention in sub-Saharan Africa: the experimental evidence. Curr Opin HIV AIDS. 2012;7:140–50. doi: 10.1097/COH.0b013e32834fc1df .

Beck CR, Sokal R, Arunachalam N, Puleston R, Cichowska A, Kessel A, et al. Neuraminidase inhibitors for influenza: a review and public health perspective in the aftermath of the 2009 pandemic. Influenza Other Respir Viruses. 2013;7:14–24. doi: 10.1111/irv.12048 .

Best A, Greenhalgh T, Lewis S, Saul JE, Carroll S, Bitz J. Large-system transformation in health care: a realist review. Milbank Q. 2012;90:421–56. doi: 10.1111/j.1468-0009.2012.00670.x .

Best L, Stevens A, Colin‐Jones D. Rapid and responsive health technology assessment: the development and evaluation process in the South and West region of England. J Clin Effec. 1997;2:51–6.

Blank L, Coster J, O'Cathain A, Knowles E, Tosh J, Turner J, et al. The appropriateness of, and compliance with, telephone triage decisions: a systematic review and narrative synthesis. J Adv Nurs. 2012;68:2610–21. doi: 10.1111/j.1365-2648.2012.06052.x .

Boycott N, Schneider J, McMurran M. Additional interventions to enhance the effectiveness of individual placement and support: a rapid evidence assessment. Rehabil Res Pract. 2012;2012:382420. doi: 10.1155/2012/382420 .

Brearley SG, Stamataki Z, Addington-Hall J, Foster C, Hodges L, Jarrett N, et al. The physical and practical problems experienced by cancer survivors: a rapid review and synthesis of the literature. Eur J Oncol Nurs. 2011;15:204–12. doi: 10.1016/j.ejon.2011.02.005 .

Brown A, Coyle D, Cimon K, Farrah K. Hip protectors in long-term care: a clinical and cost-effectiveness review and primary economic evaluation. Ottawa, ON: Canadian Agency for Drugs and Technologies in Health; 2008. http://www.cadth.ca/media/pdf/I3015_Hip_Protectors_Long_Term_Care_tr_e.pdf .

Bryant SL, Gray A. Demonstrating the positive impact of information support on patient care in primary care: a rapid literature review. Health Info Libr J. 2006;23:118–25. doi: 10.1111/j.1471-1842.2006.00652.x .

Bullock SH, Jones BH, Gilchrist J, Marshall SW. Prevention of physical training-related injuries recommendations for the military and other active populations based on expedited systematic reviews. Am J Prev Med. 2010;38:S156–81. doi: 10.1016/j.amepre.2009.10.023 .

Bungay H, Vella-Burrows T. The effects of participating in creative activities on the health and well-being of children and young people: a rapid review of the literature. Perspect Public Health. 2013;133:44–52. doi: 10.1177/1757913912466946 .

Burls A, Clark W, Stewart T, Preston C, Bryan S, Jefferson T, et al. Zanamivir for the treatment of influenza in adults: a systematic review and economic evaluation. Health Technol Assess. 2002;6:9.

Butler G, Hodgkinson J, Holmes E, Marshall S. Evidence based approaches to reducing gang violence: a rapid evidence assessment for Aston and Handsworth Operational Group. Birmingham: Government Office West Midlands and Home Office Regional Research Team; 2004. http://www.civilservice.gov.uk/wp-content/uploads/2011/09/rea_gang_violence_tcm6-7377.pdf .

Butt S, Chou S, Browne K. A rapid systematic review on the association between childhood physical and sexual abuse and illicit drug use among males. Child Abuse Rev. 2011;20:6–38. doi: 10.1002/car.1100 .

Cameron A, Watt A, Lathlean T, Sturm T. Rapid versus full systematic reviews: an inventory of current methods and practice in Health Technology Assessment. ASERNIP-S report number 60. Adelaide: Australian Safety and Efficacy Register of New Interventional Procedures – Surgical (ASERNIP-S); 2007. http://www.surgeons.org/media/297941/rapidvsfull2007_systematicreview.pdf .

Clark W, Jobanputra P, Barton P, Burls A. The clinical and cost-effectiveness of anakinra for the treatment of rheumatoid arthritis in adults. Birmingham: West Midlands Health Technology Assessment Collaboration; 2003. http://www.nice.org.uk/guidance/ta72/documents/assessment-report-the-clinical-and-costeffectiveness-of-anakinra-for-the-treatment-of-rheumatoid-arthritis-in-adults-2 .

Coomber R, Millward L, Chambers J, Warm D. A rapid interim review of the ‘grey’ literature on risky behaviour in young people aged 11–18 with a special emphasis on vulnerable groups. London: Health Development Agency; 2004.

Corabian P, Harstall C. Rapid assessments provide acceptable quality advice. Annu Meet Int Soc Technol Assess Health Care Int Soc Technol Assess Health Care Meet. 2002;18:Abstract 70.

Cummins C, Connock M, Fry-Smith A, Burls A. A rapid review of new drug treatments for juvenile idiopathic arthritis: Etanercept. Birmingham: West Midlands Development and Evaluation Service, University of Birmingham; 2001. http://www.nice.org.uk/guidance/ta35/documents/assessment-report-for-etanercept-for-juvenile-idiopathic-arthritis-2 .

De Laet C, Obyn C, Ramaekers D, Van De Sande S, Neyt M. Hyperbaric oxygen therapy: a rapid assessment. KCE reports 74C. Brussels: Health Technology Assessment (HTA) and Belgian Health Care Knowledge Centre (KCE); 2008. https://kce.fgov.be/sites/default/files/page_documents/d20081027315.pdf .

Dixon-Woods M, McNicol S, Martin G. Ten challenges in improving quality in healthcare: lessons from the Health Foundation's programme evaluations and relevant literature. BMJ Qual Saf. 2012;21:876–84. doi: 10.1136/bmjqs-2011-000760 .

Fitzpatrick-Lewis D, Ganann R, Krishnaratne S, Ciliska D, Kouyoumdjian F, Hwang SW. Effectiveness of interventions to improve the health and housing status of homeless people: a rapid systematic review. BMC Public Health. 2011;11:638. doi: 10.1186/1471-2458-11-638 .

Foerster V, Murtagh J, Fiander M. Pulsed dye laser therapy for port wine stains. Technology report number 78. Ottawa, ON: Canadian Agency for Drugs and Technologies in Health; 2007. http://www.cadth.ca/media/pdf/I3008_tr_Port-Wine-Stains_e.pdf .

Geddes R, Frank J, Haw S. A rapid review of key strategies to improve the cognitive and social development of children in Scotland. Health Policy. 2011;101:20–8. doi: 10.1016/j.healthpol.2010.08.013 .

Hailey D. Health technology assessment. Singapore Med J. 2006;47:187–92. quiz 93.

Hailey D. A preliminary survey on the influence of rapid health technology assessments. Int J Technol Assess Health Care. 2009;25:415–8. doi: 10.1017/S0266462309990067 .

Hailey D, Corabian P, Harstall C, Schneider W. The use and impact of rapid health technology assessments. Int J Technol Assess Health Care. 2000;16:651–6.

Hildon Z, Neuburger J, Allwood D, van der Meulen J, Black N. Clinicians’ and patients’ views of metrics of change derived from patient reported outcome measures (PROMs) for comparing providers’ performance of surgery. BMC Health Serv Res. 2012;12:171. doi: 10.1186/1472-6963-12-171 .

Hulstaert F, Thiry N, Eyssen M, Vrijens F. Pharmaceutical and non-pharmaceutical interventions for Alzheimer’s Disease, a rapid assessment. KCE reports 111C. Brussels: Health Technology Assessment (HTA) and Belgian Health Care Knowledge Centre (KCE); 2009. https://kce.fgov.be/sites/default/files/page_documents/d20091027329.pdf .

Jahangirian M, Eldabi T, Garg L, Jun GT, Naseer A, Patel B, et al. A rapid review method for extremely large corpora of literature: Applications to the domains of modelling, simulation, and management. Int J Inf Manag. 2011;31:234–43. doi: 10.1016/j.ijinfomgt.2010.07.004 .

Jolliffe D, Farrington DP. A rapid evidence assessment of the impact of mentoring on re-offending: a summary. London: Home Office; 2007. http://www.crim.cam.ac.uk/people/academic_research/david_farrington/olr1107.pdf .

Kelly BJ, Perkins DA, Fuller JD, Parker SM. Shared care in mental illness: A rapid review to inform implementation. Int J Ment Health Syst. 2011;5:31. doi: 10.1186/1752-4458-5-31 .

Konnyu KJ, Kwok E, Skidmore B, Moher D. The effectiveness and safety of emergency department short stay units: a rapid review. Open Med. 2012;6:e10–6.

Legrand M, Coudron V, Tailleu I, Rooryck N, Dupont K, Boudrez P, et al. Videoregistratie van endoscopische chirurgische interventies: rapid assessment. KCE reports 101A. Brussels: Health Technology Assessment (HTA) and Belgian Health Care Knowledge Centre (KCE); 2008. https://kce.fgov.be/sites/default/files/page_documents/d20081027397.pdf .

Lewis R, Whiting P, ter Riet G, O’Meara S, Glanville J. A rapid and systematic review of the clinical effectiveness and cost-effectiveness of debriding agents in treating surgical wounds healing by secondary intention. Health Technol Assess. 2001;5:1–131.

Low N, Bender N, Nartey L, Redmond S, Shang A, Judith S. Revised rapid review of evidence for the effectiveness of screening for genital chlamydial infection in sexually active young women and men. London: The National Institute for Health and Care Excellence (NICE); 2006. https://www.nice.org.uk/guidance/ph3/evidence/review-2-review-of-evidence-for-the-effectiveness-of-screening-for-genital-chlamydial-infection-in-sexually-active-young-women-and-men2 .

Maddern G, Cooter R, Lee I. Rapid review: clinical treatments for wrist ganglia. ASERNIP-S report number 63. Adelaide: Australian Safety and Efficacy Register of New Interventional Procedures – Surgical (ASERNIP-S). http://www.surgeons.org/media/310861/Clinical_treatment_for_wrist_ganglia.pdf .

Maddern G, Fitridge R, Woodruff P, Leopardi D, Hoggan B. Rapid review: treatments for varicose veins. ASERNIP-S report number 66. Adelaide: Australian Safety and Efficacy Register of New Interventional Procedures – Surgical (ASERNIP-S). http://www.surgeons.org/media/300551/Treatments_for_varicose_veins.pdf .

Maddern G, Fitridge R, Woodruff P, Leopardi D, Hoggan B. Rapid review: upper airway surgery for the treatment of adult obstructive sleep apnoea. ASERNIP-S report number 67. Adelaide: Australian Safety and Efficacy Register of New Interventional Procedures – Surgical (ASERNIP-S). https://www.surgeons.org/media/300727/Upper_airway_surgery_for_adult_OSA.pdf .

Maddern G, Morrison G, Lathlean T. Rapid review: diagnostic arthroscopy for conditions of the knee. ASERNIP-S report number 64. Adelaide: Australian Safety and Efficacy Register of New Interventional Procedures – Surgical (ASERNIP-S). http://www.surgeons.org/media/311101/Diagnostic_arthroscopy_for_conditions_of_the_knee.pdf .

Maddern G, Bridgewatern F, Perera C. Rapid review: male non-therapeutic circumcision. ASERNIP-S report number 65. Adelaide: Australian Safety and Efficacy Register of New Interventional Procedures – Surgical (ASERNIP-S); 2008. http://www.surgeons.org/media/292210/Male_non-therapeutic_circumcision.pdf .

Mann R, Gilbody S. Validity of two case finding questions to detect postnatal depression: a review of diagnostic test accuracy. J Affect Disord. 2011;133:388–97. doi: 10.1016/j.jad.2010.11.015 .

Marsh K, Fox C. The benefit and cost of prison in the UK. The results of a model of lifetime re-offending. J Exp Criminol. 2008;4:403–23. doi: 10.1007/s11292-008-9063-3 .

McGregor M, Brophy JM. End-user involvement in health technology assessment (HTA) development: a way to increase impact. Int J Technol Assess Health Care. 2005;21:263–7.

McMurran M. Individual-level interventions for alcohol-related violence: a rapid evidence assessment. Crim Behav Ment Health. 2012;22:14–28. doi: 10.1002/cbm.821 .

McRobbie H, Hajek P, Bullen C, Feigin V. Rapid review of non NHS treatments for smoking cessation. Smoking Cessation Programme. London: The National Institute for Health and Care Excellence (NICE); 2006. http://www.nice.org.uk/guidance/ph10/documents/evidence-summary-nonnhs-treatments2 .

Middleton P, Simpson B, Maddern G. Spinal cord stimulation (neurostimulation): an accelerated systematic review. ASERNIP-S report number 43. Adelaide: Australian Safety and Efficacy Register of New Interventional Procedures – Surgical (ASERNIP-S); 2003. http://www.surgeons.org/media/17785/SCSaccelreview0603.pdf .

Mitchell MD, Williams K, Kuntz G, Umscheid CA. When the decision is what to decide: using evidence inventory reports to focus health technology assessments. Int J Technol Assess Health Care. 2011;27:127–32. doi: 10.1017/S0266462311000031 .

Moran R, Davidson P. An uneven spread: a review of public involvement in the National Institute of Health Research’s Health Technology Assessment program. Int J Technol Assess Health Care. 2011;27:343–7. doi: 10.1017/S0266462311000559 .

Murphy G, Prichett-Pejic W, Severn M. Non-emergency telecardiology consultation services: rapid review of clinical and cost outcomes. Technology report number 134. Ottawa, ON: Canadian Agency for Drugs and Technologies in Health; 2010. http://www.cadth.ca/media/pdf/H0501_Telecardiology_Report_e.pdf .

Nasser M. Evidence summary: is smoking cessation an effective and cost-effective service to be introduced in NHS dentistry? Br Dent J. 2011;210:169–77. doi: 10.1038/sj.bdj.2011.117 .

Ndegwa S, Prichett-Pejic W, McGill S, Murphy G, Prichett-Pejic W, Severn M. Teledermatology services: rapid review of diagnostic, clinical management, and economic outcomes. Technology report number 135. Ottawa, ON: Canadian Agency for Drugs and Technologies in Health; 2010. http://www.cadth.ca/media/pdf/H0502_Teledermatology_Report_e.pdf .

O’Meara S, Riemsma R, Shirran L, Mather L. G. ter Riet G. A rapid and systematic review of the clinical effectiveness and cost-effectiveness of orlistat in the management of obesity. Health Technol Assess. 2001;5:1–81.

Obyn C, Mambourg F. Rapid assessment van enkele nieuwe behandelingen voor prostaatkanker en goedaardige prostaathypertrofie: High-Intensity Focused Ultrasound (HIFU) voor prostaatkanker. Photoselective Vaporization of the Prostate (PVP) en holmium laser voor goedaardige prostaathypertrofie. KCE reports 89A. Brussels: Belgian Health Care Knowledge Centre (KCE); 2008. https://kce.fgov.be/sites/default/files/page_documents/d20081027361.pdf .

Saborido CM, Hockenhull J, Bagust A, Boland A, Dickson R, Todd D. Systematic review and cost-effectiveness evaluation of ‘pill-in-the-pocket’ strategy for paroxysmal atrial fibrillation compared to episodic in-hospital treatment or continuous antiarrhythmic drug therapy. Health Technol Assess. 2010;14:iii–iv. doi: 10.3310/hta14310 . 1–75.

Schnell-Inderst P, Hunger T, Hintringer K, Schwarzer R, Seifert-Klauss VR, Gothe H, et al. Individual health services. GMS Health Technol Assess. 2011;7:Doc05. doi: 10.3205/hta000096 .

Singh D. Transforming chronic care. Evidence about improving care for people with long-term conditions. Birmingham: University of Birmingham Health Services Management Centre; 2005. http://www.download.bham.ac.uk/hsmc/pdf/transforming_chronic_care.pdf .

Singh D. Making the shift: key success factors. A rapid review of best practice in shifting hospital care into the community. Birmingham: University of Birmingham Health Services Management Centre; 2006. http://www.birmingham.ac.uk/Documents/college-social-sciences/social-policy/HSMC/publications/2006/Making-the-Shift-Key-Success-Factors.pdf .

Singh D. Improving care for people with long-term conditions: A review of UK and international frameworks. Birmingham: University of Birmingham Health Services Management Centre; 2006. http://www.improvingchroniccare.org/downloads/review_of_international_frameworks__chris_hamm.pdf .

Smith J, Cheater F, Bekker H. Parents’ experiences of living with a child with a long-term condition: a rapid structured review of the literature. Health Expect. 2013. doi: 10.1111/hex.12040 .

Stordeur S, Gerkens S, Roberfroid D. Interspinous implants and pedicle screws for dynamic stabilization of lumbar spine: rapid assessment. KCE reports 116C. Brussels: Health Technology Assessment (HTA) and Belgian Health Care Knowledge Centre (KCE); 2009. https://kce.fgov.be/sites/default/files/page_documents/d20091027346.pdf .

Sutton A, Grant MJ. Cost-effective ways of delivering enquiry services: a rapid review. Health Info Libr J. 2011;28:249–55. doi: 10.1111/j.1471-1842.2011.00965.x .

Thigpen S, Puddy RW, Singer HH, Hall DM. Moving knowledge into action: developing the rapid synthesis and translation process within the interactive systems framework. Am J Community Psychol. 2012;50:285–94. doi: 10.1007/s10464-012-9537-3 .

Thomas J, Newman M, Oliver S. Rapid evidence assessments of research to inform social policy: taking stock and moving forward. Evid Policy. 2013;9:5–27. doi: 10.1332/174426413X662572 .

Tsakonas E, Moulton K, Spry C. FDG-PET to assess infections: a review of the evidence. Ottawa, ON: Canadian Agency for Drugs and Technologies in Health; 2008. http://www.cadth.ca/media/pdf/I3016_FDG-PET_Assess_Infections_htis-3_e.pdf .

Van Brabandt H, Neyt M. Endobronchial valves in the treatment of severe pulmonary emphysema: a rapid Health Technology Assessment. KCE reports 114C. Brussels: Health Technology Assessment (HTA) and Belgian Health Care Knowledge Centre (KCE); 2009. https://kce.fgov.be/sites/default/files/page_documents/d20091027339.pdf .

Vlayen J, Camberlin C, Paulus D, Ramaekers D. Rapid assessment van nieuwe wervelzuil technologieën: totale discusprothese en vertebro/ballon kyfoplastie. KCE reports 39A. Brussels: Belgian Health Care Knowledge Centre (KCE); 2006. https://kce.fgov.be/sites/default/files/page_documents/d20061027338.pdf .

York A, Crawford C, Walter A, Walter JAG, Jonas WB, Coeytaux R. Acupuncture research in military and veteran populations: a rapid evidence assessment of the literature. Med Acupuncture. 2011;23:229–36. doi: 10.1089/acu.2011.0843 .

Zechmeister I, Schumacher I. The impact of health technology assessment reports on decision making in Austria. Int J Technol Assess Health Care. 2012;28:77–84. doi: 10.1017/S0266462311000729 .

Sturm L, Cameron AL. Brief review: Fast-track surgery and enhanced recovery after surgery (ERAS) programs. Adelaide: Australian Safety and Efficacy Register of New Interventional Procedures – Surgical (ASERNIP-S); 2009. http://www.surgeons.org/media/299206/RPT_2009-12-09_Enhanced_Patient_Recovery_Programs.pdf .

Birmingham and Black Country Strategic Health Authority. Reducing unplanned hospital admissions. What does the literature tell us? Birmingham: Birmingham and Black Country Strategic Health Authority; 2008. http://www.birmingham.ac.uk/Documents/college-social-sciences/social-policy/HSMC/publications/2006/Reducing-unplanned-hospital-admissions.pdf .

Brunton G, Paraskeva N, Caird J, Bird KS, Kavanagh J, Kwan I, et al. Psychosocial predictors, assessment and outcomes of cosmetic interventions. A systematic rapid evidence review. London: EPPI-Centre Social Science Research Unit, Institute of Education, University of London; 2013. http://eppi.ioe.ac.uk/cms/LinkClick.aspx?fileticket=Ge_RehINz8Q%3D .

Caird J, Hinds K, Kwan I, Thomas J. A systematic rapid evidence assessment of late diagnosis. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London; 2012. http://eppi.ioe.ac.uk/cms/LinkClick.aspx?fileticket=qbwCNWu8qHw%3D .

Carr SC, Leggatt-Cook C, Clarke M, MacLachlan M, Papola TS, Pais J, et al. What is the evidence of the impact of increasing salaries on improving the performance of public servants, including teachers, doctors/nurses, and mid-level occupations, in low- and middle-income countries: Is it time to give pay a chance? London: EPPI-Centre Social Science Research Unit, Institute of Education, University of London; 2011. http://eppi.ioe.ac.uk/cms/LinkClick.aspx?fileticket=bFX9uXkOyaI%3d&tabid=3208&mid=5983 .

Doran C. The costs and benefits of interventions in the area of mental health: a rapid review. An Evidence Check review brokered by the Sax Institute for the Mental Health Commission of NSW. Sax Institute: Haymarket; 2013. https://www.saxinstitute.org.au/wp-content/uploads/MH-costs-and-benefits-of-intervention_FINAL2.pdf .

Phillipson L, Larsen-Truong K, Jones S, Pitts L. Improving cancer outcomes among culturally and linguistically diverse communities: a rapid review. An Evidence Check review brokered by the Sax Institute for the Cancer Institute NSW. Sax Institute: Haymarket; 2012. https://www.saxinstitute.org.au/wp-content/uploads/Improving-cancer-outcomes-among-CALD-communities-230413v2.pdf .

Rissel C, Curac N, Greenaway M, Bauman A. Key health benefits associated with public transport: a rapid review: An Evidence Check review brokered by the Sax Institute for the NSW Ministry of Health. Sax Institute: Haymarket; 2012. https://www.saxinstitute.org.au/wp-content/uploads/05_Key-health-benefits-associated-with-public-transport.pdf .

Tripney J, Bird KS, Kwan I, Kavanagh J. The impact of post-abortion care family planning counselling and services in low-income countries: a systematic review of the evidence. London: EPPI-Centre, Social Science Research Unit, Institute of Education, University of London; 2011. http://eppi.ioe.ac.uk/cms/LinkClick.aspx?fileticket=X_JbXQTVDEQ%3D&tabid=3062&mid=5694 .

Casadesus D. Surgical resection of rectal adenoma: a rapid review. World J Gastroenterol. 2009;15:3851–4.

London Health Commission. Health inequalities and equality impact assessment of ‘Healthcare for London: consulting the capital’. Scientific Annex 2: rapid evidence review and appraisal. London: London Health Commission; 2008.

Curson JA, Dell ME, Wilson RA, Bosworth DL, Baldauf B. Who does workforce planning well? Workforce review team rapid review summary. Int J Health Care Qual Assur. 2010;23:110–9. doi: 10.1108/09526861011010712 .

De Alwis KLNSK, Dunt D, Bennett N, Bull A. Increasing vaccination among healthcare workers - Review of strategies and a study of selected Victorian hospitals. Healthcare Infection. 2010;15:63–9.

Government Social Research Unit. The Magenta Book: guidance notes for policy evaluation and analysis. Background paper 2: what do we already know? Harnessing existing research. London: Government Social Research Unit; 2007. http://resources.civilservice.gov.uk/wp-content/uploads/2011/09/the_complete_magenta_book_2007_edition2.pdf .

Ontario Ministry of Health and Long-Term Care. Endovascular repair of abdominal aortic aneurysms in low surgical risk patients: rapid review. Toronto, ON: Medical Advisory Secretariat; 2010.

Moyad M, Vitamin D. A rapid review. Urol Nurs. 2008;28:5.

Moyad M. Heart health = urologic health and heart unhealthy = urologic unhealthy: rapid review of lifestyle changes and dietary supplements. Urol Clin North Am. 2011;38:359–67.

Parker H. Making the shift: a review of NHS experience. Birmingham: University of Birmingham Health Services Management Centre; 2006. http://www.birmingham.ac.uk/Documents/college-social-sciences/social-policy/HSMC/research/making-the-shift.pdf .

Thavaneswaran P. Rapid review: robotic-assisted surgery for urological, cardiac and gynaecological procedures ASERNIP-S report number 75. Adelaide: Australian Safety and Efficacy Register of New Interventional Procedures – Surgical (ASERNIP-S); 2009. http://www.surgeons.org/media/299238/RPT_2009-12-09_Robotic-assisted_Surgery.pdf .

Tonmukayakul U, Velasco RP, Tantivess S, Teerawattananon Y. Lessons drawn from research utilization in the maternal iodine supplementation policy development in Thailand. BMC Public Health. 2012;12:391. doi: 10.1186/1471-2458-12-391 .

Van de Velde S, De Buck E, Dieltjens T, Aertgeerts B. Medicinal use of potato-derived products: conclusions of a rapid versus full systematic review. Phytother Res. 2011;25:787–8. doi: 10.1002/ptr.3356 .

Van Brabandt H, Neyt M. Percutaneous heart valve implantation in congenital and degenerative valve disease: a rapid Health Technology Assessment. KCE reports 95. Brussels: Health Technology Assessment (HTA) and Belgian Health Care Knowledge Centre (KCE); 2008. https://kce.fgov.be/sites/default/files/page_documents/d20081027381.pdf .

van Swieten JC, Heutink P. Mutations in progranulin (GRN) within the spectrum of clinical and pathological phenotypes of frontotemporal dementia. Lancet Neurol. 2008;7:965–74. doi: 10.1016/S1474-4422(08)70194-7 .

World Health Organization (WHO). WHO rapid advice guidelines on pharmacological management of humans infected with avian influenza A (H5N1) virus. Geneva: WHO; 2006. http://www.who.int/medicines/publications/WHO_PSM_PAR_2006.6.pdf?ua=1 .

Batten J. Letter to the Editor: Comment on editorial literature reviews as a research strategy. J Sch Nurs. 2012;28:409.

Watt A, Cameron A, Sturm L, Lathlean T, Babidge W, Blamey S, et al. Rapid versus full systematic reviews: validity in clinical practice? ANZ J Surg. 2008;78:1037–40. doi: 10.1111/j.1445-2197.2008.04730.x .

London Health Commission. Health inequalities and equality impact assessment of ‘Healthcare for London: consulting the capital’. Scientific annex II rapid evidence review and appraisal. Birmingham: West Midlands Public Health Observatory; 2008. http://www.apho.org.uk/resource/item.aspx?RID=52757 .

Civil Service. What is a rapid evidence assessment? London: Civil Service; 2011. http://www.civilservice.gov.uk/networks/gsr/resources-and-guidance/rapid-evidence-assessment/what-is .

Ehlers L, Vestergaard M, Kidholm K, Bonnevie B, Pedersen PH, Jørgensen T, et al. Doing mini–health technology assessments in hospitals: A new concept of decision support in health care? Int J Technol Assess Health Care. 2006;22:295–301. doi: 10.1017/S0266462306051178 .

Grant MJ, Booth A. A typology of reviews: an analysis of 14 review types and associated methodologies. Health Information Libraries J. 2009;26:91–108. doi: 10.1111/j.1471-1842.2009.00848.x .

Canadian Agency for Drugs and Technologies in Health (CADTH). Rapid review summit: then, now and in the future. 3–4 February 2015. CADTH Summit Series. CADTH: Vancouver, BC; 2015. http://www.cadth.ca/cadth-summit-series .

Moher DSK, Simera I, Altman DG. Guidance for developers of health research reporting guidelines. PLoS Med. 2010;7, e1000217.

Tricco AC, Zarin W, Antony J, Hutton B, Moher D, Sherifali D, et al. An international survey and modified Delphi approach revealed numerous rapid review methods. J Clin Epidemiol. 2015;pII-S0895-4256(15):00388–1.

Hartling L, Guise JM, Kato E, Anderson J, Aronson N, Belinson S, et al. EPC methods: an exploration of methods and context for the production of rapid reviews. Rockville MD: Agency for Healthcare Research and Quality (US); 2015.

Featherstone RM, Dryden DM, Foisy M, Guise JM, Mitchell MD, Paynter RA, et al. Advancing knowledge of rapid reviews: an analysis of results, conclusions and recommendations from published review articles examining rapid reviews. Syst Rev. 2015;4:50. doi: 10.1186/s13643-015-0040-4 .

Download references

Acknowledgements

The study was funded by a Canadian Institutes of Health Research (CIHR) Operating Grant (grant # DC0190GP, application # 294284). ACT and BH hold a CIHR/Drug Safety and Effectiveness Network New Investigator Award, DM holds a University of Ottawa Research Chair, and SES holds a Tier 1 Canada Research Chair in Knowledge Translation.

We thank Drs Donna Ciliska and Diana Sherifali who provided support and expertise in rapid reviews and knowledge translation on our systematic review protocol. We also thank Ana Guzman for formatting the paper.

Author information

Authors and affiliations.

Li Ka Shing Knowledge Institute of St Michael’s Hospital, 209 Victoria Street, East Building, Room 716, Toronto, ON, M5B 1 W8, Canada

Andrea C. Tricco, Jesmin Antony, Wasifa Zarin, Lisa Strifler, Marco Ghassemi, John Ivory & Sharon E. Straus

Epidemiology Division, Dalla Lana School of Public Health, University of Toronto, 6th Floor, 155 College Street, Toronto, ON, M5T 3 M7, Canada

Andrea C. Tricco

Institute for Health Policy Management and Evaluation, University of Toronto, 4th Floor, 155 College Street, Toronto, ON, M5T 3 M6, Canada

Lisa Strifler & Laure Perrier

Ottawa Hospital Research Institute, Ottawa Methods Centre, Ottawa, ON, Canada

Brian Hutton & David Moher

Department of Medicine, Faculty of Medicine, University of Toronto, 27 King’s College Circle, Toronto, ON, M5S 1A1, Canada

Sharon E. Straus

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Sharon E. Straus .

Additional information

Competing interests.

The authors declare that they have no competing interests.

Authors’ contributions

ACT conceived the study, obtained funding for the study, participated in all pilot tests of study eligibility and data abstraction, helped develop the framework of rapid reviews, interpreted the data, and wrote the manuscript. JA coordinated the study, screened citations and full-text articles for inclusion, abstracted, coded, analyzed the data, and edited the manuscript. WZ verified and coded the data, conducted content analysis, helped develop the framework, and edited the manuscript. LS screened citations and full-text articles, abstracted data, and edited the manuscript. MG abstracted and verified the data, and edited the manuscript. JDI abstracted data and edited the manuscript. LP screened citations and full-text articles, abstracted some data, conducted the literature search, and edited the manuscript. BH and DM helped obtain funding for the study, helped conceive the study, and edited the manuscript. SES conceived the study, obtained funding for the study, participated in pilot tests of eligibility criteria, and edited the manuscript. All authors read and approved the final manuscript.

Additional file

Additional file 1:.

Includes five appendices with supplementary data. (PDF 603 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Cite this article.

Tricco, A.C., Antony, J., Zarin, W. et al. A scoping review of rapid review methods. BMC Med 13 , 224 (2015). https://doi.org/10.1186/s12916-015-0465-6

Download citation

Received : 06 July 2015

Accepted : 28 August 2015

Published : 16 September 2015

DOI : https://doi.org/10.1186/s12916-015-0465-6

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Rapid review
  • Scoping review

BMC Medicine

ISSN: 1741-7015

rapid evidence assessment vs literature review

  • Research article
  • Open access
  • Published: 02 November 2015

Rapid Evidence Assessment of the Literature (REAL © ): streamlining the systematic review process and creating utility for evidence-based health care

  • Cindy Crawford 1 ,
  • Courtney Boyd 1 ,
  • Shamini Jain 2 ,
  • Raheleh Khorsan 2 &
  • Wayne Jonas 1  

BMC Research Notes volume  8 , Article number:  631 ( 2015 ) Cite this article

7094 Accesses

24 Citations

5 Altmetric

Metrics details

Systematic reviews (SRs) are widely recognized as the best means of synthesizing clinical research. However, traditional approaches can be costly and time-consuming and can be subject to selection and judgment bias. It can also be difficult to interpret the results of a SR in a meaningful way in order to make research recommendations, clinical or policy decisions, or practice guidelines. Samueli Institute has developed the Rapid Evidence Assessment of the Literature (REAL) SR process to address these issues. REAL provides up-to-date, rigorous, high quality SR information on health care practices, products, or programs in a streamlined, efficient and reliable manner. This process is a component of the Scientific Evaluation and Review of Claims in Health Care (SEaRCH™) program developed by Samueli Institute, which aims at answering the question of “What works?” in health care.

Methods/design

The REAL process (1) tailors a standardized search strategy to a specific and relevant research question developed with various stakeholders to survey the available literature; (2) evaluates the quantity and quality of the literature using structured tools and rulebooks to ensure objectivity, reliability and reproducibility of reviewer ratings in an independent fashion and; (3) obtains formalized, balanced input from trained subject matter experts on the implications of the evidence for future research and current practice.

Online tools and quality assurance processes are utilized for each step of the review to ensure a rapid, rigorous, reliable, transparent and reproducible SR process.

Conclusions

The REAL is a rapid SR process developed to streamline and aid in the rigorous and reliable evaluation and review of claims in health care in order to make evidence-based, informed decisions, and has been used by a variety of organizations aiming to gain insight into “what works” in health care. Using the REAL system allows for the facilitation of recommendations on appropriate next steps in policy, funding, and research and for making clinical and field decisions in a timely, transparent, and cost-effective manner.

Evidence is the basis from which we tell truth from fiction in the natural world and determine value in health care claims. Millions of articles are published in thousands of biomedical journals worldwide [ 1 ]. PubMed, a free resource developed and maintained by the US National Library of Medicine (NLM), at the National Institutes of Health (NIH), is comprised of over 20 million citations for biomedical literature from MEDLINE, life science journals, and online books [ 2 ]. With the emergence of other journal citation resources that are freely available, health care providers, consumers, researchers, and policy makers find themselves inundated with unmanageable amounts of new information from health care research. Most individuals do not have the time, skills and resources to find, appraise and interpret this evidence, nor to incorporate their findings into health care decisions in an appropriate manner. Even in special interest areas that are smaller and more narrowly focused (e.g. liver disease), it is still challenging to stay abreast with all relevant information. Consequently, despite the need for evidence to clearly inform clinical practice and policy, the best evidence is not always used due to lack of knowledge, time, skills and resources needed to quickly synthesize such information and translate that information into meaningful knowledge that can inform practice decisions.

From clinical judgment to systematic evidence evaluation

Effective health care decisions should be evidence-based rather than rely solely on clinical judgment. Such judgments are often made under conditions of uncertainty [ 3 ], and use informal methods which can be fraught with bias and inaccuracy that produce shifting or misleading recommendations in practice. For example, as of 2012, 48 documented controlled trials and seven high quality systematic reviews (SRs) examining the effects of acupuncture on approximately 7433 total participants with substance abuse, (e.g., alcohol, cocaine, crack, nicotine dependencies and other addictions) existed in the peer-reviewed literature. Since acupuncture is widely used for substance abuse and there have been many studies done on this topic, Samueli Institute in 2012 conducted a review of SRs to summarize this evidence and concluded that, based on the current available literature, needle acupuncture was not effective in treating these conditions [ 4 ]. The implications of this review state that acupuncture is not recommended as a therapy for this condition at this time. A now classic example of the limitation of clinical judgment and the need for best evidence synthesis is in the use of hormone replacement therapy (HRT). Extensively used for years in post-menopausal women, clinicians made claims about the benefits of HRT for heart disease, sexual function, hot flashes, reduction of bone loss and prevention of cognitive decline. Subsequent randomized controlled trials (RCTs) and SRs, however, demonstrated that not only were the vast majority of these claims false, but the routine use of HRT was likely harmful [ 5 ]. Similarly, invasive laser procedures continue to be widely used for the treatment of angina from coronary artery disease (CAD) yet SRs of RCTs have shown no benefit of such procedures compared to sham controls and have reported infrequent but serious adverse events and/or interactions [ 6 – 10 ]. Should clinicians continue to perform these procedures? Clinical judgment is also often mis-leading or false when used by itself and the need to integrate best evidence syntheses and a method for translation of the evidence to support judgments is apparent. Without rigorous, transparent and reproducible SR processes to synthesize the best evidence, however, it is difficult to judge the efficacy, effectiveness and safety of a health care claim, identify where the gaps lie to improve the science and make appropriate decisions concerning clinical practice.

From information to knowledge

Mastering and managing the recent explosion of medical information is a difficult task, and evidence-based problem solving skills are essential for responsible decision-making, maintaining quality health care and ensuring good outcomes. As stated, SRs form the foundation for evidence-based medicine by collating all empirical evidence that fit pre-specified eligibility criteria in order to answer a specific research question. While expert opinions and narrative reviews are popular means for organizing data, and can be informative and produced faster and more easily than SRs, they are often subjective and prone to bias. Thus, during a time characterized by large amounts of information and the critical need to make evidence-based decisions, the shift from these analyses towards SRs is not only becoming prominent, but also necessary. Indeed, high quality SRs that clearly summarize evidence have become a crucial component in helping clinicians, patients, and policymakers make accurate decisions about clinical care [ 3 ]. SR methodology holds a key position in summarizing the state of current knowledge and disseminating findings of available evidence [ 3 , 11 ]. In fact, multiple groups such as the Institute of Medicine, the Agency for Healthcare Research and Quality (AHRQ), Cochrane, as well as professional associations, insurance agencies and licensing bodies that provide health care guidelines and recommendations often utilize SR methodology as a basis to offer such recommendations. Having access to and sharing high quality evidence-based SR reports within a particular subject area can help all parties be better informed about the safety, efficacy and effectiveness of treatment claims and make sound, informed decisions. They are an important step in moving from data—to information—to knowledge, provided they are conducted in a transparent, rigorous and meaningful fashion.

Challenges with current systematic review methodology

Inconsistent review standards and processes.

SR methodology used to assess the quality of available literature has gradually improved over the years, with several groups receiving international attention for the development of standards and advancing the science in SRs. Despite this progress, SR methodologies can still present challenges. First, many still vary considerably, and as such, outside reviewers often have difficulty replicating such methodologies. There is a need for improved standardized and reliable protocols and procedures to ensure transparency and produce meaningful information. Second, research questions and data extraction can be chosen without the input of diverse stakeholders, resulting in a narrow scope of the review, and sometimes minimal relevance or utility for making clinical decisions. Third, the subjective nature of quality assessment of research can leave SRs open to bias, resulting in unreliable results. Finally, while SRs help to provide a summary of the evidence, not all provide informative syntheses, perhaps because they lack a structured approach for obtaining expert input on the implications of the evidence for recommendations.

SRs can be cumbersome to execute and quite costly, requiring large amounts of personnel time and budget. Many people grossly underestimate the amount of time needed to perform a comprehensive, rigorous, and evidence-based SR, and subsequently choose to rely on less reliable methods such as expert opinions or narrative reviews. Protocol development, search strategy formation and literature searching, quality assessment and data extraction, discussion of disagreements for study inclusion, coding and quality assessments, acquisition of missing data from authors, and data analysis are all time consuming steps requiring specific skills, training and effort. A large team trained in specific roles/responsibilities at each phase of the review is needed to perform a SR most efficiently. Because lack of resources is sometimes a challenge, training, explicit processes, and the application of online systems can enhance efficiency and decrease cost. The methodology described below incorporates such methods and in turn reduces costs while enhancing the quality of the review.

Addressing challenges of systematic review methodology

Samueli institute’s rapid evidence assessment of the literature.

In order to overcome these challenges and maximize efficiency in the execution and dissemination of good evidence, there is a need for more objective, high quality and up-to-date syntheses provided in a more streamlined manner regarding health care interventions. To fill this need, Samueli Institute has developed a SR process known as the Rapid Evidence Assessment of the Literature (REAL © ). This method utilizes specific tools (e.g., automated online software) and standard procedures (e.g., rulebooks) to rigorously deliver more reliable, transparent and objective SRs in a streamlined fashion, without compromising quality and at a lower cost than other SR methods.

Specifically, the REAL SR process involves (1) the rapid identification of literature relevant to a particular subject matter area (usually related to an intervention for a particular outcome); (2) the use of one or more grading systems to assess the quality and strength of evidence for the topic; (3) a summary of that evidence and; (4) subject matter experts (SMEs) input and assessments of implications for the current use of the intervention in practice. This rapid methodology requires a team-based approach to capitalize on resources and ensure maximum meaning, impact and utility; efficient and consistent review methodologies aimed at reducing time while maintaining quality; careful creation of objective protocols describing how to execute SR processes to ensure both reliability and reproducibility; as well as thoughtful synthesis and interpretation of the data to form a foundation for future work. Consequently, SRs that utilize this more streamlined process (i.e., “REALs”) are more efficient and reliable than some other traditional SR methods. Figure  1 depicts the steps involved in the REAL SR process, also detailed in the remainder of this paper. The REAL process can be used to evaluate interventions or claims in many fields including conventional medicine, complementary and alternative medicine (CAM), integrative health care (aka integrative medicine, IM), wellness, and health promotion, resilience and performance enhancement areas and more. In fact, to date, the REAL process has been applied to several topical areas [ 4 , 12 – 16 ] with more recent published work including a Department of Defense (DoD) funded SR of reviews on acupuncture for the treatment of trauma spectrum response (TSR) components [ 4 ], self-care and integrative health care practices for stress management [ 15 ], self-care and integrative practices for the management of pain [ 14 ] and warm-up exercises for physical performance [ 13 ].

Basic steps of a Rapid Evidence Assessment of the Literature (REAL © )

Real methodology and design

Following a team-based approach to capitalize on resources.

Efficiency is of great importance when stakeholders need immediate, evidence-based answers for “what works”. Many review teams are small in size and reviews can take years to complete. Conversely, to maximize efficiency, Samueli Institute REALs are executed by several well-trained team members, each with specific roles and responsibilities, and often take approximately 3–6 months, from question development to manuscript delivery.

Specifically, a REAL Review Team includes: (1) a Principal Investigator to oversee the entire project; (2) a Review Manager with SR methodology expertise to guide the review process from start to finish; (3) a Search Expert to assist with literature search strategy development and execution; (4) at least two trained Reviewers to screen, extract data and review the quality of the literature; (5) a Reference Manager/Research Assistant to provide administrative and project support; (6) a Statistician to provide guidance regarding the interpretation of complex results or meta-analyses; and (7) at least two SMEs with diverse perspectives related to the review topic to provide guidance and synthesize the overall literature pool. It is important to note, that while Samueli Institute has designed the REAL process to be executed by individuals within these roles/responsibilities, some organizations and entities may be more limited in terms of available personnel. As such, it is reasonable for individuals to be trained to take on multiple roles, although doing so may delay the review process. The division of labor allows for more efficient, accurate and reliable execution of the review steps and reduction of time needed by any one individual. Further, it allows for better compliance with the Institute of Medicine (IOM) recommendations for managing bias and conflicts of interest (COI) when producing reviews and recommendations [ 3 ]. The REAL process follows these IOM recommendations and follows strict criteria at each review step to guard against bias and excludes team members with COIs from portions of the review where objectivity or balance may be compromised.

Involving stakeholders to ensure maximum relevance and translatability

One of the most frequent complaints by clinicians and patients about systematic reviews is that their conclusions have little relevance to daily clinical decisions and so are not of much use. The REAL has built in a process to obtain continuous input from any stakeholder involved in these decisions. In addition to the Review Team, REALs also include a Steering Committee comprised of 4–6 diverse stakeholders (e.g., clinicians, researchers, policy makers, patients and various other relevant stakeholders) chosen by the client and Principal Investigator, who provide guidance throughout the review process. This ensures that the review’s focus stays relevant to the end-user of the SR results and allows for translation to practice to occur more effectively. The Steering Committee seeks to address the “so what” question that so often occurs after a standard SR in which simply “more and better research is needed.” Though integral to the review, the Steering Committee is not involved in the review’s technical steps. This guards against bias during the independent evidence assessment process. Once the Steering Committee and the SMEs review and approve the team’s plans and progress at each review phase the Review Team is then solely focused on conducting the review and analyses in an independent and objective fashion.

Once assembled, it is imperative that both the Review Team and Steering Committee work together to formulate the review’s research question, scope, definitions, and eligibility criteria using the PICO(S) process (i.e., Population, Intervention, Control or Comparison, Outcomes and Study Design) [ 2 ], as well as identify relevant data extraction points for synthesizing the literature. Assembling various stakeholders to pre-define the review’s research question and eligibility criteria sets the tone for the review, ensuring that different perspectives are represented in the review and requiring that all subsequent steps and processes are conducted with this information in mind. This is a critical part of the REAL and ensures the results will have enough meaning and utility for stakeholders. Although involving a large group of voices at the outset to deliberate and agree upon all elements of the SR may seem counterintuitive to increasing efficiency, outlining a clear methodological process up front is imperative to streamlining the remaining systematic processes and so saves time overall. In addition this reduces the chance that the team will have to redefine their research question or processes once the review is underway. Revisions done while the quality assessment of the literature is underway not only costs time and resources, it opens the process to bias. These are reduced in the REAL process.

Enhancing the efficiency and consistency of review methodologies

Utilizing specific search protocols to reduce quantity and improve quality.

The REAL process requires search expertise to build robust literature search strategies as well as iterative input from both the SMEs and Steering Committee members for guidance. REALs do not “exhaustively” search the literature by including grey and non-English language literature, unless essential to the specific research question (e.g., searching Chinese herbal therapy). Instead, they usually include only peer-reviewed literature published in the English language. While the traditional SR considers the inclusion of only English-language studies as a limitation, doing so rarely compromises the outcome or implication for the majority of interventions and claims [ 17 ]. There has been debate, moreover, around the importance of including grey (unpublished) literature. While including such literature can reduce publication bias, it can also result in the overestimation of an intervention’s effects, since unpublished studies are usually more difficult to find, smaller and of lower quality compared to those published in the English language literature [ 18 , 19 ]. Therefore, despite the inherent differences in methods as well as time and cost associated with these processes, the conclusions of a REAL and a SR are usually comparable, and result in the same “bottom line” conclusions about the evidence [ 20 ]. In fact, the synthesis involved in a REAL is often more informative and rigorous than some SR efforts due to the additional assessment systems employed in a REAL compared to standard SRs [ 4 , 21 ] (see Adapting and Developing of Quality Assessment Tools ).

Automating the review to enhance the review process

REALs are more efficient due to not only their focus on English and peer-reviewed literature, but also their use of readily available software systems to automate the review process. These systems have been customized for use with a REAL and streamline many of the review steps including automated article processing and management, eliminating the need for data transcription, automated reliability estimation, real time error and quality checking, and reduction of post-review data collation. Using a specific review system and rulebooks allows researchers to deliver results faster, with improved accuracy and reliability, and provides a complete audit trail of all changes to ensure transparency. Such systems can also be accessed remotely and include messaging features that allow the review team to interact virtually, thereby considerably decreasing costs associated with travel, materials, supplies, and meeting facilities.

Ensuring objectivity to reduce bias

Adapting and developing of quality assessment tools.

Most groups using SRs to develop recommendations and guidelines rely on subject matter experts (SMEs) to evaluate the quality of the research. However, SMEs almost always have a particular point of view (bias) and also are rarely trained in the proper use of quality assessment tools. REALs avoid the use of SMEs in applying quality assessment tools and instead rely on trained review teams. This way, higher standards for accuracy and reliability are obtained. There are many well-accepted quality assessment rating systems available to researchers for evaluating quality and risk of bias. These tools typically focus on internal validity, or whether the results are due to attributional bias issues. These tools are usually quite subjective and variable in how quality criteria are interpreted. Samueli Institute has adapted some of these rating systems to improve their use and objectivity. In addition, we have also developed, validated and incorporated an External Validity Assessment Tool (EVAT © ) [ 16 ] into the REAL process to assess the “real-world” relevance of the research questions being asked. While many SRs typically only evaluate internal validity, REAL uses quality assessment tools to evaluate not only internal validity but external and model validity as well. Thus, all REALs deliver a database of ratings for gaging the attributional (internal validity), generalizability (external validity) and relevance (model validly) of every study. This database has multiple uses for clients even after the specific REAL is completed.

Detailing and applying quality criteria

Due to the inherently subjective nature of interpreting research results, Samueli Institute has created rulebooks to ensure that review teams are: (1) objectively evaluating and “scoring” each included article for quality; and, (2) consistently extracting data in a specific, consistent format, thereby reducing time needed for post-review data cleaning. Reviewers utilize these rulebooks and so provide transparent data extraction as well as a consistent and sufficient inter-rater reliability Cohen’s Kappa (i.e., 90 %), indicating a low level of conflict and high level of agreement between reviewers. These rulebooks are essential for managing and minimizing bias and ensuring the quality of any review. For example, should someone question the basis for any results in a SR, the team can refer to the rulebooks to explain and demonstrate specifically why and how particular articles were scored.

Maintaining transparent reporting

Just as the criteria and parameters whereby reviewers conduct the review are explicitly detailed in rulebooks, all decisions, processes and outcomes relating to each step of the review are maintained in a Review Documentation Checklist throughout the review process. Because this Checklist was developed to adhere to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) Guidelines [ 22 ], it not only aids with transparent reporting of results and replication of methods, but can also be used as a guide for how the results can be synthesized into a report and disseminated through peer-reviewed journal publications or other venues. Using this checklist for a manuscript outline also streamlines manuscript preparation as authors have all methodological processes and decisions housed in one place, rather than having to dig through files to find the details from various phases of the review [ 22 ].

Synthesizing and interpreting the data to find meaning

The REAL process is designed to provide a basis for SMEs to identify current implications for research and practice based on the evidence as a whole. In fact, once all individual studies included in the review have been evaluated, SMEs assess the overall literature pool according to the researched outcomes relevant to the research question in order to: (1) determine the quality of the research as a whole; (2) identify gaps in the literature; (3) assess the effectiveness of the intervention or claim as well as the confidence in that effectiveness estimate; and (4) judge the appropriateness for clinical use of the intervention. This is done in the following way. A roundtable is convened with the review team, Steering Committee and SMEs to evaluate the review’s results, the overall literature pool analyses, identified gaps, as well as outline next steps for the particular field of research. Several tools are used to organize the goals and discussion at this roundtable. A synthesis report is produced from this roundtable that is reviewed and modified by the REAL team based on feedback from all participants. These syntheses form a foundation for researchers, clinicians and patients to be better informed about the current state-of-the-science for any intervention, and determine next steps needed in the field of research and practice for use and impact. The Grading of Recommendations, Assessment, Development and Evaluation (GRADE) working group has developed methods for going about synthesizing the literature as a whole that should be used standardly across all systematic reviews [ 23 ].

Laying the foundation for evidence based decisions

REALs are constructed in a way that lays a foundation for future stakeholders to use quality evidence for decision making in multiple areas—research, practice, personal and policy areas. These foundational elements include the evaluated dataset (which can be further updated and added to), effect size estimates, meta-analyses (when possible), and other elements that go into the report such as the quality tools previously described and synthesis and interpretation assessments.

Conducting meta-analyses

Meta-analyses combine the actual quantitative results (e.g., collect and pool effect sizes) of separate studies included in a review, use statistical techniques to determine the overall effect size and confidence in the effect of the intervention, and employ analytic techniques to quantify possible publication bias. They are often costly and time-consuming, and only appropriate when the existing literature suggests that there are sufficient studies with enough homogeneity in outcomes. REALs are designed to form the foundation for subsequent meta-analyses to be conducted, if appropriate. REALs can therefore be utilized as an effective tool for rapidly determining the current state of the literature, and what gaps should be addressed to conduct an effective meta-analysis.

Bridging the gap between evidence and knowledge

There is a considerable barrier to rapidly translating evidence into decision making for clinicians, patients, researchers and policy makers. Although authors of SRs disseminate results through various routes of publication, results often do not reach all parties in ways that allow them to make medical decisions and so do not maximize impact of reviews. The REAL process is one of three components of Samueli Institute’s Scientific Evaluation and Review of Claims in Health Care (SEaRCH) Program and is a key step to forming a foundation upon which the other two SEaRCH components can be used for determining the clinical impact and relevance of evidence. SEaRCH is comprised of the REAL, Claim Assessment Profile (CAP) and Expert Panel Processes (e.g., Clinical, Research and/or Policy Expert Panel) processes as described in this journal issue [ 24 , 25 ]. Together these three segments of SEaRCH can be integrated with each other in order to answer the question of “what works” in health care by providing: (1) a clear description of the intervention and claim being evaluated and its feasibility to engage in future research through the CAP; (2) a rigorous summary of current evidence for the claim gathered through the REAL process and shared with the other components of SEaRCH; (3) a balanced, expert assessment of the appropriateness of use of the intervention with the Clinical EP; evidence-based policy judgments needed to direct implementation of a practice claim with the Policy EP; the value of the research for patient-centered care with the Patient EP; and (4) next research steps needed to move the evidence base about the claim forward with the Research EP. The methods used for the CAP and the EPs process are described in subsequent articles in this set.

Similar to the REAL, expert panels and the CAP employ specific processes and safeguards to reduce variability and bias and promote collaboration and efficient delivery of meaningful results. The CAP can be conducted prior to the REAL to inform the REAL toward specific definitions about a particular claim, or can be conducted in tandem with the REAL for informing the expert panel process. While expert panels can be organized once the REAL process is completed it is important that the review and expert panel processes both remain independent of each other to manage bias and maintain a focus on clinical and patient relevance. To do this properly, SME input and the REAL process need to be carefully managed even as they are linked to the expert panel process. The SEaRCH program is designed to allow for complete interaction between the SR and expert panel process in a manner that remains both impartial and informative in the interaction between SMEs and the trained reviewers [ 3 ]. This process creates distinct, independent teams who not only engage in the literature review and expert panel processes, but also “cross-talk” (under the supervision of a SEaRCH Program Manager and the Steering Committee chair) to ensure that both relevant research questions are being addressed and the rigor of the research is maintained. Specifically, when an expert panel is solicited, the Expert Panel Manager [ 26 ] and REAL Review Manager collaborate to ensure that the panel’s topic of interest is being sufficiently addressed by the REAL. Panelists, based on their expertise, can expand upon the gaps or clinical issues brought forth through the REAL. REALs can assist expert panels to determining appropriateness, clinical guidelines, implementation policies and patient-centeredness of the evidence or for establishing research agendas. Recommendations that emerge through the SEaRCH process can then be shared with stakeholders for maximum impact.

The REAL is a process that streamlines and organizes many elements of systematic reviews in order to insert high quality, rigorous evidence in a more rapid, objective, relevant and cost-efficient manner into decision making processes. Specifically, the REAL (1) follows a team-based approach; (2) utilizes specific search strategies; (3) automates review processes to ensure efficient use of time and skill; (4) involves key stakeholders to guarantee the right questions are being asked and addressed; (5) outlines and adheres to a transparent protocol to ensure objectivity and the management of bias; and (6) forms a foundation for subsequent analyses and expert panels to guide gaps and relevance, particularly when tied into other elements of the SEaRCH process. These features not only increase efficiency, but also assure adherence to reliable and reproducible protocols that provide a more consistent, transparent SR process for evidence-based medicine and decision making by the multiple stakeholders in health care.

By providing background and information on the existing literature, research gaps, and the weaknesses and strengths of current evidence, systematic reviews utilizing the REAL process provide a solid and consistent foundation for making clinical, patient and policy decisions. The, objectivity and efficiency of the REAL process make it a valuable for a variety of organizations and entities that need good evidence for decisions about products, practices or programs currently in use or those being explored for potential use. Decision makers as diverse as a health insurance or regulatory company/agency wanting to know what the evidence is for an intervention in order to decide whether or not it should be covered, or a clinical practice wanting to know if implementing a certain practice would benefit their patients are examples of decisions that can be aided by a REAL.

Training and support for conducting REALs

Samueli Institute has shared their REAL methodology with others in the SR field and continues to extend outreach and support to those interested in using this approach for evidence assessment. The Institute has developed a workshop that teaches participants how to conduct SRs in the step-wise fashion used by the REAL. This workshop is currently offered 2–3 times a year and provides participants with a comprehensive workbook covering theoretical material (i.e., the role and purpose of different types of SRs, their place in delivering evidence-based medicine, role of bias, etc.), practical instructions and guidelines on how to conduct SRs using the REAL process, and allows participants to receive individual coaching on review projects they are developing or conducting. The course and assistance is also offered through an online, self-paced platform (Black Board) complemented by didactics, mentorship, and in person workshops. Samueli Institute also collaborates with other organizations wishing to evaluate a topic using the REAL methodology, and offers guidance and mentorship throughout the review process. These workshops have been done for government and private groups and can be customized for use by any organization interested in applying evidence to health care decision making.

There is a need for reliable, rapid, and transparent evidence to guide effective health care decision-making. The REAL approach was developed to ensure high quality SRs are conducted in a rapid, streamlined, transparent and valid fashion. It has been shown to: (1) reduce the cost of generating reviews for those making informed decisions regarding health care; and, (2) inform the public in a time sensitive, cost-effective and objective manner about the state of the evidence for any health care area.

Detailing the challenges of current SR methodology and the ways in which this rapid SR process addresses those challenges highlights the need for investigators to ensure that reviews are objective, transparent, scientifically valid, and follow a common language and structure for characterizing the strength of evidence across reviews. Adapting an approach like the REAL into current SR processes will not only decrease the variability and improve the quality of SRs, but also allow health care decision makers, including clinicians, patients and policy makers to play a crucial role in developing relevant research questions and for making sound, evidence-based decisions in all of heath care.

For those interested in utilizing the REAL approach and learning more about conducting SRs, training workshops and collaboration opportunities, please visit the Samueli Institute website [ 27 ].

Abbreviations

Agency for Healthcare Research and Quality

coronary artery disease

Claims Assessment Process

conflict of interest

Department of Defense

expert panel

external validity assessment tool

hormone replacement therapy

Institute of Medicine

National Institutes of Health

US National Library of Medicine

population, intervention, control or comparison, outcomes, study design

preferred reporting items for systematic reviews and meta-analyses

randomized controlled trials

Rapid Evidence Assessment of the Literature

Scientific Evaluation and Review of Claims in Health Care

subject matter expert

systematic review

trauma spectrum response

Mulrow C, Chalmers I, Altman D. Rationale for systematic reviews. BMJ. 1994;309:597–9.

Article   PubMed Central   CAS   PubMed   Google Scholar  

Higgins J, Green S (eds.). Cochrane Handbook for Systematic Reviews of Interventions Version 5.1.0 [updated March 2011] The Cochrane Collaboration 2011. West Sussex, England: The Cochrane Collaboration; 2011.

Graham R, Mancher M, Wolman D, Greenfield S, Steinberg E. Institute of Medicine. Clinical Practice Guidelines We Can Trust. Washington, DC: The National Academies Press; 2011.

Lee C, Crawford C, Wallerstedt D, York A, Duncan A, Smith J, Sprengel M, Welton R, Jonas W. The effectiveness of acupuncture research across components of the trauma spectrum response (tsr): a systematic review of reviews. Syst Rev. 2012;1:46.

Article   PubMed Central   PubMed   Google Scholar  

Moyer VA. Menopausal hormone therapy for the primary prevention of chronic conditions: U.S. Preventive Services Task Force recommendation statement. Ann Intern Med. 2013;158:47–54.

Article   PubMed   Google Scholar  

Cobb LA, Thomas GI, Dillard DH, Merendino KA, Bruce RA. An evaluation of internal-mammary-artery ligation by a double-blind technic. N Engl J Med. 1959;260:1115–8.

Article   CAS   PubMed   Google Scholar  

Dimond EG, Kittle CF, Crockett JE. Comparison of internal mammary artery ligation and sham operation for angina pectoris. Am J Cardiol. 1960;5:483–6.

Leon MB, Kornowski R, Downey WE, Weisz G, Baim DS, Bonow RO, Hendel RC, Cohen DJ, Gervino E, Laham R, et al. A blinded, randomized, placebo-controlled trial of percutaneous laser myocardial revascularization to improve angina symptoms in patients with severe coronary disease. J Am Coll Cardiol. 2005;46:1812–9.

Salem M, Rotevatn S, Stavnes S, Brekke M, Pettersen R, Kuiper K, Ulvik R, Nordrehaug JE. Release of cardiac biochemical markers after percutaneous myocardial laser or sham procedures. Int J Cardiol. 2005;104:144–51.

Salem M, Rotevatn S, Stavnes S, Brekke M, Vollset SE, Nordrehaug JE. Usefulness and safety of percutaneous myocardial laser revascularization for refractory angina pectoris. Am J Cardiol. 2004;93:1086–91.

Linde K. Systematic reviews and meta-analyses. In: Lewith G, Jonas W, Walach H, editors. Clinical research in complementary therapies: principles, problems and solutioins. London: Churchill Livingstone; 2002. p. 187–97.

Chapter   Google Scholar  

York A, Crawford C, Walter A, Walter J, Jonas W, Coeytaux R. Acupuncture research in military and veteran populations: a rapid evidence assessment of the Literature. Med Acupunct. 2011;23:229–36.

Article   Google Scholar  

Zeno S, Purvis D, Crawford C, Lee C, Lisman P, Deuster P. Warm-ups for military fitness testing: rapid evidence assessment of the literature. Med Sci Sports Exerc. 2013;45:1369–76.

Buckenmaier C, Crawford C, Lee C, Schoomaker E. Special issue: Are active self-care complementary and integrative therapies effective for management of chronic pain? A rapid evidence assessment of the literature and recommendations for the field. Pain Med. 2014;15(Suppl 1):S1–113.

Crawford C, Wallerstedt D, Khorsan R, Clausen S, Jonas W, Walter J. Systematic review of biopsychosocial training programs for the self-management of emotional stress: potential applications for the military. Evid Based Complement Altern Med. 2013;2013:747694. doi: 10.1155/2013/747694 .

Khorsan R, Crawford C. External validity and model validity: a conceptual approach for systematic review methodology. Evid Based Complement Altern Med. 2014;2014:694804. doi: 10.1155/2014/694804 .

Moher D, Pham B, Klassen TP, Schulz KF, Berlin JA, Jadad AR, Liberati A. What contributions do languages other than English make on the results of meta-analyses? J Clin Epidemiol. 2000;53:964–72.

Egger M, Juni P, Bartlett C, Holenstein F, Sterne J. How important are comprehensive literature searches and the assessment of trial quality in systematic reviews? Empirical study. Health Technol Assess. 2003;7:1–76.

CAS   PubMed   Google Scholar  

Hopewell S, McDonald S, Clarke M, Egger M. Grey literature in meta-analyses of randomized trials of health care interventions. Cochrane Database Syst Rev. 2007;2:MR000010.

PubMed   Google Scholar  

Watt A, Cameron A, Sturm L, Lathlean T, Babidge W, Blamey S. Rapid versus full systematic reviews: validity in clinical practice? ANX J Surg. 2008;1037–40.

Davidson J, Crawford C, Ives J, Jonas W. Homeopathic treatments in psychiatry: a systematic review of randomized placebo-controlled studies. J Clin Psychiatry. 2011;72(6):795–807.

Moher D, Liberati A, Tetzlaff J, Altman DG, PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. BMJ. 2009;339:b2535.

The Grading of Recommendations Assessment, Development and Evaluation (GRADE) Working Group. http://www.gradeworkinggroup.org/ . Accessed 15 Jan 2015.

Hilton L, Jonas W. Claim Assessment Profile Methodology: a method for capturing health care evidence in the Scientific Evaluation and Review of Claims in Health Care (SEaRCH™). To be published in BMC Res Notes. 2015.

Jonas W, Crawford C, Hilton L, Elfenbaum P. Scientific Evaluation and Review of Claims in Health care (SEaRCH™): a streamlined, systematic, phased approach for determining “what works” in health care. To be published in BMC Res Notes. 2015.

Coulter I, Elfenbaum P, Jain S, Jonas W. SEaRCH Expert Panel Process: streamlining the link between evidence and practice. To be published in BMC Res Notes. 2015.

Samueli Institute: Research Services. 2015. https://www.samueliinstitute.org/research-areas/research-services/search-services . Accessed 15 Jan 2015.

Download references

Authors’ contributions

CC and WJ developed and designed the Rapid Evidence Assessment of the Literature (REAL) methodology and the Scientific Evaluation and Review of Claims in Healing (SEaRCH) process. Both were involved in drafting the manuscript and revising it for important intellectual content. CL, SJ and RK have made substantial contributions to the conception and design of the REAL process and SEaRCH, and have been involved in the drafting and critical review of the manuscript for important intellectual content. All authors have given final approval of the version to be published and take public responsibility for the methodology being shared in this manuscript. All authors read and approved the final manuscript.

Acknowledgements

The authors would like to acknowledge Mr. Avi Walter for his assistance with the overall SEaRCH process developed at Samueli Institute, and Ms. Viviane Enslein for her assistance with manuscript preparation.

Funding and disclosures

This project was partially supported by award number W81XWH-08-1-0615-P00001 (United States Army Medical Research Acquisition Activity). The views expressed in this article are those of the authors and do not necessarily represent the official policy or position of the US Army Medical Command or the Department of Defense, nor those of the National Institutes of Health, Public Health Service, or the Department of Health and Human Services.

Competing interests

The authors declare that they have no competing interests.

Author information

Authors and affiliations.

Samueli Institute, 1737 King Street, Suite 600, Alexandria, VA, 22314, USA

Cindy Crawford, Courtney Boyd & Wayne Jonas

Samueli Institute, 2101 East Coast Hwy., Suite 300, Corona del Mar, CA, 92625, USA

Shamini Jain & Raheleh Khorsan

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Cindy Crawford .

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Cite this article.

Crawford, C., Boyd, C., Jain, S. et al. Rapid Evidence Assessment of the Literature (REAL © ): streamlining the systematic review process and creating utility for evidence-based health care. BMC Res Notes 8 , 631 (2015). https://doi.org/10.1186/s13104-015-1604-z

Download citation

Received : 01 May 2015

Accepted : 19 October 2015

Published : 02 November 2015

DOI : https://doi.org/10.1186/s13104-015-1604-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Rapid Evidence Assessment of the Literature (REAL)
  • Methodology
  • Systematic review process
  • Meta-analysis
  • Evidence-based medicine
  • Scientific Evaluation and Review of Claims in Health Care (SEaRCH)

BMC Research Notes

ISSN: 1756-0500

rapid evidence assessment vs literature review

Library Guides

Systematic Reviews

  • Introduction to Systematic Reviews
  • Systematic review
  • Systematic literature review
  • Scoping review

Rapid evidence assessment / review

  • Evidence and gap mapping exercise
  • Meta-analysis
  • Systematic Reviews in Science and Engineering
  • Timescales and processes
  • Question frameworks (e.g PICO)
  • Inclusion and exclusion criteria
  • Using grey literature
  • Search Strategy This link opens in a new window
  • Subject heading searching (e.g MeSH)
  • Database video & help guides This link opens in a new window
  • Documenting your search and results
  • Data management
  • How the library can help
  • Systematic reviews A to Z

rapid evidence assessment vs literature review

Description

Rapid evidence assessments or reviews allow for a structured and rigorous search, as well as a quality assessment of the uncovered evidence, but are not as extensive and exhaustive as a systematic review. They often provide a brief summary of the evidence discovered, so that informed, evidence-based, conclusions can be drawn. They tend to be used in making informed decisions often by policy makers or are used to justify the need for further research.

Common characteristics

  • Applies systematic review methodology within short timeframe.
  • Provides a quick overview of the available evidence on a chosen topic
  • Acknowledges the inherent weaknesses of conducting a fast, rapid review, including bias.
  • Useful for finding evidence fast to support quick decisions making.
  • Purposely restricted to searching a limited number of key resources or databases
  • Limited in the types of studies they might include (e.g. randomised controlled trials)
  • Less focus on the overall quality of the original source material or evidence
  • Conducts only limited data extraction
  • Uses a smaller team over several weeks, catering for tight deadlines

Additonal resources

Crawford, C., Boyd, C., Jain, S., Khorsan, R. and Jonas, W. (2015) 'Rapid Evidence Assessment of the Literature (REAL©): Streamlining the systematic review process and creating utility for evidence-based health care.' BMC Research Notes , 8(1) pp. 631-640.

More information

Department for International Development. (2015)  Rapid Evidence Assessments . [Online] [Accessed on 6th August] https://www.gov.uk/government/collections/rapid-evidence-assessments

  • << Previous: Scoping review
  • Next: Evidence and gap mapping exercise >>
  • Last Updated: Jan 23, 2024 10:52 AM
  • URL: https://plymouth.libguides.com/systematicreviews

Rapid Evidence Assessment of the Literature (REAL(©)): streamlining the systematic review process and creating utility for evidence-based health care

Affiliations.

  • 1 Samueli Institute, 1737 King Street, Suite 600, Alexandria, VA, 22314, USA. [email protected].
  • 2 Samueli Institute, 1737 King Street, Suite 600, Alexandria, VA, 22314, USA. [email protected].
  • 3 Samueli Institute, 2101 East Coast Hwy., Suite 300, Corona del Mar, CA, 92625, USA. [email protected].
  • 4 Samueli Institute, 2101 East Coast Hwy., Suite 300, Corona del Mar, CA, 92625, USA. [email protected].
  • 5 Samueli Institute, 1737 King Street, Suite 600, Alexandria, VA, 22314, USA. [email protected].
  • PMID: 26525982
  • PMCID: PMC4630849
  • DOI: 10.1186/s13104-015-1604-z

Background: Systematic reviews (SRs) are widely recognized as the best means of synthesizing clinical research. However, traditional approaches can be costly and time-consuming and can be subject to selection and judgment bias. It can also be difficult to interpret the results of a SR in a meaningful way in order to make research recommendations, clinical or policy decisions, or practice guidelines. Samueli Institute has developed the Rapid Evidence Assessment of the Literature (REAL) SR process to address these issues. REAL provides up-to-date, rigorous, high quality SR information on health care practices, products, or programs in a streamlined, efficient and reliable manner. This process is a component of the Scientific Evaluation and Review of Claims in Health Care (SEaRCH™) program developed by Samueli Institute, which aims at answering the question of "What works?" in health care.

Methods/design: The REAL process (1) tailors a standardized search strategy to a specific and relevant research question developed with various stakeholders to survey the available literature; (2) evaluates the quantity and quality of the literature using structured tools and rulebooks to ensure objectivity, reliability and reproducibility of reviewer ratings in an independent fashion and; (3) obtains formalized, balanced input from trained subject matter experts on the implications of the evidence for future research and current practice.

Results: Online tools and quality assurance processes are utilized for each step of the review to ensure a rapid, rigorous, reliable, transparent and reproducible SR process.

Conclusions: The REAL is a rapid SR process developed to streamline and aid in the rigorous and reliable evaluation and review of claims in health care in order to make evidence-based, informed decisions, and has been used by a variety of organizations aiming to gain insight into "what works" in health care. Using the REAL system allows for the facilitation of recommendations on appropriate next steps in policy, funding, and research and for making clinical and field decisions in a timely, transparent, and cost-effective manner.

Publication types

  • Research Support, U.S. Gov't, Non-P.H.S.
  • Evidence-Based Practice* / statistics & numerical data
  • Information Storage and Retrieval* / methods
  • Meta-Analysis as Topic
  • Publications* / statistics & numerical data
  • Reproducibility of Results
  • Systematic Reviews as Topic*

DistillerSR Logo

About Systematic Reviews

The Difference Between a Rapid Review vs Systematic Review

rapid evidence assessment vs literature review

Automate every stage of your literature review to produce evidence-based research faster and more accurately.

Health policymakers and system implementers are often faced with situations that require critical decisions to be made within the shortest time possible. This makes systematic reviews less practical. Fortunately, rapid review methods are helping to streamline this process. In addition to rapid reviews, there are several other types of review methods that can help move the review and approval process along. Understanding the differences between a peer review vs systematic review and an integrative review vs systematic review is essential to making the right choice for your research. Each of these types of reviews comes with its own advantages and drawbacks. The use of a review type depends on the research needs of the author and the place-time attributes of the intended research.

Systematic Review

A systematic review employs reproducible, analytical approaches to identify, collect, choose, and critically evaluate data from multiple studies that can be included in a scientific review. If you are looking for a systematic review example , you can find all you need to know in the link.

A systematic review seeks to answer a specific predefined research question that should be carefully formulated to guide the review. Mostly the PICO model is used to formulate a concise research question. The research question helps in determining the eligibility criteria used. The review type tells the researcher how to gather information from specified research, and present findings.

Learn More About DistillerSR

(Article continues below)

rapid evidence assessment vs literature review

Rapid Review

A rapid review is the synthesis of evidence designed to provide more timely data for speedy decision-making. Compared to a systematic review, a rapid review takes a much shorter time to complete. Although the approaches used in rapid reviews vary greatly, they usually take less than five weeks. With rapid reviews, there are short deadlines because they omit several phases of the review process that are essential in systematic reviews. The time-decompression aspect of rapid reviews makes them an attractive alternative.

A rapid review is mostly used to:

  • explore a new or developing research topic
  • update a previous review, or
  • evaluate a critical topic

It’s also used to reevaluate existing facts about a policy or practice that was based on systematic-review methods. In rapid reviews, several methods are used to simplify or omit some of the processes used in systematic reviews, including reducing databases, allocating one reviewer for each review stage, omitting or minimizing the use of gray literature (information produced outside traditional publishing and distribution channels), and narrowing the scope of the review.

In terms of impartiality, rapid reviews may be more prone to bias than systematic reviews. The use of several methods stated above may lead to exclusion of studies that may have been impactful in developing a consistent conclusion. The use of these methods develops a certain scope, which constraints the results of a rapid review of that specific scope. However, the extent of this restriction is still unknown. Although many health policymakers and system implementers have embraced rapid reviews, some stakeholders in academia have expressed their reservations, arguing that rapid reviews are “quick and dirty”. But this shouldn’t negate their usefulness, as there is a time and place where a rapid review is exactly what’s needed.

3 Reasons to Connect

rapid evidence assessment vs literature review

  • Reserve a study room
  • Library Account
  • Undergraduate Students
  • Graduate Students
  • Faculty & Staff

Rapid Review Protocol

What is a rapid review, step 1: form/refine question, step 2: define parameters, step 3: identify biases, step 4: plan & execute search, step 5: screen & select, step 6: quality appraisal, step 7: evidence synthesis, rapid review workbook.

For articles that will be included in your review, keep track of your findings with a review matrix. Click on the image below to view a sample review matrix:

Sample health sciences review matrix

You can also download this template as a Microsoft Excel file .

How can the Health Sciences Library Help?

Health Sciences librarians can assist you with:

  • Expert literature searches
  • Finding protocols
  • Citation management assistance
  • Organizing your rapid review findings

If your database search results in too many or too few citations, please contact us! We are trained on how to craft efficient searches.

Contact your liaison librarian to schedule a consultation.

Additional Resources

  • Bibliography References cited in this research guide.
  • Supplemental Resources Other resources that can assist with your rapid review.

A rapid review (or rapid evidence assessment) is a variation of a systematic review   that balances time constraints with considerations in bias.

rapid evidence assessment vs literature review

Consider your research question. Is it focused and well-defined?

After taking into account basic considerations such as the biology and physiology of the problem, its epidemiology, and the unsatisfactory clinical performance and patient outcomes that lead to interest in the topic, Haynes 3 suggests the following to further develop a research topic:

  • What is the appropriate stage for evaluation?
  • Can internal validity be achieved?
  • To what extent is external validity achievable?
  • What will your circumstances permit?
  • What can you afford?
  • What is the best balance between "idea" and "feasibility"?

Alternatively, Farrugia 4  summarizes two frameworks for refining research questions, FINER and PICOT.

rapid evidence assessment vs literature review

Determine the parameters of your literature search by answering the following questions:

What resources will you search?

Databases commonly searched at VCU include PubMed , CINAHL , Web of Science , and  PsycInfo . Embase is another good resource for institutions with access to it. Popular "grey literature" resources include  clinicaltrials.gov , NIH RePORTER ,  Dissertations and Theses , and professional associations' conference proceedings. Check our  research guides for additional resources.

What will be your inclusion/exclusion criteria?

Some criteria to consider include: time period, language,  location,  age range, animal or human studies, type of published materials (e.g. randomized-control trials, cohort studies, etc.)

What will be your screening protocol?

Things to consider include:

How many reviewers will you have and who will they be? ( The  IOM  recommends a team of 2+ reviewers for systematic reviews.) 5

If you use multiple reviewers, how will disagreements between them be settled (e.g. consensus, third-party)?

The Cochrane Handbook  (7.2.3) lists specific steps to take in the screening and selection process  that could be adapted for a rapid review. 2

How will you appraise the quality of selected studies? What  tool/rubric will you use?

Many reviews employ a system similar to that developed by the Cochrane Handbook for assessing bias in interventional studies ( Section 8.5, Table 8.5a ). 2

Many recent studies also analyze and suggest more efficient and reliable ways to assess the quality of quantitative, qualitative, and mixed methods studies.   See supplemental resources .

Critical appraisal worksheets may be useful  for a small number of studies. Some examples of these can be found on the Oxford Centre for Evidence-Based Medicine's (CEBM) website , and Duke's EBP research guide . Note whether you decide to modify these worksheets in order to save time; this may create some bias in your conclusions.

As a result of your choices in Step 2 , what biases will be introduced into your protocol? Are these biases acceptable given your time constraints?

rapid evidence assessment vs literature review

Plan your search.

  • Consult with a health sciences librarian . Some studies have shown that librarian involvement can improve the quality of reported systematic review and meta-analysis search strategies. 9,10
  • Determine the best method for documenting your search (e.g. spreadsheet, etc.). 
  • Select a citation management tool to use. VCU librarians can provide instruction and troubleshooting for Mendeley and Zotero, which are free to the VCU community.

Execute your search and store your citations.

Screen search results based on the criteria defined in Step 2 .

A table or worksheet is often used to keep track of the screening and review process. Sample screening worksheets: 1 , 2 , 3 .

Apply appraisal tool/rubric selected in Step 2 to identify high quality studies that  will be included in your evidence synthesis. The simplest way to track the final quality judgment will vary by tool, e.g.  1 , 2 .

Evidence summary tables are used to track important characteristics of appraised studies, including the reference, study design, sample size, and quality score. Examples of tables used to present the studies included in a review differ by the aspects listed above as well as other, e.g. 1 , 2 , 3 .  Consider creating your own review matrix (sample Excel file)   to take notes on papers that will be included in your study.

A narrative synthesis of studies that made it through the screening and quality appraisal phases is a simple, efficient way to set the stage for your own work. At a minimum, the synthesis should include:

  • Study problem / purpose
  • Why is the research important?
  • Has it been done before?
  • How will the study benefit patients, increase knowledge, or influence policy?
  • What methods are most commonly used in previous studies?
  • What are the most common outcomes analyzed?
  • Is there a significant patient population that is not well - studied?
  • What limitations were present in the existing body of research?
  • What sources of bias could have been introduced in your review of literature?
  • Last Updated: Nov 9, 2023 2:15 PM
  • URL: https://guides.library.vcu.edu/rapidreview

Cookies on GOV.UK

We use some essential cookies to make this website work.

We’d like to set additional cookies to understand how you use GOV.UK, remember your settings and improve government services.

We also use cookies set by other sites to help us deliver content from their services.

You have accepted additional cookies. You can change your cookie settings at any time.

You have rejected additional cookies. You can change your cookie settings at any time.

rapid evidence assessment vs literature review

  • International
  • International aid and development

Rapid evidence assessments

DFID rapid evidence assessments provide rigorous and policy relevant syntheses of evidence, carried out in 3-6 months.

Rapid evidence assessments provide a more structured and rigorous search and quality assessment of the evidence than a literature review but are not as exhaustive as a systematic review. They can be used to:

  • gain an overview of the density and quality of evidence on a particular issue
  • support programming decisions by providing evidence on key topics
  • support the commissioning of further research by identifying evidence gaps
  • 8 March 2018
  • Research and analysis
  • 6 July 2017
  • 26 January 2017
  • 24 November 2016
  • 17 November 2016
  • 15 September 2016
  • 29 July 2016
  • 27 July 2016
  • 27 October 2015
  • 28 July 2015

Financial services and SME: rapid evidence assessment added

Public Procurement Reform: rapid evidence assessment added

Legislative Oversight in Public Financial Management: Rapid Evidence Assessment added

The effectiveness of conflict prevention interventions added

Security Sector Reform and Organisational Capacity Building added

4 new assessments added

First published.

Is this page useful?

  • Yes this page is useful
  • No this page is not useful

Help us improve GOV.UK

Don’t include personal or financial information like your National Insurance number or credit card details.

To help us improve GOV.UK, we’d like to know more about your visit today. Please fill in this survey .

IMAGES

  1. Basic steps of a Rapid Evidence Assessment of the Literature (REAL

    rapid evidence assessment vs literature review

  2. Flowchart depicting the steps of the Rapid Evidence Assessment (REA

    rapid evidence assessment vs literature review

  3. Rapid evidence assessment process flow.

    rapid evidence assessment vs literature review

  4. 15 Literature Review Examples (2024)

    rapid evidence assessment vs literature review

  5. What is a Scoping Review?

    rapid evidence assessment vs literature review

  6. Understanding an Integrative Review vs Systematic Review

    rapid evidence assessment vs literature review

VIDEO

  1. Difference Between Studies and Literature

  2. Rapid Office Strain Assessment

  3. Assessing

  4. THE TEACHING AND ASSESSMENT OF LITERATURE STUDIES

  5. ASK THE EXPERT: What are the differences between rapid tests and PCR?

  6. The Power of a Systematic Literature Review: Unearthing Hidden Insights

COMMENTS

  1. PDF CEBMa Guideline for Rapid Evidence Assessments

    A Rapid Evidence Assessment (REA) provides a balanced assessment of what is known (and not known) in the scientific literature about an intervention, problem or practical issue by using a ... review, and a traditional literature review, please see the Appendix: 'Summarizing Scientific Literature'. Requirements for reviewers To successfully ...

  2. Rapid Evidence Assessment of the Literature (REAL

    Conclusions. The REAL is a rapid SR process developed to streamline and aid in the rigorous and reliable evaluation and review of claims in health care in order to make evidence-based, informed decisions, and has been used by a variety of organizations aiming to gain insight into "what works" in health care.

  3. Rapid reviews: the pros and cons of an accelerated review process

    A rapid review is a 'a form of knowledge synthesis that accelerates the process of conducting a traditional systematic review through streamlining or omitting various methods to produce evidence for stakeholders in a resource-efficient manner'. 12 There is not a single-validated methodology in conducting rapid reviews. 13 Therefore, variation in methodological quality of rapid reviews can ...

  4. Rapid Evidence Assessment

    Rapid Evidence Assessment of the Literature (REAL©): streamlining the systematic review process and creating utility for evidence-based health care. BMC research notes, 8(1), 1-9. Full Text Thomas, J., Newman, M., & Oliver, S. (2013). Rapid evidence assessments of research to inform social policy: taking stock and moving forward.

  5. Rapid Reviews: the easier and speedier way to evaluate papers, but with

    It can be difficult to access all the literature if it is restricted or in a different language, exposing it to publication bias. The review should include details of these concessions and challenges to give context to any of the claims made. The writing process. The process of performing a rapid review is laid out below: THE LITERATURE SEARCH

  6. Rapid evidence assessment: increasing the transparency of an ...

    Rapid evidence assessment (REA), also known as rapid review, has emerged in recent years as a literature review methodology that fulfils this need. It highlights what is known in a clinical area to the target audience in a relatively short time frame. Methods: This article discusses the lack of transparency and limited critical appraisal that ...

  7. PDF Rapid Evidence Assessments: A guide for commissioners, funders, and

    Rapid Evidence Assessment 45 1.Title (25 words max) 45 2. Short Description and Review Questions (50-100 words) 45 3. Key Findings (200-500 words) 46 4. Review Methods 47 5. Findings and Synthesis 50 CAPE Rapid Evidence Assessments: A guide for commissioners, funders, and policymakers 2

  8. Rapid evidence assessment

    Synonyms: rapid evidence review, rapid review. Rapid Evidence Assessment is a process that uses a combination of key informant interviews and targeted literature searches to produce a report in a few days or a few weeks. This process is faster and less rigorous than a full systematic review but more rigorous than ad hoc searching.

  9. Systematic, scoping, and rapid reviews: An overview

    Systematic review. "A systematic review attempts to identify, appraise and synthesize all the empirical evidence that meets pre-specified eligibility criteria to answer a given research question. Researchers conducting systematic reviews use explicit methods aimed at minimizing bias in order to produce more reliable findings that can be used to ...

  10. Rapid literature review: definition and methodology

    KEYWORDS. Rapid review; systematic literature review; methodology; Delphi consensus. Introduction. A systematic literature review (SLR) summarizes the results of all available studies on a specific topic and provides a high level of evidence. Authors of the SLR have to follow an advanced plan that covers defining a priori information regarding ...

  11. What are 'rapid reviews' and why do we need them?

    A rapid review is a form of knowledge synthesis that accelerates the process of conducting a traditional systematic review through streamlining or omitting specific methods to produce evidence for stakeholders in a resource-efficient manner. The timeframe of the review depends on resource availability, the quantity and quality of the literature ...

  12. Chapter 1

    Other common terms used for these types of literature reviews include rapid reviews, rapid evidence reviews, and rapid evidence assessments (Littell 2018; Tricco et al. 2015). For the sake of consistency, the type of literature review evaluated in this paper will be referred to as a rapid evidence review for the remainder of the paper.

  13. Rapid Review

    A typology of literature reviews. "A rapid review is a form of knowledge synthesis that accelerates the process of conducting a traditional systematic review through streamlining or omitting specific methods to produce evidence for stakeholders in a resource-efficient manner." (Garritty et. al, 2020)

  14. Paper 2: Performing rapid reviews

    Needs assessment, topic selection, and topic refinement. Rapid reviews are typically conducted at the request of a particular decision-maker, who has a key role in posing the question, setting the parameters of the review, and defining the timeline [40,41,42].The most common strategy for completing a rapid review within a limited time frame is to narrow its scope.

  15. What is a Rapid Evidence Assessment (REA)? » CEBMa

    A SR is therefore transparent, verifiable and reproducible. Because of this the likelihood of bias is considerably smaller in a SR compared to traditional literature reviews. A Rapid Evidence Assessments (REAs) is another type of evidence summary that can inform practice. An REA applies the same methodology as a SR and both involve the ...

  16. A scoping review of rapid review methods

    Terminology used to describe the rapid review method. The most frequent term used to describe the rapid review approaches was 'rapid review', used in 34 of the included articles (Fig. 2).This was followed by 'rapid evidence assessment', which was used in 11 papers, 'rapid systematic review' in ten papers, and 'health technology assessment' or 'rapid health technology ...

  17. Rapid Evidence Assessment of the Literature (REAL

    Background Systematic reviews (SRs) are widely recognized as the best means of synthesizing clinical research. However, traditional approaches can be costly and time-consuming and can be subject to selection and judgment bias. It can also be difficult to interpret the results of a SR in a meaningful way in order to make research recommendations, clinical or policy decisions, or practice ...

  18. Rapid evidence assessment / review

    Description. Rapid evidence assessments or reviews allow for a structured and rigorous search, as well as a quality assessment of the uncovered evidence, but are not as extensive and exhaustive as a systematic review. They often provide a brief summary of the evidence discovered, so that informed, evidence-based, conclusions can be drawn.

  19. Rapid Evidence Assessment of the Literature (REAL ...

    Conclusions: The REAL is a rapid SR process developed to streamline and aid in the rigorous and reliable evaluation and review of claims in health care in order to make evidence-based, informed decisions, and has been used by a variety of organizations aiming to gain insight into "what works" in health care. Using the REAL system allows for the ...

  20. Rapid literature review: definition and methodology

    Definition. Cochrane Rapid Reviews Methods Group developed methods guidance based on scoping review of the underlying evidence, primary methods studies conducted, as well as surveys sent to Cochrane representative and discussion among those with expertise [Citation 11].They analyzed over 300 RLRs or RLR method papers and based on the methodology of those studies, constructed a broad definition ...

  21. The Difference Between a Rapid Review vs Systematic Review

    A rapid review is the synthesis of evidence designed to provide more timely data for speedy decision-making. Compared to a systematic review, a rapid review takes a much shorter time to complete. Although the approaches used in rapid reviews vary greatly, they usually take less than five weeks. With rapid reviews, there are short deadlines ...

  22. Research Guides: Rapid Review Protocol: Steps: Rapid Review

    Step 4: Plan & Execute Search. Plan your search. Consult with a health sciences librarian. Some studies have shown that librarian involvement can improve the quality of reported systematic review and meta-analysis search strategies. 9,10. Determine the best method for documenting your search (e.g. spreadsheet, etc.).

  23. Rapid evidence assessments

    Rapid evidence assessments provide a more structured and rigorous search and quality assessment of the evidence than a literature review but are not as exhaustive as a systematic review.

  24. The Most Detailed Comparison of A Scoping Review vs Literature Review

    Scoping reviews provide an overview of existing evidence, identifying gaps in knowledge and research methods. On the other hand, literature reviews analyze information, identifying key concepts, theories, and methodologies. Therefore, it is essential to understand the differences to employ the most suitable strategies in research processes.