Background
What is a rapid response program and what types of programs already exist?
What are the main products of rapid response programs?
Objective of this research
Methods
Rapid reviews
Identifying supplementary evidence for potential shortcuts
Systematic review step | Possible “shortcuts” | Potential impact on the validity of the results | Relevant AMSTAR question and potential impact of shortcut on AMSTAR score |
---|---|---|---|
Preparation of a protocol | • Omit protocol | Unknown | Q1. Loss of one point if a protocol is not prepared and/or not mentioned in report |
Question formulation | • Limit the number of questions and sub-questions • Limit the scope of the question/s | None expected | |
Selecting relevant studies | • One reviewer screens titles and abstracts • One reviewer screens full text | Unknown, though one reviewer could miss up to 9 % of eligible randomized controlled trials [42] | Q2. Loss of one point if only one reviewer does screening and/or only one reviewer does data extraction |
Data extraction | • One reviewer extracts data | ||
• One reviewer extracts data with checking by a second reviewer | Unknown | ||
• Data extraction limited to key characteristics, results, conflicts of interest | Unknown | ||
Literature search | • Limit number of databases searched • Limit or omit hand searching of references lists and relevant journals • Eliminate consultation with experts to find additional studies | Q3. Loss of one point if less than two databases searched and/or no supplementary strategies | |
Inclusion criteria | |||
Gray literature | • Limit or omit gray literature | Q4. Loss of one point if gray literature omitted | |
Language | • English only | ||
Dates | • Narrow time frame, e.g., last 5 or 10 years | None expected | |
Study types | • Restrict study types to systematic reviews (and economic evaluations) • Restrict study types to randomized controlled trials or controlled clinical trials (and economic evaluations) | ||
Quality assessment | • Limit or omit quality assessment | Q7 and Q8. Loss of two points if not assessed, documented and used in formulation of conclusions | |
• Omit “a priori” specification • Done by one reviewer | Unknown | ||
Data synthesis | • Narrative synthesis only (no meta-analysis) | Unknown – meta-analysis can increase power and precision but also has potential to mislead if not applied appropriately and done correctly [19] | Q9. None if explained that meta-analysis not possible due to heterogeneity. If not, loss of one point |
Assessment of publication bias | • Omit | Unknown | Q10. Loss of one point if omitted |
Assessment of conflict of interest | • Omitted for individual studies and/or for systematic review | Unknown | Q11. Loss of one point if omitted |
Report | • Information included limited | Q1–11. Potential large loss of points if key AMSTAR questions not covered | |
External peer review | • Omit or limit | Unknown |
Case studies of existing rapid response programs
Rapid response program | Reason for selection | References |
---|---|---|
Cochrane response by Cochrane Innovations, Cochrane Collaborationa
| Has a potential global reach and is supported by the expertise and experience of the Cochrane Collaboration, though is still in development | |
McMaster Health Forum Rapid Response Program, McMaster University, Canada | Comes from a developed country with a strong history in knowledge translation, potentially national reach | |
Regional East African Community Health (REACH) Policy Initiative, Uganda | Comes from a low-income economy and has a published evaluation | [35] |
Sax Institute Evidence Check Program, NSW, Australia | Has a long history (since 2007) and is a state-level program |
Supplementary literature to inform the design and operationalization of the program
Results
Question 1—methodologies for rapid reviews
Key findings from the rapid review
Key findings from supplementary literature in relation to ‘shortcuts’
Question 2—strategies to facilitate evidence-informed decision-making
Key findings from the rapid review
Question 3—how best to operationalize the program
Key findings from the case studies
Model | Started | Reach | How funded | Time to complete a RR | External review of RR | RRs made publicly available | Lag period before publication |
---|---|---|---|---|---|---|---|
Cochrane response | 2013 | Potentially global |
Service
Not clear
Reviews
“User-pays” | ≈8 weeks (first review took 12 weeks) | Yes | All | Yes |
McMaster Health Forum Rapid Response Program | 2012 | Potentially national (Canada) |
Service
Ontario government
Reviews
Free in Ontario, “User-pays” rest of Canada | Max 6 weeks | Yes | All | Yes |
REACH Policy Initiative | 2010 | National (Uganda) |
Service and reviews
Donor funds | Max 4 weeks | Yes | Not reported | Not reported |
Sax Institute Evidence Check Program | 2006 | State, potentially national (Australia) |
Service
State government plus other funds
Reviews
“User-pays” | ≈12–16 weeks | No | Most (some kept confidential if requested by funder) | Yes |
Issue | Description of issue |
---|---|
Contracts and intellectual property | If it is a “user-pays” model the use of a contract can slow the process down. However, the impact can be minimized by operating in good faith and starting the review before the contract is signed (e.g. Cochrane Innovations, Sax Institute). Where a contract is used, the intellectual property is usually owned by the funder but there seems to be general acceptance of joint publication of completed reviews. However, this may not always be the case, for example, for the Sax Institute model not all reviews are made publicly available if confidentiality is requested by the funder. |
External review of the rapid review | External review or “merit review” has the potential to slow the process down if reviewers don’t respond quickly but the different services all seem to have found ways to manage this well, e.g., approaching another reviewer if the first one can’t commit to a quick response. |
Staffing | Recruiting staff with the right mix of skills and qualifications was noted as an issue for the REACH Policy Initiative model. The other three models used mentoring or internal training to address this issue, with the Sax Institute key informant noting that the Institute also had plans to develop a formal training program for researchers. |
Evaluation | None of the models have formally evaluated the impact of the service on the uptake of research evidence for policy and/or practice—though there are plans to do this for the McMaster service [27]. |
Issues particular to developing countries | Having a fast and reliable internet connection was noted as an issue for the REACH Policy Initiative model. Access to databases and full text papers was noted as an issue for the REACH Policy Initiative model. |
Key findings from supplementary literature
-
The continuous close relationship with a specific end user in an iterative fashion throughout the work to ensure that the product will meet the end user’s need
-
A high reliance on maintaining highly trained staff to conduct the reviews in a short time frame and that understand the type of product that might meet the needs of the decision-maker [11]
-
Include high-level representation from all relevant stakeholders, including policymakers and researchers, on a steering committee to govern the service
-
Implement minimum training standards and provide ongoing mentorship for staff contributing to the program
-
Funding needs to be long-term and cover both program delivery and ongoing evaluation of the program
Insights for designing a rapid response program for the Americas region
The product — rapid reviews
-
Making the process more efficient, e.g., using specialized software for the reviews such as DistillerSR®
-
Using a larger, highly skilled staff, who are part of a reserve capacity
-
Updating an existing high quality review