Background
Objectives
Methods
Stage I: development and population of the framework of methods
Search methods
Eligibility criteria
Study selection
Data extraction, coding and analysis
Stage II: Identification and mapping of evaluations of methods
Search methods
Eligibility criteria
Study selection
Data extraction
Data extracted | Description |
---|---|
Study characteristics | Citation details |
Primary objective | |
Search filter evaluation details | Type of search filter evaluation (categorised as single search filter evaluation, comparative search filter evaluation, comparative database evaluation) |
Health field filter designed for | |
Number of filters evaluated | |
Number of filters developed by author | |
Databases filters tested in and the interface(s) | |
Technique to identify and/or create gold standard | |
Sample size of the gold standard set or validation set | |
Performance measures (e.g. sensitivity/recall, specificity) | |
Search dates of the gold standard or validation set | |
Name of filters evaluated | |
Risk of bias criteria | Existence of a protocol |
Validation on a data set distinct from the derivation set |
Assessment of the risk of bias
Analysis
Results
Results of the search
Stage I: development and population of the framework of methods
Steps in the conduct of an overview | ||||||
---|---|---|---|---|---|---|
Citation | Type of study | Summary description of the article | Purpose, objectives, scope | Eligibility criteria | Search methods | Data extraction |
Baker 2014 [29] The benefits and challenges of conducting an overview of systematic reviews in public health: a focus on physical activity. | Article describing methods for overviews | • Describes the usefulness of overviews for decision makers and summarises some procedural steps to be undertaken • Provides a case study of an overview on public health interventions for increasing physical activity | ✓✓ | ✓✓ | ✓✓ | |
Becker 2008 [1] Overviews of reviews. | Guidance for undertaking overviews | • Early guidance providing the structure and procedural steps for the production of an overview • Details the different purposes of an overview, providing examples and describes how to present findings through tables and figures | ✓✓ | ✓✓ | ✓✓ | ✓✓ |
Bolland 2014 [30] A case study of discordant overlapping meta-analyses: vitamin D supplements and fracture. | Article describing methods for overviews | • Describes criteria for explaining differences in overlapping M-As with discordant conclusions • Builds on the guide to interpret discordant SRs proposed by Jadad 1997 • Suggests reporting items there are overlapping trials in M-As | ✓ | ✓✓ | ✓✓ | |
Caird 2015 [31] Mediating policy-relevant evidence at speed: are systematic reviews of systematic reviews a useful approach? | Article describing methods for overviews | • Describes the methodological challenges in the production of overviews that mediate existing synthesised knowledge to policy makers • Describes the trade-offs between producing a rapid overview and its comprehensiveness and reliability | ✓✓ | ✓✓ | ✓✓ | ✓✓ |
Chen 2014 [2] Scientific hypotheses can be tested by comparing the effects of one treatment over many diseases in a systematic review. | Study examining methods used in a cohort of overviews | • Identifies possible aims of an overview as being to detect unintended effects, improve the precision of effect estimates, or explore heterogeneity of effect across disease groups • Describes the value and pitfalls of synthesis of M-As using three case studies | ✓✓ | ✓✓ | ||
CMIMG 2012 [4] Review type and methodological considerations. | Guidance for undertaking overviews | • Provides updated Cochrane guidance on the purpose and conduct of overviews • Builds on the Cochrane guidance for overviews by Becker 2008 • Describes the factors in the decision to conduct an overview vs. an SR | ✓✓ | ✓✓ | ✓✓ | ✓✓ |
Cooper 2012 [32] The overview of reviews: unique challenges and opportunities when research syntheses are the principal elements of new integrative scholarship. | Article describing methods for overviews | • Describes steps in the conduct of an overview and methods to address challenges (for example dealing with overlap in primary studies) • Describes methods for second order meta-analysis | ✓✓ | ✓✓ | ✓✓ | ✓✓ |
Flodgren 2011 [33] Challenges facing reviewers preparing overviews of reviews.a
| Article describing methods for overviews | • Mentions the issue of missing or inadequately reported data • Mentions the challenges in summarising and evaluating large amounts of heterogeneous data | ✓ | |||
Foisy 2011 [34] Mixing with the ‘unclean’: Including non-Cochrane reviews alongside Cochrane reviews in overviews of reviews.a
| Article describing methods for overviews | • Describes some challenges inherent in the eligibility criteria process (defining AMSTAR scoring as inclusion criteria, inclusion of non-Cochrane reviews alongside Cochrane reviews) • Develops inclusion criteria to minimise overlap in primary studies | ✓✓ | ✓✓ | ||
Hartling 2012 [35] A descriptive analysis of overviews of reviews published between 2000 and 2011. | Study examining methods used in a cohort of overviews | • Describes steps in the conduct of overviews and methods used • Describes methodological standards for SRs (MECIR) and their applicability to overviews • Describes PRISMA reporting standards and their applicability to overviews | ✓ | ✓✓ | ✓✓ | ✓✓ |
Hartling 2013 [36] Generating empirical evidence to support methods for overviews of reviews.a
| Study examining methods used in a cohort of overviews | • Mentions challenges relating to the eligibility criteria process in terms of SR quality, search dates, the strength of the evidence to include, etc. | ✓ | ✓ | ||
Hartling 2014 [37] Systematic reviews, overviews of reviews and comparative effectiveness reviews: a discussion of approaches to knowledge synthesis. | Article describing methods for overviews | • Briefly defines overviews, mentions the purposes in conducting an overview, and discusses some methodological challenges | ✓ | |||
Ioannidis 2009 [38] Integration of evidence from multiple meta-analyses: a primer on umbrella reviews, treatment networks and multiple treatments meta-analyses. | Article describing methods for overviews | • Defines umbrella reviews as a pre-step to network meta-analysis • Describes challenges of overviews and a checklist of pitfalls | ✓✓ | |||
James 2014 [39] Informing the methods for public health overview reviews: a descriptive analysis of Cochrane and non-Cochrane public health overviews.a
| Study examining methods used in a cohort of overviews | • Briefly describes several steps in the conduct of overviews including determining the eligibility criteria and search methods • Compares Cochrane and non-Cochrane reviews in terms of restrictions on inclusion criteria | ✓ | ✓ | ✓ | |
Methodology for JBI umbrella reviews. | Guidance for undertaking overviews | • Provides guidance as to what methods should be used at which step in the conduct of an overview • Provides stylistic conventions for overviews to meet publication and reporting criteria for the JBI Database of Systematic Reviews and Implementation Reports | ✓✓ | ✓✓ | ✓✓ | ✓✓ |
Kovacs 2014 [42] Overviews should meet the methodological standards of systematic reviews. | Commentary or editorial that discuss methods for overviews | • Mentions four methodological shortcomings of one overview on surgical interventions as a letter to the editor | ✓ | |||
Kramer 2009 [43] Preparing an overview of reviews: lessons learned.a
| Article describing methods for overviews | • Mentions the challenges encountered when the authors conducted three overviews including missing information when extracting data | ✓ | ✓ | ||
Li 2012 [44] Quality and transparency of overviews of systematic reviews. | Article describing methods for overviews | • Presents a pilot reporting/quality checklist • Evaluates a cohort of overviews using the pilot tool, with the mean number of items but no details of the items | ✓ | ✓ | ✓ | ✓ |
Overviews of reviews often have limited rigor: a systematic review. | Study examining methods used in a cohort of overviews | • Describes the methods used in a cohort of overviews • Recommends using validated search filters for retrieval of SRs • Discusses whether to update the overview by including primary studies published after the most recent SR | ✓ | ✓✓ | ✓✓ | ✓ |
Pieper 2014 [46] Methodological approaches in conducting overviews: current state in HTA agencies. | Article describing methods for overviews | • Describes the methods recommended in 8 HTA guideline documents related to overviews • Compares the Cochrane Handbook guidance to guidance produced by HTA agencies | ✓ | ✓ | ||
Pieper 2014 [47] Up-to-dateness of reviews is often neglected in overviews: a systematic review. | Study examining methods used in a cohort of overviews | • Describes the process of searching for primary studies in an overview • Presents decision rules for when to search for primary studies • Outlines search methods in terms of sequential versus parallel searching for SRs and primary studies | ✓✓ | |||
Integrating bodies of evidence: existing systematic reviews and primary studies. | Article describing methods for overviews | • Describes the steps to undertake a complex review that includes multiple SRs, which is similar to overviews • Discusses challenges inherent in the production of complex reviews that include SRs | ✓✓ | ✓✓ | ✓✓ | ✓✓ |
Building blocks for meta-synthesis: data integration tables for summarising, mapping, and synthesising evidence on interventions for communicating with health consumers. | Article describing methods for overviews | • Presents tabular methods to deal with the preparation of overview evidence • Discusses the data extraction process and organisation of data • Presents a table of taxonomy of outcomes from the included SRs, and a data extraction table based on this taxonomy | ✓✓ | ✓✓ | ||
Salanti 2011 [3] Evolution of Cochrane intervention reviews and overviews of reviews to better accommodate comparisons among multiple interventions. | Guidance for undertaking overviews | • Provides Cochrane guidance on the definition of an overviews and as compared to SRs • Suggests broadening the search in an overview to include individual studies • Suggests missing data should be retrieved from original reports | ✓✓ | ✓✓ | ✓✓ | ✓ |
Silva 2015 [55] Overview of systematic reviews - a new type of study. | Study examining methods used in a cohort of overviews | • Examines a cohort of Cochrane reviews for methods used • Documented the sources and types of search strategies conducted | ✓✓ | |||
Singh 2012 [56] Development of the Metareview Assessment of Reporting Quality (MARQ) Checklist. | Article describing methods for overviews | • Presents a pilot reporting/quality checklist • Evaluates four case studies using the pilot tool, with the mean number of items but no details of the items | ✓✓ | ✓✓ | ||
Smith 2011 [57] Methodology in conducting a systematic review of systematic reviews of healthcare interventions. | Article describing methods for overviews | • Describes some steps and challenges in undertaking an overview, namely search methods, study selection, quality assessment, and presentation of results • Presents tabular methods for the preparation of an overview | ✓✓ | ✓✓ | ✓✓ | ✓ |
Thomson 2010 [58] The evolution of a new publication type: Steps and challenges of producing overviews of reviews. | Article describing methods for overviews | • Describes some steps in undertaking an overview and the challenges inherent in production of overviews • Discusses that gaps or lack of currency in included evidence will weaken the overview findings | ✓✓ | ✓✓ | ||
Thomson 2013 [59] Overview of reviews in child health: evidence synthesis and the knowledge base for a specific population. | Study examining methods used in a cohort of overviews | • Describes the process of including trials in overviews ▪ Discusses the challenge of overview topics differing from the topics of the included SRs ▪ Provides potential solutions as to what to do when mixed populations are reported in SRs and how to extract age subgroup data | ✓✓ | ✓✓ | ✓✓ |
Step | Sub-step | Methods/approaches | Sources ▪ Examples |
---|---|---|---|
1.0 Determine stakeholder involvement in planning the overview | |||
1.1 Agree on who is responsible for setting the overall purpose and objectives | |||
1.1.1 Commissioners of the overview | |||
1.1.2 Researcher or author team | |||
1.1.3 Multiple/all stakeholders in collaboration | |||
1.2 Determine the extent and approach to stakeholder involvement in defining the purpose, objectives and scope of the overview (i.e. who, on what aspects, at what stage(s), how) | |||
2.0 Define the purpose, objectives and scope | |||
2.1 Define the purpose of the overview | |||
2.1.1 Map the type and quantity of available evidence (e.g. types of interventions, outcomes, populations/settings, study designs but not effects) | |||
2.1.2 Compare multiple interventions with the intent of drawing inferences about the comparative effectiveness of the interventions intervention for the same condition, problem or population | Becker 2008 [1]; CMIMG 2012 [4]; Cooper 2012 [32]; Hartling 2012 [35]; Hartling 2014 [37]; Ioannidis 2009 [38]; Ryan 2009 [53, 54]; Salanti 2011 [3]; Smith 2011 [57] ▪ An overview of interventions for nocturnal enuresis (Becker 2008 [1]) | ||
2.1.3 Summarise the effects of an intervention for the same condition, problem or population where different outcomes are addressed in different SRs | Becker 2008 [1]; CMIMG 2012 [4]; Cooper 2012 [32]; Hartling 2012 [35]; Hartling 2014 [37]; Ryan 2009 [53, 54]; Salanti 2011 [3]; Smith 2011 [57] ▪ An overview of hormone replacement therapy for menopause where outcomes may include bone density, menopausal symptoms, cardiovascular risk/ events, cognitive function etc. (Becker 2008 [1]) | ||
2.1.4 Summarise the effects of an intervention across conditions, problems or populations (e.g. “borrowing strength” when there is sparse data for a single condition and a similar mechanism of action for the intervention is predicted across conditions) | Becker 2008 [1]; Chen 2014 [2]; CMIMG 2012 [4]; Cooper 2012 [32]; Hartling 2012 [35]; Hartling 2014 [37]; Ryan 2009 [53, 54]; Salanti 2011 [3]; Smith 2011 [57] ▪ An overview of vitamin A for different populations and conditions (Becker 2008 [1]) | ||
2.1.5 Summarise unexpected (including adverse) effects of an intervention across conditions, problems or populations | Becker 2008 [1]; Chen 2014 [2]; CMIMG 2012 [4]; Cooper 2012 [32]; Hartling 2012 [35]; Ioannidis 2009 [38]; Salanti 2011 [3]; Smith 2011 [57] ▪ An overview of adverse effects of NSAIDs when used for osteoarthritis or rheumatoid arthritis or menorrhagia (Becker 2008 [1]) | ||
2.1.6 Identify and explore reasons for heterogeneity in the effects of an intervention (e.g. by examining reasons for discordant results or conclusions across SRs) | Bolland 2014 [30]; Caird 2015 [31]; Chen 2014 [2]; Cooper 2012 [32]; JBI 2015 [40, 41]; Singh 2012 [56]; Smith 2011 [57] ▪ Overview investigating differences between the meta-analyses of vitamin D for prevention of fracture (Bolland 2014 [30]) | ||
2.1.7 Other purposes | |||
2.2 Confirm that an overview is the appropriate type of study for addressing the purpose and objectives, as opposed to other types of reviews (i.e. intervention review, network meta-analysis) | |||
2.2.1 Use a decision algorithm | ▪ CMIMG editorial decision tree which covers decision points for choosing between an overview or a new or updated SR (with or without network meta-analysis) (Salanti 2011 [3]) | ||
2.2.2 Use other reasoning (triggers), for example, a new or updated SR might be more appropriate than an overview when SRs: (i) are not available, or have insufficient overlap with the overview question/PICO, (ii) have methodological shortcomings (including not being up-to-date), (iii) are discordant and the reason for discordance cannot be identified (e.g. by methodological differences), and (iv) need independent confirmation (or disconfirmation) (e.g. where SR authors have conflicts of interest such as industry ties or funding) | |||
2.3 Determine any constraints that will restrict the scope of the overview (e.g. time, staffing, skill set) | |||
2.4 Define the scope of the overview taking into account 2.1–2.3 | |||
2.4.1 Narrow scope-based on a well-defined question (specific PICOs) or methodological criteria restrictions (i.e. date range of eligible literature, sources searched, publication types and study designs, extent and quality of data extracted, type of synthesis undertaken) | Baker 2014 [29]; Chen 2014 [2]; CMIMG 2012 [4]; Cooper 2012 [32]; JBI 2015 [40, 41]; Pieper 2012 [6, 45]; Ryan 2009 [53, 54]; Salanti 2011 [3]; Thomson 2010 [58] ▪ Interventions restricted to a specific intervention for a specific condition/population (e.g. smoking cessation therapies for reducing harmful effects of smoking during pregnancy) | ||
2.4.2 Broad scope - based on a broadly defined question with diverse and multiple PICOs elements, or no methodological restrictions | Baker 2014 [29]; Caird 2015 [31]; Chen 2014 [2]; CMIMG 2012 [4]; Cooper 2012 [32]; JBI 2015 [40, 41]; Pieper 2012 [6, 45]; Pieper 2014 [46]; Ryan 2009 [53, 54]; Salanti 2011 [3]; Smith 2011 [57]; Thomson 2010 [58] ▪ Interventions of broad policy relevance (e.g. any intervention to reduce the harmful effects of smoking, including cessation therapies, mass media, and pricing policies.) | ||
2.5 Define the objectives using PICO elements (or equivalent) to develop an answerable question |
Step | Sub-step | Methods/approaches | Sources ▪ Examples |
---|---|---|---|
1.0 Plan the eligibility criteria | |||
1.1 Determine PICO eligibility criteria for the overview (and setting and timing if applicable) | |||
1.2 Determine PICO eligibility criteria for SRs | |||
1.2.1 Select only SRs that are similar (or narrower) in scope to the overview PICO elements (i.e. exclude SRs that include out-of-scope interventions/populations in addition to the intervention/population addressed by the overview) | |||
1.2.2 Select all SRs that address the PICO elements, including those broader in scope than the overview (i.e. SRs that include the intervention/ population addressed by the overview, plus other out-of-scope interventions/ populations). This may involve selecting: (i) any SR, irrespective of whether separate data are available for the subgroup of interest or (ii) limiting to SRs that present separate data for the subgroup of interest | |||
1.3 Determine criteria (mechanisms) to select outcomes where there are multiple | |||
1.3.1 Include all outcomes reported in included SRs | |||
1.3.2 Select one or more outcomes using pre-specified criteria, for example: (i) outcomes judged important by subject specialists (e.g. consumers, policy makers), (ii) primary outcomes, and (iii) outcomes common to more than one SR | |||
1.3.3 Select one or more outcomes using pre-specified decision rules (e.g. combine selection criteria in an algorithm) | Inferred method | ||
1.4 Determine methodological eligibility criteria for SRs | |||
1.4.1 Include all SRs that meet the PICO criteria (i.e. no methodological criteria applied) | Caird 2015 [31] | ||
1.4.2 Select SRs that meet minimum quality criteria or take a particular methodological approach. Minimum criteria include: (i) meets definition of an SR, (e.g. explicit search) (ii) up-to-date (iii) quality of the SR (e.g. based on selected criteria; cutoffs derived from AMSTAR score) (iv) use of best practice methods (e.g. specific RoB tools; Cochrane or AHRQ’s EPC methods) (v) free of conflicts of interest (e.g. no industry funding) (vi) reports sufficient primary study characteristics to interpret results (e.g. PICO elements, RoB assessment) Methodological approaches include: (vii) type of included primary studies (viii) type of data (ix) type of synthesis (e.g. meta-analysis, narrative) | |||
1.5 Determine eligibility criteria to deal with SRs with overlap | |||
1.5.1 Include all SRs that meet the PICO, irrespective of overlap | |||
1.5.2 Select one SR from multiple addressing the same question using pre-specified methodological criteria as outlined in 1.4.2 | ▪ Select the highest quality SR (Cooper 2012 [32]) | ||
1.5.3 Select one SR from multiple addressing the same question using pre-specified decision rules (e.g. combine one or more eligibility criteria in an algorithm) | Cooper 2012 [32] ▪ Select the SR with the most complete information, and if these are equivalent, the M-A with the greatest number of primary studies (Cooper 2012 [32]) | ||
1.5.4 Exclude SRs that do not contain any unique primary studies, when there are multiple SRs | Pieper 2014 [46] | ||
1.6 Determine whether to consider additional primary studies for inclusion | |||
1.6.1 Do not include primary studies | |||
1.6.2 Include primary studies if pre-specified eligibility criteria are met, for example: (i) when a SR is not up-to-date, (ii) when a SR is inconclusive (i.e. new studies may overturn the findings of a SR), (iii) when the included SRs provide incomplete coverage of evidence in relation to the overview PICO (e.g. missing one or more interventions, population subgroup, study design), and (iv) when there are concerns about the methods SRs used to identify and select studies | Baker 2014 [29]; Caird 2015 [31]; Cooper 2012 [32]; Pieper 2014 [46]; 2014 [47]; Thompson 2013 [59]; White 2009 [48‐52] ▪ Include primary studies if the evidence in the SRs is inconclusive (e.g. when addition of a new primary study may overturn the findings) (Pieper 2014 [47]) ▪ Include primary studies if the SRs are assessed as low quality (Pieper 2014 [47]) | ||
1.6.3 Include primary studies using pre-specified decision rules to determine eligibility (e.g. combine one or more eligibility criteria in an algorithm for selection) | Pieper 2014 [47] | ||
2.0 Plan the study selection process | |||
2.1 Determine the number of overview authors required to select studiesa
| |||
2.1.1 Independent screening all stages by 2 or more authors | |||
2.1.2 One author screening at all stages | |||
2.1.3 One author screening titles/abstracts, 2 or more screening full text | Hartling 2012 [35] | ||
2.1.4 One screened at all stages, 2nd confirmed | Hartling 2012 [35] | ||
2.1.5 One screened at all stages, 2nd confirms if uncertainty | Hartling 2012 [35] |
Step | Sub-step | Methods/approaches | Sources ▪ Examples |
---|---|---|---|
1.0 Plan the sources to search | |||
1.1 Determine the type of sources to search | |||
1.1.1 Select the types of databases to search (e.g. SR databases (e.g. Cochrane, Epistemonikos), prospective SR registers (e.g. PROSPERO), or general bibliographic databases (e.g. EMBASE, PubMed), or grey literature databases (e.g. conference databases, government websites)) | |||
1.1.2 Select other types of sources (e.g. reference checking, forward citation searching, handsearching key journals)a
| |||
1.1.3 Select a combination of 1.1.1–1.1.2 | |||
2.0 Plan the search strategy for retrieval of SRs | |||
2.1 Determine the search filter to use in general databases | |||
2.1.1 Select a published SR filter (e.g. EMBASE, MEDLINE, PubMed) | ▪ Montori 2006 SR filter (Cooper 2012 [32]) | ||
2.1.2 Develop a new search filter based on a conceptual approach or a textual analysis approach | |||
3.0 Plan how primary studies will be retrieved, if eligibility criteria determines that primary studies should be included | |||
3.1 Determine the sequence for searching | |||
3.1.1 Run a parallel search strategy for both SRs and primary studies simultaneously | |||
3.1.2 Run a sequential search strategy first for SRs and second for primary studies (i.e. either develop a strategy to search for primary studies, or use the search strategies of the included SRs to search for primary studies) | Pieper 2014 [47] | ||
3.2 Use pragmatic/expedient approaches to retrieve primary studies | Caird 2015 [31] ▪ Consult experts (Caird 2015 [31]) | ||
3.3 Select a combination of 3.1–3.2 |
Step | Sub-step | Methods/approaches | Sources ▪ Examples |
---|---|---|---|
1.0 Plan the data elements to extract | |||
1.1 Determine the data to extract on the characteristics of SRsa
| |||
1.2 Determine the data required to assess which SRs address the overview question and allow assessment of the overlap across SRsa
| Smith 2011 [57] | ||
1.3 Determine data to extract about the results from the SRs for each relevant primary outcome | |||
1.3.1 Extract M-A results | |||
1.3.2 Extract numeric trial results | Thomson 2013 [59] | ||
1.3.3 Extract narrative results | |||
1.3.4 Extract a combination of 1.3.1–1.3.3 | |||
1.3.5 Extract risk of bias assessment (overall assessment, or domain/item level data, or both) and certainty of the evidence | |||
1.4 Determine the data to extract from primary studiesa
| |||
1.4.1 Extract numerical trial results | Caird 2015 [31] | ||
1.4.2 Extract data required to assess risk of bias for each domain or item | Hartling 2012 [35] | ||
1.5 Develop a data extraction forma
| |||
2.0 Plan the data extraction process | |||
2.1 Determine the sources where data will be obtained from | |||
2.1.1 SRs | |||
2.1.2 Primary studies | |||
2.1.3 Registry entries (for SRs and/or trials) | Inferred method | ||
2.1.4 A combination of the above | |||
2.2 Determine how overlapping information across SRs will be handled | |||
2.2.1 Extract information from all SRs | |||
2.2.2 Extract information from only one SR based on a priori eligibility criteria | Cooper 2012 [32]; CMIMG 2012 [4]; Foisy 2011 [34]; Hartling 2014 [37]; Pieper 2012 [6, 45]; Pieper 2014 [47]; Thomson 2013 [59] ▪ SR with the greatest number of trials (Cooper 2012 [32]) | ||
2.3 Determine how discrepant data across SRs will be handled in data extraction | |||
2.3.1 Extract all data, recording discrepancies | |||
2.3.2 Extract data from only one SR based on a priori eligibility criteria | ▪ Most recent SR and SR of the highest quality (Pieper 2014 [47]) ▪ Highest quality SR (Cooper 2012 [32]) | ||
2.3.3 Extract data element (e.g. effect estimates, quality assessments) from the SR which meets decision rule criteria | ▪ SR that reports the most complete information on effect estimates (Bolland 2014 [30]) | ||
2.3.4 Reconcile discrepancies through approaches outlined in 2.4 | |||
2.4 Determine additional steps to deal with missing data from SRs, or when there is variation in information reported across SRs | |||
2.4.1 Retrieve reports of the primary studies | |||
2.4.2 Contact SR or trial authors, or both, for missing info and/or clarification | |||
2.4.3 Search SR or trial registry entries for information | Inferred method | ||
2.4.4 A combination of the above approaches | |||
2.4.5 Do not take additional steps to deal with missing data or discrepancies | |||
2.5 Pilot the data extraction forma
| |||
2.6 Determine the number of overview authors required to extract dataa
| |||
2.6.1 Single, double, or more | |||
2.6.2 Data extraction versus data checking | |||
2.7 Determine if authors (co-)authored one or several of the reviews included in the overview, and if yes, plan safeguards to avoid bias in data extraction | Buchter 2015 [60] ▪ Overview authors do not extract data from their co-authored SRs |
Methods/approaches proposed in the literaturea
| |||
---|---|---|---|
Scenario for which authors need to plan | Eligibility criteria (Table 4) | Data extraction (Table 6) | |
1 | Reviews include overlapping information and data (e.g. arising from inclusion of the same primary studies) | 1.4.2 1.5 (1.5.1–1.5.4) | 1.2 2.2 (2.2.1, 2.2.2) |
2 | Reviews report discrepant information and dataa
| 1.4.2 1.6.2, 1.6.3 | 2.3 (2.3.1–2.3.4) 2.2.1, 2.2.2 2.4 (2.4.1–2.4.5) |
3 | Data are missing or reviews report varying information (e.g. information on risk of bias is missing or varies across primary studies because reviews use different tools) | 1.6.2, 1.6.3 | 2.4 (2.4.1–2.4.5) |
4 | Reviews provide incomplete coverage of the overview question (e.g. missing comparisons, populations) | 1.6.2, 1.6.3 | 1.2 2.1.2, 2.1.4 2.4 |
5 | Reviews are not up-to-date | 1.4.2 1.6.2, 1.6.3 | 2.1.2, 2.1.4 |
6 | Review methods raise concerns about bias or quality | 1.4.2 1.6.2, 1.6.3 | 1.2 |
Characteristics of included articles
Specification of purpose, objectives and scope
Specification of eligibility criteria
Search methods
Data extraction
Addressing common scenarios unique to overviews
Stage II: identification and mapping of evaluations of methods
First Author Year Title | Primary objective | Existence of a protocol | Study design | Health field the filter designed for | # of filters evaluated (# filters developed by the author) | Database (interfaces) | Technique to identify and/or create a gold standard | Sample size of the gold standard set or validation set (n) | Validation on a data set distinct from the derivation data | Performance measures used | Search dates for the gold standard or validation set | Name of filters evaluated (number of filters) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
Boluyt 2008 [13] Usefulness of systematic review search strategies in finding child health systematic reviews in MEDLINE. | Assess search filters for child health SRs in PubMed | NR | Comparative search filter evaluation | Child health | 9 | PubMed | Handsearching, Developed based on database searches | 387 | Yes | Sensitivity/recall, precision | Handsearch 1994, 1997, 2000, 2002, and 2004; DARE up to 2004, and year 2006 | PubMed filter 2006 Shojania 2001 Boynton 1998 White 2001 (two) Montori 2005 (four) |
Boynton 1998 [15] Identifying systematic reviews in MEDLINE: developing an objective approach to search strategy design. | Evaluate propose a range of search strategies to identify SRs in MEDLINE | NR | Search filter evaluation, Comparative search filter evaluation | Medicine (general and internal) | 15 (11) | MEDLINE (Ovid) | Handsearching | 288 | No | Sensitivity/recall, precision | 1992 and 1995 | Boynton 1998 (eleven) Hunt 1997 (two) CRD - Oxman 1994 (two) |
Eady 2008 [19] PsycINFO search strategies identified methodologically sound therapy studies and review articles for use by clinicians and researchers | Evaluate search strategies for finding SRs in PsycINFO | NR | Search filter evaluation | Psych. | N/A | PsycINFO | Handsearching | 58 | No | Sensitivity/recall, precision, specificity, accuracy | 2000 | Eady 2008 |
Golder 2006 [20] Identifying systematic reviews of the adverse effects of health care interventions. | Identify SRs of adverse effects in two major databases | NR | Search filter evaluation | Adverse effects | N/A | DARE (CDSR and CRD) | Developed based on database searches | 270 | No | Sensitivity/recall, precision | 1994 to 2005 | Golder 2006 |
Lee 2012 [16] An optimal search filter for retrieving systematic reviews and meta-analyses. | Develop and validate the health-evidence.ca SR filter and compare its performance to other filters | NR | Search filter evaluation, Comparative search filter evaluation | Public health | 31 (3) | MEDLINE, EMBASE, and CINAHL | Handsearching, Developed based on database searches | 219 | Yes | Sensitivity/recall, precision, specificity NNR | 2004/2005 | health-evidence.ca SR filter - Lee 2012 (three) Montori [2005- four] Hunt 1997 (two) Shojania 2001 Boynton 1998 (two) BMJ Clin Evidence n.d. CRD Ciliska 2007 (four) SIGN n.d. Wilczynski 2007 (four) McKibbon 1998 |
Montori 2005 [17] Optimal search strategies for retrieving systematic reviews from Medline: analytical survey. | Develop optimal search strategies in Medline for retrieving SRs | NR | Search filter evaluation, Comparative search filter evaluation | Medicinefamily practice, nursing, mental health | 10 (4) | MEDLINE | Handsearching | 753 | Yes | Sensitivity/recall, specificity, precision | 2000 | Montori 2005 (four) White 2001 (three) Hunt 1997 (two) Shojania 2001 |
Rathbone 2016 [12] A comparison of the performance of seven key bibliographic databases in identifying all relevant systematic reviews of interventions for hypertension. | Evaluate seven databases to determine their coverage of SRs of hypertension | NR | Comparative database evaluation | Hypertension | N/A | Cochrane, DARE, EMBASE, Epistemonikos, MEDLINE, PubMed, and TRIP | Developed based on database searches | 440 | N/A | Sensitivity/recall, precision | 2003–2015 | SR filters incorporated into the databases; MEDLINE used Montori 2005 |
Shojania 2001 [21] Taking advantage of the explosion of systematic reviews: an efficient MEDLINE search strategy. | Evaluate a search strategy for identifying SRs | NR | Search filter evaluation | Treatment diagnosis, prognosis, causation, quality improvement, or economics | N/A | MEDLINE (PubMed) | Handsearching, Developed based on database searches | 104 | No | Sensitivity/recall, precision | 1999–2000 | PubMed n.d. |
White 2001 [18] A statistical approach to designing search filters to find systematic reviews: objectivity enhances accuracy. | Improve methods to derive a more objective search strategy to identify SRs in MEDLINE | NR | Search filter evaluation, Comparative search filter evaluation | Treatment diagnosis, prognosis, causation | 7 (5) | MEDLINE (Ovid) | Handsearching journals | 110 | No | Sensitivity/recall, precision | 1995 and 1997 | White 2001 (five] Boynton 1998 CRD - Wolf 1996 |
Wilczynski 2007 [22] EMBASE search strategies achieved high sensitivity and specificity for retrieving methodologically sound systematic reviews. | Develop search strategies that optimize the retrieval of SRs from EMBASE. | NR | Search filter evaluation | Internal medicinegeneral practice, mental health, nursing practice | N/A | MEDLINE | Handsearching journals | 220 | No | Sensitivity/recall, specificity, precision, accuracy | 2000 | Wilczynski 2007 |
Wilczynski 2009 [23] Consistency and accuracy of indexing systematic review articles and meta-analyses in medline. | Determine the consistency and accuracy of indexing SRs and meta-analyses in MEDLINE | NR | Search filter evaluation | Medicine | N/A | MEDLINE | Developed based on database searches | NA | No | Sensitivity/recall, specificity, precision, accuracy | 2000 | Wilczynski 2009 |
Wilczynski 2011 [24] Sensitive Clinical Queries retrieved relevant systematic reviews as well as primary studies: an analytic survey. | Determine how well the previously validated broad and narrow Clinical Queries retrieve SRs | NR | Search filter evaluation | Therapy, diagnosis prognosis, etiology | N/A | MEDLINE, EMBASE, CINAHL, and PsycINFO | Developed based on database searches | NA | No | Sensitivity/recall, specificity precision | 2000 | Wilczynski 2011 |
Wong 2006 [14] Comparison of top-performing search strategies for detecting clinically sound treatment studies and systematic reviews in MEDLINE and EMBASE. | Compare sensitivity and specificity of search strategies for detecting reviews in MEDLINE and EMBASE | NR | Comparative search filter evaluation | Medicine | 7 | MEDLINE, EMBASE | Handsearching journals | 753 in MEDLINE, 220 in EMBASE | N/A | Sensitivity/recall, specificity precision | 2000 | Montori 2005 (three) Wilczynski 2007 (four] |
Wong 2006 [26] Optimal CINAHL search strategies for identifying therapy studies and review articles. | Design optimal search strategies for locating review articles in CINAHL | NR | Search filter evaluation | Nursing and allied health | N/A | CINAHL | Handsearching journals | 127 | No | Sensitivity/recall, specificity precision, accuracy | 2000 | Wong 2006 |
Zacks 1998 [25] Developing search strategies for detecting high quality reviews in a hypertext test collection. | Determine whether sensitive and specific search strategies exist to select SRs | NR | Search filter evaluation | Etiology, prognosis, therapy diagnosis | N/A | SWISH v.1.1.1 | Developed based on database searches | 209 | No | Sensitivity/recall, specificity | Not reported | Zacks 1998 |