Introduction
Item 7: Describe all information sources (e.g., databases with dates of coverage, contact with study authors to identify additional studies) in the search and date last searched.
Item 8: Present full electronic search strategy for at least one database, including any limits used, such that it could be repeated.
Item 17: Give numbers of studies screened, assessed for eligibility, and included in the review, with reasons for exclusions at each stage, ideally with a flow diagram.
Part 1: Developing the Checklist
Part 2: Checklist
SECTION/TOPIC | ITEM # | CHECKLIST ITEM |
---|---|---|
INFORMATION SOURCES AND METHODS | ||
Database name | 1 | Name each individual database searched, stating the platform for each. |
Multi-database searching | 2 | If databases were searched simultaneously on a single platform, state the name of the platform, listing all of the databases searched. |
Study registries | 3 | List any study registries searched. |
Online resources and browsing | 4 | Describe any online or print source purposefully searched or browsed (e.g., tables of contents, print conference proceedings, web sites), and how this was done. |
Citation searching | 5 | Indicate whether cited references or citing references were examined, and describe any methods used for locating cited/citing references (e.g., browsing reference lists, using a citation index, setting up email alerts for references citing included studies). |
Contacts | 6 | Indicate whether additional studies or data were sought by contacting authors, experts, manufacturers, or others. |
Other methods | 7 | Describe any additional information sources or search methods used. |
SEARCH STRATEGIES | ||
Full search strategies | 8 | Include the search strategies for each database and information source, copied and pasted exactly as run. |
Limits and restrictions | 9 | Specify that no limits were used, or describe any limits or restrictions applied to a search (e.g., date or time period, language, study design) and provide justification for their use. |
Search filters | 10 | Indicate whether published search filters were used (as originally designed or modified), and if so, cite the filter(s) used. |
Prior work | 11 | Indicate when search strategies from other literature reviews were adapted or reused for a substantive part or all of the search, citing the previous review(s). |
Updates | 12 | Report the methods used to update the search(es) (e.g., rerunning searches, email alerts). |
Dates of searches | 13 | For each search strategy, provide the date when the last search occurred. |
PEER REVIEW | ||
Peer review | 14 | Describe any search peer review process. |
MANAGING RECORDS | ||
Total records | 15 | Document the total number of records identified from each database and other information sources. |
Deduplication | 16 | Describe the processes and any software used to deduplicate records from multiple database searches and other information sources. |
Part 3: Explanation and Elaboration
Item 1. Database name
Example
“The following electronic databases were searched: MEDLINE (Ovid), CINAHL (EBSCOhost), PsycINFO (Ovid), Cochrane Central Register of Controlled Trials (Ovid), SPORTDiscus (EBSCOhost), EMBASE (Ovid) and ProQuest Dissertations and Theses Global (ProQuest).” [38]
Explanation
Browse: Browsing is the practice of scanning for information by reviewing content. This may include using tables of contents, indices in books or other materials, web directories, full journal issues, specific web pages, or other types of information scanned without using a formal search strategy. | |
Citation index: A type of database or database function that enables searchers to analyze relationships between publications through citations, including what publications have cited, and which publications are citing the publication(s) you are interested in. Common examples include Science Citation Index, Scopus, and Google Scholar. | |
Cited reference: Publication referenced in a given publication. | |
Citing reference: Publications that have referenced a given publication. | |
Database: Within PRISMA-S, this refers to a literature database designed to search journal literature. Databases may be multidisciplinary or specialized. Many include specialized search features, subject headings, and structured data designed to facilitate easy and comprehensive searching. Examples include MEDLINE, EconLit, and PsycINFO. | |
Digital object identifier: Also called a DOI, a digital object identifier is a unique code assigned to a publication, dataset, or other online item or collection that will remain constant over time. | |
Field code: Unique to each database platform and database, field codes are used to specify where a term is searched for in a database record. In PubMed, for instance, the field code [tiab] is placed after a search term to tell the database to search only within the title and abstract fields. | |
Filter: Filters are predefined combinations of search strategies designed to locate references meeting certain criteria, usually publication type, topic, age group, or other categorization. Filters generally are combinations of keywords, subject headings or thesaurus terms, logical operators, and database-specific syntax. Many filters are validated and offer sensitivity and specificity information that allows searchers to determine their usefulness for a given search. Filters may also be called hedges or optimal search strategies and are designed for other searchers to use and reuse. | |
Indexing: Application of standard terminology to a reference to describe the contents of the full article. Depending on the database or other information source, indexers may add subject headings or thesaurus terms as well as list age groups, language, human studies, study design, publication type, or other descriptive terms. Examples of indexing terminology include MEDLINE’s Medical Subject Headings (MeSH) and Embase’s EMTREE. | |
Information source: Any database or other resource such as a web site, journal table of contents, email alert, web directory, contact with authors or industry, study registry, or preprint server, etc. searched or browsed as part of the search. | |
Literature search: Here, an overall term for the entire information retrieval process as part of a systematic review. This includes the full range of searching methods and information sources, including databases, study registries, regulatory datasets, web searches, government documents, unpublished data, and much more. | |
Limits: Features built into databases to allow searchers to quickly restrict their search by one or more categories. Common limits built into databases include publication date ranges, language, gender, age group, and publication type. Limits are different from filters (see above) and are also not the inclusion/exclusion criteria used in the screening process. | |
Multi-database search: Many database platforms offer more than one database on the same platform. Some platforms allow users to search these multiple databases at one time, for example using the Ovid platform to simultaneously search MEDLINE, Embase, and the Cochrane Database of Systematic Reviews. | |
Peer review: In PRISMA-S, this refers to the peer review of search strategies prior to executing the search. Peer review is used to identify errors, missing keywords or subject headings, and other issues within a search strategy. One commonly used tool for search strategy peer review is the Peer Review of Electronic Search Strategies (PRESS) Guideline [1]. | |
Platform: Many databases are available on multiple different systems, each of which have their own specifications for how a search strategy can be constructed. The location or host system of the database is the platform. Platform is sometimes referred to as the interface or vendor. Common examples include Ovid, EBSCOhost, ProQuest, and Web of Science. | |
Records: Individual items retrieved from any type of search, though most commonly used in conjunction with database searches. Records may also be referred to as references or hits. | |
Repository: An online archive for varying types of electronic files, including text documents, data files, and more. Repositories may be hosted by an institution or more broadly available. | |
Rerun: Re-executing the same search strategy in the same database one or more times after the original search was conducted. See Updating a search strategy. | |
Search: Overall term for the entire information retrieval process as part of a systematic review. It can also refer to searching a specific database, web site, or other information source. | |
Search strategy: Structure of terms, logical operators, and syntax elements (field codes (see above), adjacency operators, phrases, etc) that is used to search a database or other information source. A search strategy may be very simple or very complex, depending on the information source and requirements of the search. | |
Sensitivity: A measure of how well a search strategy finds relevant articles, sensitivity (usually expressed as a percentage) is the number of relevant records found with a search strategy divided by the total number of relevant records in a given information source. Highly sensitive search strategies or filters detect most or all records that are relevant. Together with specificity, sensitivity is a measure used to assess the performance of filters. Sensitivity may also be called recall. | |
Specificity: A measure of how well a search strategy omits irrelevant articles, specificity (usually expressed as a percentage) is the number of irrelevant records not found with (or excluded by) a search strategy divided by the total number of irrelevant records in a given information source. Search strategies or filters with high specificity will find few irrelevant articles. Together with sensitivity, specificity is often used to assess the performance of filters. | |
Study registry: A database of records of research studies in progress. Originally designed for clinical trials as a location for patients to find clinical trials to join, study registries have spread beyond biomedical research to other fields. Study registries may also contain research results, posted after study completion. | |
Supplementary materials: Additional content for a study that does not fit in the main manuscript text. For a systematic review, supplementary materials should include full search strategies for all information sources and more complete search methods description. Supplementary materials are generally submitted with the manuscript for peer review. | |
Syntax: Search structure and organization, based on a set of rules governing how a search operates in a specific database and platform. Rules might include field codes, phrase and adjacency searching, Boolean operators, and truncation, amongst others. | |
Systematic review: For the purposes of PRISMA-S, systematic review is used for the entire family of methods-based reviews. This includes rapid reviews, scoping reviews, meta-narrative reviews, realist reviews, meta-ethnography, and more. | |
Updating a search strategy: To ensure currency, authors often search for additional information throughout the systematic review process or before submitting a report. The search may be updated by running the exact same search (rerunning the search) or by conducting a new or modified search to locate additional references. |
Suggested location for reporting
Item 2. Multi-database searching
Examples
“The MEDLINE and Embase strategies were run simultaneously as a multi-file search in Ovid and the results de-duplicated using the Ovid de-duplication tool.” [51]
“A systematic literature search was performed in Web of Knowledge™ (including KCI Korean Journal Database, MEDLINE, Russian Science Citation Index, and SciELO Citation Index)….” [52]
Explanation
Suggested location for reporting
Item 3. Study registries
Example
“[We] searched several clinical trial registries (ClinicalTrials.gov, Current Controlled Trials (www.controlled-trials.com), Australian New Zealand Clinical Trials Registry (www.actr.org.au), and University Hospital Medical Information Network Clinical Trials Registry (www.umin.ac.jp/ctr)) to identify ongoing trials.” [53]
Explanation
Suggested location for reporting
Item 4. Online resources and browsing
Examples
“We also searched the grey literature using the search string: “public attitudes” AND “sharing” AND “health data” on Google (in June 2017). The first 20 results were selected and screened.” [60]
“The grey literature search was conducted in October 2015 and included targeted, iterative hand searching of 22 government and/or research organization websites that were suggested during the expert consultation and are listed in S1 Protocol. Twenty two additional citations were added to the review from the grey literature search.” [61]
“To locate unpublished studies, we searched Embase [via Embase.com] for conference proceedings since 2000 and hand-searched meeting abstracts of the Canadian Conference on Physician Health and the International Conference on Physician Health (2012 to 2016).” [62]
Explanation
Web search engines and specific web sites
Conference proceedings
General browsing
Suggested location for reporting
Item 5. Citation searching
Examples
“Reference lists of included articles were manually screened to identify additional studies.” [75]
“[W]e used all shared decision making measurement instruments that were identified in Gärtner et al’s recent systematic review (Appendix A). We then performed a systematic citation search, collecting all articles that cited the original papers reporting on the development, validation, or translation of any the observational and/or self-reported shared decision making measurement instruments identified in that review. An experienced librarian (P.J.E.) searched Web of Science [Science Citation Index] and Scopus for articles published between January 2012 and February 2018.” [76]
“We [conducted] citation tracking of included studies in Web of Science Core Collection on an ongoing basis, using citation alerts in Web of Science Core Collection.” [77]
Explanation
Suggested location for reporting
Item 6. Contacts
Examples
“We contacted representatives from the manufacturers of erythropoietin-receptor agonists (Amgen, Ortho-Biotech, Roche), corresponding or first authors of all included trials and subject-area experts for information about ongoing studies.” [79]
“We also sought data via expert requests. We requested data on the epidemiology of injecting drug use and blood-borne viruses in October, 2016, via an email distribution process and social media. This process consisted of initial emails sent to more than 2000 key experts and organisations, including contacts in the global, regional, and country offices of WHO, UNAIDS, Global Fund, and UNODC (appendix p 61). Staff in those agencies also forwarded the request to their colleagues and other relevant contacts. One member of the research team (SL) posted a request for data on Twitter, which was delivered to 5525 individual feeds (appendix p 62).” [80]
Explanation
Suggested location for reporting
Item 7. Other methods
Examples
“We also searched… our personal files.” [84]
“PubMed’s related articles search was performed on all included articles.” [85]
Suggested location for reporting
Item 8. Full search strategies
Examples
Database search. Methods section description. “The reproducible searches for all databases are available at DOI:10.7302/Z2VH5M1H.” [88]
Database search. One of the full search strategies from supplemental materials in online repository. “Embase.com (692 on Jan 19, 2017)1.'social media'/exp OR (social NEAR/2 (media* OR medium* OR network*)):ti OR twitter:ti OR youtube:ti OR facebook:ti OR linkedin:ti OR pinterest:ti OR microblog*:ti OR blog:ti OR blogging:ti OR tweeting:ti OR 'web 2.0':ti2.'professionalism'/exp OR 'ethics'/exp OR 'professional standard'/de OR 'professional misconduct'/de OR ethic*:ab,ti OR unprofessional*:ab,ti OR professionalism:ab,ti OR (professional* NEAR/3 (standard* OR misconduct)):ab,ti OR ((professional OR responsib*) NEAR/3 (behavi* OR act OR conduct*)):ab,ti
Online resources and browsing. Methods section description. “The approach to study identification from this systematic review is transparently reported in the Electronic Supplementary Material Appendix S1.” [89]
Online resources and browsing. One of the full online resource search strategies reported in supplement. “Date: 12/01/16. Portal/URL: Google. https://www.google.co.uk/webhp?hl=en. Search terms: ((Physical training) and (man or men or male or males) and (female or females or women or woman) and (military)). Notes: First 5 pages screened on title (n=50 records).” [89]
Explanation
Fully documenting a search will require publication of supplementary materials. Due to the instability of supplementary materials published as part of a journal article, uploading complete documentation to a secure and permanent archive is recommended. | |
Options for secure and permanent archives Many options exist for uploading documentation. Ideally, use an archive or repository that will provide a digital object identifier (DOI) for any uploaded materials (Table 2). These are a few of the many options available. Institutional repository: Many institutions or their affiliated libraries host online repository systems for their faculty, staff, and students. An example is the University of Michigan’s Deep Blue Data system (https://deepblue.lib.umich.edu/). Open Science Framework (http://osf.io/): The Open Science Framework (OSF) platform enables the storage of any documentation associated with a research study. It is possible to create DOIs for individual files or groups of files. OSF is openly and freely available. figshare (https://figshare.com/): figshare is a commercial platform that allows researchers to share any type of data or research output. It is possible to create DOIs for individual files or collections. Zenodo (https://zenodo.org/): Zenodo is a general purpose, freely available open access repository available from CERN for research data and associated materials. Uploaded materials are assigned DOIs. | |
What documentation to upload Materials related to all PRISMA-S checklist items can be included in supplementary materials. Sufficient information should be uploaded that would enable an interested reader to replicate the search strategy. Specifically, it is recommended that documentation relating to the full search strategies for all information sources and methods be included in supplementary materials. Optionally, authors may wish to upload additional supplementary information, including files of all references retrieved, all references after deduplication, and all references to included studies. Authors who wish to share these files should note that abstracts are copyrighted materials and should be removed from files before sharing them publicly. For an example of supplementary materials related to a systematic review search, see: MacEachern, M. (2017). Literature search strategies for "Substance Use Education in Schools of Nursing: A Systematic Review of the Literature" [Data set]. University of Michigan - Deep Blue. https://doi.org/10.7302/Z24X560Q In this example, the materials shared include a Read Me file to explain the files, EndNote (.enlx) files of screened references, the original files imported into EndNote, and the complete, reproducible search strategies for all information sources. |
Suggested location for reporting
Item 9: Limits and restrictions
Examples
No limits. “We imposed no language or other restrictions on any of the searches.” [95]
Limits described without justification. “The search was limited to the English language and to human studies.” [96]
“The following search limits were then applied: randomized clinical trials (RCTs) of humans 18 years or older, systematic reviews, and meta-analyses.” [97]
Limits described with justification. “The search was limited to publications from 2000 to 2018 given that more contemporary studies included patient cohorts that are most reflective of current co-morbidities and patient characteristics as a result of the evolving obesity epidemic.” [98]
Limits described, one with justification. “Excluded publication types were comments, editorials, patient education handouts, newspaper articles, biographies, autobiographies, and case reports. All languages were included in the search result; non-English results were removed during the review process…. To improve specificity, the updated search was limited to human participants.” [99]
Explanation
Suggested location for reporting
Item 10. Search filters
Example
Explanation
Suggested location for reporting
Item 11. Prior work
Example
“We included [search strategies] used in other systematic reviews for research design [111], setting [112, 113], physical activity and healthy eating [114‐116], obesity [111], tobacco use prevention [117], and alcohol misuse [118]. We also used a search [strategy] for intervention (implementation strategies) that had been employed in previous Cochrane Reviews [119, 120], and which was originally developed based on common terms in implementation and dissemination research.” [121]
Explanation
Suggested location for reporting
Item 12. Updates
Examples
“Ovid Auto Alerts were set up to provide weekly updates of new literature until July 09, 2012.” [123]
“Two consecutive searches were conducted and limited by publication type and by date, first from January 1, 1990, to November 30, 2012, and again from December 1, 2012, to July 31, 2015, in an updated search…. The original search strategy was used to model the updated search from December 1, 2012, to July 31, 2015. The updated search strategy was consistent with the original search; however, changes were required in the ERIC database search because of a change in the ERIC search algorithm. Excluded publication types were identical to the initial search. To improve specificity, the updated search was limited to human participants.” [99]
Explanation
Suggested location for reporting
Item 13. Dates of searches
Example
“A comprehensive literature search was initially run on 26 February 2017 and then rerun on 5 February 2018….” [127]
Explanation
Suggested location for reporting
Item 14. Peer review
Example
Explanation
Suggested location for reporting
Item 15. Total records
Examples
Explanation
Suggested location for reporting
Item 16. Deduplication
Example
“Duplicates were removed by the librarians (LP, PJE), using EndNote's duplicate identification strategy and then manually.” [134]