Systematic Reviews and Meta Analysis
Systematic review data extraction: cross-sectional study showed that experience did not increase accuracy

https://doi.org/10.1016/j.jclinepi.2009.04.007Get rights and content

Abstract

Objective

This study assessed the impact of systematic review and data extraction experience on the accuracy and efficiency of data extraction in systematic reviews.

Study Design and Setting

We conducted a prospective cross-sectional study from October to December 2006. Participants were classified as having minimal, moderate, or substantial experience in systematic reviews and data extraction. Three studies on insomnia treatment were extracted. Our primary outcome was the accuracy of data extraction. Data sets of each experience level were analyzed for errors in data extraction and results of meta-analyses. Additionally, the time required for completion of data extraction was compared.

Results

Error rates were similar across the various levels of experience and ranged from 28.3% to 31.2%. Mean rates for errors of omission (11.3–16.4%) were generally lower than those for errors of inaccuracy (13.9–17.9%). There were no significant differences in error rates or accuracy of meta-analysis results between groups. Time required approached significance, with minimally experienced participants requiring the most time.

Conclusion

Overall, there were high error rates by participants at all experience levels; however, time required for extraction tended to decrease with experience. These results illustrate the need to develop strategies aimed at mastery of data extraction, rather than reliance on previous data extraction experience alone.

Section snippets

Background

There is currently no recommended standard for data extraction in systematic reviews with respect to the experience level of reviewers in systematic reviews and data extraction. To our knowledge, there is no empirical evidence regarding the types and magnitude of errors accompanying data extraction conducted by reviewers with various levels of experience in systematic reviews and data extraction, the impact of these errors on the results of meta-analysis, or the efficiency in data extraction

Participant recruitment

The participants of this study were recruited through The Cochrane Collaboration, the Evidence-based Practice Center program of the US Agency for Healthcare Research and Quality, and relevant departments at the University of Alberta. A letter of invitation was sent to the members and students of these respective entities, which directed them to an online screening questionnaire. Individuals with prior knowledge of the systematic review process by education and/or experience were eligible for

Participants

Two hundred and forty individuals responded to the invitation to participate in the study and completed the screening questionnaire (Fig. 1). One hundred and fifty-four individuals were eligible to participate based on their completion of the screening questionnaire and were categorized according to their level of experience in systematic reviews and data extraction. One hundred and twenty-one individuals began the data extraction process with variable completion rates across studies.

Discussion

This is one of the first studies to examine, in a controlled manner, the effect of systematic review and data extraction experience on the accuracy of data extraction in systematic reviews. Overall, we found that level of experience did not result in measurable differences in error rates. Of note is the high level of errors in general with an overall error rate of 28.7%, which is higher than that found in previous research [4]. We found that the errors were more often because of inaccuracies

Conclusion

We found that data extraction did not vary significantly by level of data extraction and systematic review experience in terms of error rates or results of the meta-analysis. Overall, we found high error rates by all experience level groups, which underscores the importance of adequate instruction, training, and care in data extraction in systematic reviews. The familiarity of the data extractors with the terminology and outcomes specific to a field of research may play a role in the accuracy

Acknowledgments

The authors would like to acknowledge Joseph Lau, David Moher, and Lina Santaguida (on behalf of Parminder Raina) who provided expert input on the development of the participant experience classification scheme. We also thank Marilyn Josefsson and Kelley Bessette for their assistance in handling participant inquiries and records.

We gratefully acknowledge the Canadian Agency for Drug and Technologies for Health who provided the funding for this research.

Author contributions: J.H. participated in

Cited by (50)

  • Resource use during systematic review production varies widely: a scoping review

    2021, Journal of Clinical Epidemiology
    Citation Excerpt :

    Because the goal of this scoping review was to descriptively map the resource use of SR production steps rather than to derive cause–effect relationships, we did not apply a formal certainty of evidence or risk of bias (RoB) assessment. We included 34 studies (32 quantitative primary studies, 1 qualitative primary study, 1 SR) that were published in 38 publications [9,23–59] (Fig. 2: PRISMA study flow). In the following sections, we first summarize the characteristics of the included studies.

  • Few studies exist examining methods for selecting studies, abstracting data, and appraising quality in a systematic review

    2019, Journal of Clinical Epidemiology
    Citation Excerpt :

    Mathes et al. [16] recently published a methodological review of data extraction errors and methods to increase data extraction quality. Our study identified the same three methods studies as Mathes et al. [35–37] and several others covering other extraction approaches. Notably, our findings and conclusions are consistent.

  • Meta-analysis of Nutrition Studies

    2019, Analysis in Nutrition Research: Principles of Statistical Methodology and Interpretation of the Results
View all citing articles on Scopus
View full text