Skip to main content
main-content

01.12.2017 | Research | Ausgabe 1/2017 Open Access

Implementation Science 1/2017

Does access to a demand-led evidence briefing service improve uptake and use of research evidence by health service commissioners? A controlled before and after study

Zeitschrift:
Implementation Science > Ausgabe 1/2017
Autoren:
Paul M Wilson, Kate Farley, Liz Bickerdike, Alison Booth, Duncan Chambers, Mark Lambert, Carl Thompson, Rhiannon Turner, Ian S Watt

Abstract

Background

The Health and Social Care Act mandated research use as a core consideration of health service commissioning arrangements in England. We undertook a controlled before and after study to evaluate whether access to a demand-led evidence briefing service improved the use of research evidence by commissioners compared with less intensive and less targeted alternatives.

Methods

Nine Clinical Commissioning Groups (CCGs) in the North of England received one of three interventions: (A) access to an evidence briefing service; (B) contact plus an unsolicited push of non-tailored evidence; or (C) unsolicited push of non-tailored evidence. Data for the primary outcome measure were collected at baseline and 12 months using a survey instrument devised to assess an organisations’ ability to acquire, assess, adapt and apply research evidence to support decision-making. Documentary and observational evidence of the use of the outputs of the service were sought.

Results

Over the course of the study, the service addressed 24 topics raised by participating CCGs. At 12 months, the evidence briefing service was not associated with increases in CCG capacity to acquire, assess, adapt and apply research evidence to support decision-making, individual intentions to use research findings or perceptions of CCG relationships with researchers. Regardless of intervention received, participating CCGs indicated that they remained inconsistent in their research-seeking behaviours and in their capacity to acquire research. The informal nature of decision-making processes meant that there was little traceability of the use of evidence. Low baseline and follow-up response rates and missing data limit the reliability of the findings.

Conclusions

Access to a demand-led evidence briefing service did not improve the uptake and use of research evidence by NHS commissioners compared with less intensive and less targeted alternatives. Commissioners appear well intentioned but ad hoc users of research. Further research is required on the effects of interventions and strategies to build individual and organisational capacity to use research.
Literatur
Über diesen Artikel

Weitere Artikel der Ausgabe 1/2017

Implementation Science 1/2017 Zur Ausgabe