Background
Using research in policy-making
Building capacity to use research in policy-making
Aims
Our approach: a realist scoping review
Mechanisms, process effects and outcomes
Methods
Focus of the review: what and who?
Search strategy
Inclusion criteria
-
Interventions. The study reported on an intervention designed to improve or increase policy-makers’ capacity to use research in their work. We took a broad view of capacity-building that included strategies for supporting research use as well as strategies for advancing it, so studies were included if they were designed to enhance access to research; skills in appraising research, generating research (commissioning or conducting it) or applying research; organisational systems for developing and supporting research use; and/or connections with researchers (including partnership work and co-production). Any strategy that employed one or more capacity-building activities aimed at individual-level or organisational structures and systems was eligible, irrespective of whether the strategy was part of a research study or initiated/implemented by policy agencies, or a combination. Studies were excluded if they focused on policy development capabilities in which the use of research was a minor component (e.g. [83]); evaluated one-off attempts to get policy-makers to use research for a specific policy initiative; or focused on specific fields other than health such as education or climate change (e.g. [84]). Studies that addressed the use of research by policy-makers in general were included (e.g. [41]).
-
Populations. The intervention targeted policy-makers, by which we mean either (1) non-elected civil servants working in some aspect of policy or program development, funding, implementation or evaluation within government agencies such as Departments of Health, and/or (2) senior health services decision-makers, e.g. executives in regional or federal health authorities who have responsibility for large scale service planning and delivery, and/or (3) elected government ministers. We included studies where participants included policy-makers and other groups (e.g. frontline clinicians, NGO officers or researchers) and where some disaggregated data for the different groups was reported, but not those in which intervention effects/outcomes were reported in aggregated form, for instance, health staff at all levels took part in the intervention but the data for senior health services decision-makers was not reported separately from that of frontline staff (e.g. [25, 85‐91]).
-
Study design. The intervention was evaluated. This includes process evaluations and reports of ‘soft’ proximal outcomes such as satisfaction and awareness, which we conceptualise in our analysis as mechanisms. As mentioned, opportunistic evaluations of initiatives that were planned outside the auspices of a research study were included, but studies were excluded if they described achievements or ‘lessons learnt’ but did not explain how such information was captured as part of an evaluation strategy (e.g. [32, 84, 92‐94]).
-
Publication. The evaluation results were published in English between 1999 and 2016. This date range was judged by the authors as likely to encompass the vast majority of relevant publications in this field, and included the earliest relevant study of which we were aware [95].
-
Settings. We included studies from all countries, including low- and middle-income countries. These settings are likely to differ considerably from those of high-income countries (e.g. less developed infrastructure, fewer resources, different health priorities and a poorer local evidence-base [96]) but, together, the studies provide insights into creative interventions and produce findings that may have implications across the country income divide in both directions.
-
Quality. Studies were excluded if they were “fatally flawed” according to the appraisal criteria used in critical interpretive synthesis [97]. Following Dixon-Woods et al. [97], we used a low threshold for quality pre-analysis due to the diversity of methodologies in our final sample, and because our method of synthesising data would include judgements about the credibility and contribution of studies.
Analysis
Concept | Definition of concept in this review | How this was identified in the analysis |
---|---|---|
Intervention strategy | Intervention strategies work by changing participants’ reasoning and resources; importantly, they do not work in a void, but interact with contextual features to generate change | The intervention strategies listed in the results were identified from authors’ descriptions of attempts to support or advance policy-makers’ capacity to use research in their work (listed by study in Additional file 2) |
Context | Context is any condition that affects how people respond to intervention strategies (i.e. if and how mechanisms are activated); they include settings, structures, circumstances, and the attributes and attitudes of those delivering and receiving the intervention | Contextual features were identified primarily from authors’ accounts of intervention settings and circumstances before and during implementation. On occasion, they were inferred from information about responses to the intervention. In the SCMO tables that follow, we focus on aspects of context that relate specifically to each mechanism; a more general overview of context is provided first |
Mechanism | Mechanisms are how an intervention works; they are responses to the resources, opportunities or challenges offered by the intervention. Mechanisms are activated to varying extents (or not at all) depending on interactions between intervention strategies and contexts. Although mechanisms are not observable, their impacts are, so they can be inferred, tested and refined | With one arguable exception [101], none of the studies explicitly identified or tested causal mechanisms so we inferred mechanisms from authors’ accounts of how the intervention was conceived, designed, delivered and received (i.e. how it was meant to function and how it actually functioned); this was supplemented with similar information from some of the non-eligible studies that were identified in this search, and from the wider theoretical and implementation literature. Mechanisms posited in the results tables that follow include hypotheses about (1) how each intervention strategy worked (where mechanisms were activated successfully) and (2) how strategies would have worked if they had been appropriate in that context (where mechanisms were not activated and their absence may account for poor outcomes) |
Outcome | These are intended or unintended impacts generated by mechanisms; as described in the previous section, in this review outcomes may be proximal (process effects) or more distal study outcomes | Outcomes of interest were explicit in most of the reviewed studies; where they were vague, we inferred them from the studies’ research questions, interview foci and reported results ( described in Additional file 2) |
Results
Search results
Study design
Domains of capacity-building and support
Research utilisation domain | Intervention strategies (and number of studies that used it) |
---|---|
Access to research | 1. Providing access to research articles or syntheses via an online database (5) 2. Disseminating tailored syntheses summaries or reports, including policy briefs (7) 3. Commissioning research and reviews (2) 4. Seminars or other forums in which research findings are presented (4) 5. Facilitated access using a knowledge broker or other intermediary (3) |
Skills improvement | 6. Skills development workshops (10) 7. Intensive skills training programs (4) 8. Training or support for managers in championing and modelling research use (4) 9. Mentoring (includes using knowledge brokers to build skills) (7) 10. Goal-orientated mentoring (with presentations or assessment) (4) |
Systems improvement | 11. Improving infrastructure, e.g. library, new research portals, data sharing software (5) 12. Improving organisational tools, resources and processes, e.g. procedures, toolkits, knowledge management protocols, funds for commissioning research (2) 13. Workforce development, e.g. research-related positions and incentives (1) 14. Establishing internal research support bodies, e.g. research units and committees (3) |
Interaction | 15. One-off or periodic interactive forums, e.g. roundtables, cross-sector retreats, policy dialogues (4) 16. Platforms for ongoing interactivity, e.g. community of practice, cross-sector committees (4) 17. Collaboration in the development of a research report or policy brief/dialogue (2) 18. Partnership projects: research co-production (3) |
Study reference (chronological order) | Intervention strategies as listed in Table 2 | Intervention domains | Intervention design | |||||
---|---|---|---|---|---|---|---|---|
Access
|
Skills
|
Systems
|
Interaction
|
Needs-based tailoring
|
Multi-component
| |||
1. | Dobbins et al., 2001 [95] | 2 | ✓ | |||||
2. | Pappaioanou et al., 2003 [102] | 1, 7, 8, 9, 11, 12 | ✓ | ✓ | ✓ | ✓ | ✓ | |
3. | Kothari et al., 2005 [91] | 2, 17 | ✓ | ✓ | ||||
4. | Dobbins et al., 2009 [106] | 1, 2, 5 | ✓ | ✓ | ✓ | |||
5. | Wehrens et al., 2010 [138] | 4, 18 | ✓ | ✓ | ✓ | ✓ | ||
6. | Brownson et al., 2011 [202] | 2 | ✓ | ✓ | ||||
7. | Campbell et al., 2011 [29] | 3, 5 | ✓ | ✓ | ||||
8. | Rolle et al., 2011 [101] | 7, 8, 9, 10 | ✓ | ✓ | ||||
9. | Peirson et al., 2012 [104] | 3, 6, 11, 12, 13, 14 | ✓ | ✓ | ✓ | ✓ | ✓ | |
10. | Uneke et al., 2012 [132] | 6 | ✓ | |||||
11. | Dagenais et al., 2013 [129] | 2, 4, 15 | ✓ | ✓ | ✓ | |||
12. | Hoeijmakers et al., 2013 [127] | 6, 11, 18 | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ |
13. | Waqa et al., 2013 [134] | 1, 6, 9, 10, 11 | ✓ | ✓ | ✓ | ✓ | ✓ | |
14. | Kothari et al., 2014 [62] | 16, 18 | ✓ | ✓ | ||||
15. | Traynor et al., 2014 [133] | 1, 2, 5, 6, 8, 9, 12 | ✓ | ✓ | ✓ | ✓ | ||
16. | Brennan et al., 2015 [110] | 1, 6, 16 | ✓ | ✓ | ✓ | ✓ | ||
17. | Dwan et al., 2015 [128] | 4, 15 | ✓ | ✓ | ✓ | |||
18. | Shroff et al., 2015 [103] | 2, 6, 14, 15 | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ |
19. | Uneke et al., 2015 [130] | 6, 9, 10 | ✓ | ✓ | ||||
20. | Uneke et al., 2015 [137] | 6, 7, 8, 9, 10, 16, 17 | ✓ | ✓ | ✓ | |||
21. | Hawkes et al., 2016 [41] | 4, 6, 11, 14, 15 | ✓ | ✓ | ✓ | ✓ | ✓ | ✓ |
22. | Langlois et al., 2016 [131] | 7, 9, 16 | ✓ | ✓ | ✓ | ✓ |
Intervention participants and settings
Program theories
Intervention strategies, contexts, causal mechanisms and outcomes
Overarching contextual considerations and their implications
-
Research characteristics: Policy-makers’ use of research was influenced by the degree to which they were able to obtain research that was relevant, applicable and easy to read. This suggests the need for strategies that increase the availability and accessibility of fit-for-purpose research findings. Credibility of research and researchers is also a consideration.
-
Individual characteristics: Policy-makers’ use of research was affected by existing research-related knowledge, skills and self-confidence, and views about the value of research. The former clearly indicates task-appropriate skills development and support, but the latter suggests that, in some cases, it will be necessary to influence beliefs.
-
Interpersonal characteristics: Although neither community is heterogeneous, there were common differences between policy-makers and researchers in terms of language, values, expectations and incentives. This suggests the need for strategies that either bridge these communities, or form more cohesive connections between them, potentially blurring their boundaries.
-
Organisational characteristics: Research use in policy agencies was shaped by organisational culture. Agency remits, resources and constraints further influenced how research was prioritised and used. This suggests the need to activate structural mechanisms that increase expectations of, and facilitate, research use in day-to-day practice. Underlying values and assumptions may need to be influenced. Leadership by managers and opinion leaders is likely to be key.
-
Environmental characteristics: Policy-making environments were complex, political and responsive, affecting what research could be used for what purposes, and the time available for this. The way that research is (or can be) interpreted in this argumentative arena is likely to determine its role in policy processes, thus relevance, applicability and credibility are not fixed research characteristics but determined in relation to circumstances. This suggests that tailored research (both in terms of methodology and presentation), rapid production of findings and responsive dialogue with researchers may be valuable. Methods for supporting this are likely to include commissioning and/or research-policy partnerships and/or internal generation of research by policy agencies.
Access to research (Table 4)
Skills improvement (Table 5)
Systems improvement (Table 6)
Interaction with researchers (Table 7)
Whole-of-intervention design strategies (Table 8)
Intervention strategies | Contextual factors | Hypothesised mechanisms | Potential process effects/outcomesa |
---|---|---|---|
Each intervention site has unique features that interact with the implemented strategies [79] Policy-makers have existing strengths and skills; their needs will vary | • Tailoring based on high quality needs/situational analysis maximises intervention compatibility, including its ability to build on local strengths and tackle areas of real need • Where participants collaborate in tailoring they feel respected/heard and have increased ownership of and investment in the intervention outcomes | • Greater acceptance of the intervention’s local fit and utility • Active support of the intervention and its goals | |
Research use is multi-factorial Capacity exists in different forms and at different levels (individual, interpersonal, organisational and wider environmental); supports and constraints in one area or at one level affect responses in others | • Strengthened capacity in one area supports capacity growth in other areas • Strategies interact synergistically to shape a conducive environment for research use | • Greater, and more sustainable, change in using research |
Discussion
Access
Skills
Systems
Interaction
Key considerations in intervention design
Implications for future interventions
-
Understanding research use in context: Several studies in our review concluded that they had insufficient understanding of the local practices and contexts that were being addressed. This aligns with wider arguments that we continue to have a limited understanding of how policy-makers engage with research ideas and integrate them with other forms of evidence [26], which affects how we conceive of and design interventions, and interpret findings. Designing interventions that are “close to practice” [182] in terms of fit with local research needs and context seems to be essential, but we may also require further investigation of the ‘irrational’ (aka differently rational [6]) and non-linear uses of research that dominate the use of research in policy more generally. One of the reviewers of this paper pointed out that the consistency of contextual factors between our findings and other studies (which we attribute here to enduring contextual regularities in policy-makers’ use of research) may, in fact, be an artefact of an enduring research paradigm. The reviewer states, “it could also be because much of the research into evidence use is conducted from identical premises (more research should be used) and using identical methods (surveys or interviews asking why more research isn't used).” This is an important reminder of the need to ensure that the theories, models and methods we use in investigating research use are sensitive to real world policy practices and contexts rather than perpetuating a ‘barriers and enablers’ framework that risks masking complex interactions, identities and processes [183‐185].
-
Researchers’ capacity: Research-informed policy-making requires that researchers have the skills to produce policyrelevant research, present findings accessibly, and work productively with policy-makers; but these skills are often lacking [6]. Six of the reviewed studies attempted to build some aspect of researchers’ capacity in conjunction with that of policy-makers [62, 127, 131, 132, 137, 138]; however, a cursory scan of the literature suggests that capacity-building for researchers in this field is less developed than for policy-makers, with very few intervention trials; the onus remains on policy-makers. This appears to be a gap that would benefit from further attention. It would likely require that policy-makers are involved in designing the content of such interventions. Many of the mechanisms suggested in our analysis are likely to be relevant.
-
Leadership: Findings reinforced the role of impassioned and strategic leadership as a crucial driver of organisational change (e.g. [103, 104, 128, 133]), including leadership by high profile external experts in the wider policy environment [102]. However, there seemed to be few attempts to target or harness internal leadership in intervention activities. The pivotal role of organisational leaders, champions and opinion leaders in driving change is well established both in practice settings [57, 186‐188] and within policy agencies [18, 44, 50, 51]; but we know little about how leadership dynamics within hierarchical and procedure-focused policy agencies function and effect change in relation to research utilisation capacity-building. Recent arguments about the strengths of distributed or collective leadership for knowledge mobilisation, including cross-sector partnerships, suggest that our conceptualisation of leadership may need to expand [32, 100, 170]. This area could benefit from further investigation.
-
Audit and feedback: With the exception of Peirson et al. [104], none of the studies reported using organisational-level progress feedback as a strategy, and none used audit and feedback – a process that gathers information about current practice and presents it to participants to facilitate learning, develop goals, create motivation for change and focus attention on change tasks [189]. Audit and feedback is well-established as a catalyst for professional practice change [189], including the uptake and use of research [144, 190]. There is mixed evidence for its effectiveness, but a recent systematic review found that it generally leads to small yet potentially important improvements in professional practice [191], and it may be more successful than change techniques such as persuasion [192]. It seems a potentially valuable strategy within research utilisation interventions, particularly in the light of systems-influenced implementation frameworks that emphasise the need to establish performance feedback loops in organisational change processes [10, 100, 108].
-
Commissioning research syntheses: Findings of the two studies that looked at commissioned research syntheses suggest that the value policy-makers attribute to syntheses is affected by the commissioning process and/or their involvement in the conduct of the review [29, 91]. A contribution mapping review of 30 studies found that research was most likely to be used when it was initiated and conducted by people who were in a position to use the results in their own work [193]. However, an evaluation of health policy-makers’ use of a briefing service provided by academics found that access to the service did not improve the policy-makers’ capacity, nor their uptake and use of research [194]. What critical factors are at play in these scenarios? We would benefit from greater understanding of how commissioning models can best support policy-makers’ capacity development and use of research, including the contribution that researchers and knowledge brokers can make.
-
Sustainability: The concept of capacity-building is linked to that of sustainability [70], but sustainability itself was seldom mentioned in the reviewed studies. As bureaucracies, policy organisations are characterised by their adherence to protocol, but there appeared to be few attempts to embed strategies within existing work systems (with some notable exceptions, e.g. [104, 134]). The need for continuous active participation in knowledge mobilisation practices [70] was evident in few studies. Several tried to embed new knowledge and skills in practice (e.g. via mentored assessment), but this targets individual knowledge rather than organisationally owned processes, which is an important consideration in organisations known for their high turnover [40]. Sustainability may depend on different mechanisms from those posited here. For example, self-efficacy may be critical for initiating new patterns of behaviour, but have a limited impact on the decision to maintain that behaviour over time [195]. Greater consideration of organisational learning and the use of measures to prevent capacity initiatives from being ‘washed out’ [196] may be required. Longer-term evaluation would help, but organisational change, like relationship building, is a lengthy and evolving process, often taking years to reach intended goals [62, 104].
-
Underpinning assumptions about research-informed policy-making: Despite the lack of clear theoretical drivers in most studies, the conceptual basis of attempts to address research use in policy-making seems to be maturing – rational linear models of research are being supplanted by ideas from political science, organisational change, systems thinking and other bodies of work that disrupt the evidence-based policy ideal. The field is also making use of opportunities to evaluate capacity-building endeavours that are initiated outside of academia, using creative methods to learn from complex real world projects and refusing to be cowed by the entanglement of change strategies, process indicators and outcomes. However, there is still evidence of “theoretical naivety” as described by Oliver et al. [26]; for example, focusing on research as exemplary evidence rather than on policy-makers’ use of diverse information and ideas within which research must function; the belief that a reconfiguration of barriers and enablers to accessing research would lead to greater impact; and a general lack of understanding about policy processes. Oliver et al. [26] provide advice about the direction that future research can take to address these issues.