Background
Supporting Policy In health with Research: an Intervention Trial (SPIRIT)
Intervention components | Subcomponents | Change principles underpinning each subcomponent |
---|---|---|
1. Audit, feedback and goal setting
| a. Feedback forum | • Engages agencies in owning and driving the program |
• Provides feedback about current practice | ||
• Provides a clear rationale for change | ||
b. Intervention selection | • Develops agreement about concrete and specific change goals | |
• Is tailored to focus on the agency's priorities | ||
c. Identification of other strategies | • Develops agreement about concrete and specific change goals | |
d. Mid-intervention feedback | • Engages agencies in owning and driving the program | |
• Monitors and provides feedback about change during the intervention | ||
e. SPIRIT newsletter | • Monitors and provides feedback about change during the intervention | |
2. Leadership program
| a. Supporting organisational use of evidence | • Addresses systems, operations, structures and relations |
• Engages agencies in owning and driving the program | ||
• Develops agreement about concrete and specific change goals | ||
b. Leading organisational change | • Provides self-education opportunities and access to resources | |
• Recognises the expertise of participants | ||
• Is interactive with a focus on shared reflection and problem solving | ||
• Uses credible, dynamic experts as presenters | ||
3. Organisational support for research
| a. Quarterly email endorsement from CEO | • Engages agencies in owning and driving the program |
• Uses champions to model and promote the use of evidence from research | ||
• Monitors and provides feedback about change during the intervention | ||
b. Access to WebCIPHER | • Provides self-education opportunities and access to resources | |
• Uses champions to model and promote the use of evidence from research | ||
c. Resources for improving agency's use of research | • Provides self-education opportunities and access to resources | |
• Provides opportunity for rehearsal and practice | ||
4. Opportunity to test systems for accessing research and reviews (brokered services)
| a. Brokered commission of a rapid review, evaluation plan or linked data analysis | • Provides opportunity for rehearsal and practice |
• Is tailored to focus on the agency's priorities | ||
• Recognises the expertise of participants | ||
• Provides self-education opportunities and access to resources | ||
5. Research exchange
| a. Interactive forum | • Is tailored to focus on the agency's priorities |
• Recognises the expertise of participants | ||
• Is interactive with a focus on shared reflection and problem solving | ||
• Provides self-education opportunities and access to resources | ||
• Uses champions to model and promote the use of evidence from research | ||
• Uses credible, dynamic experts as presenters | ||
b. Summary of systematic reviews | • Is tailored to focus on the agency's priorities | |
• Provides self-education opportunities and access to resources | ||
6. Educational symposia for staff
| a. Valuing research symposium | • Recognises the expertise of participants |
• Is interactive with a focus on shared reflection and problem solving | ||
• Provides self-education opportunities and access to resources | ||
• Uses champions to model and promote the use of evidence from research | ||
• Uses credible, dynamic experts as presenters | ||
b. Agencies can chose two symposia from: Access to research/Appraising research/Evaluation/Working with researchers | • Recognises the expertise of participants | |
• Is interactive with a focus on shared reflection and problem solving | ||
• Provides self-education opportunities and access to resources | ||
• Uses champions to model and promote the use of evidence from research | ||
• Uses credible, dynamic experts as presenters |
Process evaluation of interventions to increase the use of research in health policy and program agencies
Aims and objectives
Methods/Design
Process evaluation design
Process evaluation domains | Research questions | Core information sought | Data type | Data source | Record kept |
---|---|---|---|---|---|
Domain 1: Implementation | 1. How was the intervention implemented in each agency? Including: | Intervention components selected and how agencies asked that they be tailored | Updates | SPIRIT staff implementing the intervention | Fieldnotes and memos - data indexed in NVivo |
a. What components were delivered? | What was delivered in each agency, including: which components; delivery format; provider recruitment and preparation; learning materials; the fidelity with which essential elements were delivered; any changes to the plan, and follow-up activities | Structured observation | Intervention sessions | Completed checklists - data input in spreadsheet | |
b. To what extent were the essential elements implemented? | Email information | SPIRIT trial coordinator's records | Collated spreadsheet data on email delivery | ||
Knowledge brokering records | SPIRIT staff delivering brokered services | Brokered service assessment form | |||
Domain 2: Participation and response | 2. How did people interact with the intervention? What were their levels of participation and satisfaction? | Session participation and responses, including: roles of attendees, proportion of invitees who attended; nature of participation (types and extent of interaction) | Pre-session sign-in and consent process | Participants attending each session | Completed sign-in/consent sheets -no.s and roles input in spreadsheet |
Semi-structured and structured observation | Intervention sessions | Completed checklist, fieldnotes and memos - data indexed in NVivo | |||
3. What effects that are not captured by the outcome measures did the intervention have (including unexpected effects)? | Participants' evaluation of intervention sessions | Self-reported evaluation feedback | Participants attending each session | Completed feedback forms - data input in spreadsheet | |
Informal conversations after sessions | Liaison Person and ad hoc participants | Fieldnotes - data indexed in NVivo | |||
How participants and the organisational system responded to the intervention overall (including unexpected effects) | Interviews | Purposively sampled participants, Liaison Person and CEO | Audio recordings, transcripts and memos - data indexed in NVivo | ||
Interviews, meetings and informal conversations | SPIRIT intervention staff and providers | Fieldnotes and memos - data indexed in NVivo | |||
Domain 3: Context | 4. What was the context of the agencies in which the intervention was implemented? | Immediate characteristics of session delivery context (site, facilities, etc.) | Structured observation | Intervention sessions | Completed checklist fieldnotes data input in spreadsheet |
Organisational context: (i) agency culture, (ii) agenda-setting & prioritisation, (iii) leadership styles & perceptions of leaders, (iv) how research & other information is valued, accessed & used, (v) barriers and enablers to using research, (vi) other contextual factors that may affect outcomes | Semi-structured observation | Intervention sessions | Audio recordings, fieldnotes and memos - data indexed in NVivo | ||
Interviews | Purposively sampled participants | Audio recordings, transcripts and memos - data indexed in NVivo | |||
Interviews, meetings and informal conversations | SPIRIT staff implementing the intervention | Audio recordings, fieldnotes and memos - data indexed in NVivo | |||
Across domains
| 5. How might the relationships between the program, the people and the context in each agency have shaped variations in these effects? | Analytic synthesis of above data | |||
6. What lessons can we derive from this study that might be relevant for other interventions and settings? |
Data collection
Domain 1: Implementation
Intervention implementation questions used to guide the documentation of intervention implementation | Data sources | ||||
---|---|---|---|---|---|
SPIRIT team: engagement and design | SPIRIT team: implementation | Intervention session observations | Participant feedback | Participant interviews | |
Intervention design
| |||||
1. What was the planned number & length of intervention sessions/activities, and their distribution and duration over time? Who were the targeted participants? |
✓
| ||||
2. What theoretical model/theory-of-change were the strategies based on? |
✓
| ||||
3. What essential elements were to be delivered in each component? |
✓
|
✓
| |||
Intervention providers
| |||||
4. What selection process was used to identify providers? What were the credentials of providers? |
✓
| ||||
5. What training, guidance or information did providers receive (what was the content and format, and were there any changes over time?)? |
✓
| ||||
Recruitment
| |||||
6. How were agencies recruited as participants? |
✓
| ||||
7. What was the nature of the relationship between these agencies and the researchers/institutions involved in the trial?* |
✓
|
✓
| |||
Tailoring
| |||||
8. Which intervention components did agency leaders select? |
✓
|
✓
| |||
9. What reasons did they give for that selection (what goals were they targeting)? |
✓
|
✓
| |||
10. What, if any, other goals and strategies were nominated by the agency leaders for supporting research use? |
✓
|
✓
|
✓
| ||
Intervention delivery
| |||||
11. What method was used to specify and direct implementation? |
✓
|
✓
| |||
12. How long was each session? |
✓
| ||||
13. What was the type, number and distribution of intervention sessions/activities? |
✓
| ||||
14. To what extent were the essential elements delivered? How were they monitored/measured? |
✓
| ||||
15. Were there any planned changes made to R4P while it was in progress? Why? |
✓
|
✓
|
✓
| ||
16. Were there any unplanned changes? What happened? |
✓
|
✓
| |||
Context
| |||||
17. What was the culture and overarching context of the participating agencies at the start of the intervention?* |
✓
|
✓
| |||
18. Were there any changes/initiatives during the intervention that may have affected responses to the intervention?* |
✓
|
✓
| |||
19. What were the immediate contextual conditions around intervention sessions? |
✓
|
✓
| |||
Participation
| |||||
20. Who was invited to participate? (numbers and professional roles)? |
✓
|
✓
| |||
21. How many potential participants attended sessions? Who were they? |
✓
| ||||
22. What proportion of targeted participants attended (approximately)? |
✓
| ||||
23. Did key people (agency leaders or topic specialists) attend? |
✓
| ||||
Responses to intervention sessions
| |||||
24. How did people participate in intervention sessions? |
✓
|
✓
|
✓
| ||
25. How satisfied were participants with sessions and SPIRIT overall?* |
✓
|
✓
|
✓
| ||
26. Did participants identify or anticipate any changes in using research in response to intervention sessions/activities?* |
✓
|
✓
|
✓
| ||
Intervention improvements
| Analysis of data above | ||||
27. What improvements to the intervention design and/or implementation are suggested by this data?* | |||||
28. What lessons might be relevant to other interventions and settings?* |