Background
Methods
Setting
Intervention | Description | Service Provider | Recipients per yr. (% BiBBS)b | Main Outcome / Domain | Evaluation planned | Proposed Method for Evaluation | ||
---|---|---|---|---|---|---|---|---|
Implement-ation | Before & After | Effective-ness | ||||||
Antenatal Support | ||||||||
Personalised Midwifery | Continuous midwife care | BDHFT Midwifery Services | 500 (300) | Maternal mental health | X | X | Propensity Score (Control BiBBS women receiving standard midwifery care) | |
Family Links Antenatal | Universal antenatal parenting skills programme | Local Authority | 200 (120) | Maternal mental health (PHQ8) | X | X | Pre and post study of difference in main outcome for participants | |
ESOL+ | English language course for women with little or no English during pregnancy | Shipley FE College | 90 (54) | Socio emotional / language | X | Validation of logic model | ||
Antenatal & Postnatal Support | ||||||||
Family Nurse Partnershipa | Intensive home visiting for vulnerable women aged < 25 | BDCT | 100 (60) | Monitoring only | National evaluation currently underway | |||
Baby Steps | Parent education programme for vulnerable parents | VCS – Action For Children | 100 (60) | Parent Infant relationship | X | X | X | Propensity Score (Control BiBBS women receiving standard midwifery care) |
Doula | Late pregnancy, birth and post-natal support for vulnerable women | VCS Action For Community Ltd | 82 (50) | Implementation | X | X | Implementation using monitoring data + interviews women | |
HAPPY | Healthy eating & parenting course for overweight mums with a BMI over 25. | VCS – Barnardo’s | 120 (72) | BMI age 2 | X | X | X | Trial within Cohort (TwiCs) (Control: eligible BiBBS women not selected to rec’ HAPPY) |
Perinatal Support Service | Perinatal support for women at risk of mild/moderate mental health issues | VCS – Family Action | 140 (84) | Maternal mental health (PHQ9) | X | X | Implementation Evaluation | |
Postnatal Support | ||||||||
Breast feeding (BF) support service | Universal practical and emotional support to breastfeeding mums and their families | VCS – Health For All (Leeds) | TBC | BF duration | X | Validation of logic model | ||
Early Years Support | ||||||||
Home-Start | Peer support for vulnerable women | VCS – Home-Start | 45 (27) | Socio-emotional | X | Validation of logic model | ||
Little Minds Matter | Support and nurturing of parent-infant relationships for those at risk of relationship problems | BDCT/ Family Action | 40 (24) | Socio-emotional | X | Validation of logic model | ||
HENRY | Universal group programme to improve healthy eating and physical activity in young children | VCS & Schools / HENRY | 186 (111) | BMI age 5 | X | X | X | Propensity Score (Control matched BiBBS women not attending HENRY) |
Incredible Years Parentinga | Universal parenting programme for parents with toddlers | VCS – Barnardo’s | 160 (96) | Socio-emotional | X | X | X | Propensity Score (Control matched BiBBS women not attending) |
Cooking for a Better start | Universal cook and eat sessions | VCS - HENRY | 72 (43) | Implementation | X | Validation of logic model | ||
Pre-schoolers in the Playground | Pre-schoolers physical activity in the playground | Schools | 108 (65) | Physical activity /obesity | X | X | Trial within Cohort (cluster randomised) | |
Forest Schools | Outdoor play in the natural environment for young children & parents | VCS – Get Out More CiC | 90 (54) | Physical activity /obesity | X | Trial within Cohort (cluster randomised) | ||
Better Start Imagine | Book gifting & book sharing sessions | VCS – BHT Early Education and Training | 1015 (609) | Parent attitudes and behaviours @ 2 yrs | Validation of logic model for sharing session. Acceptability of book gifting in different cultures | |||
I CAN Early Talk | Strengthening parents’ and practitioners’ knowledge in improving language development | VCS – BHT Early Education and Training | 115 (69) | Staff / parental knowledge | X | Implementation Evaluation | ||
Talking Together | Universal screening for language delay of 2 year olds; in home programme for parents with children at risk of delay. | VCS – BHT Early Education and Training | 954 (572) | Language assessment at 3 month follow up | X | X | X | Trial within Cohort (Control: Waiting list comparison grp) |
Strategy and tool development
Challenge | Possible Causes | Strategies |
---|---|---|
1. Researchers, Communities and Stakeholders often have different priorities and timeframes of research outputs. |
Differing areas of expertise.
A lack of shared understanding of each groups’ main concerns.
| • Identify and involve relevant communities and stakeholders at all stages. • Establish community and/or stakeholder groups and host consultation events. • Have a presence in the community and in practice by hosting regular outreach and education events. |
2. It can be difficult to accommodate the requirements of the evaluation, implementation and delivery of the intervention within service design. |
The demands of scientific rigour in data collection and intervention development may be at odds with the practical needs of delivery.
| • Use and adapt the toolkits presented in this paper to aid service design and ensure the needs of commissioners, providers and evaluators are all considered in a structured and efficient way. |
3a. There may be gaps in the collection or entry of routine data that are required for evaluation. |
The use of clinical data for evaluation is not usually considered by practitioners.
| • Develop training sessions and manuals for practitioners to empower them to collect data that is useful for research. • Work with data teams to modify databases to make it easier to collect required data. • Work with commissioners to modify service level specifications regarding data collection. |
3b. Services may use non-validated measures to assess outcomes. |
Validated measures can be complex and burdensome to participants. Measures may not appear relevant to practitioners
| • Co-production / selection of validated measures involving practitioners, service providers, community members and researchers. |
3c. Organisations may be concerned about sharing data. |
Organisations may have different interpretations of the same laws and acts.
| • Building of good relationships with key stakeholders. • Prompt sharing of findings with organisations to support their practice and planning. • Develop consent and privacy notices with stakeholders and the community. |
4. It may be difficult to easily identify early successes and challenges in intervention implementation. |
Service providers/ commissioners capture too much information and/or information that is not appropriate for monitoring/evaluation.
| • Use the toolkits presented in this paper to ensure the right data is collected. • Co-produce key progression criteria to allow early identification of success and/or areas of potential concern that can then result in adaptations to enhance performance. |
5. Service providers and commissioners are pressured to find quick answers, but rigorous evaluation can take much longer. |
Differing areas of expertise and priorities. Many interventions require in-depth implementation evaluations before they are ready for effectiveness evaluations. Long-term evaluations can seem daunting to service providers.
| • Use the evaluation framework presented in this paper to set expectations, ensure that the necessary groundwork is completed and answer important implementation questions before embarking on effectiveness evaluations. |
Findings
Community and stakeholder engagement
Intervention design
Optimising the use of routinely collected data
Data quality
National Institute for Health and Clinical Excellence (guidelines [1] recommend that the Whooley questions [2] are completed to assess maternal mental health and a full mood assessment completed if the woman answers positively. In the health data system in Bradford we discovered that the code for Whooley questions is present if the questions were asked, but it doesn’t record the response to the questions. This is very challenging for evaluations because we can only assume the outcome of the assessment by subsequent actions, e.g. if no other action was taken we assume a negative response to Whooley, but this might not be the case. We are working with the systems provider and NHS Trust data specialists to amend the Whooley data fields to enable the actual response to the questions to be captured. Early access and exploration of routine data is advisable to ensure that any data capture issues are identified and addressed. 1. National Institute for Health and Clinical Excellence. NICE guidelines [CG192]: antenatal and postnatal mental health: clinical management and service guidance. NICE, 2014.
http://www.nice.org.uk/guidance/cg192/chapter/1-recommendations#recognising-mental-health-problems-in-pregnancy-and-the-postnatal-period-and-referral-2
. Accessed 22nd March 2018 2. Whooley M, Avins A, Miranda J, et al. Case-finding instruments for depression. Two questions are as good as many. J Gen Intern Med 1997;12. |
Valid and meaningful outcome measures
In Bradford Health Visitors complete a 3–4 month visit to assess the mother-child relationship. The National Institute for Clinical Excellence guidance in the UK [1] recommends that the mother-child relationship is assessed but doesn’t recommend any particular measure for use with babies and consequently the assessment in Bradford (and other places across the UK) is based on subjective observations. To allow us to evaluate the impact of interventions on attachment we needed to implement an objective validated measure. We have worked with Health Visitors, their managers and commissioners to pilot the use of the Maternal Postnatal Attachment Scale (MPAS) [2]. The pilot is exploring the utility and acceptability of this measure as a swift, inexpensive screening tool for Health Visitors. It will also look to see if the measure helps identify attachment issues for appropriate referrals. Based on the results of this pilot, the tool will either be implemented or adapted (and validated) using a co-design method to provide a feasible tool for practitioners and mothers. 1. National Institute for Health and Clinical Excellence. NICE guidelines [PH40]: social and emotional wellbeing: early years. NICE, 2012.
2. Condon JT & Corkindale CJ. The assessment of parent-to-infant attachment: Development of a self-report questionnaire instrument. Journal of Reproductive and Infant Psychology. 1998: 16:1, 57–76.
|
Stage 1, Review the intervention’s logic model: Discuss the key programme outcomes as identified in the logic model and review the current measures for these. This will help to identify that not all outcomes in the logic model are being measured in a way that will show the effectiveness of the intervention. Explore how additional measures would also be useful for practice. | |
Stage 2, Identifying the optimal measure: The research team should identify all relevant validated measures that map onto the outcomes in the logic model, with a focus on free or low-cost options with easy administration procedures. These measures are then shared with the organisation leads, and then presented to the team of practitioners for discussion, and selection of measures for piloting. Where small changes to measures are possible without overtly affecting validity, this should be considered in response to the practitioners preferences. | |
Stage 3, Operational considerations: This will include implementing database changes and ensuring that reports can be completed from databases. Translation of measures should also be considered when working in ethnically diverse communities. Relying on interpreters or bi-lingual practitioners to translate an outcome measure can result in inconsistent use of terminology/meanings in complex assessments which may negatively influence the validity of the outcome. | |
Stage 4, Training of practitioners: This should involve careful planning of training at a time that is convenient for practitioners as well as ensuring support and buy-in from senior managers. Training is likely to work best when delivered by someone regarded as an expert and fellow practitioner, and when supported by a clear and comprehensive manual. | |
Stage 5, Piloting: The process of implementing the new measures should be done through negotiation with the team, with consideration for how the measures would impact on aspects of the practitioner’s work (e.g. time spent with clients, development of rapport with new clients, administration time) as well as evaluation needs (e.g. baseline measures, consistency of administration). A period of piloting the new measures to consider their feasibility and acceptability for practitioners and families should be completed. Throughout this time, the research team should check in with the team to consider ongoing changes and challenges. | |
Stage 6, Implementation: Feedback from the pilot, and consideration of the quantitative performance of the measures should be completed from the pilot before the measures are introduced as a part of standard practice. |
Data sharing and linkage
Bradford health and education organisations use local data to inform their planning and their work. The Better Start Bradford programme has encouraged a breakdown of data at ward level and a search for more up to date local data. In the past ethnicity prevalence have been taken at a City-wide population level from the UK Census completed in 2011. The Better Start Bradford work allowed us access to maternity records that indicated a different ethnicity prevalence for pregnant women and for young children in the Better Start Bradford areas than that reported in the Census. Similarly the maternity data highlighted that one-third of pregnant women had little or no English. This has informed practice across the City and has led to a focus on enhancing service provision and accessibility for these women within the service design and monitoring processes. |
Monitoring implementation
One of the Better Start Bradford interventions is a locally developed project that offers a universal language screening of two-year olds in the Better Start Bradford area, and an in-home intervention for those identified as at risk of language delay. The progression criteria were agreed with the service provider and commissioner. Early review of these criteria revealed a higher demand for the in-home intervention than originally anticipated. This encouraged early review of the capacity and resources for the project to ensure successful delivery. The reach criteria indicated challenges in engaging one particular ethnic group, which encouraged the service provider to focus engagement activities with this group and also ensure interpreting services were available. |
Evaluation
One of the Better Start Bradford interventions is a personalised midwifery model adapted from the evidence based continuity of care model [1]. The adaptation was the removal of continuity at delivery due to local concerns of high burden on midwives. The removal of a key component means there is no evidence of implementation or effect for this intervention. The first stage of evaluation that we undertook was an implementation evaluation to look at the feasibility, fidelity and acceptability of the model using midwifery data, complemented with structured interviews with midwives and women who had recent midwifery care. The implementation results helped us to demonstrate that the intervention was feasible and acceptable and also helped to identify the key components and outcomes that can be rolled out to other midwifery teams in the area. The next step will be an effectiveness evaluation using routinely collected data to explore the benefits of continuity of care without the birth element. This evaluation will use propensity score matching within the BiBBS cohort. To assess whether an intervention is ready for an effectiveness evaluation we use an evaluability checklist [2]. This considers numerous important factors including: availability of good quality data, use of validated outcome measures, continuous good recruitment and engagement, intervention delivered with fidelity and indication of promise from previous evaluations. 1. Sandall J, Soltani H, Gates S, Shennan A, Devane D. Midwife-led continuity models versus other models of care for childbearing women. Cochrane Database of Systematic Reviews 2016, Issue 4. Art. No.: CD004667. DOI: https://doi.org/10.1002/14651858. 2. Davies, R. Planning Evaluability Assessments: A Synthesis of the Literature with Recommendations. 2013. Report of a Study Commissioned by the Department for International Development. |
Implementation evaluations
Before and after evaluations
Effectiveness evaluations
Conclusion
-
Members of the local community and service providers should be involved at each stage of intervention development and evaluation.
-
Researchers, the local community and stakeholders need to work together and understand each other’s worlds.
-
Use and adapt the toolkits presented here [23] to aid intervention design and ensure the needs of commissioners, providers and evaluators are all considered.
-
Conduct effective and focussed monitoring using progression criteria agreed by commissioners and providers. This will allow early identification of success and/or areas of potential concern that can then result in adaptations to enhance performance.
-
Researchers should harness the use of routine outcome measures in research, and service providers should recognise the value and requirements of their data for evaluation as well as for clinical practice.
-
Implement validated outcome measures through a co-production method to ensure they are valid, feasible and useful in practice within the intended population.
-
Use the evaluation framework presented here [23] to set expectations, ensure that the necessary groundwork is completed and answer important implementation questions before embarking on ambitious effectiveness evaluations.