Background
Methods
The PEACH™ program
Translation of evaluation from PEACH™ RCT to PEACH™ QLD
Stage of translation: | Stage 1: Randomised-Controlled Trial (CONSORT) | Stage 2: Small-scale community trial (COMMUNITY I) | Stage 3: Large-scale community intervention (COMMUNITY II) |
---|---|---|---|
PEACH™ RCT [7] | PEACH™ IC [33] | PEACH™ QLD | |
n participants |
n = 169 families; n = 169 children |
n = 62 families; n = 78 children |
n = 919 families; n = 1122 children |
n groups |
n = 6 |
n = 8 |
n = 105 |
n facilitators |
n = 3 delivered sessions |
n = 12 delivered sessions; n = 54 trained |
n = 52 delivered sessions; n = 80 trained |
Program overview and setting | ▪ Single-blinded RCT with 2 intervention groups (1) 12× parenting (P) and healthy lifestyle (HL) group sessions OR (2) 8× HL group sessions 90- to 120-min group sessions, both with 4× one-to-one phone calls, delivered over 6 months with tapered frequency (weekly, fortnightly, then monthly) ▪ Multi-site (Sydney and Adelaide) | ▪ 10× 90-min fortnightly face-to-face group HL sessions incorporating P skills with 3× one-to-one phone calls over 6 months ▪ South Australia | ▪ 10× 90-min face-to-face group HL sessions incorporating P skills, with 3 one-to-one phone calls over 6 months ▪ Sessions 1–9 initially held fortnightly, changed to weekly during program delivery ▪ Queensland |
Evaluation | |||
Trial registration | ✓ ACTR00001104; ACTRN12606000120572 | ✗ | ✓ ACTRN12617000315314 |
Program delivery (logistics) | ✓ | ✓ | ✓ |
Demographics | ✓ family, parent and child | ✓ family, parent and child | ✓ family, parent and child |
Anthropometry | ✓ parent and child | ✓ child | ✓ child |
Child diet | ✓ parent-reported | ✓ parent-reported | ✓ parent-reported |
Child activity | ✓ parent-reported | ✓ parent-reported | ✓ parent-reported |
Child quality of life | ✓ parent- and child-reported | ✗ | ✓ child-reported |
Parenting | ✓ parent-reported | ✗ | ✓ parent-reported |
Program satisfaction | ✓ parent-reported | ✓ parent-reported | ✓ parent- and child-reported |
Child body image | ✓ child-reported | ✓ child-reported | ✗ |
Follow-up | ✓ up to 5 years | ✓ only to 6 months | ✓ only to 6 months |
Program fidelity | ✓ independently assessed from audio recordings of sessions | ✓ informal only | ✓ facilitator-reported |
Facilitator training/delivery | ✓ | ✓ pre- and post-training, post-delivery | ✓ pre- and post-training, post-delivery |
Clinical biochemistry | ✓ child | ✗ | ✗ |
Follow-up parent interviews | ✓ 12 months | ✗ | ✗ |
Service-level evaluation | ✗ | ✓ | ✓ |
System-level evaluation | ✗ | ✗ | ✓ |
PEACH ™ QLD
RE-AIM dimension and definition | Level (source) of data | Data collected (O/I/P)a
| Tool used/items generated | Further detail and references | |
---|---|---|---|---|---|
REACH |
Proportion of the target population that participated in the intervention
| Individual (Family) | Number of families enrolled (P) | Purpose-developed database | Recruitment and enrolment databases developed, unique nine-digit ID allocated at enrolment |
Family demographics (P) | Questionnaire | Family demographics included family composition, parent education, ethnic background and income level. It is adapted from a previously used data collection form [34]. | |||
EFFICACY/EFFECTIVENESS |
Success rate if implemented as in guidelines; defined as positive outcomes minus negative outcomes
| Individual (Facilitator) | Changes in knowledge, skills and confidence (I) | Purpose-developed questionnaire | Self-rated on a Likert scale for the practice areas of family-focussed weight management, lifestyle support, behaviour modification |
Satisfaction with program training and resources (P) | Purpose-developed questionnaire | Parent facilitator satisfaction with program training workshop and program resources was collected pre- and post-training, and post delivery | |||
Individual (Child) | Child anthropometric measures (O) | Standardized measures for weight, height, waist circumference | |||
Parent-reported child diet (O) | Children’s Dietary Questionnaire (CDQ) scores for 1) Fruits & vegetables; 2) Sweetened beverages; 3) Fat from dairy products; 4) Discretionary foods; and 5) Food behaviours | ||||
Core food group serves for: 1) Fruits; 2) Vegetables; 3) Grains; 4) Meats and alternatives; and 5) Dairy and alternatives | Ten-item, parent completed questionnaire to assess intake of the five core food groups of Australian Guide to Healthy Eating (AGHE) validated in a sample of 45 [31]. | ||||
Parent-reported child physical activity and sedentary behaviours (O) | Children’s Leisure Activities Study Survey (CLASS) | Assessed using the Children’s Leisure Activities Study Survey (CLASS) questionnaire [42], modified to focus on active pastimes and screen-time only. Provides a quantitative estimate of children’s time spent in moderate, vigorous and total physical activity, and in screen-based sedentary activities per day. The parent-completed version was used as it is equal in validity and reliability to the child-completed questionnaire [42], and allowed consistency in survey administration of diet and PA outcomes. | |||
Child-reported health-related quality of life (I) | Child Health Utility 9D (CHU9D) | ||||
Child program satisfaction (P) | Purpose-developed group activity and questionnaire | Children’s views of their group sessions were captured via a brief questionnaire and informal group discussion in the last session. | |||
Individual (Family) | Parenting self-efficacy (I) | Parenting self-efficacy | Four-item questionnaire from the Longitudinal Study of Australian Children [45]. | ||
Parent barriers, confidence and health beliefs (I) | Purpose-developed questionnaire | Five-item purpose-developed tool to assess parent beliefs about their child’s health, and perceived (pre-program) or actual barriers (post-program) to changing their child’s and family’s health. A further 3 items ask parents to report their confidence to 1) make healthy changes to child and family eating and activity patterns; 2) set limits regarding child food and eating; and 3) set limits regarding child activity/inactivity patterns. These questions are conceptually based on the Health Belief Model [46, 47]. | |||
Attendance rates (P) | Program sign-in sheets | Purpose-developed sign in sheets for parents at each session | |||
Satisfaction with program and materials (P) | Purpose-developed questionnaire | Completed by parents at the end of program delivery. Includes satisfaction with program delivery and changes the family has made during the program. | |||
ADOPTION |
Proportion of settings, practices, and plans that will adopt this intervention
| Organisation (Facilitator) | Number of facilitators trained (P) | Purpose-developed database and questionnaire | PEACH™ parent facilitator training logs |
Demographics (facilitators and services) (P) | Facilitator descriptors included gender, age, education, current employment status and experience in adult and child weight management in groups and 1:1 | ||||
Number of health services/other organisations engaged (P) | For purpose database containing details on each PEACH™ group including organisational setting | ||||
Stakeholder interviews (P) | Purpose-developed interviews | Semi-structured interviews with facilitators, organisations and stakeholders | |||
IMPLEMENTATION |
Extent to which the intervention is implemented as intended in the real world
| Organisation (Facilitator) | Number of facilitators who delivered groups and number of groups (P) | Purpose-developed database | For purpose database tracking facilitator involvement in the program (including demographics, training and program delivery) |
Adherence to program protocol and session outlines (fidelity) (P) | Purpose-developed questionnaire and session monitoring forms | Facilitators self-rate the quality of the group facilitation and content fidelity, for each session. It is based on a checklist developed for the NOURISH RCT [48]. | |||
MAINTENANCE |
Extent to which a program is sustained over time
| Organisation (Facilitator) | Workforce capacity change | This is beyond the scope of the PEACH™ delivery stage | To be determined |
Organisation (Health System) | Funding committed | ||||
Individual (Family) | Long term family impact |
Results
Experience | Response | Key learnings | |
---|---|---|---|
Ethics |
Ethics committees appeared to approach the Project from an RCT paradigm
| ||
● Early requests from ethics committees included the addition of a control group and the de-identification of data prior to it being shared with the team. ● There was an apparent misunderstanding that ethics approval was required for the delivery of the program, as would be the case for an RCT, versus ethics approval for the collection of evaluation data, which is more appropriate for community program participants. | ● Effort was made to develop relationships with ethics committees to enhance understanding of the Program and its implementation research approach. | ● At ethics review, there is a need for the distinction between research-based practice and practise-based research such as program evaluation research. | |
Evaluation design |
Engagement challenges experienced during implementation required changes to inclusion criteria which are avoided in an RCT
| ||
● In later stages of the program, inclusion criteria were expanded to include healthy weight children in addition to overweight and obese children in an effort to reduce the stigma of participation in a program for overweight/obese children. | ● Questionnaires were updated to reflect the new criteria and changes in anthropometry needed to be reported separately for healthy weight children and the target population of children above a healthy weight. Data cleaning processes, data analysis syntax and feedback letters to families were tailored as needed. | ● Make concessions for, and anticipate changes in, evaluation which are necessary when there are responsive changes in delivery of upscaled programs. | |
Evaluation length and consent process may have been unanticipated and burdensome on participants who signed up for a community program, and not an RCT
| |||
● Participants enrolled in a community healthy lifestyle program and may not have considered themselves enrolled in a research project (c.f. RCT participants). Correspondingly, the lengthy participant information sheet and associated consent form required by ethics may have impacted participant engagement with evaluation and/or the program. ● In addition to consent procedures as required by ethics, evaluation questionnaires were lengthy | ● An ethics modification was made in order to use data which were collected with implicit consent prior to and at sessions, without a signed consent form. ● The instrument for measuring physical activity was changed to a much shorter tool and onerous process evaluation items were omitted from questionnaires when further program changes were out of scope. | ● The collection of some data for program monitoring without explicit participant consent (analogous to health service performance monitoring) should be considered reasonable and opt-out consent may be suitable for upscaled programs. ● Use a ‘minimalist or bare essentials’ lens when designing evaluation. | |
Data collection |
Research conducted in the world has a level of incomplete, unusable, and missing data, which is higher than research in a more tightly controlled RCT setting
| ||
● Child facilitators conducted the anthropometric measures following training, using standardised equipment and protocols. These facilitators had various backgrounds (e.g. health professionals, teachers, team sport coaches) and some had limited experience in research and taking child measurements and may not have appreciated the implications for data collection. Consequently, there were some inaccuracies. ● Despite training and support on evaluation, parent facilitators may have held different perspectives on their role in the Program, particularly if they were operating in a service delivery paradigm rather than practice-based research. Hence, assisting with or ensuring data collection at sessions was not always seen as a priority – establishing rapport with parents was – and thus there was varied engagement with, and completion of evaluation. | ● A height test to ensure correct assembly of the stadiometer improved the error rate and protocols for handling unreliable anthropometry data were established as part of quality assurance. ● All parent and child facilitators were trained, including the importance of data collection and evaluation processes, and the Evaluation Team monitored the return of evaluation data and sent reminders to facilitators to collect outstanding data at sessions or return outstanding questionnaires post-program. | ● Where anthropometry is a key outcome, consider experienced or accredited personnel (e.g. International Society for the Advancement of Kinanthropometry) to take measures. ● Consider central and/or pre-program collection of evaluation data, to reduce burden on facilitators to collect data at early program sessions (especially for large groups). | |
Program re-enrolments violate rigourous RCT protocols but are optimal in an upscaled program
| |||
● Families were able to enrol more than once which had a cascade effect as multiple ID numbers were given to the same child where their family re-enrolled. Multiple ID numbers were also administered when parents in a split family were enrolled in two separate groups, but the same child was participating in the program. | ● Where a child had multiple enrolments and hence multiple study ID numbers, they had to be manually screened and excluded in data analysis so that each child was counted once. | ● Flexibility to re-enrol in upscaled programs held in the community is desirable, however can lead to duplication of work: resources and time should be allocated to deal with data from these cases to manually exclude duplicates or reconcile sources of data when incomplete data are collected. | |
Process evaluation |
Process evaluation data are nice to have in an RCT but crucial for successful implementation of upscaled programs. Process data are challenging to identify and capture in real time
| ||
● The Project Implementation Team desired ‘real-time’ feedback from programs to inform decision making during program implementation and were driven by meeting contracted enrolment targets. ● Rich program monitoring data were collected from a variety of sources during the project. These data were outside the formal evaluation framework and were not formally captured in databases in situ and analysed for reporting. | ● The Evaluation Team was able to provide only limited process and outcome data in real-time outside the formal and contracted reporting schedule as data were collected only at program end and data cleaning/analysis processes were time-intensive. ● It took considerable time to reconcile data from multiple sources and prepare these data for analysis at the end of the project term. | ● The identification, systematic capture, and analysis of process evaluation data from a range of sources may be better managed by the project delivery team who are in tune with program challenges and best equipped to respond to real-time feedback by making changes to delivery. ● Plan a priori for necessary expertise and budget to collect, manage, analyse and interpret process evaluation data in real time. |