Background
Implementation fidelity has, until recently, been defined as the degree to which a given program is implemented as it was originally planned and it has been mostly focused in measuring the deployment of the evidence-based intervention under study [
1,
2]. Yet, in complex interventions, the program includes the evidence-based intervention and the supporting implementation strategies aimed at facilitating the adoption of this intervention by those responsible for delivering it [
3,
4]. Consequently, factors related to adherence to the planned implementation strategy, dose received, i.e., the extent to which the recipient was exposed to the implementation strategy, participant responsiveness and actual involvement, as well as modifications made and the role of context, become central issues for understanding the impact of implementation initiatives to improve real-world clinical practice [
3,
5]. As the field of implementation expands and the use of hybrid trial designs grows [
6], the distinction between intervention-level fidelity and implementation strategy-level fidelity is becoming increasingly important.
Evaluating the degree to which implementation strategies are operated as designed within implementation trials are key in order to determine the internal and external validity of implementation studies [
1,
3‐
5]. They are necessary to investigate both the receipt and the scope of an implementation strategy to improve the adoption of evidence-based practice in routine settings [
4,
7]. Additionally, they help in the interpretation of outcome results of interventions translated to real practice and inform the optimization of both the clinical intervention and/or implementation strategy to favor adoption of the intervention and implementation and future scale-up in other contexts and settings [
8‐
11]. Fidelity evaluation is especially necessary in multisite trials, where the “same” implementation strategy may be enacted and received in different ways [
12,
13].
However, despite the importance of implementation fidelity evaluation, first, there is currently no framework explicitly establishing either a set of procedures or specific requirements to guide the evaluation of the fidelity of an implementation strategy [
4,
14]. And second, likely linked to this first issue, under-reporting of fidelity of implementation strategies is the rule rather than the exception in the implementation research literature [
4]. Among general existing frameworks to guide fidelity evaluations [
2,
3,
5,
7,
8], the framework stated by Dane et al. [
8] and its adaptation by Dusenbury et al. [
5], has been successfully used by others for rating the fidelity of implementation strategies [
4]. This framework points to the following five main fidelity dimensions: a) “adherence”, which strategies were actually used during implementation versus which were planned; b) “dose”, the quantity of the implementation strategy delivered and extent to which the recipients actually received it; c) “quality of program delivery”, how well the components of the implementation strategy have been delivered and if modifications occurred; d) “participant responsiveness”, how the delivery process of an implementation strategy is received by its recipients; and e) “program differentiation”, defined as the degree of enactment of differentiated procedures or strategies in one condition from that in the other condition should be also considered as an important element of implementation fidelity evaluation [
4,
5,
8].
The aim of the present study is to evaluate the fidelity of two procedures being compared for deploying the PVS-PREDIAPS implementation strategy to optimize type-2 diabetes prevention in primary care (PC) [
15]. Briefly, the PVS-PREDIAPS implementation strategy consist in conducting externally facilitated collaborative modeling process through a set of planned implementation actions with PHC professionals in order to adapt and integrate an evidence-based healthy lifestyle promotion intervention to prevent type-2 diabetes within the routine primary care services. This fidelity evaluation will help to gain understanding about the quality of the implementation of the PVS-PREDIAPS strategy and specifically regarding the two procedures compared for engaging professionals and deploying the implementation actions. This in turn will help to explain and interpret future results of the PREDIAPS trial (ie., to reject a possible type III implementation failure error), will identify what has been changed from the original implementation plan and how changes may impact outcomes, and will inform how future dissemination and scale-up can be improved. Specifically, the objectives of the present study were to:
-
To describe the process indicators of the PREDIAPS trial and deployment procedures of the PVS-PREDIAPS implementation strategy [
15], including a clear description of the context in which it has been implemented
-
To assess the fidelity of the delivery of the PREDIAPS implementation strategy among the groups compared, in terms of the dose of both the strategy delivered and that actually received by professionals and centers involved, the planned and unplanned modifications that took place, i.e., adherence, and the degree to which elements can reliably differentiate one type of procedure for engagement and deployment of the strategy from another, i.e., program differentiation.
-
To assess the perceived usefulness of the implementation strategy by professionals involved (responsiveness)”
Results
Of the 12 PC centers put forward by the management of the 4 participating Osakidetza integrated healthcare organizations, 9 were recruited to the project. Of these nine centers, six were classified as being located in an “Urban city” area and three were allocated to each of the comparison groups, while the only center classified as “Residential”, which had the lowest mean Deprivation Index, was allocated to the Global strategy group, and the only two “Urban-rural” centers were allocated to the Sequential strategy group. Centers described as “large” based on the catchment population and number of practitioner lists were more represented in the Global than in the Sequential group, which contained the two smallest centers.
Across the 9 centers, 137 physicians and nurses originally gave written consent and agreed to collaborate (70% of all the physicians and 82% of all the nurses assigned to these centers) (Table
1). The initial collaboration rate among nurses was higher in the Sequential (94%) than in the Global (69%) group. The overall exposure rate to the programed implementation strategy actions (% of hours received out of those delivered; maximum = 20 h) was slightly higher in the Sequential than the Global group for both categories, with nurses (89.4% vs 85%) having a higher rate of exposure than physicians (82.6% vs 75%). Assessing the exposure of collaborating professionals in each category to the implementation actions, the percentage of nurses exposed to at least 80% of the actions was again a higher in the Sequential group (70% vs 60% in the Global group).
Table 1
Collaborating primary care centers’ characteristics and professionals participation and exposure rates
Type of PC | Sub-urban | City center | City center | City center | NA | City center | City center | City center | Town-rural | Town-rural | NA |
Deprivation Index of the population, mean | 1.5 | 3.9 | 3.5 | 2.9 | NA | 3.9 | 3.8 | 2.8 | 2.8 | 2.9 | NA |
Catchment population, n | 15,201 | 16,552 | 22,438 | 14,897 | 69,088 | 12,348 | 17,877 | 12,948 | 10,324 | 7374 | 60,871 |
Initial collaboration from the centers’ | | | | | | | | | | | |
Physicians, n/total | 10/12 | 13/14 | 8/15 | 6/11 | 37/52 | 6/7 | 7/9 | 6/9 | 4/5 | 5/5 | 28/35 |
Nurses, n/total | 8/12 | 8/13 | 14/15 | 5/11 | 35/51 | 7/7 | 9/9 | 9/9 | 4/6 | 5/5 | 34/36 |
Practitioner lists, n | 10 | 12 | 7 | 6 | 35 | 8 | 8 | 9 | 6 | 5 | 36 |
Exposure rate* | | | | | | | | | | | |
Physicians, % | 72% | 66% | 78% | 84% | 75% | 85% | 82% | 80% | 77% | 89% | 82.6% |
Nurses, % | 90% | 89% | 79% | 82% | 85% | 90% | 81% | 86% | 98% | 92% | 89.4% |
% professionals exposed to 80% of the strategy | | | | | | | | | | | |
Physicians, % | 30% | 15% | 37% | 83% | 41.2% | 33% | 37% | 14% | 25% | 80% | 37.8% |
Nurses, % | 62% | 75% | 43% | 60% | 60% | 86% | 55% | 55% | 75% | 80% | 70.2% |
Tables
2 and
3 describes the actual execution of the actions of the implementation strategy and its components, including modifications (planned or reactive), over the course of the 2.5 years during which the deployment of the strategy and setting up of the intervention program were due to take place (March 2017–May 2019). All nine centers held the three leader training sessions and the 10 sessions led by the local leader supported by the external facilitator, which comprise the core strategies and actions within the implementation phase. The rates of participation in each of the sessions of the strategy were over 50% of the staff in each category in almost all cases and confirm differing patterns of execution of the actions by centers and by group (Global vs Sequential) (see
Appendix Fig. 1). In the post-implementation phase, five sessions were held to support and monitor the setting up of the intervention program. As a component of the strategy, befovre each of the sessions held in their center,11 coordination and preparation sessions were run for the leaders, led by the facilitator, of which 5 were for following-up the setting up of the program and monitoring progress.
Table 2
PVS-PREDIAPS implementation strategy and action timing mapping
Implementation strategy component | Actions conducted | 1 Mar | 2 Apr | 3 May | 4 June | 5 July | 6 Aug | 7 Sept | 9 Oct | 9 Nov | 10 Dec | 11 Jan | 12 Feb |
Strengthening of Local Leadership Goals: to provide the local coordinator with interpersonal and organizational skills to support the implementation Discrete implementation strategies: Recruit, designate and train for leadership (ERIC 57); Conduct educational meetings (ERIC 15); Ongoing support for implementation (ERIC 55) | 3-day leader training workshop (Lt i) (5 h/session) | X Lt1-3 | | | | | | | | | | | |
Ongoing leader support meetings (Lsmi) (4 h/meeting) | | XLsm1 | XLsm2 | | | | XLsm3 | | XLsm4 | XLsm5 | | |
Training in the clinical intervention Goals: to provide initial training in the recommended type 2 diabetes (T2D) prevention clinical intervention and the ICT support tool in the electronic health record Specific implementation strategies: Conduct educational and skill development meetings (ERIC 15 and 19); Changes in record systems (ERIC 12) | Session 1: Primary prevention of T2D in PC: evidence and recommended practice (90 min/session) | Xs1 M1 Xs2 | | | | | | | | | | | |
Session 2: ICT application for the promotion of healthy habits in the EHR (6 h/session) |
Collaborative planning of the intervention program Goals: to plan the local program based on shared decision-making: objectives, actions, agents, work flow, organization and sharing out of tasks Discrete implementation strategies: Conduct local needs assessment (ERIC 18); Conduct educational and outreach meetings (ERIC 15); Conduct local consensus discussions (ERIC 17); Intervention mapping (ERIC 48, 51, and 59); Conduct ongoing training (ERIC 19); Plan-Do-Study-Act cycles (ERIC 14); Develop a formal implementation blueprint (ERIC 23). | Session 3 – Needs assessment and prioritization of areas for improvement (90 min); | | | Xs3 | | | | | | | | | |
Session 4/5 – Planning T2D prevention program (180 min); | | | Xs4 | Xs5 | | | | | | | | |
Session 6/7 – Plan-Do-Study-Act cycles 1 and 2 (90 min each); | | | | | | | Xp1 | Xp2 | | |
Session 8 – Refresher training (180 min); | | | | | | | | | | | Xs8 |
Session 9 – Plan-Do-Study-Act cycle 3, plus engagement of physicians in the Sequential group (90 min); | | | | | | | | | | | | Xp3 M2 |
YEAR 2 (March 2018–March 2019) | MONTH |
Implementation strategy component | Actions conducted | 13 March | 14 April | 15 May | 16 June | 17 July | 18 Aug | 19 Sptb | 20 Octb | 21 Nov | 22 Dec | 23 Jan | 24 Feb |
Continuation of … Strengthening of local leadership | Ongoing support meetings (4 h/meeting) | XLsm6 | | XLsm7 | | | | XLsm8 | | XLsm9 | | XLsm10 | |
Continuation of …Collaborative planning of the intervention program | Session 10 – Standardization of the local program (90 min) | Xs10■ | | | | | | | | | | | |
Ongoing sustainability Goals: to continually support and assess innovation being put into practice Discrete implementation strategies: Develop quality monitoring systems (ERIC 27); Audit and provide Feedback (ERIC 5); | Regular audits and ongoing facilitation sessions (6 sessions, 90 min each) | | | Xms1 M4.1 | | | | Xms2 | | Xms3 M4.2 | | Xms4 M4.3 |
Ongoing supportive training | | | M3 | | | | | X M5 | | | |
YEAR 3 (March 2019–May 2019) | MONTH |
Implementation strategy component | Actions conducted | 25 March | 26 April | 27 May | 28 June | | | |
Continuation of …Strengthening of local leadership | Ongoing support meetings (4 h/meeting) | | | | | XLsm11 | | | | | |
Continuation of … Ongoing sustainability | Regular audits and ongoing facilitation sessions (6 sessions, 90 min each) | | | | | Xms5 M6 | | | | |
Table 3
Reporting modifications of the PVS-PREDIAPS implementation strategy
1. Added actions |
Modification/Description | Additional training session in the recommended clinical intervention and in the ICT tool (M1) | Additional session for physicians’ involvement in the Sequential Group (M2) | Additional training session in the recommended clinical intervention and in the ICT for new professionals (M3) |
When did the modification occur? | Month 1 of the implementation phase | Month 11 of the implementation phase | Month 20–21 of the implementation phase |
Were modifications planned? | Unplanned and reactive | Unplanned and reactive | Unplanned and reactive |
Who participated in the decision to modify? Who made the ultimate decision? | Implementation team, formed by healthcare professionals of the center and the external facilitator | Local leader and the external facilitator | Implementation team, formed by healthcare professionals of the center and the external facilitator |
What was modified? | The dose of the training component by repeating the former planned action/session | The dose and the actors involved, as the session for “involvement of physicians” (PDSA session #3) was repeated adding the presence of the integrated healthcare organization management staff | An additional training session was to newly incorporated professionals was offered, scheduled and performed |
At what level of delivery (for whom/what is the modification made)? | Healthcare professionals of one PC of the Sequential group (Sv) | One PC of the Sequential group (Sv) | Few new professionals from 4 centers: 1 from the Global (A, n = 3); 3 from the Sequential (Sv, n = 1; Z, n = 2; So, n = 1) |
What is the nature of the content modification? | An additional 90-min session was performed on the recommended clinical health promotion intervention and the ICT healthy lifestyle promotion tool | An additional 90-min session with all PC professionals with the participation of representatives of the management of the integrated healthcare organization | An additional 90-min session was performed on the recommended clinical health promotion intervention and the ICT healthy lifestyle promotion tool |
Relationship fidelity/core elements? | Fidelity consistent/core elements or functions preserved | Fidelity consistent/core elements or functions preserved | Unknown |
What was the goal? | Increase healthcare professional’s knowledge and skills to provide the recommended clinical intervention | Maximize number of physicians engaged | Train newly incorporated professionals to provide the recommended clinical intervention |
What are the reasons for the modifications? | Perceived difficulty of the providing the recommended intervention through the ICT healthy lifestyle promotion tool | Lack of attendance to the planned session | Staff turnover |
2. Actions not or only partially delivered |
Modification/Description | Failed to organize and conduct the planned ongoing supportive training session (M5) | Failed to attend to regular audits and ongoing facilitation sessions (M4) | Failed to organize and conduct the planned number of regular audits and ongoing facilitation sessions (M6) |
When did the modification occur? | Month 20 in the post-implementation phase | Months 15, 22 and 24 in the post-implementation phase | Post-implementation phase |
Were adaptations planned? | Unplanned and reactive | Unplanned and reactive | Unplanned and reactive |
Who participated in the decision to modify? Who made the ultimate decision? | Local leader and healthcare professionals of the center | Local leader and healthcare professionals of the center | Implementation team, formed by healthcare professionals of the center and the external facilitator |
What was modified? | Planned training session suspended | Planned and scheduled sessions suspended | Planned sessions suspended |
At what level of delivery (for whom/what is the modification made)? | 2 centers in the Sequential group (Eg, So) and 2 in the Global group (A, Z) failed | 2 PC in the Sequential group (Eg, Z) and one in the Global group (I) failed once; 1 PC in the Sequential group (Er) failed twice | All centers |
What is the nature of the content modification? | No modification required | No modification required | No modification required |
Relationship fidelity/core elements? | Fidelity inconsistent as this was a core element within the planned strategy | Fidelity inconsistent as this was a core element within the planned strategy | Fidelity inconsistent as this was a core element within the planned strategy |
What was the goal? | Increase healthcare professionals’ knowledge and skills to provide the recommended clinical intervention | Provide ongoing support and assess progress in the implementation process | Provide ongoing support and assess progress in the implementation process |
What are the reasons for the modifications? | Difficulty for scheduling meetings due to work overload or other competing priorities | Difficulty for scheduling meetings due to work overload or other competing priorities | Difficulty for scheduling meetings due to work overload or other competing priorities |
In the process of executing the strategy, various unplanned and reactive modifications were made (see Table
3). Specifically, three actions were carried out in addition to those planned: two were related to training, one of these being a session for one center and the other for new nursing staff from several centers; and one was a repeat session for encouraging participation of medical staff in one of the Sequential strategy centers, on this occasion involving managerial staff of the integrated healthcare organization, seeking to boost the involvement of physicians. All of these modifications were requested by the centers themselves via the local leader. On the other hand, three of the planned actions were not carried out as intended. Specifically, of the planned six ongoing monitoring and facilitation sessions, a maximum of five were held, with fewer in some centers, mainly due to scheduling constraints and heavy workloads. Additionally, not all centers managed to organize and run one of the planned Ongoing supportive training sessions.
Table
2 displays the original implementation plan involving 11 ERIC discrete implementation strategies and the intervention mapping, which consisted of a combination of 3 ERIC discrete strategies, yielding a total of 14 discrete strategies. In the group surveys/interviews regarding the perceived usefulness of the implementation strategies, the healthcare professionals recognized half of these 14 planned strategies.
Appendix Table 5 lists the 14 planned strategies, as well as 3 additional ERIC implementation strategies that were perceived by the healthcare professionals to be part of the implementation process. In general, the strategies that were identified were rated positively, with 10 ERIC strategies described by the healthcare professionals as useful for their own professional development, valuable for the optimization process, and/or specifically helpful to build competence, engage professionals, and/or enhance inter-professional collaboration. Room for improvement was noted for only two ERIC strategies delivered - audit and feedback (ERIC 5) and conducting educational meetings (ERIC 15). While centers assigned to both groups identified the need for increased frequency, length, or quality of educational sessions, only one center (assigned to the Global arm) indicated a need for more audit and feedback sessions. Few differences were observed in the perception of strategies received between centers assigned to each arm. Healthcare providers in the centers assigned to the Global strategy identified the importance of conducting cyclical small tests of change (ERIC 14) and revising professional roles (ERIC 59), while none in the Sequential group identified these strategies. In contrast, one Sequential group center noted the value of promoting adaptability (ERIC Strategy 51), while no Global group centers did. Facilitation (ERIC 33) and need to mandate change (ERIC 44) were identified as important by more Global than Sequential arm centers, while providing audit and feedback (ERIC 5) and conducting local needs assessment (ERIC 18) were identified by more Sequential than Global arm centers. See
Appendix Table 5.
Discussion
Fidelity evaluation is necessary to advance knowledge and understanding about the effectiveness of implementation strategies designed to facilitate adoption of evidence-based interventions and practices [
2,
4]. The PREDIAPS project seeks to generate scientific evidence concerning the optimization of healthcare practice in primary prevention of T2D in Osakidetza PC centers through the application of implementation science as a way to achieve feasible, sustainable and effective translation of the recommended evidence-based clinical intervention to clinical practice [
15]. A first step in this process is to ascertain whether the implementation strategy and the procedures for putting it into practice in the groups and centers compared has been executed as planned or there have been modification or variations that could have an impact on the outcomes of interest and in the future reproducibility of the study [
4,
5,
8]. Considering that, we lack a specific framework to guide implementation strategy-level fidelity evaluations, we have adopted the framework for fidelity evaluations stated by Dane and Dusensbury with the purpose of evaluating and reporting on the fidelity of PVS-PREDIAPS implementation strategy and the two procedures for its deployment [
15]. This framework considers the following elements in fidelity evaluations: adherence, dose, quality of delivery, participant responsiveness and program differentiation. Results of the present study seem to indicate that the PVS-PREDIAPS implementation strategy to improve T2D primary prevention has been carried out with high degree of fidelity. Despite some differential exposure to overall strategy within the nursing staff of compared groups, professionals involved have been notably exposed to the implementation strategy and the planned program differentiation related to engagement of professionals and deployment of the implementation strategy has been attained.
Part of the present evaluation of PVS-PREDIAPS implementation strategy’s fidelity involves examining the implementation strategy dosage, that is, the degree of passive exposure to the planned implementation strategies and actions. In general, we can state that the professionals involved in both comparison groups had a notably high degree of exposure to the implementation strategy and that, as planned, the procedures for involving the professional groups and delivering the PVS-PREDIAPS strategy actions were executed differently in the two arms. Additionally, the exposure indicators suggest that professionals assigned to execute the strategy through sequential engagement of colleagues (starting with nurses who later sought the engagement of the physicians) had an unexpectedly higher degree of exposure to the implementation strategy actions, particularly in the case of nurses, the staff responsible for providing the active element of the clinical intervention. As described in the literature, commonly reported obstacles faced by physicians to fully engaging in implementation actions included heavy workload, staff turnover, difficulties in investing time and effort improvement initiatives beyond providing care, and existing practice priorities [
21‐
23].
Dosage can also be evaluated subjectively, in terms of perceived receipt of the implementation strategies by the healthcare professionals. Although all healthcare professionals participating in the PREDIAPS trial were exposed to the 14 ERIC discrete implementation strategies with minor modifications, our qualitative evaluation of their experience indicates that they only recognized having received half of them. Many of the strategies that did not emerge from the surveys and group discussion related to “more structural” implementation actions like developing a formal implementation blueprint, changing record systems, or developing and organizing quality monitoring systems. This may be due to the fact that healthcare professionals are not implementation specialists and did not pay attention to or notice these changes. They also failed to identify several “ongoing” implementation activities, such as ongoing training and ongoing support, and local discussion and consensus sessions (collaborative modeling sessions). It is possible that these activities were seen to be typical, or standard, implementation tools that were too obvious to mention in the evaluation session. It is also possible that these strategies were not strong enough to be detectable. In any case, differential participation or exposure to the strategy could compromise the future implementation of the clinical intervention that we are seeking to promote [
24]. Therefore, future analysis and interpretation of the main outcomes of interest should be adjusted for the degree of participation of professionals in general and/or exposure to the actions of the implementation strategy as a function of study arm and professional group [
13,
23].
A second factor of interest in the evaluation of the fidelity of the execution of an implementation strategy is the extent of changes and adjustments that may have been made in the process of putting it into practice. Nevertheless, an adequate fit between “fidelity” of the strategy and the necessary “adaptability” to the local context of centers remains a great challenge in implementation trials [
25]. Few modifications were made to the PVS-PREDIAPS implementation strategy. With respect to the planned implementation strategy [
15], it proved unfeasible to carry out some activities, for example, the total number of planned monitoring and ongoing facilitation sessions. As commented previously, difficulty in scheduling meetings due to work overload or other competing priorities are common barriers faced by facilitators in improvement initiatives [
23,
26]. There was demand among professionals for additional actions related to specific core strategies within the overall implementation strategy like training actions regarding the clinical intervention. Contextual factors, for example, site characteristics, needs and priorities are considered to be among the main drivers for tailoring implementation strategies [
27], and one of the approaches used is to permit flexibility in order to enhance alignment and involvement while offering support and guidance towards change [
26,
27].
Regarding participant responsiveness, all the ERIC discrete implementation strategies identified by the healthcare professionals in the evaluation session were described as beneficial. Specifically, 10 ERIC discrete strategies were perceived to be useful by at least one center. Such a positive evaluation is indicative of the high quality of the delivery of all strategies identified. Notably, however, two of these 10 strategies (conducting educational meetings and audit and feedback) were also mentioned as needing improvement by at least one individual in five of the nine centers. Nonetheless, the same two strategies were the most mentioned overall, and therefore also the most positively evaluated. The high frequency of mention within the evaluation session likely correlates with high exposure to these strategies, and this may imply more time for critical thinking about these specific strategies that were an important part of the implementation plan.
Interestingly, healthcare professionals also perceived receipt of ERIC strategies that were not necessarily specified a priori in the implementation plan or incorporated as part of a planned modification. These previously unidentified strategies included creating a learning collaborative, facilitation, and the mandating of change. Given the emphasis of the implementation strategy on facilitation to create a learning collaborative to develop and adapt the implementation strategy to an individual center’s context, it is not surprising that these two strategies were identified by at least half of the centers. The positive evaluation of these strategies seems to indicate the perceived value of external support and building teamwork in the successful implementation of complex interventions in the healthcare setting [
26,
28]. Moreover, the participants felt that mandating change (socially and/or organizationally) fostered engagement in half the centers in the Global arm and built competence in one center in the Sequential arm. In the PREDIAPS trial, the perception of these additional discrete implementation strategies provides further evidence of an appropriate dose having been received and suggests further ecological validity of the overall implementation plan. Lastly, the lack of large qualitative differences in perceived receipt of implementation strategies observed between centers assigned to the Global and Sequential collaborative processes provides subjective evidence that exposure to planned implementation strategies seems to have been fairly similar, regardless of the deployment procedure.
The present study has some important limitations. The first, and possibly most important, is the small number of participating centers, which limits the potential generalizability of the findings to other PC centers in the Basque Country or other health systems. Second, the emphasis on the beneficial aspects of the implementation strategies identified in the qualitative inquiry may have biased our evaluation of participant responsiveness. The qualitative evaluation was carried out exclusively with the local leaders, who, though best positioned to provide related information, might have a different perspective to that of their colleagues. Furthermore, most of the open-ended questions posed in the evaluation session tended to be phrased positively in terms of usefulness. Future studies of participant responsiveness should consider researcher/interviewer bias in the design and realization of the evaluation. In interpreting our results, potential social desirability bias in participant responses should also not be ruled out.
Despite these limitations, a major strength of this study is the nature of the results obtained regarding fidelity, as they demonstrate that professionals involved were capable of identifying and rating the implementation actions conducted. Moreover, the data presented in this manuscript provides a practical example from the PREDIAPS trial that brings to life the core components of fidelity assembled from various existing frameworks. These components include adherence to the planned implementation strategies, dose/exposure to the strategies, quality of delivery, participant responsiveness to the strategies received, modifications made and program differentiation. Given the lack of operational definitions and existing frameworks to evaluate the fidelity of implementation strategies, this paper helps advance scientific research on fidelity.
The present fidelity evaluation has fulfilled some of its most important goals. In this sense, it seems to confirm the high quality of the implementation of the PVS-PREDIAPS strategy and of the two procedures for its deployment. Further, it will help to explain and interpret future results of the PREDIAPS trial, by rejecting the possibility of an implementation failure and by informing about potential confounding factors due to differences observed between comparison groups, these being potentially associated with both exposure and results. Lastly, it points out to some core elements of the implementation strategy that should be improved for future dissemination and scale-up, as for example the training component. In this sense, there is missed opportunity regarding the assessment of the training component of the strategy (eg., overall quality, satisfaction, etc.) that could lead to an improvement to ensure that professionals of any background and skills to be appropriately trained to deliver the intervention in a standardized way. Nevertheless, the majority of the demanded modifications were related to additional actions related to training in the clinical intervention.
Acknowledgements
The Prediaps Group:
ALANGO PC center (OSI Uribe): Nurses: Amaia Bengoetxea, Olga Galarza, Elsa Martínez, Itziar Zalduegi, Dorothea Chausson, Agurtzane Gorroño, Alicia Pollán, Marisol Bernabéu, MªYolanda Calvo, Ander Artiagoitia, Nerea Zaramillo, Lidia Gonzalez, and Asier Aurrekoetexea; Family Physicians: Jon Azkarate, María Muñoa, Mar Bilbao, Vicki Camineiro, Gonzalo Gómez de Iturriaga, Fernando Gago, Iciar Ochoa de Retana, Ana Zorrilla, MªLuisa Gutiérrez, Jone Capetillo Serra, and Mª Nieves Lopez.
ERANDIO PC center (OSI Uribe): Nurses: Nekane Iguerregui, Dolores López, Maite Gastañaga, Antonia Flores, Marcos Pereda, Amaya González, Ana Castresana, Laura Saiz, Nerea Regulez, and Estibaliz Peciña; Family Physicians: Jasone De la Plaza, Lucía Irastoza, Jose Contreras, Idoia Etxebarria, Begoña Oleaga, Cristina Herrero, Nora Cabezón, Fátima Calvo, J Manuel Llamazares, Mª Ángeles Gutierrez, and Monica Prieto.
ZUAZO PC center (OSI Barakaldo-Sestao): Nurses: Concepción Estébanez, María José Cordovilla, Alicia Domínguez, Isabel Lázaro, Elena Resines, Yolanda Villalba, and Begoña Ayerdi; Family Physicians: Florencia Martín, Magdalena Presmanes, Floreal Crespo, Araceli Benito, Mª Belén Molina, Mª Mar García, Mª Gracia Díaz, Mª Luisa Rodriguez Ortiz de Zarate, Rebeca San Cristóbal, and M Zugazaga Prieto.
SAN VICENTE PC center (OSI Barakaldo-Sestao): Nurses: Pedro Martínez, Mercedes Crespo, Estíbaliz Albitre, Adelina García-Roldán, Amaia García, Maite Castro, Iñaki Gorospe, Amelia V. Hernández, Maite López, and Mirian Sainz; Family Physicians: Irene Marín, María Jesús Aragón, Leire Ulayar, Encarnación Santamaría, Carmen Sánchez, Javier Bayo, Begoña Urkullu, Ana Inés Pereda, and Mercedes Garcia.
PORTUGALETE PC center (OSI Ezkerraldea-Ekarterri-Cruces): Nurses: Pilar Blanco, Silvia Soler Valverde, Jose I. Atela, Hiart Trespalacios, Anabel Llarena, and Verónica Ruiz; Family Physicians: Begoña Cabieces, Concepción Ugarte, Guadalupe Icaza, Edurne Zubeldia, Idoia González, and Ángeles Gayo.
ZALLA PC center (OSI Ezkerraldea-Ekarterri-Cruces): Nurses: Itxaso Arévalo, Gloria Intxausti, Esther García, Teresa Sánchez, Igone Lobato, Noelia Fuente, Naiara Ortolachipi, and Edelweiss Sánchez; Family Physicians: Victoria Cosgaya, Ángeles Gayo, Arantza Azazeta, Patricia Zaballa, Ana Isabel Ramila, and Teresa Rodeño.
SODUPE PC center (OSI Ezkerraldea-Ekarterri-Cruces): Nurses: Inmaculada Rodríguez, Teresa Vázquez, Raquel Ruíz, Rosa Herrero, María Valvanera, and Saioa Setién; Family Physicians: Begoña Ruíz, Juan José Casas, Joana Clemente, Javier Amiama, and Javier Angulo.
IZTIETA PC center (OSI Donostialdea): Nurses: Belén Aramendia, Soledad Asenjo, Mª Sonia Mayoral, Remedios Oyarzun, Bergoi Calvar, Belinda Zulueta, Edurne Elola, Josu Egaña, Gemma Díaz, Mª Ángeles Sola, Laura Balague, Mª Ángeles Ganzarain, Arantxa Aramburu, Ana Mª Guinea, Edurne Lizarazu, Inma Valverde, Nekane Arenas, and Susana Alonso; Family Physicians: Rosa Salaberria, Javier Merino, Mercedes Álvarez, Ester Lázaro, Juncal Izcara, Leire González, Ainhoa García Leunda, Idoia Sánchez, Esther Usandiaga, and Eluska Yetano.
EGIA PC center (OSI Donostialdea): Nurses: Jaione Larrea, Inés Mendinueta, Asún Uria, Ana Belén Gaztañaga, Eva Mayo, Onintza Aranzadi, Eulalia Medina, Rosa González, Ione Gutiérrez, Arantxa Perez, and Mª Ángeles Izquierdo; Family Physicians: Alejandro García, Ainhoa Ugarte, Mª Teresa Zubeldia, Bingen Uriondo, Mª Carmen Aranegui, Arantxa Mendiguren, Yolanda Fernández, Maite Zapirain, Mª Jose Garín, Aitziber Ayerbe, and Jon Urkia.
COORDINATING RESEARCH TEAM: Director Team: Alvaro Sánchez, Josep Cortada (Deusto PC center), Esther Gorostiza (Matiena PC center), Susana Pablo, Heather Lynn Rogers, Arturo García-Alvarez, Gonzalo Grandes; Clinical Commitee: Alicia Cortazar (Cruces Hospital), Virginia Bellido (Cruces Hospital), Patxi Ezkurra (Zumaia PC center), and Rafa Rotaetxe (Alza PC center).