Skip to main content
Erschienen in: BMC Health Services Research 1/2019

Open Access 01.12.2019 | Debate

The Value Equation: Three complementary propositions for reconciling fidelity and adaptation in evidence-based practice implementation

verfasst von: Ulrica von Thiele Schwarz, Gregory A. Aarons, Henna Hasson

Erschienen in: BMC Health Services Research | Ausgabe 1/2019

Abstract

Background

There has long been debate about the balance between fidelity to evidence-based interventions (EBIs) and the need for adaptation for specific contexts or particular patients. The debate is relevant to virtually all clinical areas. This paper synthesises arguments from both fidelity and adaptation perspectives to provide a comprehensive understanding of the challenges involved, and proposes a theoretical and practical approach for how fidelity and adaptation can optimally be managed.

Discussion

There are convincing arguments in support of both fidelity and adaptations, representing the perspectives of intervention developers and internal validity on the one hand and users and external validity on the other. Instead of characterizing fidelity and adaptation as mutually exclusive, we propose that they may better be conceptualized as complimentary, representing two synergistic perspectives that can increase the relevance of research, and provide a practical way to approach the goal of optimizing patient outcomes. The theoretical approach proposed, the “Value Equation,” provides a method for reconciling the fidelity and adaptation debate by putting it in relation to the value (V) that is produced. The equation involves three terms: intervention (IN), context (C), and implementation strategies (IS). Fidelity and adaptation determine how these terms are balanced and, in turn, the end product – the value it produces for patients, providers, organizations, and systems. The Value Equation summarizes three central propositions: 1) The end product of implementation efforts should emphasize overall value rather than only the intervention effects, 2) implementation strategies can be construed as a method to create fit between EBIs and context, and 3) transparency is vital; not only for the intervention but for all of the four terms of the equation.

Summary

There are merits to arguments for both fidelity and adaptation. We propose a theoretical approach, a Value Equation, to reconciling the fidelity and adaptation debate. Although there are complexities in the equation and the propositions, we suggest that the Value Equation be used in developing and testing hypotheses that can help implementation science move toward a more granular understanding of the roles of fidelity and adaptation in the implementation process, and ultimately sustainability of practices that provide value to stakeholders.
Hinweise

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Abkürzungen
EBI/EBIs
Evidence-based intervention(s)
V
Value
Vs
value and fit of intervention in the system context
Vo
Value and fit of intervention in organizational context
Vpr
Value and fit of the intervention for the provider
Vpt
Value and fit of the intervention for the patient
IN
Intervention
INf
Extent to which the intervention is carried out as it was described (fidelity)
INfc
Fidelity-consistent adaptations
INfi
Fidelity-inconsistent adaptations
C
Context
Cs
System context
Co
Organizational context
Cpr
Provider context (e.g., professional discipline, training, attitudes toward the EBI)
Cpt
Patient context (e.g., target group)
IS
Implementation Strategy
ISc
Implementation strategy optimizing the context
ISi
Implementation strategy optimizing the intervention

Background

Implementation science is defined as the scientific study of methods to promote the uptake of research findings into routine healthcare in clinical, organizational, or policy contexts [1]. The goal is to close the gap between what has been shown to be effective in rigorous trials (i.e., evidence-based interventions [EBIs], such as diagnostic tools and treatments) and what is done in clinical practice, so that patient and population health is improved.
If patients and ultimately populations are going to benefit from the best available evidence, fidelity (also denoted as adherence or treatment integrity), defined as the degree to which an intervention is carried out as it was described and originally tested and/or as the developer intended [28], is important in all steps of the research-to-practice pathway. During efficacy and effectiveness trials, when the main purpose is to separate effects of the intervention from other factors, high fidelity ensures that it is the intervention, not other factors, that produces the effect. In implementation studies, EBI fidelity is often a primary implementation outcome for determining if the methods used to promote uptake were successful [9]. In routine care, the degree to which EBIs are delivered as originally designed ultimately determines if patients and populations indeed receive the interventions that correspond with the best available evidence [2, 5, 10]. Overall, this makes understanding fidelity a central issue for implementation science.
However, throughout all the steps of the research-to-practice pathway, there are forces that pull away from fidelity. This has drawn increased attention to the role of adaptations, defined as the changes made to an intervention based on deliberate considerations to increase fit with patient or contextual factors at the system, organization, team, and individual clinician level [1115]. Deliberate distinguishes adaptations from drift [16]. The central role of adaptations in implementation is increasingly being acknowledged, as evident from recent special issues [17] and themes in recent scientific meetings such as the US National Institutes of Health and Academy Health 9th Annual Conference on the Science of Dissemination and Implementation [10] and the 2018 Nordic Implementation Conference [18]. Yet, although there is global concern about fidelity-adaptation questions, the related conceptual and methodological issues are far from resolved.
Starting from routine care, there is ample evidence illustrating how adaptation of EBIs is the rule rather than the exception when used in real-world practice [12, 19, 20]. Influences on multiple levels, from the system, organization, provider, and patient, can all influence the degree to which EBIs might require adaptation [14, 21]. For example, reasons for adaptation can include system- and organization-level concerns including workforce readiness, organizational context, and the cost of purchasing EBIs from intervention developers and purveyors. In line with this, it has been noted that adaptability is an important factor that implementation strategies should address and that adaptation is likely to be needed to promote uptake [22]. This follows Everett Rogers’ seminal research stipulating that an innovation (e.g., an EBI) almost always will be reshaped to fit the organization or context in which it will be used [23]. In a concurrent development, research about cultural adaptations has also highlighted the need to tailor interventions based on the culture of the target populations [24], as well as the need to increase our understanding of cultural influences on implementation strategies and outcomes [25].
Recently, there has also been more discussion about the role that adaptations play earlier along the research-to-practice pathway [26, 27]. This includes questioning the assumption that fidelity automatically maximizes effectiveness [28]. It also includes showing that adaptation happens not only when EBIs are used in practice but also during trials, indicating that intervention researchers also needs to attend to issues related to fidelity and adaptation [27]. There have been a number of efforts to improve the reporting of fidelity and adaptation (e.g., [12, 13, 2931]) (see also Roscoe et al., (2019) [32] for a comparison of four classification taxonomies), but to date, neither adaptations nor the intervention as planned (i.e., fidelity), are sufficiently described or documented in effectiveness trials [3336]. This leaves a gap in understanding the full scope of the fidelity and adaptation dilemma earlier in the research-to-practice pathway. Thus, fidelity and adaptation are concepts that implementation science by necessity needs to acknowledge and address. Yet, this is prevented by the plethora of terms used, and the lack of clarity as to how the constructs can be conceptually organized. In Table 1, we propose a taxonomy that further refines the definitions of fidelity and adaptation, according to subcomponents and dimensions to which fidelity and adaptation can refer. This may aid in identifying relevant constructs for assessment and measurement.
Table 1
Definitions of subcomponents that represent dimensions suggested in the literature that fidelity and adaptation can refer toa
Sub-components
Characteristics of the intervention
Characteristics of the context of delivery
Content of intervention
What intervention activities and processes are performed? [4, 37]
Intervention delivery
How is the intervention delivered? [2, 37]
Format
In which format and through which channels is the intervention delivered? [13, 24]
Conditions of the delivery context
What are the precise conditions related to the system, organization and providers under which the intervention is delivered? [38]
What is the target population specifics? [7]
Suggested dimensions to consider
• The intervention core components, the specific active ingredients that make an EBI effective [7], fidelity consistent components [39]
• The logic or theory that explains how the intervention is intended to work (i.e. the interventions’ deep structure [40], intervention strength [38] and program model [7]; its function [41])
• Components that are part of the intervention but not central for producing the outcomes (surface structure) [40] or adaptable periphery
• Components, not part of the intervention: fidelity consistent [5, 7, 39], or fidelity inconsistent [39]
• Components that make the intervention uniquely distinguishable (program differentiation) [2]
• Number, length and frequency of sessions,
• Density: how the intervention is spaced out in time; the intensity of the intervention [28]
• The quality of delivery [2, 4]
• What the treatment provider plans and believes is delivered (dose delivered)
• What the recipient perceives that they have received (dose received) [42]
• The timing; when the various parts of the intervention is delivered in relation to the other parts [43]
• Format of delivery, e.g. one-to-one or group
• Channels of delivery, e.g. phone, internet or face-to-face
• Location of delivery, e.g. school, non-profit organization or church
• Interventionist specifics (e.g. who is delivering the intervention; their training, competence level; personal attributes and skills)
• Setting, e.g. primary care, hospital, community-based, workplace-based [13]
• Organizational factors, e.g. climate, leadership, mandate, history [44]
• System characteristics, e.g. reimbursement models, contracts, laws, policies, regulations, political climate [44]
• The health conditions that the intervention targets
• Cultural characteristics
• Age groups (e.g., children, youth, adults)
• Patient’s own health goals and specific needs
• Comorbidities
aFidelity and adaptation sometimes refer to implementation strategies rather than interventions, that is, to what extent the strategies chosen to facilitate the use of the intervention is adhered to or adapted. The focus of the current paper is on fidelity to and adaptations of interventions
The issue of fidelity and adaptation has been controversial for decades (e.g., [45]). This debate deals with the longstanding tensions between achieving internal and external validity [46]. Whereas some scholars emphasize the importance of drawing valid conclusions about the effects of an intervention, thereby prioritizing internal validity, others highlight the need for interventions to fit and function in the daily operations of different systems and organizations, thus highlighting the virtue of external validity. However, there has been little theoretical development that can guide how fidelity and adaptations should be managed and documented across the research-to-practice pathway and how this is related to implementation science [21]. The debate and the research on fidelity and adaptations has been split and fragmented across multiple fields and journals representing various clinical fields, disciplines, and/or settings in which EBIs are implemented. With some noticeable exceptions (e.g., [13, 28]), the debate has taken place in parallel silos, and there is currently a lack of overview over the main arguments for fidelity on one side and adaptations on the other. This hampers a more comprehensive understanding needed to move toward a theoretical approach for how the fidelity and adaptation debate can be reconciled. This paper aims to synthesise the main arguments for fidelity and adaptation and, based on that, propose a theoretical and applied approach for how adaptation and fidelity can optimally be managed.

Five reasons fidelity is vital – and five reasons adaptations are also vital

As outlined above, the logic of the research-to-practice pathway stipulates that EBIs should be used as they were described and intended to be provided. This approach implies that fidelity to the intervention is central and any deviations problematic. However, there are also strong arguments for why adaptations are needed. Table 2 summarizes the more pervasive reasons and justifications found in the literature for fidelity and adaptations, respectively.
Table 2
Arguments for Fidelity and Adaptation
Argument
Fidelity …
Adaptation …
1
… is vital for drawing valid conclusions by:
… improves intervention–context fit by:
- increasing internal validity by the transparent and adherent use of EBIs [35]
- separating implementation failure from theory failure, i.e., distinguishing between lack of effects due to insufficient implementation from lack of effects of an ineffective intervention [47]
- avoiding type-III errors: concluding that an intervention is not effective when it actually was poorly implemented [48]
- ensuring that EBIs can be implemented and used in/for systems, organizations, providers, or patients that is different than the one in which the EBI was originally tested [49]
- increasing the acceptability, feasibility, and applicability of an EBI to a given context [29]
- increasing practical and/or value fit (philosophical and cultural) [12], e.g., creating fit with practical circumstances by changing from individual treatment to group format to align with funding contract [50]
2
… makes accumulation of knowledge possible by:
… balances different outcomes by:
- making replication possible by ensuring the intervention remains the same across studies, thereby distinguishing between random and robust results [35, 51]
- allowing results from multiple studies to be synthesised in systematic reviews and meta-analyses
- focusing on a broader spectrum of objectives, e.g., not only the specific clinical outcome an EBIs is evaluated against (e.g., symptom reduction, improved functioning) but also outcomes on other levels (patient, provider, organization and system) such as reach, relevance, costs [29, 49, 52]
- focusing on optimizing benefits over time rather than focusing on sustained delivery (i.e., sustainment) [14, 28, 53, 54]
3
... assures EBI effectiveness by:
...  assures EBI effectiveness by:
- relying on studies, across different types of interventions and settings, showing that high fidelity can improve outcomes (e.g., [27, 37, 5558] (at least in comparison to drift)
- relying on studies, across different types of interventions and settings, showing that adaptations can improve outcomes, e.g., [20, 59] (at least when there is a large variation in client and provider characteristics)
4
... provides transparency and confidence by:
... is necessary to address multiple diagnoses by:
- ensuring that users, patients and their families, care providers, and funders (health systems, governments, insurers, and foundations) gets what they are promised [47, 60]:
 ... that patients know that the EBI offered is also the EBI delivered, facilitating informed choices
 ... that subsequent providers can deliver appropriate care; trusting that the treatment as documented in clinical records is also the treatment delivered
 ... that funders get what they are paying for
 ... that systems allow fair comparison between organizations in a competitive market, and fair benchmarking of treatment outcomes
- acknowledging that comorbidity is the rule rather than the exception in clinical practice, and that most EBIs have only been indicated (shown to be effective) for a very limited group of patients, primarily without comorbidities [61]
- allowing “indication shift” to be able to use EBIs for groups for which evidence is lacking
5
...  provides equal care and reduces disparities by:
... optimise the benefit for each patient by:
- decreasing unwanted variation between providers, organizations, geographical regions, and different target groups or individuals, e.g., between men and women [60]
- ensuring that decisions about adoption and use are made systematically, reducing the risk of gender and cultural biases
- translating mean effects into what is best for each individual in the group [62]
- taking individual patient variability into account by detecting the individuals that are likely to improve less (i.e., the tails of the distribution of effects), consistent with the personalized medicine movement [63]

Discussion

There are valid and reasonable arguments in support of fidelity, and there are valid and reasonable arguments in support of adaptation. However, many of the arguments seem to be contradictory and sometimes mutually exclusive. Much of the debate over the years has taken one or the other position, but the possibility that adaptation and fidelity can coexist has also been raised. These suggestions note that they can co-exist as long as the EBI core components are adhered to (e.g., [11, 24, 30]), and, more recently, that adaptation can improve fidelity by ensuring adherence to the key principles or elements underlying the EBI [64, 65]. Recent advancements in the conceptualization, measurement and documentation of adaptations have moved the field forward by aiding the empirical exploration of the relationship between fidelity, adaptations and outcomes (e.g., [14, 29, 31]).
Yet, with some noticeable exceptions (e.g., Chambers and Norton’s “Adaptome” model [26]), there have been few attempts at making theoretical propositions that address how fidelity and adaptation can be reconciled. In the following, we deconstruct the arguments for fidelity and adaptation to get at underlying assumptions, and then make three propositions that reconcile fidelity and adaptation. The propositions and equation terms are summarized in the Value Equation, as shown in Table 3. The Value Equation states that the optimal value (V) is a product of the intervention (IN), the nature of the context (C) in which the intervention is being implemented, and the implementation strategies (IS). The Value Equation (V = IN * C * IS) terms are described in detail below.
Table 3
The Value Equation: V = IN * C * IS
Terms
Specification
Value (V)
Vs
Value and fit of intervention in the system context
Vo
Value and fit of intervention in organizational context
Vpr
Value and fit of the intervention for the provider
Vpt
Value and fit of the intervention for the patient
Intervention (IN)
INf
Extent to which the intervention is carried out as it was described (fidelity)
INfc
Fidelity-consistent adaptations
INfi
Fidelity-inconsistent adaptations
Context (C)
Cs
System context
Co
Organizational context
Cpr
Provider context
Cpt
Patient context
Implementation Strategy (IS)
ISc
Implementation strategy optimizing the context
ISi
Implementation strategy optimizing the intervention

Building the value equation

Table 3 summarizes elements of the Value Equation. Written as a simple mathematical equation, its starting point is an assumption that it is (only) the EBI that produces the effect:
$$ Intervention\ (IN)= Effect\ \left(E\ \right). $$
Implicit here is that by adhering to the intervention as it was designed, the 1) effect is maximized; 2) it is clear what is being delivered; 3) there is little unwanted EBI variation between organizations, professionals, and patients; and 4) it is possible to accumulate knowledge across studies. Nevertheless, as described previously, adaptation happens. Thus, there is a need to specify the intervention as the extent to which the intervention was carried out as it was described (fidelity) (INf), as well as fidelity-consistent (INfc) and fidelity-inconsistent (INfi) adaptations [39] (see Table 3).
As the EBI moves along the research-to-practice pathway, the influence of contextual factors is increasingly recognizable. Thus, a second term is added to the equation: Context (C).
$$ IN\ast C=E. $$
Because many implementations take place in complex systems including influences on system, organization, provider, and patient levels, context needs to be further specified. Thus, we suggest that context be delineated as system context (Cs), organizational context (Co), provider context (e.g., professional discipline, training, attitudes toward the intervention) (Cpr), and patient context (e.g., target group) (Cpt).
The Value Equation proposes that by acknowledging that context is indeed a term in the equation, the effects of intervention, by necessity, need to be understood in relation to the context in which it is implemented. For example, even in efficacy trials, there are contextual factors that will influence the outcome (e.g., highly trained staff delivering the intervention, urban settings). Thus, an EBI is not effective in isolation; it is more or less effective for a certain group, in certain settings, and under certain conditions. When the EBI is used beyond that, the context term changes, and so does the expected effect. High fidelity may increase effects in certain contexts, and adaptation in others. The optimal answers lie in the configuration of both terms in the equation.

Implementation strategies create intervention–context fit

Implementation strategies are systematic processes to adopt and integrate EBIs into clinical care [22]. Implementation strategies can be simple (e.g., clinical reminders) or complex and multicomponent (e.g., training + coaching + audit and feedback) and varies with EBIs and contexts. We build on this notion to derive our first proposition: that implementation strategies are ways to create fit (i.e., appropriateness [9]) between an intervention and a specific context. We add a third term to the equation: Implementation Strategy (IS).
$$ IN\ast C\ast IS=E $$
We argue that implementation strategies can optimize the effect of interventions in two ways: 1) by optimizing the outer system or inner organizational context so that it fits the intervention (ISc) [44], or 2) by optimizing the intervention so that it fits the context (ISi) (Table 3). Thus, in the first case, implementation strategies are concerned with increasing fidelity by enabling appropriate changes in the context (e.g., by increasing competence among staff and/or create opportunities for the target behaviours through environmental restructuring such as changing the reimbursement system to allow clinicians needed time, etc. [66, 67]). In the second case, the implementation strategies promote adaptations to achieve fit (e.g., remove components because they are perceived as culturally inappropriate, or tailor based on patient preferences [12, 68]).
This proposition builds on the first argument for adaptation, stating that intervention–context fit is a necessary condition for implementation, but also invokes Elliott and Mihalic’s (2004) [69] notion that the need for intervention–context fit does not necessarily mean adaptation of the intervention; it may as well mean adaptation of the context to facilitate fidelity to the intervention. Thus, we build on previous work suggesting that adaptation and fidelity can co-exist (e.g., [11, 24, 30]), and add to that by explicitly proposing implementation strategies as the activities that optimize fit and reconcile fidelity and adaptation, whether those are concerned with modifying the intervention or the context, or both intervention and context.
The proposition to view implementation strategies as ways to create fit between an intervention and a specific context opens up new innovative approaches to choosing and matching implementation strategies, which has proven to be challenging [70]. The proposition aligns with recent suggestions to use user-centred design principles and community-academic partnerships for the purpose of creating fit between interventions and context, by engaging intervention developers and/or implementers and practitioners in a collaborative redesign process [7173]. The value equation can aid this process by explicating which strategies are used, and why (if it is for the purpose of achieving fit by changing the context, or the intervention), and to what effect.

Moving from effect to multilevel value proposition

A compelling argument for both fidelity and adaptation is the potential for increase in the effectiveness and public health impact of an intervention. Here, we make our second proposition by proposing an intentional shift from focusing on the effect of an intervention to focusing on the value (V) it creates, making a final adjustment to the equation by exchanging effect for value. Expressed mathematically, the complete Value Equation becomes the following:
$$ IN\ast C\ast IS=V $$
Value is broader than intervention effects alone. It reflects the optimization of a configuration of patient (Vpt), provider (Vpr), organization (Vo), and system (Vs) values and outcomes. Thus, value is a multicomponent, multilevel construct that represents the perceived or real benefit for each stakeholder and for stakeholders combined: a multilevel value proposition. For example, value for a service system may be increased population health, while for an organization, it may be optimized service delivery and decreased costs. Concurrently, a clinical professional may view value as being able to consider individual patient needs and outcomes, and patients may value their own improved functioning in daily life and/or clinical outcomes.
But what then is success of an EBI? By focusing on value, we suggest that implementation success can be defined as the ability to optimize value across the different levels and stakeholders. This perspective on implementation success aligns with recent definitions of sustainability, which highlight the ability to continuously deliver benefits as key part of the construct [74], with adaptations being a strategy to promote it [75]. The value equation proposes that effects on certain clinical outcomes are necessary but not sufficient. An EBI also needs to maximize value for individual providers, for the organization, and for the system. This shifts the focus on implementation from getting an EBI in place, to thinking about its value more broadly, and being more egalitarian in considering the needs of multiple stakeholders, including recently identified “bridging factors” to optimize implementation across context levels [76].
The equation, with its focus on value, also has implications for intervention developers. It implies that moving from designing interventions to maximize efficacy to designing interventions that maximize value, for multiple stake-holders. According to the Value Equation, the intervention that is most efficacious may not be the one that also provides the most value. A less complex intervention that can be delivered by less skilled staff and that requires less implementation resources (e.g., supervision, re-organization of care) may result in higher value than an intervention that stands little chance of being used in practice [26]. This is consistent with approaches to maximizing public health impact where a given EBI may have a smaller effect size, but if it is sustained and reaches more patients then even a small effect sizes can have significant public health impact [77].
It is in relation to the multidimensional value configuration that fidelity and adaptation should be considered. Sometimes, fidelity is a way to optimize value, sometimes it is adaptations, and often it is a combination. This also means that fidelity might optimize one outcome and adaptation another. Furthermore, the different types of outcomes may be valued differently by different stakeholders. In this, we acknowledge that different stakeholders’ definitions of value may differ. In fact, they may often be misaligned, such as when an organization is required by the system to provide a service to a sub-population that does not request it. We suggest that the better implementers are at acknowledging and addressing these value conflicts, the higher the likelihood for successful and sustained implementation. Community–academic partnerships may be one bridging factor that may facilitate this process [78] by engaging stakeholders in jointly considering system, organization, and patient needs, increasing their understanding of others agendas and encouraging a transparent negotiation of how to best address different needs. Techniques such as concept mapping and co-created program logic (COP) may be useful to promote an understanding of divergent viewpoints, and an effective dialogue [79, 80].
Similarly, by moving from focus only on treatment effect to a value configuration, we can reconcile arguments for fidelity and adaptation in relation to equity. We simply propose focusing on equity of the value achieved by the equation as a whole (i.e., for all stakeholders across levels) rather than only equity in relation to the intervention.

Transparency over all the value equation terms

One of the main arguments for fidelity is related to transparency: Fidelity to an EBI is needed for comparisons, accumulation of knowledge, and accountability. Our third proposition is that what is essential is transparent use. Thus, replication and accumulation of knowledge is still possible, but redefined to focus on transparency in relation to all terms in the Value Equation. In this, the Value Equation is consistent with recent calls for redefining replicability in clinical science (e.g., [81]). Requirements from funders to provide information on all equation terms would be helpful to push the development in this direction.
This proposition is consistent with calls for rigorous strategies to monitor, guide and evaluate fidelity [8284] as well as adaptation, as increasingly has been acknowledged (e.g., [17, 26, 29, 31, 85, 86]). The Value Equation adds to this by proposing transparent reporting of all equation terms, and justification of fidelity and adaptation based on how it promotes fit between the EBI and context and in relation to how it impacts value. In this way, users will be supported in assessing INfi and INfc in subsequent implementations. Otherwise, the risk is what can be called “adaptation neglect,” a syndrome where adaptations pass unnoticed or undocumented regardless of how obvious they are.

Toward personalized value equations

One of the main argument for fidelity is to enable accumulation of knowledge through replication, putting focus on only one of the terms of the equation. The Value Equation and the transparency proposition instead focuses on all terms, thereby facilitating a gradual increase in the precision of the knowledge of what works for whom and when (i.e., specificity) [87]. This requires sophisticated processes and infrastructure. One way to achieve this may be to create databases of the different ways in which an intervention has been used, in what context, and to what effect [26, 86, 88]. Such data, thus, can form the basis for a gradual increased understanding of what creates value for whom and shows how the logic of the Value Equation can look in practice. For example, in the Paediatric Oncology Department of Karolinska University Hospital in Sweden, when a child does not respond as expected to a treatment protocol, adaptations are made, and both adaptations and effects are documented. Data from similar cases are accumulated, creating additional arms in the ongoing comparative trial. In this way, data on intervention*context configurations are collected.
Nevertheless, building databases that reflect the whole Value Equation may increase the administrative burden on clinical staff and organizations as a whole. A way to circumvent this risk may be to build a data infrastructure where all stakeholders involved in the healthcare process (patients, providers, organizations, and system) are invited to share and use data for their specific needs, so that those entering the data also benefit from it in their daily operations [89]. Although such a development may seem utopian in many fragmented systems, there are examples of these learning healthcare systems, for instance, at Cincinnati Children’s Hospital Medical Center [89] and in rheumatology care in Sweden [90]. Researchers may, for example, use the data for comparative effectiveness studies, and healthcare system representatives for benchmarking. However, the most transformative aspect may be when patients and providers can use the system at the point of care to track how an EBI is used (fidelity and adaptation) and what value it creates for the specific patient. This is consistent with recent applications of measurement-based care, where data related to intervention, context and implementation is assessed real-time along with clinical data to guide clinical decision making [91].
Used in this way, the learning healthcare system [92] will provide the most precise version of “what works for whom, when” we can think of: personalized value equations in patient- or provider-driven n = 1 studies [93]. Aggregation of all n = 1 studies will then provide the basis for accumulation of knowledge of “what works for whom, when,” thereby bridging personalized medicine and the ideas for systemizing knowledge about adaptations as outlined in the Adaptome [26].

Conclusions

In mathematics and statistics, we are used to thinking about how the different terms of an equation together determine the outcome. Implementation scientists can use the same approach to understand the product of an EBI, minding the context in which it is used and given the implementation strategies applied. The Value Equation is a theoretical proposition that reconciles the role of adaptation and fidelity in the research-to-practice pathway. The Value Equation states that the optimal value configuration of the intervention that can be obtained (V) is a product of the intervention (IN), the nature of the context (C) in which the intervention is being implemented, and how well the implementation strategy (IS) optimizes the intervention and the context. Fidelity and adaptation determine how these terms are mixed and, in turn, the end product: the value configuration it produces for multiple stakeholders.
The Value Equation contains three central propositions: 1) it positions implementation strategies as a way to create fit between EBIs and context, 2) it explicates that the product of implementation effort should move from emphasizing effects to emphasizing optimization of a multilevel value configuration, and 3) it shifts focus from fidelity to transparency over all terms of the equation. While there are many complexities in each of these propositions and in each of the terms in the equation, we suggest that the Value Equation be used to develop and test hypotheses that ultimately can help implementation science move toward a more granular understanding of how methods to promote the uptake of research findings can be optimized.
NA
NA

Competing interests

The authors declare that they have no competing interests.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://​creativecommons.​org/​licenses/​by/​4.​0/​), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://​creativecommons.​org/​publicdomain/​zero/​1.​0/​) applies to the data made available in this article, unless otherwise stated.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Literatur
1.
Zurück zum Zitat Eccles MP, Mittman BS. Welcome to implementation science. Implenent Sci. 2006;1:1.CrossRef Eccles MP, Mittman BS. Welcome to implementation science. Implenent Sci. 2006;1:1.CrossRef
2.
Zurück zum Zitat Carroll C, Patterson M, Wood S, Booth A, Rick J, Balain S. A conceptual framework for implementation fidelity. Implement Sci. 2007;2:1–9.CrossRef Carroll C, Patterson M, Wood S, Booth A, Rick J, Balain S. A conceptual framework for implementation fidelity. Implement Sci. 2007;2:1–9.CrossRef
3.
Zurück zum Zitat Bellg AJ, Borrelli B, Resnick B, Hecht J, Minicucci DS, Ory M, et al. Enhancing Treatment Fidelity in Health Behavior Change Studies: Best Practices and Recommendations From the NIH Behavior Change Consortium. Health Psychol. 2004;23:443–51.PubMedCrossRef Bellg AJ, Borrelli B, Resnick B, Hecht J, Minicucci DS, Ory M, et al. Enhancing Treatment Fidelity in Health Behavior Change Studies: Best Practices and Recommendations From the NIH Behavior Change Consortium. Health Psychol. 2004;23:443–51.PubMedCrossRef
4.
Zurück zum Zitat Dusenbury L, Brannigan R, Falco M, Hansen WB. A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. Health Educ Res. 2003;18:237–56.PubMedCrossRef Dusenbury L, Brannigan R, Falco M, Hansen WB. A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. Health Educ Res. 2003;18:237–56.PubMedCrossRef
5.
6.
Zurück zum Zitat Sechrest L, West SG, Phillips MA, Redner R, Yeaton W. Some neglected problems in evaluation research: Strength and integrity of treatments. Evaluation studies review annual. 1979;4:15–35. Sechrest L, West SG, Phillips MA, Redner R, Yeaton W. Some neglected problems in evaluation research: Strength and integrity of treatments. Evaluation studies review annual. 1979;4:15–35.
7.
Zurück zum Zitat Gearing RE, El-Bassel N, Ghesquiere A, Baldwin S, Gillies J, Ngeow E. major ingredients of fidelity: a review and scientific guide to improving quality of intervention research implementation. Clin psychol rev. 2011;31:79–88.PubMedCrossRef Gearing RE, El-Bassel N, Ghesquiere A, Baldwin S, Gillies J, Ngeow E. major ingredients of fidelity: a review and scientific guide to improving quality of intervention research implementation. Clin psychol rev. 2011;31:79–88.PubMedCrossRef
8.
Zurück zum Zitat Perepletchikova F, Treat TA, Kazdin AE. Treatment integrity in psychotherapy research: analysis of the studies and examination of the associated factors. Journal of consulting and clinical psychology. 2007;75(6):829.PubMedCrossRef Perepletchikova F, Treat TA, Kazdin AE. Treatment integrity in psychotherapy research: analysis of the studies and examination of the associated factors. Journal of consulting and clinical psychology. 2007;75(6):829.PubMedCrossRef
9.
Zurück zum Zitat Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38:65–76.PubMedCrossRef Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38:65–76.PubMedCrossRef
10.
Zurück zum Zitat Chambers D, Simpson L, Neta G, von Thiele Schwarz U, Percy-Laurry A, Aarons GA, et al., editors. Proceedings from the 9 th annual conference on the science of dissemination and implementation. Implement Sci; 2017; 12(Suppl1). Chambers D, Simpson L, Neta G, von Thiele Schwarz U, Percy-Laurry A, Aarons GA, et al., editors. Proceedings from the 9 th annual conference on the science of dissemination and implementation. Implement Sci; 2017; 12(Suppl1).
11.
Zurück zum Zitat Lee SJ, Altschul I, Mowbray CT. Using planned adaptation to implement evidence-based programs with new populations. Am J Community Psychol. 2008;41:290–303.PubMedCrossRef Lee SJ, Altschul I, Mowbray CT. Using planned adaptation to implement evidence-based programs with new populations. Am J Community Psychol. 2008;41:290–303.PubMedCrossRef
12.
Zurück zum Zitat Moore J, Bumbarger B, Cooper B. Examining Adaptations of Evidence-Based Programs in Natural Contexts. J Prim Prev. 2013;34:147–61.PubMedCrossRef Moore J, Bumbarger B, Cooper B. Examining Adaptations of Evidence-Based Programs in Natural Contexts. J Prim Prev. 2013;34:147–61.PubMedCrossRef
13.
Zurück zum Zitat Stirman S, Miller C, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implement Sci. 2013;8:65.PubMedPubMedCentralCrossRef Stirman S, Miller C, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implement Sci. 2013;8:65.PubMedPubMedCentralCrossRef
14.
Zurück zum Zitat Aarons GA, Green AE, Palinkas LA, Self-Brown S, Whitaker DJ, Lutzker JR, et al. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implement Sci. 2012;7:32.PubMedPubMedCentralCrossRef Aarons GA, Green AE, Palinkas LA, Self-Brown S, Whitaker DJ, Lutzker JR, et al. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implement Sci. 2012;7:32.PubMedPubMedCentralCrossRef
15.
Zurück zum Zitat Card JJ, Solomon J, Cunningham SD. How to adapt effective programs for use in new contexts. Health promot pract. 2011;12:25–35.PubMedCrossRef Card JJ, Solomon J, Cunningham SD. How to adapt effective programs for use in new contexts. Health promot pract. 2011;12:25–35.PubMedCrossRef
16.
17.
Zurück zum Zitat Bumbarger BK, Kerns SEU. Introduction to the Special Issue: Measurement and Monitoring Systems and Frameworks for Assessing Implementation and Adaptation of Prevention Programs. The Journal of Primary Prevention. 2019;40(1):1–4.PubMedCrossRef Bumbarger BK, Kerns SEU. Introduction to the Special Issue: Measurement and Monitoring Systems and Frameworks for Assessing Implementation and Adaptation of Prevention Programs. The Journal of Primary Prevention. 2019;40(1):1–4.PubMedCrossRef
18.
Zurück zum Zitat von Thiele Schwarz U, Hasson H, Aarons GA, Sundell K. Usefulness of evidence -Adaptation and adherence of evidence-based methods. Nordic Implementation Conference; 2018, May 29; Copenhagen, Denmark. von Thiele Schwarz U, Hasson H, Aarons GA, Sundell K. Usefulness of evidence -Adaptation and adherence of evidence-based methods. Nordic Implementation Conference; 2018, May 29; Copenhagen, Denmark.
19.
Zurück zum Zitat Aarons GA, Miller EA, Green AE, Perrott JA, Bradway R. Adaptation happens: a qualitative case study of implementation of The Incredible Years evidence-based parent training programme in a residential substance abuse treatment programme. Journal of Children's Services. 2012;7(4):233–45.CrossRef Aarons GA, Miller EA, Green AE, Perrott JA, Bradway R. Adaptation happens: a qualitative case study of implementation of The Incredible Years evidence-based parent training programme in a residential substance abuse treatment programme. Journal of Children's Services. 2012;7(4):233–45.CrossRef
20.
Zurück zum Zitat Wiltsey Stirman S, Gamarra JM, Bartlett BA, Calloway A, Gutner CA. Empirical examinations of modifications and adaptations to evidence-based psychotherapies: Methodologies, impact, and future directions. Clin Psychol: Science Practice. 2017;24(4):396–420. Wiltsey Stirman S, Gamarra JM, Bartlett BA, Calloway A, Gutner CA. Empirical examinations of modifications and adaptations to evidence-based psychotherapies: Methodologies, impact, and future directions. Clin Psychol: Science Practice. 2017;24(4):396–420.
21.
Zurück zum Zitat Aarons GA, Sklar M, Mustanski B, Benbow N, Brown CH. “Scaling-out” evidence-based interventions to new populations or new health care delivery systems. Implement Sci. 2017;12:111.PubMedPubMedCentralCrossRef Aarons GA, Sklar M, Mustanski B, Benbow N, Brown CH. “Scaling-out” evidence-based interventions to new populations or new health care delivery systems. Implement Sci. 2017;12:111.PubMedPubMedCentralCrossRef
22.
Zurück zum Zitat Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21.PubMedPubMedCentralCrossRef Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10:21.PubMedPubMedCentralCrossRef
23.
Zurück zum Zitat Rogers EM. Diffusion of innovations. New York: Simon and Schuster; 2010. Rogers EM. Diffusion of innovations. New York: Simon and Schuster; 2010.
24.
Zurück zum Zitat Castro FG, Barrera M Jr, Martinez CR Jr. The cultural adaptation of prevention interventions: Resolving tensions between fidelity and fit. Prev Sci. 2004;5:41–5.PubMedCrossRef Castro FG, Barrera M Jr, Martinez CR Jr. The cultural adaptation of prevention interventions: Resolving tensions between fidelity and fit. Prev Sci. 2004;5:41–5.PubMedCrossRef
25.
Zurück zum Zitat Cabassa L, Baumann A. A two-way street: bridging implementation science and cultural adaptations of mental health treatments. Implement Sci. 2013;8:90.PubMedPubMedCentralCrossRef Cabassa L, Baumann A. A two-way street: bridging implementation science and cultural adaptations of mental health treatments. Implement Sci. 2013;8:90.PubMedPubMedCentralCrossRef
27.
Zurück zum Zitat von Thiele Schwarz U, Förberg U, Sundell K, Hasson H. Colliding ideals–an interview study of how intervention researchers address adherence and adaptations in replication studies. BMC Med Res Methodol. 2018;18:36. von Thiele Schwarz U, Förberg U, Sundell K, Hasson H. Colliding ideals–an interview study of how intervention researchers address adherence and adaptations in replication studies. BMC Med Res Methodol. 2018;18:36.
28.
Zurück zum Zitat Chambers D, Glasgow R, Stange K. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8:117.PubMedPubMedCentralCrossRef Chambers D, Glasgow R, Stange K. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8:117.PubMedPubMedCentralCrossRef
29.
Zurück zum Zitat Rabin BA, McCreight M, Battaglia C, Ayele R, Burke RE, Hess PL, et al. Systematic. Multimethod Assessment of Adaptations Across Four Diverse Health Systems Interventions. Front Public Health. 2018;6:102.PubMed Rabin BA, McCreight M, Battaglia C, Ayele R, Burke RE, Hess PL, et al. Systematic. Multimethod Assessment of Adaptations Across Four Diverse Health Systems Interventions. Front Public Health. 2018;6:102.PubMed
30.
Zurück zum Zitat Pérez D, Van der Stuyft P, del Carmen ZM, Castro M, Lefèvre P. A modified theoretical framework to assess implementation fidelity of adaptive public health interventions. Implement Sci. 2015;11:91.CrossRef Pérez D, Van der Stuyft P, del Carmen ZM, Castro M, Lefèvre P. A modified theoretical framework to assess implementation fidelity of adaptive public health interventions. Implement Sci. 2015;11:91.CrossRef
31.
Zurück zum Zitat Stirman SW, Baumann AA, Miller CJ. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implement Sci. 2019;14:58.CrossRef Stirman SW, Baumann AA, Miller CJ. The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implement Sci. 2019;14:58.CrossRef
32.
Zurück zum Zitat Roscoe JN, Shapiro VB, Whitaker K, Kim BE. Classifying changes to preventive interventions: applying adaptation taxonomies. J Prim Prev. 2019;40:89–109.PubMedCrossRef Roscoe JN, Shapiro VB, Whitaker K, Kim BE. Classifying changes to preventive interventions: applying adaptation taxonomies. J Prim Prev. 2019;40:89–109.PubMedCrossRef
33.
Zurück zum Zitat Hoffmann TC, Erueti C, Glasziou PP. Poor description of non-pharmacological interventions: analysis of consecutive sample of randomised trials. BMJ. 2013;347:f3755.PubMedPubMedCentralCrossRef Hoffmann TC, Erueti C, Glasziou PP. Poor description of non-pharmacological interventions: analysis of consecutive sample of randomised trials. BMJ. 2013;347:f3755.PubMedPubMedCentralCrossRef
34.
Zurück zum Zitat Leichsenring F, Steinert C, Ioannidis JP. Toward a paradigm shift in treatment and research of mental disorders. Psychol Med. 2019:1–7. Leichsenring F, Steinert C, Ioannidis JP. Toward a paradigm shift in treatment and research of mental disorders. Psychol Med. 2019:1–7.
35.
Zurück zum Zitat Cox JR, Martinez RG, Southam-Gerow MA. Treatment integrity in psychotherapy research and implications for the delivery of quality mental health services. J Consult Clin Psych. 2019;87:221.CrossRef Cox JR, Martinez RG, Southam-Gerow MA. Treatment integrity in psychotherapy research and implications for the delivery of quality mental health services. J Consult Clin Psych. 2019;87:221.CrossRef
36.
37.
Zurück zum Zitat Dane AV, Schneider BH. Program integrity in primary and early secondary prevention: are implementation effects out of control? Clin Psychol Rev. 1998;18:23–45.PubMedCrossRef Dane AV, Schneider BH. Program integrity in primary and early secondary prevention: are implementation effects out of control? Clin Psychol Rev. 1998;18:23–45.PubMedCrossRef
38.
Zurück zum Zitat Yeaton WH, Sechrest L. Critical dimensions in the choice and maintenance of successful treatments: strength, integrity, and effectiveness. J Consulting Clin Psychol. 1981;49:156.CrossRef Yeaton WH, Sechrest L. Critical dimensions in the choice and maintenance of successful treatments: strength, integrity, and effectiveness. J Consulting Clin Psychol. 1981;49:156.CrossRef
39.
Zurück zum Zitat Stirman SW, Gutner C, Crits-Christoph P, Edmunds J, Evans AC, Beidas RS. Relationships between clinician-level attributes and fidelity-consistent and fidelity-inconsistent modifications to an evidence-based psychotherapy. Implement Scie. 2015;10(1):115.CrossRef Stirman SW, Gutner C, Crits-Christoph P, Edmunds J, Evans AC, Beidas RS. Relationships between clinician-level attributes and fidelity-consistent and fidelity-inconsistent modifications to an evidence-based psychotherapy. Implement Scie. 2015;10(1):115.CrossRef
40.
Zurück zum Zitat Resnicow K, Soler R, Braithwaite RL, Ahluwalia JS, Butler J. Cultural sensitivity in substance use prevention. J Community Psychol. 2000;28:271–90.CrossRef Resnicow K, Soler R, Braithwaite RL, Ahluwalia JS, Butler J. Cultural sensitivity in substance use prevention. J Community Psychol. 2000;28:271–90.CrossRef
41.
Zurück zum Zitat Hawe P. Lessons from complex interventions to improve health. Ann Rev Public Health. 2015;36:307–23.CrossRef Hawe P. Lessons from complex interventions to improve health. Ann Rev Public Health. 2015;36:307–23.CrossRef
42.
Zurück zum Zitat Steckler AB, Linnan L, Israel B. Process evaluation for public health interventions and research. San Francisco, California: Jossey-Bass; 2002. Steckler AB, Linnan L, Israel B. Process evaluation for public health interventions and research. San Francisco, California: Jossey-Bass; 2002.
43.
Zurück zum Zitat von Thiele Schwarz U, Hasson H, Lindfors P. Applying a fidelity framework to understand adaptations in an occupational health intervention. Work. 2015;51:195–203.PubMedCrossRef von Thiele Schwarz U, Hasson H, Lindfors P. Applying a fidelity framework to understand adaptations in an occupational health intervention. Work. 2015;51:195–203.PubMedCrossRef
44.
Zurück zum Zitat Aarons GA, Ehrhart MG, Farahnak LR, Sklar M. Aligning leadership across systems and organizations to develop a strategic climate for evidence-based practice implementation. Annu Rev Public Health. 2014;35:255–74.PubMedPubMedCentralCrossRef Aarons GA, Ehrhart MG, Farahnak LR, Sklar M. Aligning leadership across systems and organizations to develop a strategic climate for evidence-based practice implementation. Annu Rev Public Health. 2014;35:255–74.PubMedPubMedCentralCrossRef
45.
Zurück zum Zitat Castro FG, Yasui M. Advances in EBI development for diverse populations: Towards a science of intervention adaptation. Prev Sci. 2017;18:623–9.PubMedPubMedCentralCrossRef Castro FG, Yasui M. Advances in EBI development for diverse populations: Towards a science of intervention adaptation. Prev Sci. 2017;18:623–9.PubMedPubMedCentralCrossRef
46.
Zurück zum Zitat Cook TD, Campbell DT, Shadish W. Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin; 2002. Cook TD, Campbell DT, Shadish W. Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin; 2002.
47.
Zurück zum Zitat Schoenwald SK, Garland AF, Chapman JE, Frazier SL, Sheidow AJ, Southam-Gerow MA. Toward the effective and efficient measurement of implementation fidelity. Adm Policy Ment Health. 2011;38:32–43.PubMedPubMedCentralCrossRef Schoenwald SK, Garland AF, Chapman JE, Frazier SL, Sheidow AJ, Southam-Gerow MA. Toward the effective and efficient measurement of implementation fidelity. Adm Policy Ment Health. 2011;38:32–43.PubMedPubMedCentralCrossRef
48.
Zurück zum Zitat Scanlon JW, Horst P, Nay JN, Schmidt RE, Waller A. Evaluability assessment: Avoiding type III and IV errors. Evaluation management. 1977:71–90. Scanlon JW, Horst P, Nay JN, Schmidt RE, Waller A. Evaluability assessment: Avoiding type III and IV errors. Evaluation management. 1977:71–90.
49.
Zurück zum Zitat Green LW, Glasgow RE. Evaluating the relevance, generalization, and applicability of research issues in external validation and translation methodology. Evaluation and the Health Professions. 2006;29:126–53.PubMedCrossRef Green LW, Glasgow RE. Evaluating the relevance, generalization, and applicability of research issues in external validation and translation methodology. Evaluation and the Health Professions. 2006;29:126–53.PubMedCrossRef
50.
Zurück zum Zitat Aarons G, Hurlburt M, Horwitz S. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38:4–23.PubMedCrossRef Aarons G, Hurlburt M, Horwitz S. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38:4–23.PubMedCrossRef
51.
Zurück zum Zitat Schmidt S. Shall we really do it again? The powerful concept of replication is neglected in the social sciences. Rev Gen Psychol. 2009;13:90. Schmidt S. Shall we really do it again? The powerful concept of replication is neglected in the social sciences. Rev Gen Psychol. 2009;13:90.
52.
Zurück zum Zitat Escoffery C, Lebow-Skelley E, Haardoerfer R, Boing E, Udelson H, Wood R, et al. A systematic review of adaptations of evidence-based public health interventions globally. Implement Sci. 2018;13:125.PubMedPubMedCentralCrossRef Escoffery C, Lebow-Skelley E, Haardoerfer R, Boing E, Udelson H, Wood R, et al. A systematic review of adaptations of evidence-based public health interventions globally. Implement Sci. 2018;13:125.PubMedPubMedCentralCrossRef
53.
Zurück zum Zitat von Thiele Schwarz U, Lundmark R, Hasson H. The dynamic integrated evaluation model (DIEM): achieving sustainability in organizational intervention through a participatory evaluation approach. Stress Health. 2016;32(4):285–93.PubMedPubMedCentralCrossRef von Thiele Schwarz U, Lundmark R, Hasson H. The dynamic integrated evaluation model (DIEM): achieving sustainability in organizational intervention through a participatory evaluation approach. Stress Health. 2016;32(4):285–93.PubMedPubMedCentralCrossRef
54.
Zurück zum Zitat Shediac-Rizkallah M, Bone L. Planning for the sustainability of community-based health programs: conceptual frameworks and future directions for research, practice and policy. Health Educ Res. 1998;13:87–108.PubMedCrossRef Shediac-Rizkallah M, Bone L. Planning for the sustainability of community-based health programs: conceptual frameworks and future directions for research, practice and policy. Health Educ Res. 1998;13:87–108.PubMedCrossRef
55.
Zurück zum Zitat Blakely CH, Mayer JP, Gottschalk RG, Schmitt N, Davidson WS, Roitman DB, et al. The fidelity-adaptation debate: Implications for the implementation of public sector social programs. Am J Community Psychol. 1987;15:253–68.CrossRef Blakely CH, Mayer JP, Gottschalk RG, Schmitt N, Davidson WS, Roitman DB, et al. The fidelity-adaptation debate: Implications for the implementation of public sector social programs. Am J Community Psychol. 1987;15:253–68.CrossRef
56.
Zurück zum Zitat Hansen WB, Graham JW, Wolkenstein BH, Rohrbach LA. Program integrity as a moderator of prevention program effectiveness: Results for fifth-grade students in the adolescent alcohol prevention trial. J Stud Alcohol. 1991;52:568–79.PubMedCrossRef Hansen WB, Graham JW, Wolkenstein BH, Rohrbach LA. Program integrity as a moderator of prevention program effectiveness: Results for fifth-grade students in the adolescent alcohol prevention trial. J Stud Alcohol. 1991;52:568–79.PubMedCrossRef
57.
Zurück zum Zitat Becker SJ, Tanzman B, Drake RE, Tremblay T. Fidelity of supported employment programs and employment outcomes. Psychiatr Serv. 2001;52(834).PubMedCrossRef Becker SJ, Tanzman B, Drake RE, Tremblay T. Fidelity of supported employment programs and employment outcomes. Psychiatr Serv. 2001;52(834).PubMedCrossRef
58.
Zurück zum Zitat Fauskanger Bjaastad J, Henningsen Wergeland GJ, Mowatt Haugland BS, Gjestad R, Havik OE, Heiervang ER, et al. Do clinical experience, formal cognitive behavioural therapy training, adherence, and competence predict outcome in cognitive behavioural therapy for anxiety disorders in youth? Clin psychol psychother. 2018;25:865–77.PubMedCrossRef Fauskanger Bjaastad J, Henningsen Wergeland GJ, Mowatt Haugland BS, Gjestad R, Havik OE, Heiervang ER, et al. Do clinical experience, formal cognitive behavioural therapy training, adherence, and competence predict outcome in cognitive behavioural therapy for anxiety disorders in youth? Clin psychol psychother. 2018;25:865–77.PubMedCrossRef
59.
Zurück zum Zitat Sundell K, Beelmann A, Hasson H, von Thiele Schwarz U. Novel Programs, International Adoptions, or Contextual Adaptations? Meta-Analytical Results From German and Swedish Intervention Research. J Clin Child Adoles Psychol. 2015:1–13. Sundell K, Beelmann A, Hasson H, von Thiele Schwarz U. Novel Programs, International Adoptions, or Contextual Adaptations? Meta-Analytical Results From German and Swedish Intervention Research. J Clin Child Adoles Psychol. 2015:1–13.
60.
Zurück zum Zitat Bond GR, Becker DR, Drake RE. Measurement of fidelity of implementation of evidence-based practices: Case example of the IPS Fidelity Scale. Clin Psychol: Science Practice. 2011;18:126–41. Bond GR, Becker DR, Drake RE. Measurement of fidelity of implementation of evidence-based practices: Case example of the IPS Fidelity Scale. Clin Psychol: Science Practice. 2011;18:126–41.
61.
Zurück zum Zitat Tinetti ME, Fried TR, Boyd CM. Designing health care for the most common chronic condition—multimorbidity. JAMA. 2012;307:2493–4.PubMedPubMedCentral Tinetti ME, Fried TR, Boyd CM. Designing health care for the most common chronic condition—multimorbidity. JAMA. 2012;307:2493–4.PubMedPubMedCentral
62.
63.
64.
Zurück zum Zitat Anyon Y, Roscoe J, Bender K, Kennedy H, Dechants J, Begun S, et al. Reconciling Adaptation and Fidelity: Implications for Scaling Up High Quality Youth Programs. The journal of primary prevention. 2019;40(1):35–49.PubMedCrossRef Anyon Y, Roscoe J, Bender K, Kennedy H, Dechants J, Begun S, et al. Reconciling Adaptation and Fidelity: Implications for Scaling Up High Quality Youth Programs. The journal of primary prevention. 2019;40(1):35–49.PubMedCrossRef
65.
Zurück zum Zitat Marques L, Valentine SE, Kaysen D, Mackintosh M-A, De Silva D, Louise E, et al. Provider fidelity and modifications to cognitive processing therapy in a diverse community health clinic: Associations with clinical change. J Consult Clin Psych. 2019;87(4):357.CrossRef Marques L, Valentine SE, Kaysen D, Mackintosh M-A, De Silva D, Louise E, et al. Provider fidelity and modifications to cognitive processing therapy in a diverse community health clinic: Associations with clinical change. J Consult Clin Psych. 2019;87(4):357.CrossRef
66.
Zurück zum Zitat Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, et al. Methods to improve the selection and tailoring of implementation strategies. J Beh Health Ser & Research. 2017;44:177–94.CrossRef Powell BJ, Beidas RS, Lewis CC, Aarons GA, McMillen JC, Proctor EK, et al. Methods to improve the selection and tailoring of implementation strategies. J Beh Health Ser & Research. 2017;44:177–94.CrossRef
67.
Zurück zum Zitat Michie S, Atkins L, West R. The behaviour change wheel. A guide to designing interventions. Great Britain: Silverback Publishing; 2014. Michie S, Atkins L, West R. The behaviour change wheel. A guide to designing interventions. Great Britain: Silverback Publishing; 2014.
68.
Zurück zum Zitat Kakeeto M, Lundmark R, Hasson H, von Thiele Schwarz U. Meeting patient needs trumps adherence. A cross-sectional study of adherence and adaptations when national guidelines are used in practice. J Eval Clinic Practice. 2017;23:830–8.CrossRef Kakeeto M, Lundmark R, Hasson H, von Thiele Schwarz U. Meeting patient needs trumps adherence. A cross-sectional study of adherence and adaptations when national guidelines are used in practice. J Eval Clinic Practice. 2017;23:830–8.CrossRef
69.
Zurück zum Zitat Elliott DS, Mihalic S. Issues in disseminating and replicating effective prevention programs. Prev Sci. 2004;5:47–53.PubMedCrossRef Elliott DS, Mihalic S. Issues in disseminating and replicating effective prevention programs. Prev Sci. 2004;5:47–53.PubMedCrossRef
70.
Zurück zum Zitat Waltz TJ, Powell BJ, Fernández ME, Abadie B, Damschroder LJ. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci. 2019;14:42.PubMedPubMedCentralCrossRef Waltz TJ, Powell BJ, Fernández ME, Abadie B, Damschroder LJ. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci. 2019;14:42.PubMedPubMedCentralCrossRef
71.
Zurück zum Zitat Lyon AR, Bruns EJ. User-Centered Redesign of Evidence-Based Psychosocial Interventions to Enhance Implementation—Hospitable Soil or Better Seeds? JAMA Psych. 2019;76:3–4.CrossRef Lyon AR, Bruns EJ. User-Centered Redesign of Evidence-Based Psychosocial Interventions to Enhance Implementation—Hospitable Soil or Better Seeds? JAMA Psych. 2019;76:3–4.CrossRef
72.
Zurück zum Zitat Drahota A, Meza RD, Brikho B, Naaf M, Estabillo JA, Gomez ED, et al. Community-academic partnerships: A systematic review of the state of the literature and recommendations for future research. The Milbank Quarterly. 2016;94(1):163–214.PubMedPubMedCentralCrossRef Drahota A, Meza RD, Brikho B, Naaf M, Estabillo JA, Gomez ED, et al. Community-academic partnerships: A systematic review of the state of the literature and recommendations for future research. The Milbank Quarterly. 2016;94(1):163–214.PubMedPubMedCentralCrossRef
73.
Zurück zum Zitat Hasson H, Gröndal H, Hedberg Rundgren Å, Avby G, Uvhagen H, Von Thiele Schwarz U. How can evidence-based interventions give the best value for users in social services? Balance between adherence and adaptations: A study protocol. Implement Sci Communications. In press. Hasson H, Gröndal H, Hedberg Rundgren Å, Avby G, Uvhagen H, Von Thiele Schwarz U. How can evidence-based interventions give the best value for users in social services? Balance between adherence and adaptations: A study protocol. Implement Sci Communications. In press.
75.
Zurück zum Zitat Stirman SW, Finley EP, Shields N, Cook J, Haine-Schlagel R, Burgess JF, et al. Improving and sustaining delivery of CPT for PTSD in mental health systems: a cluster randomized trial. Implement Sci. 2017;12(1):32.CrossRef Stirman SW, Finley EP, Shields N, Cook J, Haine-Schlagel R, Burgess JF, et al. Improving and sustaining delivery of CPT for PTSD in mental health systems: a cluster randomized trial. Implement Sci. 2017;12(1):32.CrossRef
76.
Zurück zum Zitat Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implement Sci. 2019;14:1.PubMedPubMedCentralCrossRef Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implement Sci. 2019;14:1.PubMedPubMedCentralCrossRef
77.
Zurück zum Zitat Rutledge T, Loh C. Effect sizes and statistical testing in the determination of clinical significance in behavioral medicine research. Ann Behav Med. 2004;27:138–45.PubMedCrossRef Rutledge T, Loh C. Effect sizes and statistical testing in the determination of clinical significance in behavioral medicine research. Ann Behav Med. 2004;27:138–45.PubMedCrossRef
78.
Zurück zum Zitat Aarons GA, Fettes DL, Hurlburt MS, Palinkas LA, Gunderson L, Willging CE, et al. Collaboration, negotiation, and coalescence for interagency-collaborative teams to scale-up evidence-based practice. J Clin Child Adoles Psychol. 2014;43:915–28.CrossRef Aarons GA, Fettes DL, Hurlburt MS, Palinkas LA, Gunderson L, Willging CE, et al. Collaboration, negotiation, and coalescence for interagency-collaborative teams to scale-up evidence-based practice. J Clin Child Adoles Psychol. 2014;43:915–28.CrossRef
79.
Zurück zum Zitat Green AE, Fettes DL, Aarons GA. A concept mapping approach to guide and understand dissemination and implementation. J Behav Health Ser and Research. 2012;362–73.CrossRef Green AE, Fettes DL, Aarons GA. A concept mapping approach to guide and understand dissemination and implementation. J Behav Health Ser and Research. 2012;362–73.CrossRef
80.
Zurück zum Zitat von Thiele Schwarz U, Richter A, Hasson H. Getting everyone on the same page: Cocreated program logic (COP). In: Nielsen K, Noblet A, editors. Organizational Interventions for Health and Well-being: Taylor and Francis; 2018. p. 58–83.CrossRef von Thiele Schwarz U, Richter A, Hasson H. Getting everyone on the same page: Cocreated program logic (COP). In: Nielsen K, Noblet A, editors. Organizational Interventions for Health and Well-being: Taylor and Francis; 2018. p. 58–83.CrossRef
81.
Zurück zum Zitat Tackett JL, Lilienfeld SO, Patrick CJ, Johnson SL, Krueger RF, Miller JD, et al. It’s time to broaden the replicability conversation: Thoughts for and from clinical psychological science. Persp Psychol Sci. 2017;12:742–56.CrossRef Tackett JL, Lilienfeld SO, Patrick CJ, Johnson SL, Krueger RF, Miller JD, et al. It’s time to broaden the replicability conversation: Thoughts for and from clinical psychological science. Persp Psychol Sci. 2017;12:742–56.CrossRef
82.
Zurück zum Zitat Des Jarlais D, Lyles C, Crepaz N, Group T. Improving the reporting quality of nonrandomized evaluations of behavioral and public health interventions: the TREND statement. Am J Public Health. 2004;94:361–6.PubMedPubMedCentralCrossRef Des Jarlais D, Lyles C, Crepaz N, Group T. Improving the reporting quality of nonrandomized evaluations of behavioral and public health interventions: the TREND statement. Am J Public Health. 2004;94:361–6.PubMedPubMedCentralCrossRef
83.
84.
Zurück zum Zitat Von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP, et al. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. PLoS Med. 2007;4(e296).PubMedPubMedCentralCrossRef Von Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche PC, Vandenbroucke JP, et al. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) statement: guidelines for reporting observational studies. PLoS Med. 2007;4(e296).PubMedPubMedCentralCrossRef
85.
Zurück zum Zitat Lewis CC, Lyon AR, McBain SA, Landes SJ. Testing and Exploring the Limits of Traditional Notions of Fidelity and Adaptation in Implementation of Preventive Interventions. J Prim Prev. 2019;40:137–41.PubMedCrossRef Lewis CC, Lyon AR, McBain SA, Landes SJ. Testing and Exploring the Limits of Traditional Notions of Fidelity and Adaptation in Implementation of Preventive Interventions. J Prim Prev. 2019;40:137–41.PubMedCrossRef
86.
Zurück zum Zitat DeRosier ME. Three Critical Elements for Real-Time Monitoring of Implementation and Adaptation of Prevention Programs. J Prim Prev. 2019;40:129–35.PubMedCrossRefPubMedCentral DeRosier ME. Three Critical Elements for Real-Time Monitoring of Implementation and Adaptation of Prevention Programs. J Prim Prev. 2019;40:129–35.PubMedCrossRefPubMedCentral
87.
Zurück zum Zitat Pawson R. The science of evaluation: A realist manifesto: Sage; 2013.CrossRef Pawson R. The science of evaluation: A realist manifesto: Sage; 2013.CrossRef
88.
Zurück zum Zitat Berkel C, Gallo CG, Sandler IN, Mauricio AM, Smith JD, Brown CH. Redesigning Implementation Measurement for Monitoring and Quality Improvement in Community Delivery Settings. J Prim Prev. 2019;40:111–27.PubMedCrossRefPubMedCentral Berkel C, Gallo CG, Sandler IN, Mauricio AM, Smith JD, Brown CH. Redesigning Implementation Measurement for Monitoring and Quality Improvement in Community Delivery Settings. J Prim Prev. 2019;40:111–27.PubMedCrossRefPubMedCentral
89.
Zurück zum Zitat Lindblad S, Ernestam S, Van Citters A, Lind C, Morgan T, Nelson E. Creating a culture of health: evolving healthcare systems and patient engagement. QJM: Int J Med. 2017;110:125–9. Lindblad S, Ernestam S, Van Citters A, Lind C, Morgan T, Nelson E. Creating a culture of health: evolving healthcare systems and patient engagement. QJM: Int J Med. 2017;110:125–9.
90.
Zurück zum Zitat Ovretveit J, Keller C, Forsberg HH, Essén A, Lindblad S, Brommels M. Continuous innovation: developing and using a clinical database with new technology for patient-centred care—the case of the Swedish quality register for arthritis. Int J Qual Health Care. 2013;25:118–24.PubMedCrossRef Ovretveit J, Keller C, Forsberg HH, Essén A, Lindblad S, Brommels M. Continuous innovation: developing and using a clinical database with new technology for patient-centred care—the case of the Swedish quality register for arthritis. Int J Qual Health Care. 2013;25:118–24.PubMedCrossRef
92.
Zurück zum Zitat Atkins D, Kilbourne AM, Shulkin D. Moving from discovery to system-wide change: the role of research in a learning health care system: experience from three decades of health systems research in the Veterans Health Administration. Ann Review Publ Health. 2017;38:467–87.CrossRef Atkins D, Kilbourne AM, Shulkin D. Moving from discovery to system-wide change: the role of research in a learning health care system: experience from three decades of health systems research in the Veterans Health Administration. Ann Review Publ Health. 2017;38:467–87.CrossRef
93.
Zurück zum Zitat Riggare S, Unruh KT, Sturr J, Domingos J, Stamford JA, Svenningsson P, et al. Patient-driven N-of-1 in Parkinson’s Disease. Methods information med. 2017;56:e123–e8.CrossRef Riggare S, Unruh KT, Sturr J, Domingos J, Stamford JA, Svenningsson P, et al. Patient-driven N-of-1 in Parkinson’s Disease. Methods information med. 2017;56:e123–e8.CrossRef
Metadaten
Titel
The Value Equation: Three complementary propositions for reconciling fidelity and adaptation in evidence-based practice implementation
verfasst von
Ulrica von Thiele Schwarz
Gregory A. Aarons
Henna Hasson
Publikationsdatum
01.12.2019
Verlag
BioMed Central
Erschienen in
BMC Health Services Research / Ausgabe 1/2019
Elektronische ISSN: 1472-6963
DOI
https://doi.org/10.1186/s12913-019-4668-y

Weitere Artikel der Ausgabe 1/2019

BMC Health Services Research 1/2019 Zur Ausgabe