Background
Implementation strategies constitute the “how to” of getting evidence-based interventions (EBIs) into practice [
1‐
4]. Powell et al. [
5] define implementation strategies as “methods or techniques used to enhance the adoption, implementation, sustainment and scale-up” of interventions. Strategies can vary in complexity from single to multifaceted [
3,
5]. Selecting appropriate strategies to target barriers to implementing interventions represents a considerable challenge in implementation research and practice [
4‐
9], e.g., choosing appropriate strategies to influence poor motivation or negative attitudes concerning the use of a new intervention. Systematic approaches are needed to design and tailor implementation strategies that integrate evidence, theory, and stakeholder perspectives [
7,
10,
11]. Theories, models, and frameworks have been developed to facilitate the matching of relevant strategies to specific barriers [
4,
6,
8‐
10,
12]. However, the use of inconsistent terminology and inadequate descriptions of strategies [
3,
13] make it difficult to identify optimal strategies and to advance our understanding of how and when different implementation strategies are effective [
14‐
16]. Implementation strategies tend to be entangled in the context, which can affect the effectiveness of the strategies [
17]. Thus, if the identified barrier is insufficient skills training can have more impact on health care practitioners’ performance than a “default” strategy, which has always been selected in the past regardless of what barriers might have existed.
Developing and selecting implementation strategies can be achieved in different ways, e.g., through expert panels [
18] that involve both implementation researchers and clinical experts or through stakeholder involvement in co-design processes to achieve more contextually adapted strategies to increase the likelihood of successful implementation of EBIs [
19‐
21]. Co-design has been defined as “the creativity of designers and people not trained in design working together in the design development process” [
22]. Co-design is used to develop solutions to complex problems [
23] and is intended to be a social and democratic process [
24]. Several advantages of accounting for stakeholder priorities in co-design processes have been described, including improved credibility of the results and optimization of the implementation of EBIs through better understanding of the intervention-context fit [
7,
11,
25‐
27]. Some studies have investigated the use of co-design processes involving practitioners (i.e., non-researchers) together with researchers to develop and select implementation strategies for implementation of EBIs in health care [
28‐
30].
This study was part of the WALK-Copenhagen (WALK-Cph) project [
31], which was initiated to implement a multi-facetted intervention to increase mobility in older patients acutely admitted to two medical departments at two university hospitals in Denmark (Hospital X and Hospital Y). The project used a Hybrid II design, with a dual focus on the development and the implementation of the intervention [
32]. This study report results from study 1b, 2d and 2 h in relation to the overall WALK-Cph project (Additional file
1: Appendix S1). The intervention was co-designed in a collaborative process between researchers, one design architect employed at the hospital, health care practitioners, patients, and their relatives. The co-design of the strategies to implement the intervention involved only researchers and health care practitioners, who were to be responsible for implementing the intervention. The content of the intervention is shown in Additional file
2: Appendix S2. In response to lack of empirical knowledge on the involvement of health care practitioners in co-designing implementation strategies, the aim of this study was to investigate what implementation strategies were selected by health care practitioners and their managers and how they justified these strategies aimed at facilitating the implementation of the WALK-Cph intervention.
Discussion
This qualitative study investigated the strategies selected by health care practitioners and their managers for the implementation of the WALK-Cph intervention. We found that implementation champions and managers selected primarily implementation strategies classified as educating and planning in the taxonomy by Powell et al. [
13]. There were also a few managing quality, restructuring and financing strategies. The justifications for the selection of these strategies were made by using pragmatic justifications (experience, intentions, habits, resource limitations, sense of shared responsibility and unfamiliarity with implementation) and theoretical justifications (QI knowledge, organizational theory knowledge, and professional knowledge). These results are consistent with the taxonomy developed by Proctor et al. [
3] concerning different types of justifications. We did not identify any empirical justifications to motivate selections of strategies.
The use of pragmatic justifications in our study highlights a tension between using generalizable research-based knowledge and paying attention to the importance of the local context [
50]. Research has shown that clinical experts largely make choices based on a tacit practice-based knowledge built up from practitioners’ experience, which is manifested in their craft expertise and skills; the source is often specific problems that require solutions [
51]. In contrast, some authors have suggested that implementation experts should make decisions based on findings from implementation research, i.e., research-based knowledge [
3,
13]. However, there is a risk that research-based knowledge is difficult to apply in the local setting because it is not adapted to the specific context where implementation occurs, e.g., the type of leadership or culture in a particular hospital or department. On the other hand, a co-design process can produce more contextualized strategies and interventions, but the risk is that too much emphasis is put on pragmatic justifications of what clinical experts “believe” would work well because it may have worked in the past. It has been argued that unsuccessful implementation is more likely when strategies are chosen routinely or by habits rather than being based on purposefully addressing specific barriers [
52].
Our study findings suggest the importance of finding a balance between research- and practice-based knowledge in co-design processes. Practice-based knowledge predominantly serves to solve the problems that occur in everyday life and work, but the subjective and context-bound nature of this knowledge limits its generalizability. On the other hand, research-based knowledge typically has ambitions for applicability beyond the immediate boundaries of the specific study, but this type of knowledge can rarely provide quick solutions to problems [
52]. In practice, however, aspects of the two types of knowledge are intertwined, which means that it is rarely an either/or choice for health care practitioners, but more often a question of making sense of many sources of knowledge. Fitzgerald et al. [
53] view the relationship as “circular”, with the two knowledge types reinforcing each other as they become woven together. An important point is that implementation is not only a science but also a practice since many implementers, as in this study, are not researchers. When implementation occurs in a real-world setting (and when not done by researchers) it requires judgment and skills to ensure adaptation to real-time changes and variations, as well as drawing on evidence from the field of implementation science. So, both types of knowledge are needed for practitioners to develop a high level of competence, i.e. the ability to act knowledgeably, effectively, deliberately, strategically and reflectively in a situation.
Many of the implementation strategies mapped onto the category of planning and educating [
13], providing examples of selections based on practice-based evidence from past experiences. However, strategies like these tend to have a limited duration and may not support the implementation of an intervention throughout an entire project. The reason for choosing these short-term strategies may be due to the participants’ belief that implementation is an “introduction” of something new, i.e. implementation is something with a clear beginning and an end. However, if implementation is viewed in terms of a longer, more complex change process with no clear ending there is a need for strategies that can provide long-term support for implementation to be successful. Hence, the selection of strategies is dependent on the perspective of implementation. This finding highlights the relevance of selecting implementation strategies that can support the implementation process in a longer time perspective. One tool that could support the selection of strategies is for researchers to make a greater effort to relate these strategies to the participants and settings, so the participants become more familiar with the different types of implementation strategies and the underlying evidence. Evidence was not a factor that appeared to be significant to the participants in their justifications when no empirical justifications were used.
Education was a commonly selected category of implementation strategy by the stakeholders in our study. Despite the preference for this type of strategy there is rather weak evidence of effectiveness for educational strategies focusing on cognitive participation at the expense of collective action in changing healthcare professionals’ behaviours [
54], which usually describes such strategies as ineffective for changing practice or achieving optimal care [
44]. These results highlight the research-based knowledge versus practice-based knowledge dilemma, raising the question of whether a more research-led process would have ensured a stronger emphasis on strategies with proven effectiveness. Participatory design has seen a development from the users as subjects to users as partners, with a continuum of user-involvement methods in which the power to determine the outcome to a greater or lesser degree is placed with the researchers [
22]. In the current study, the participants had full authority to influence the process and the outcome. Based on some of our results, including the choice of strategies being based only on pragmatic and theoretical justifications and the paucity of knowledge about implementation based on implementation research, a more equal distribution of power could possibly have been more appropriate. On the other hand, a more researcher-led process utilizing more research-based knowledge could challenge the basic premise of co-design, i.e., to ensure a high degree of user involvement and move decisional authority from the research team to the participants, also defined as “power from us to them” [
55] and where the definition of power is the ability to influence an outcome [
56].
The theoretical justifications in our study were not based on knowledge from implementation science [
57]. This finding is not problematic per se, but the participants expressed a need to learn more about implementation research. Several of the participants were selected as implementation champions because they had previously dealt with QI tasks and implementation issues in their daily work. Our finding is consistent with Mosson et al. [
58], who found that health care practitioners and their managers often lack skills for implementing evidence-based methods and that implementation often occurs without a structured approach [
59].
These results suggest a need for expanding and/or improving training in implementation science for health care practitioners to facilitate the practical use of this knowledge. Westerlund et al. [
60] and Lyon et al. [
61] have highlighted a form of “implementation paradox,” wherein there is a risk of knowledge produced in implementation science not being used in real-world health care practice despite the fact that the field was borne out of ambitions to bridge the knowing-doing gap. Although implementation science is an applied science, the extent to which knowledge produced in this field is actually used by practitioners is not known [
60]. There are few empirical studies concerning if or how knowledge on implementation is being applied in health care practice [
62,
63]. Meissner et al. [
38] and Ramaswamy et al. [
64] argue that more courses on training in implementation science nationally and internationally are needed to expand implementation capacity [
65]. Our experiences working with the implementation of the WALK-Cph intervention top managers need to acknowledge implementation research as a scientific field with relevant knowledge to support real-life implementation if they are to prioritize staff resources for learning about implementation science in the form of skills, knowledge, and practice-based learning.
A pragmatic justification that occurred in our study was a shared sense of responsibility, which can be understood as a relational concept where a person has responsibility for causing something to happen where there can be implications of praise or blame [
66]. Achieving collaboration across health care sectors in Denmark is a complex matter [
67‐
69], and the participants in our study had previously tried coordination of communications across health care sectors without much success. From earlier experiences, the participants had learned that there was not a one size fits all implementation strategy that could ensure efficient collaboration across sectors and a shared responsibility when implementing interventions. These previous experiences entailed developing a new strategy, a visit across sectors, which none of the participants had tried before. This was justified by the idea that it could create a shared responsibility, which would have consequences for the trust or distrust between the participants depending on the outcome of the implementation of the WALK-Cph intervention.
It has become increasingly important for managers to gain the trust of followers which is needed to achieve effective leadership [75]. Balkrishnan et al. [
70] have defined trust as the willingness to be vulnerable to the actions of another party, irrespective of the ability to monitor or control the other party. To propose an implementation strategy that was not tried or known by some of the participants could be a sign of trust between the participants across sectors but also between the implementation champions and the managers. By creating a relational “safe” place built on trust in their interactions with the implementation champions, the managers made the participants more open to trying something new even if it meant failing [
71]. Building trust became an important factor for pragmatic justifications.
Strengths and limitations
A strength of the study was the design and methodological choices, which enabled us to follow the strategies from their selection to their implementation. This increased our knowledge about the “birth” and the “life” of implementation strategies in clinical practice. It was also a strength that we were present at the workshops to observe (gestures, mimics, etc.) the participants, which gave us a better understanding of the situational and contextual situation in which these strategies were selected. Another strength was that we described implementation strategies based on Proctor et al.’s [
3] standards for characterizing implementation strategies, because this strengthened the labeling, increased comparability, and increased the knowledge of whether, which, who, and why when working with implementation strategies and health care practitioners.
A limitation of the study was that we could not comment on the effectiveness of the strategies selected. Further research is necessary to study whether there are correlations between the type of justification and management styles and effectiveness.
The way the co-design process was carried out may be considered a limitation. The researchers chose only to take on a productive role when contributing with implementation science knowledge and a part from this a facilitating role to ensure that ownership of the implementation plan would lie with the stakeholders. The ambition was to ensure that ownership of the implementation plan would lie with the stakeholders. Much emphasis was put on pragmatic justifications despite the fact that the research team had ensured that all relevant knowledge was present in the co-design team. Further, knowledge about implementation strategies was presented and discussed. Throughout the process, the facilitator asked challenging questions to the participants concerning their justifications of strategies.
The challenges faced in the co-design process may be difficult to generalize too broadly since many decisions taken were likely specific to the process and context of the studied case. Variability in co-design processes restricts generalizability and the ability to draw definitive or far-reaching conclusions.
A further limitation is the sole focus on health care practitioners’ training and learning implementation science instead of letting implementation researchers and health care practitioners train together as they could learn from each other, potentially making practice more research-informed and the science of implementation more practical and applicable.
A learning point for future co-design processes is to ascertain that expert knowledge and experience from the researchers, who were implementation researchers in this study, is afforded an equally central place in the co-design process. The goal would be to orchestrate the co-design process in a way that enables a synergistic combination of the different stakeholders’ expertise and experiences.
Transferability of the findings to other settings is possible, as thick descriptions were developed concerning the selection of implementation strategies and the justifications by selecting exemplary citations and describing the contexts of the data collection, including referring to the article focusing on reflective reflections on the use of co-design methods from a researcher perspective [
36].
In this study, the implementation champions and managers came from medical departments and the municipalities. Further research should explore other settings and departments where implementation of evidence-based practice is more of a strategic imperative, which could yield a stronger focus on EBI. The study has shown that situational and relational factors are important for justification of implementation strategies, underscoring that justifications are highly sensitive to contextual factors such as culture, climate, and management.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.