Results
Twenty three of the 29 eligible participants completed the preworkshop questionnaire (n = 23), 24 completed the postworkshop questionnaire (n = 24), and 13 completed the follow-up questionnaire (n = 13). Participants were asked to check all titles that applied to their role. Of the 23 who responded, 12 identified as clinicians, 11 as clinical leaders, 13 as administrator/managers, 9 as researchers, and 3 as “other.” Participants were asked to describe their knowledge of the research evidence related to OSG before the workshop. Two out of 23 participants self-identified as expert, 8/23 as knowledgeable, 7/23 as somewhat knowledgeable, and 7/23 as having limited knowledge.
Results are discussed according to participants’ responses to these dimensions:
-
Methods of engagement (emails, slide show, journal articles, small group work, workshop as a whole)
-
Evidence (scientific, experiential, systems) and the delivery of this evidence
-
Factors at the institutional level, such as administrative support, culture of workplace, budgetary issues, new programming policies, etc., that differentially influence decision-making abilities and strategies
Methods of stakeholder engagement
Prior to the workshop, a series of emails were sent by the OSG advisory committee to invitees that contained a slideshow and two manuscripts. In the postworkshop questionnaire, all participants said that receiving the emails containing the supplemental material prior to the workshop prepared them (24/24), and the majority stated that there was no other information that would have better prepared them for the meeting (20/24). The large majority of participants read the two manuscripts (19/23 and 17/23) and found them most useful in providing background information about OSG and disseminating research evidence. Only half of participants (12/23) who completed the preworkshop questionnaire watched the slideshow.
All respondents who completed the postworkshop questionnaire (n = 24) stated that the workshop was relevant to their work in supportive care, that the material was appropriate for those invited, that the focus was what they expected, and that their questions and concerns were addressed. The workshop was most effective in translating the current activity of the National Steering Group, with the vast majority (18/24) reporting it to be very effective. Just under half of participants (10/24) reported the workshop to be very effective in increasing their knowledge of the research evidence related to OSG. When asked to identify all that they found beneficial about the workshop, most respondents indicated the opportunity to communicate matters of strategy and practical issues related to implementation with others in supportive care (21/24) and create an action plan for how to implement OSG in their area (17/24).
The value of evidence: scientific and other
In the preworkshop questionnaire, participants were asked to identify all of the methods they used to stay abreast of current research in relation to supportive cancer care. Publications were the most highly reported method (17/23), followed by conferences (12/23), colleagues/word of mouth (12/23), and continuing education sessions (11/23).
In the 5-month follow-up survey, all respondents (n = 13) said that they had the opportunity to discuss OSG with their colleagues and that they had a good grasp of what OSG are about. All but one said they felt that they could convey this knowledge to others they work with and that they returned to their work place after the workshop with the intent of finding a way to implement OSG in their region (12/13 participants). Participants were asked which strategies would be useful in making the most of existing supports for OSG in their region/institution. Most participants reported the usefulness of a detailed outline of budgetary requirements to implement OSG (11/13), followed by published peer-reviewed journal articles (7/13). If the aim is to tackle barriers related to implementing OSG, participants identified a detailed outline of budgetary requirements to be the most helpful (8/13), followed by a short research brief summarizing findings into policy and program relevant messages (6/13).
Institutional-level factors that influence decision making
In the preworkshop questionnaire (n = 23), participants were asked about the types of supports they were offered to access research evidence in their everyday practice. Nearly all participants (22/23) said that they received support. Time off work to attend conferences, presentations, rounds, etc. were the most common form of reported support (21/23), followed by research assistance through an individual such as a librarian or educator (20/23) and encouragement to participate on research teams (18/23).
In the postworkshop survey (n = 24), participants were asked if OSG are an intervention that would be supported by their institution. Nearly all respondents (20/24) said yes, and 5/24 said unsure (two people checked both yes and unsure). The majority of participants identified budgetary restrictions as an issue (14/24), followed by limited professional capacity to follow through (8/24) and uncertainty regarding effectiveness of OSG (3/24). Comments included “budgetary restrictions could be problematic”; “limited resources, program planning”; and “cost/benefit analysis need to be conducted to build business case for long term clinical secure viability.”
In the follow-up survey 5 months after the workshop (n = 13), participants were asked about current supports in their region/at their institution that would assist in implementing OSG, a commitment to support cancer care programming was the most common (12/13), followed by a need for services in rural areas (11/13) and existing moral support from colleagues and clinicians (10/13). Participants were asked to identify all the reasons why OSG are not/may not be right for their jurisdiction. Again budgetary issues were the most frequently cited: 4/13 said costing implications for operating OSG are not clear; another 3/13 said uncertainty with funding a service that crosses provincial jurisdictions.
Discussion
The goal of the OSG advisory committee was to educate workshop participants about the of OSG program (disseminate) with the specific objectives of presenting information relevant to previously identified barriers, specifically, clinical/relational aspects of the therapeutic intervention and organizational challenges and logistics. The information presented and the facilitated discussion and brainstorming sessions were designed to advance decision makers from contemplation to commitment to deliver and fund an OSG program in their regions (implementation). Of the 14 decision makers who attended the workshop, 13 committed to partnering with the OSG national advisory committee to offer OSG in their regions and dedicate some form of resource.
According to questionnaire results, the dissemination of information at the workshop (including emails sent before) was successful. The opportunity to network and converse with others face to face was identified as the most beneficial thing about this KE strategy, a point substantiated in the KE literature [
24‐
26]. Interestingly,
learning about OSG was the least identified benefit of the workshop, and less than half of responders classified it as “very effective” in increasing their knowledge of the research evidence related to OSG. This does not suggest that the workshop failed in educating participants about OSG; rather, this high-level group of research users regards scientific evidence as but one type of information relevant to their decision-making roles. This group of high-level end users appears to utilize traditional methods of KE, such as peer-reviewed articles, through which to stay current of scientific evidence. They are well supported to seek scientific evidence at the institutional level, a key factor also identified in the KE literature [
27,
28]. Face-to-face meetings, therefore, may be regarded as an activity better suited to communicating with others about the logistical and strategic aspects of implementation, rather than to learn about and understand the scientific evidence. While conversing and networking was not a primary goal of the workshop, the workshop accomplished the two other goals of dissemination and garnering commitment for implementation. When asked to identify the most significant topic covered, responses included information to move forward, the sustainability and feasibility of OSG, and a variety of region/province-specific discussions:
-
“Moving forward to my region, continuing conversations and building support and partnerships”
-
“Sustainability of OSG”
-
“Necessary training to provide the type of care, next steps for sustainability”
-
“Discussion in small groups within provinces”
-
“Clear demonstration of feasibility and safety of OSG”
-
“Practical issues; how to move on”
As these individuals come from a diversity of institutions, the fact that nearly all of them are well supported in accessing and consulting the research evidence could be due to selection bias or may suggest a cultural shift where the importance of evidence-informed decision making is not only recognized at the organizational level, but also facilitated. While much of the early literature supports the idea that face-to-face opportunities are often more effective than printed material in facilitating knowledge exchange [
9,
29,
30], for individuals who have research support, published material remains an optimal method of disseminating scientific evidence and, in so doing, frees up face-to-face meetings to focus on systems level or implementation information, such as pertinent policies, budgetary requirements, and technical issues.
In a review of the KE literature, “acceptable evidence for decision makers can be less rigorous than that for researchers and includes gray literature (i.e., government publications, consultants’ reports, monographs and conference proceedings)” and that in one study, “decision makers persistently valued experience more than they did research” [
20]. The evidence exchanged at the workshop included scientific (research evidence), experiential (that of OSG facilitators), and systems level (information related to uptake, implementation, and maintenance as defined by Best, 2009) [
26]. Scientific evidence is clearly valued by this group—research briefs and peer-reviewed papers were identified as being helpful in conveying knowledge to others about OSG, in making the most of existing supports and to tackling barriers to implementing OSG. At the same time, however, discussion of the scientific evidence did not dominate the workshop. The initial concerns by the OSG advisory committee that the risks and effectiveness of OSG were not clearly understood by clinicians did not appear to be an issue for those who completed the follow-up questionnaire. Only one person identified “uncertainty regarding patient privacy and data security” and “uncertainty regarding effectiveness of OSG” as reasons for why OSG may not be right for their region. The main concern by this group was related to an absence of precedent for a national approach and practical details such as cost sharing formulas and professional accreditation. Lasting concerns seem to be related to logistic and administrative issues rather than with OSG as a method of psychosocial counseling.
Analysis indicates that the implementation of the information exchanged at the workshop was successful. Since the workshop, 13 of the 14 decision makers have committed resources from their cancer center—either their own time in the leadership group and/or clinical staff time for facilitation, to the OSG initiative. For those who stated that OSG is not right for their institution, more people said that this was due to uncertainty around budgetary implications rather than simply having budgetary restrictions. This decision appears to be influenced by the lack of funding information rather than the lack of available funds, or disinterest, in the program.
A desire for systems-level information reflects the types of knowledge needed by decision makers in different jurisdictions. While decision makers are responsible for identifying a program or intervention that is evidence based, they must also illustrate how this evidence will translate in their local, institutional setting [
26]. Systems-level information is not likely to be found in the scientific literature, nor is it common in health services research and policy reports, despite a call by many for this to change [
25,
31]. While this workshop was successful in disseminating information related to OSG, there was a need for more systems-level information by some participants.
This small-scale study has two limitations. First, due to the number of participants, the generalizability of findings is limited. Replication of findings through several more case studies would address this limitation. Second, selection bias may affect the measures of KE success as participants were selected by the OSG advisory committee, they may already have been informed of the evidence related to OSG, and rather predisposed to commit to funding and offering OSG through their institutions regardless of the workshop. However, since there is still appreciable variation in the level of commitment, selection bias alone could not explain these results.
Conclusions
KE is a complex and multifaceted process that cannot easily be replicated from one context to the next [
32]. Evaluation of specific strategies, however, could lead to both a better understanding of the types of information that clinicians and decision makers need and the most effective communication approaches through which to engage them [
33]. The KE-DS Model offers a comprehensive approach to mapping knowledge exchange, thus making transparent the potential opportunities for knowledge transfer, as well as the evaluation of specific dissemination and implementation strategies.
OSG is a relatively new intervention that has been proven effective through early scientific research. As there is little precedent for this type of professionally led online service, implementation information around policies, budget, and professional accreditation is limited. Through the evaluation of a workshop with invited decision makers and clinical leads, the advisory committee learned that this group of end users desired systems-level information related to implementation. A face-to-face meeting is best suited to facilitate networking and disseminate implementation and policy relevant information. Peer-reviewed journal articles and conferences remain optimal methods of dissemination for research evidence. This group of end users is well supported at the organizational level to stay current of scientific evidence, marking a shift in health care culture where the importance of KE is recognized and encouraged. This workshop was effective in both disseminating information through emails sent with published, peer-reviewed journal articles attached and facilitating the implementation of this evidence by securing 13 participants’ commitment to dedicate resources towards OSG. Evaluation of KE activities is an important component of facilitating evidence-informed decision making and will lead to improved understanding of these complex processes.