Background
In the 21
st century, clinical practice guidelines (CPGs) continue to be promoted as a means of improving the quality of patient care and patient health outcomes, reducing practice variation, and promoting more efficient use of health resources. Their potential benefits, however, are contingent on both rigorous guideline development processes that incorporate the best available evidence and successful implementation of guidelines into practice [
1‐
4].
Canadian developers of 1446 guidelines released between 1994 and 1999 that were included in the Canadian Medical Association (CMA) InfoBase were surveyed to determine how guidelines were developed, disseminated, and evaluated; the analysis of these data were reported in an earlier publication[
5]. At that time, the guideline development process was characterized by computerized searches of the literature, grading of the evidence in about half the guidelines, and consensus about recommendations using expert opinion and/or open discussion. Guideline developers largely disseminated their guideline via mailings direct to healthcare professionals or publications in professional newsletters or journals. Few evaluated their dissemination strategies or the impact of the guideline on health outcomes (6% and 5% respectively).
Reviews of systematic reviews about the effectiveness of various strategies to increase health care professionals use of research and practice guidelines published in the late 1990s indicated that dissemination of educational materials and didactic educational sessions had little or no effect on changing professional practice[
6,
7]. In the intervening years, considerable work has been done investigating the effectiveness of various dissemination and implementation strategies aimed at changing practitioners' practice. Grimshaw and colleagues' 2004 review of the effectiveness of guideline dissemination and implementation strategies supported the earlier findings that strategies such as reminders were potentially effective and resulted in moderate improvements in the process of care[
8]. Educational outreach had modest effects although it was considered resource intensive and potentially costly[
8]. Educational materials, audit and feedback, and patient directed interventions were less commonly evaluated but appeared to have "limited effect"[
8]. A review of cluster randomized controlled trials revealed that passive strategies (e.g. mailings of printed educational material), contrary to conventional wisdom, may actually be useful for promoting the uptake of guidelines on their own by about 8%[
3]. The same review showed that the median absolute improvement in performance across interventions was 14.1% for reminders; 7% for audit and feedback strategies; and 6% for strategies which included educational outreach[
3].
Curious about whether there had been any changes in guideline development, dissemination, and evaluation activities in Canada over the past decade, the CMA continued to survey the developers of guidelines released during the period 2000–2005 and included in the CMA Infobase. This paper reports on how guidelines were developed, disseminated and evaluated in Canada between 1994 and 2005. We examined changes in these activities between the 6 year periods of 1994–1999 (earlier period) and 2000–2005 (recent period).
Discussion
Guideline development in Canada, as elsewhere, is undertaken by many different organizations and so it is challenging to know "who is doing what." Surveying guideline developers about their processes of guideline development and implementation and doing so over time, offers a unique glimpse in the guideline industry in Canada and how it is evolving. We are unaware of any other national longitudinal data revealing trends over time in the practices of major guideline developers. Our findings are also unique in that we report on guideline developers' efforts to increase the use of guidelines and evaluate their impact. These are areas for which there are very few data in the literature despite the critical role of implementation strategies in facilitating the uptake of guidelines. Furthermore, without the adoption of guidelines by health providers there will certainly be no impact on health status or health system outcomes, the ultimate purpose of developing guidelines in the first place.
Comparing guidelines released and included in the CMA Infobase from 1994–1999 and 2000–2005, revealed that 100 fewer guidelines were deposited in the CMA Infobase in the recent 6 year period and 19 fewer guideline developers submitted their guidelines to the Infobase. While this may suggest that guideline development in Canada may be slowing, there is no way to know whether this reflects the development of fewer guidelines or whether guideline developers were depositing their guidelines in the CMA Infobase less often. For both time periods, national professional, para-government and government associations and agencies were the dominant producers of guidelines.
Over the 12 year period, guideline developers in Canada increasingly submitted guidelines in English only. More recent guidelines were 6% more likely to have conducted a computerized search of the literature, 8% more likely to have stated the search strategy in the guideline document, and 17% more likely to have reached consensus via open discussion than guidelines in the earlier period (11% reduction in the use of structured processes to reach consensus). The greater reliance on computer searching for the evidence and greater transparency about the search strategy in the more recent period is positive but it is not known whether this translated in higher quality guidelines.
Of concern is the fact that less than half the guidelines in the recent period graded the quality of the evidence and 6% did not even review the scientific literature and both at lower rates than that of guidelines produced in the earlier period. While we did not assess the quality of the guidelines in the CMA Infobase in this study, previous work has revealed that the quality of drug guidelines in this database was less than optimal[
9] and given the limited changes in guideline development reported between the two periods, there is little reason to expect that the quality of Canadian Guidelines has vastly improved over the 12 year study period.
Given the international efforts such as the AGREE Collaboration[
1] to improve the quality of reporting of practice guidelines and the GRADE working group[
10] to encourage consensus on approaches to grading of the evidence, the timing may be right to encourage and support Canadian guideline developers to improve the rigor of the methods used to develop their guidelines and their reporting.
In terms of guideline developer knowledge translation activities over the two time periods, there has been a small but significant decrease in the total number of dissemination and implementation strategies employed per guideline. This was largely due to the use of fewer passive dissemination strategies in the more recent period. One hypothesis for the decline might be growing awareness of early evidence that passive dissemination was ineffective at changing professional behaviour[
6,
7]. More recent evidence[
3] is suggesting that there may actually be value in passive dissemination since it is inexpensive and may be as effective as more costly and labour intensive approaches such as audit and feedback[
3,
8]. While the evidence continues to indicate that interactive education approaches and more active implementation strategies can be effective in changing professional behaviour[
8,
11‐
13], there has also been small non significant declines in the use of these activities in the more recent period which may suggest developers are unaware of, or choosing to ignore, this evidence.
If the benefits of the guidelines produced are to be achieved, guideline developers and their stakeholders should reconsider their dissemination and implementation activities and how to work together to better encourage the adoption of their guidelines into routine practice. Based on the findings of Grimshaw and colleagues, it is reasonable to continue using passive dissemination but also important to use more targeted implementation strategies aimed at overcoming contextual barriers to implementation and embedding guidelines within organizational structures such as documentation and ordering systems[
8]. As these activities have resource implications, it will be important for guideline developers and KT researchers to consider the cost-effectiveness of dissemination and implementation strategies in the future.
It is encouraging that in the more recent period, the effectiveness of dissemination and implementation activities is being evaluated in twice as many guidelines (12.2% vs. 6.1%). Since there remains considerable room for greater research on KT strategies, it is unfortunate that fewer developers reported intending to undertake such evaluations in the future. The proportion of guidelines whose developers report formally evaluating the impact of their guideline on health incomes has increased substantially over the 12 year period from 5% to 24% of guidelines. Data on the positive health outcomes of guidelines may be useful for encouraging others to incorporate guidelines into their healthcare decision making. The opportunity to evaluate the impact of a guideline on health outcomes may also provide a safe forum for potential adopters to try the guideline, to contextualize the recommendations of the guideline for their clinical setting and to support implementation under temporary research conditions. Evaluation research can be considered an active implementation strategy, especially where the changes in clinical practice recommended in the guideline are sustained.
Our comparison of developers who submitted three or fewer, or 4 or more, guidelines to the CMA Infobase over the 12 year period revealed some interesting findings that will need to be confirmed by future research. Guidelines produced by more experienced guideline developers were more likely to have done a computerized search of the literature, graded the quality of the evidence, planned to formally evaluate the dissemination/implementation strategies they use, formally evaluated the impact of the guideline on health outcomes, and had a companion document for consumers. There were no differences in terms of the dissemination and implementation activities undertaken by the two groups. One interpretation of these findings is that the volume of guidelines produced by a developer may be important and related to higher guideline quality and evaluation but not KT activities. Graham and colleagues previously found lower quality in guidelines developed by government, para-government or professional organizations compared to those developed by other types of guideline developers[
9]. Consequently, more research is needed to understand the relationship between guideline quality and characteristics of guideline developers.
Limitations
These findings should be considered within the limitations of the study. First, the survey data were self-reported by the guideline developers and were not objectively verified. However, the information provided in the survey (information about the guideline development process) is available on the CMA Infobase with the guideline and therefore makes verification of survey responses possible. Another consideration was that the survey was also sent only after the guideline had been accepted for inclusion in the CMA Infobase rather than as part of the process of accepting the guideline. Both of these factors may have encouraged guideline developers to accurately report their responses. Furthermore, since more quality indicators remain unmet in the recent time period, the change in response between the two study periods is likely an accurate reflection of their activities.
Another limitation relates to the questions used to assess the quality of guideline development. In the years since the survey was developed, the AGREE Collaboration[
1] has developed criteria for assessing guideline quality. Although the CMA Infobase survey items address many of the same concepts as the AGREE Instrument, the time may be right for the CMA to adopt or adapt the AGREE instrument to survey guideline developers about the quality of their development processes.
A third limitation is that conclusions can only be drawn about the guidelines and their developers that were deposited into the CMA Infobase during the study period. We have no way of knowing what proportion of Canadian guidelines is deposited in the Infobase. It is possible that guideline developers producing guidelines in French may not be submitting them to the CMA Infobase since only about one-third of Quebec physicians are members of the CMA. There are also other repositories of guidelines for non-physician health care providers in Canada (for example, RNAO's Best Practice Guidelines at
http://www.rnao.org/). However, CMA believes that the database represents the majority of CPGs published in English Canada since they have built a comprehensive searching and screening strategy that adds to what developers' submit: they search various databases (notably Medline) and websites regularly and hand search major medical journals for guidelines of interest to physicians. The CMA also believes that the guidelines that are not identified in this process are most likely published by some very small specialty groups. Finally, it is likely that developers will deposit their guidelines here since this is the primary repository for guidelines targeting Canadian physicians, who can access the CMA Infobase at no cost.
Competing interests
The authors declare that they have no competing interests. Nan Bai is an information specialist at CMA in charge of Infobase.
Authors' contributions
JK participated in the design of the study, completed the statistical analysis, and drafted the manuscript. IDG conceived of the study, provided input into the design of the survey, guided the analysis and helped draft the manuscript. DS contributed to the interpretation of the results and drafting of the manuscript. NB provided the data and contributed to interpreting the results. All authors read and approved the final manuscript.