Skip to main content
Erschienen in: Implementation Science 1/2015

Open Access 01.12.2015 | Editorial

Education and training for implementation science: our interest in manuscripts describing education and training materials

verfasst von: Sharon E. Straus, Anne Sales, Michel Wensing, Susan Michie, Bridie Kent, Robbie Foy

Erschienen in: Implementation Science | Ausgabe 1/2015

Abstract

Alongside the growth in interest in implementation science, there has been a marked increase in training programs, educational courses, degrees, and other offerings in implementation research and practice to meet the demand for this expertise. We believe that the science of capacity building has matured but that we can advance it further by shining light on excellent work in this area and by highlighting gaps for future research. At Implementation Science, we regularly receive manuscripts that describe or evaluate training materials, competencies, and competency development in implementation curricula. We are announcing a renewed interest in manuscripts in this area, with specifications described below.
Hinweise

Competing interests

The authors all hold editorial positions with Implementation Science. SES is funded by a Tier 1 Canada Research Chair in Knowledge Translation.

Authors’ contributions

All authors have been involved in discussions about journal scope. SES drafted the initial manuscript and revised it in the light of comments from all other authors. All authors read and approved the final manuscript.

Editorial

Alongside the growth in interest in implementation science, there has been a marked increase in training programs, educational courses, degrees, and other offerings in implementation research and practice to meet the demand for this expertise [15]. At Implementation Science, we regularly receive manuscripts that describe or evaluate training materials, competencies, and competency development in implementation curricula. We have previously accepted a limited number of these. We are announcing a renewed interest in manuscripts in this area, subject to some specifications, which we describe below. We encourage interested authors to review our recent editorial describing the journal mission and scope, which provides additional details on the types of manuscripts we are seeking [6].
As noted in our earlier editorial, Implementation Science is focused on promoting the uptake of research findings into health care practice and health policy [6]. Articles that address building capacity in this area will be considered. Overall, we are most interested in manuscripts that describe the rigorous (i.e. using systematic, replicable, and valid methods) development and/or evaluation of educational and training interventions and resources to build capacity in the science or practice of evidence implementation in health care.
There are various target audiences for capacity building initiatives including those interested in becoming implementation researchers, researchers from other disciplines who want to gain necessary skills to facilitate appropriate dissemination and implementation of their own research, and decision makers including clinicians, undergraduate health care professional students, funders, policy makers, health care managers, and members of the public interested in the principles of evidence dissemination and implementation [5]. Given this wide range of learners, different types of educational and training activities are necessary to meet their needs. For example, comprehensive national training initiatives and graduate and postgraduate training opportunities (for individuals from a variety of backgrounds including those with clinical, business, and research training) have been developed to provide skills for those interested in becoming implementation researchers [2, 5]. Opportunities for researchers interested in developing skills to disseminate or implement their own research findings have also been developed including those by the NIH [1]. And, training for various decision makers is available to develop their skills in evidence implementation [4, 7] and to support others in its practice. Across these different target audiences, educators have taken various approaches from comprehensive training curricula [5] through to targeting specific competencies to focus training efforts [8]. Manuscripts that address any of these audiences will be considered.
Efforts have been made to identify the core competencies for various learners [5, 9, 10], highlighting how the implementation science field has advanced and how training must reflect these advances. In particular, with development in research methods and the building of the foundational science for implementation, training programs need to be flexible and create educational initiatives to meet these needs that evolve. For example, given the challenges in easily identifying relevant theory to inform the design of behaviour change interventions, workshops such as those by Michie and colleagues were created [11].
Finally, training initiatives have been provided in various formulations including in-person and online (both synchronous and asynchronous learning). Similarly, different dosages of training have been provided from brief one-off sessions to multiple sessions. This variability in offerings is helpful to meet the needs of different learners, and we anticipate receiving manuscripts addressing these different offerings.

Systematic reviews

We are interested in systematic reviews of capacity building in the science or practice of evidence implementation (Table 1). We will consider various review methodologies including scoping reviews, rapid reviews, and those that integrate qualitative and quantitative data.
Table 1
Scope of education and training manuscripts
Issue
Likely to be accepted
Likely to be rejected
Field of interest
Health care and population health
Anything else
Effectiveness studies
Evaluated the effectiveness of an intervention targeted to build capacity in the science or practice of implementation. Examples include studies evaluating implementation coaching, knowledge brokers, graduate curricula in implementation science, or continuing professional development courses in implementation
Evaluating the effectiveness of a patient education intervention
We are interested in studies that use quantitative and/or qualitative methods to evaluate impact
Process evaluations
Submitted with or following the report of intervention effectiveness as described above
Submitted without main intervention effectiveness paper
Training and educational intervention development reports
Prepared and submitted prior to the reporting of the effectiveness of the capacity building intervention
Description of a graduate course in theory without description of its development
Description of a graduate course that is not going to be rigorously evaluated
Developed using rigorous empirical and/or theoretical approaches
Reports of measurement tools
Described the development and validation of a measurement tool to assess the impact of a capacity building initiative in the science and/or practice of implementation
Development of a measurement tool without validation or description of empirical methods used to develop it
Protocols
Described evaluation of a capacity building initiative in the science and/or practice of implementation
Protocol that has not been peer reviewed by a nationally recognised funding agency
Peer reviewed by a nationally recognised research agency
Received ethics review board approval
Submitted prior to data cleaning or analysis
Reports of development/evaluation of competencies
Described rigorous development of core competencies for implementation coach or scientist
No explicit methods used to develop core competencies
No plan to evaluate competency-based intervention

Evaluations of capacity building interventions

We welcome studies that evaluate the effectiveness of an intervention targeted to build capacity in the science or practice of implementation. As outlined in our previous editorial, we expect studies that evaluate effectiveness to use rigorous and appropriate experimental or quasi-experimental designs [6]. Examples of potential studies include those evaluating implementation coaching, graduate curricula in implementation science, or continuing professional development courses in implementation practice. We are generally not interested in studies that evaluate the effectiveness of a patient education intervention as their goals are not aligned with capacity building in implementation. However, if the study evaluated a training program for patients to develop skills in the science or practice of implementation, it would be eligible. Similarly, we are less interested in programs that train health care professionals in the use of evidence-based practice as these initiatives are typically focused on enhancing research use at the individual patient-clinician level. We are also interested in studies that use qualitative or mixed methods for evaluating the capacity building intervention, although evaluations that focus purely on the experience of participants are of less interest to us. We are particularly interested in studies that describe capacity building initiatives that occur across more than one setting or country.

Process evaluations

We are keen to consider process evaluations of capacity building initiatives in the science or practice of implementation. In particular, we welcome studies that advance our understanding of the outcomes of effectiveness studies of capacity building initiatives. For example, the impact of context and type of learners on outcomes and the ‘dose’ and ‘formulation’ of the capacity building strategy are critical to advance knowledge in this area. We encourage authors to consider qualitative, quantitative, or mixed methods when developing their process evaluations, and we are interested in process evaluations that are submitted with or following the report of the intervention effectiveness. We are not interested in process evaluations that do not refer to the effectiveness of the capacity building initiative or that are submitted without the main intervention effectiveness paper.

Intervention development reports

We welcome manuscripts that describe the development of a capacity building initiative that use novel methods, and provide empirical or theoretical rationale for the content. These manuscripts should be submitted before the report of the effectiveness of the capacity building intervention is submitted to ensure the intervention was not modified after consideration of study outcomes. Similarly, we are not interested in descriptions of capacity building interventions that are not going to be rigorously evaluated. We require that authors include the course content as an appendix that we will make available on the Journal’s website.

Methods reports

We are interested in articles that advance methods for the study of capacity building. In particular, we welcome reports that describe the development and validation of measurement instruments to assess the impact of capacity building initiatives and that describe the development of competencies in implementation research or practice. We will typically reject articles that do not use explicit and rigorous methods for developing competencies or that do not plan to evaluate these competencies.

Protocols

We welcome protocols that describe the testing of a capacity building initiative in the research or practice of implementation. As noted in our recent editorial, we aim to publish protocols that have been peer reviewed by a nationally or internationally recognised research agency, that have received ethics approval, and that are submitted prior to data cleaning or analysis [6]. We encourage authors to refer to appropriate reporting guidelines to enhance transparency [12].
We welcome replications of research if they are accompanied by an appropriate rationale. We believe in treating effectiveness studies equally whether they report a positive, negative, or no effect on relevant outcomes. We are interested in studies that report outcomes relevant to capacity building in implementation research or practice, and these could include outcomes relevant to individuals, organisations, or the health system. At the individual level, these could include changes in attitudes, knowledge, skills, and behaviours. At the organisation level, these could include changes in processes of care or changes in culture, climate, or policy. At the health system level, changes in attitudes towards using research or actual research use in policy could be considered amongst others. We are interested in outcomes beyond ‘numbers of trainees’ that participated in educational events and their satisfaction with the events.

Next steps

We are excited to witness the growth in interest in implementation science and in capacity building efforts to meet the demand. We believe that the science of capacity building has matured but that we can advance it further by shining light on excellent work in this area and by highlighting gaps for future research. We look forward to receiving manuscripts that reflect innovative work in this field and also invite authors to provide feedback on our approach. We will endeavour to continue to revisit our scope, using reflection on the field and input from our readers.

Acknowledgements

The authors would like to thank Dr. Margaret Hanley for the comments on a previous draft.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://​creativecommons.​org/​licenses/​by/​4.​0/​), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://​creativecommons.​org/​publicdomain/​zero/​1.​0/​) applies to the data made available in this article, unless otherwise stated.

Competing interests

The authors all hold editorial positions with Implementation Science. SES is funded by a Tier 1 Canada Research Chair in Knowledge Translation.

Authors’ contributions

All authors have been involved in discussions about journal scope. SES drafted the initial manuscript and revised it in the light of comments from all other authors. All authors read and approved the final manuscript.
Literatur
1.
Zurück zum Zitat Meissner H, Glasgow RE, Vinson CA, Chambers D, Brownson RC, Green LW, et al. The US training institute for dissemination and implementation research in health. Implement Sci. 2013;8:12.CrossRefPubMedPubMedCentral Meissner H, Glasgow RE, Vinson CA, Chambers D, Brownson RC, Green LW, et al. The US training institute for dissemination and implementation research in health. Implement Sci. 2013;8:12.CrossRefPubMedPubMedCentral
2.
Zurück zum Zitat Proctor EK, Landsverk J, Baumann AA, Mittman BS, Aarons GA, Brownson RC, et al. The implementation research institute: training mental health implementation researchers in the United States. Implement Sci. 2013;8:105.CrossRefPubMedPubMedCentral Proctor EK, Landsverk J, Baumann AA, Mittman BS, Aarons GA, Brownson RC, et al. The implementation research institute: training mental health implementation researchers in the United States. Implement Sci. 2013;8:105.CrossRefPubMedPubMedCentral
4.
Zurück zum Zitat Leeman J, Calancie L, Hartman M, Escoffery CT, Herrmann AK, Tague LE, et al. What strategies are used to build practitioners’ capacity to implement community-based interventions and are they effective? A systematic review. Implement Sci. 2015;10:80.CrossRefPubMedPubMedCentral Leeman J, Calancie L, Hartman M, Escoffery CT, Herrmann AK, Tague LE, et al. What strategies are used to build practitioners’ capacity to implement community-based interventions and are they effective? A systematic review. Implement Sci. 2015;10:80.CrossRefPubMedPubMedCentral
5.
Zurück zum Zitat Straus SE, Brouwers M, Johnson D, Lavis JN, Légaré F, Majumdar SR, et al. Core competencies in the science and practice of knowledge translation: description of a Canadian strategic training initiative. Implement Sci. 2011;6:127.CrossRefPubMedPubMedCentral Straus SE, Brouwers M, Johnson D, Lavis JN, Légaré F, Majumdar SR, et al. Core competencies in the science and practice of knowledge translation: description of a Canadian strategic training initiative. Implement Sci. 2011;6:127.CrossRefPubMedPubMedCentral
6.
Zurück zum Zitat Foy R, Sales A, Wensing M, Flottorp S, Kent B, Michie S, et al. Implementation science: a reappraisal of our journal mission and scope. Implement Sci. 2015;10:51.CrossRefPubMedPubMedCentral Foy R, Sales A, Wensing M, Flottorp S, Kent B, Michie S, et al. Implementation science: a reappraisal of our journal mission and scope. Implement Sci. 2015;10:51.CrossRefPubMedPubMedCentral
7.
Zurück zum Zitat Jacobs JA, Duggan K, Erwin P, Smith C, Borawski E, Compton J, et al. Capacity building for evidence-based decision making in local health departments: scaling up an effective training approach. Implement Sci. 2014;9:124.CrossRefPubMedPubMedCentral Jacobs JA, Duggan K, Erwin P, Smith C, Borawski E, Compton J, et al. Capacity building for evidence-based decision making in local health departments: scaling up an effective training approach. Implement Sci. 2014;9:124.CrossRefPubMedPubMedCentral
8.
Zurück zum Zitat Gagliardi AR, Webster F, Perrier L, Bell M, Straus S. Exploring mentorship as a strategy to build capacity for knowledge translation research and practice. Implement Sci. 2014;9:122.CrossRefPubMedPubMedCentral Gagliardi AR, Webster F, Perrier L, Bell M, Straus S. Exploring mentorship as a strategy to build capacity for knowledge translation research and practice. Implement Sci. 2014;9:122.CrossRefPubMedPubMedCentral
9.
Zurück zum Zitat Newman K, Van Eerd D, Powell B, Urquhart R, Cornelissen E, Chan V, et al. Identifying priorities in knowledge translation from the perspective of trainees: results from an online survey. Implement Sci. 2015;10:92.CrossRefPubMedPubMedCentral Newman K, Van Eerd D, Powell B, Urquhart R, Cornelissen E, Chan V, et al. Identifying priorities in knowledge translation from the perspective of trainees: results from an online survey. Implement Sci. 2015;10:92.CrossRefPubMedPubMedCentral
10.
Zurück zum Zitat Holmes BJ, Schellenberg M, Schell K, Scarrow G. How funding agencies can support research use in health care: an online province-wide survey to determine knowledge translation training needs. Implement Sci. 2014;9:71.CrossRefPubMedPubMedCentral Holmes BJ, Schellenberg M, Schell K, Scarrow G. How funding agencies can support research use in health care: an online province-wide survey to determine knowledge translation training needs. Implement Sci. 2014;9:71.CrossRefPubMedPubMedCentral
Metadaten
Titel
Education and training for implementation science: our interest in manuscripts describing education and training materials
verfasst von
Sharon E. Straus
Anne Sales
Michel Wensing
Susan Michie
Bridie Kent
Robbie Foy
Publikationsdatum
01.12.2015
Verlag
BioMed Central
Erschienen in
Implementation Science / Ausgabe 1/2015
Elektronische ISSN: 1748-5908
DOI
https://doi.org/10.1186/s13012-015-0326-x

Weitere Artikel der Ausgabe 1/2015

Implementation Science 1/2015 Zur Ausgabe