Background
Methods/design
Stage 1: Identification of the research questions
Stage 2: Identification of studies
Implementation interventions
Component | Definition |
---|---|
Staff selection | Includes academic qualifications or level of experience of staff or peers selected to carry out the program, methods for recruiting and selecting practitioners to carry out the program. |
Pre-service and in-service training | Strategies used to enhance understanding background information and rationale for key practices (may include communication channels), training on the program components and opportunities to practice new skills and receive feedback. |
Ongoing coaching and consultation | A coach provides information, advice, encouragement, and opportunities to practice and use skills specific to the program or innovation; coaching for behavior change may target the practitioner, supervisory, or administrative support levels. |
Staff performance assessment | Evaluation to assess the use and outcomes of skills taught in the training, and reinforced and expanded in coaching processes. Includes performance feedback for skill development and other measures for gauging implementation fidelity, or other assessments to ensure competent delivery of core intervention components. |
Decision-support data systems | Process and outcome data, organizational fidelity measures, or collection of any data to support decision-making and progress in the implementation of the core intervention components over time. |
Facilitative administration | Provision of leadership to support the overall processes of implementation and keep staff focused on achieving intervention outcomes. May include new policies and procedures, change in care structures (model of care or care processes), and promoting the climate and cultural shift required to support the change |
Systems intervention | Strategies to work with external systems to ensure the availability of the financial, organizational, and human resources required to support the work of the practitioners. |
Strategy | Description |
---|---|
Audit and feedback | A summary of health workers’ performance over a specified period of time, given to them in written, electronic, or verbal format. May include recommendations for clinical action. |
Local opinion leaders/champions | The identification and use of identifiable local opinion leader to promote or champion good clinical practice. |
Patient-mediated interventions | The use of patients to change professional practice; could include the provision of patient-reported outcomes to practitioners. |
Public release of performance data | Informing the public about healthcare providers by the release of performance data in written or electronic form. |
Reminders | Manualized or computerized interventions that prompt health care workers to perform and action during a consultation with a patient, i.e., computer decision support systems. |
Educational games | The use of games as an education strategy to improve standards of care. |
Educational materials | Distribution of individuals, groups, or educational materials to support clinical care, i.e., any intervention in which knowledge is distributed. |
Education meetings | Courses, workshops, conferences of other educational meetings. |
Educational outreach visits or academic detailing | Personal visits by a trained person to health care workers in their own settings, to provide information with the aim of changing practice. |
Routine patient-reported outcome measures | Routine administration and reporting of patient-reported outcome measures to providers and/or patients. |
Managerial supervision | Routine supervision visits by health staff. |
Decision support tools such as guidelines | Evidence-based guidance on appropriate health care for specific clinical circumstances, e.g., symptom triage protocols. |
Local consensus processes | Formal or informal local consensus processes, for example agreeing to a clinical protocol to manage a patient group, or for adapting a guideline. |
Continuous quality improvement | An iterative process to review and improve care that includes involvement of healthcare teams, analysis of a process or system, a structured process improvement method or problem-solving approach and use of data to analyze changes, i.e., Plan, Do, Study, Act Cycles |
Inter-professional education | Continuing education for healthcare professionals that involves more than one profession in joint, interactive learning. |
Tailored interventions | Interventions to change practice that are selected based on an assessment of systematic assessment of barriers to change. |
SMS intervention and/or program
Stage 3: Study selection
Search criteria
Databases
Study screening and selection
Inclusion criteria | |
---|---|
Term | Definition |
P—population | Implementation studies of SMS programs targeted to adult cancer populations (age 18 and over) at any stage of the cancer trajectory (treatment, post-treatment survivorship, palliative or end of life care). |
I—intervention | Any implementation intervention study that focused on, or incorporated strategies to support self-management and delivered as part of routine clinical service or in a community agency or organization. SMS program targeting patients and/or providers and/or changes in the delivery system in the context of cancer care. Study design: Implementation studies that include a range of methodologies: population-level randomized controlled trials or cluster trials, quasi-experimental prospective studies, retrospective controlled studies, interrupted time series, controlled before and after studies, case-control, uncontrolled before and after studies, observational studies, qualitative studies of implementation processes or strategies or factors enabling or hindering implementation. |
C—comparator | Any comparator such as usual care or other intervention if relevant. |
O—outcomes | Outcomes of interest are not restricted but could include (self-management/self-care behaviors/healthy lifestyle behaviors, symptoms, emotional distress or adjustment (depression/anxiety), quality of life, patient experience, self-efficacy/mastery, survival, empowerment, health care use and/or costs, biological markers of disease, and process or implementation outcomes (clinicians’ knowledge and skills, attendance at education sessions, change in care delivery processes as per EPOC). |
H—healthcare context | Any health care setting that provides care to cancer populations; hospital, ambulatory or outpatient care, community services or organizations, primary care practices, remote (telehealth or other web-based designs). |
Exclusion criteria Non-empirical sources, i.e., opinion papers, book chapters, guidelines, or editorials. Efficacy trials of self-management interventions in cancer that are not focused on implementation. Self-management interventions and/or programs that do not include at a minimum guided support to patients in the development of core skills and/or strategies to support change in behaviors to manage problems or adjust to cancer; or focus only on management of comorbidities. Patient education programs or interventions that do not emphasize patient acquisition of skills for self-management. Papers or studies not published in English. |
Stage 4: Data extraction and charting the data
Stage 5. Collating, summarizing, and reporting the results
Data analysis and narrative synthesis approach
NPT construct | Definition | Questions to Consider |
---|---|---|
Coherence
| Meaning and sense-making by participants Refers to the extent to which technology or health care practice makes sense to stakeholders for successful adoption. | 1. Is the SMS intervention easy to describe? 2. Is it clearly distinct from other interventions? 3. Does it have a clear purpose that end-users understand? 4. Expected benefits and are they valued? 5. Will it fit with overall goals of the organization? |
Cognitive participation
| Concerns the commitment and collective engagement of stakeholders | 1. Do end users think SMS is a good idea? 2. End users willingness to invest time in SMS? |
Collective action
| Refers to the relationships and the work required for a new intervention to be taken up in practice and to identify the factors that serve as barriers to implementation and embedding | 1. Perceived impact on workload? 2. Promote or impede their work? 3. Compatible with existing work practices? 4. Impact on division of labor, resources or responsibility? 5. Fit with overall goals and activities of the organization? |
Reflexive monitoring
| Participants reflect on or appraise the trial Successful embedding of resources and technologies in everyday practice relies upon a continuous process of evaluation that can feed back into refining the object of implementation to ensure it is fit for purpose. | 1. How do end-users perceive the intervention in use? 2. Perceived as advantageous to patients? 3. Ongoing monitoring of intervention uptake? Or adapting to local context? |