Background
Methods
Formation of the academic-community partnership
Participants
Procedures
Step 1: needs assessment data collection
Measure | Description |
---|---|
Employee Demographics Questionnaire (developed in-house) | This survey examined demographic information such as age, gender, ethnicity as well as theoretical orientation, training background, and clinical experience information (e.g., years of experience in counseling, previous training/experiences with CBT). |
Evidence-Based Practice Attitudes Scale (EBPAS; Aarons, 2004) | The EBPAS is a 15-item measure that evaluates attitudes towards adopting evidence-based practices (EBPs) rated on a 5-point Likert scale of “Not at all” to “A very great extent.” This measure consists of a total score and four subscales, with a higher score indicating more positive attitudes towards the adoption of EBPs, (except for the divergence subscale in which a higher score indicates higher endorsed views of EBPs not being clinically useful). The four subscales are as follows: appeal (i.e., attitudes towards adopting if the EBP was intuitively appealing), requirements (i.e., attitudes towards adopting if it was required), openness (i.e., openness towards trying and using new interventions), and divergence (i.e., views of EBPs being unimportant and irrelevant to clinical experience) [21]. The EBPAS has displayed good internal consistency [21, 34] and construct and convergent validity [21, 34‐36]. The internal consistency for EBPAS total score (Cronbach’s α = .88) was good for the sample as well as for the four subscales: requirements (Cronbach’s α = .91), appeal (Cronbach’s α = .94), openness (Cronbach’s α = .87), and divergence (Cronbach’s α = .76). Norm references include [37]. |
Attitudes Towards Standardized Assessment Scale (ASA; Jensen-Doss and Hawley 2010) | The ASA is a 22-item measure that assesses attitudes towards using standardized assessment on a 5-point Likert scale from “Strongly Disagree” to “Strongly Agree,” with a higher score indicative of more positive attitudes towards using standardized assessment in practice. The ASA is comprised of three subscales: Benefit over Clinical Judgment, Psychometric Quality, and Practicality. The ASA subscales have demonstrated good internal consistency and been found to be predictive of actual standardized assessment tool use [38]. The internal consistency of our sample was acceptable for Benefit over Clinical Judgment scale (Cronbach’s α = .74) and Psychometric Quality scale (Cronbach’s α = .68), but poor for the Practicality scale (Cronbach’s α = .58). Norm references include [38]. |
Organizational Culture Survey (OCS; Glaser 1983; Glaser et al. 1987) | The OCS measures the culture, or the normative beliefs and shared expectations of behavior [39, 40], of an organization. It is 31 items and uses a five-point Likert scale ranging from “To a Very Little Extent” to “To a Very Great Extent.” Higher scores demonstrate a more positive perception of the different aspects of the organization’s culture. The OCS is reliable, has high internal consistency, and contains six subscales: Teamwork, Morale, Supervision, Involvement, Information Flow, and Meetings [39]. The current sample’s internal consistency was good to excellent (Teamwork (Cronbach’s α = .93), Morale (Cronbach’s α = .94), Information Flow (Cronbach’s α = .85), Involvement (Cronbach’s α = .89), Supervision (Cronbach’s α = .94), and Meetings (Cronbach’s α = .93)). Norm references include [41]. |
Survey of Organizational Functioning (TCU SOF; Broome et al. 2007) | The SOF is 162-item measures that assesses the motivation factors, resources, staff attributes, climate, job attitudes, and workplace practices of an organization. The SOF includes the 23 scales of the Organizational Readiness for Change [42] plus nine additional scales examining workplace practices and job attitudes. All of the scales use a 5-point Likert scale of “Disagree Strongly” to “Agree Strongly” except for the Training Exposure scale, which uses a 5-point Likert scale of “None” to “4 or More,” and the Training Utilization-Individual Level and Training Utilization-Program-level scales, which utilize a 5-point Likert scale ranging from “Never” to “Almost Always.” Each scale score ranges from 10 to 50. All of the scales have shown acceptable psychometrics [43]. The internal consistency for most of the scales in the current study ranged from acceptable to excellent, with the scales Pressures for Change (Cronbach’s α = .58), Computer Access (Cronbach’s α = .33), Efficacy (Cronbach’s α = .55), Adaptability (Cronbach’s α = .50), Autonomy (Cronbach’s α = − .07), Stress (Cronbach’s α = .54), Change (Cronbach’s α = .48), Peer Collaboration (Cronbach’s α = .47), and Counselor Socialization (Cronbach’s α = .56) demonstrating poor to unacceptable internal consistency. Norm references include [42, 44‐46] |
Infrastructure Survey (Keough, Comtois, Lewis, and Landes 2013) | The Infrastructure Survey measures the impact of infrastructure (i.e., organization’s documentation, team format, performance evaluations/job descriptions, funding structure, staffing, and implementation outcomes) of clinical settings on the implementation and sustainment of empirically based treatments. The Infrastructure survey is comprised of 30 items and a recent exploratory factor analysis of this data [47] revealed four factors: Facilitative Staff Role, Flexibility, Adaptability, and Compatibility. The survey utilized a 5-point Likert scale ranging from “Strongly Disagree” to “Strongly Agree.” This study represents the first time this survey has been used since its creation, so its psychometrics have yet to be validated. The survey had good to excellent internal consistency for each scale (Facilitative Staff Role (Cronbach’s α = .92), Flexibility (Cronbach’s α = .89), Adaptability (Cronbach’s α = .93), and Compatibility (Cronbach’s α = .79). No norm references were available for the measure. |
Opinion Leader Survey (Valente et al. 2007) | This survey elucidates the opinion leaders among an organization by requesting the respondent to name individuals “for whom you have relied on for advice on whether and how to use evidence-based practices for meeting the mental health needs of youth served by our agency” [48]. After listing these individuals, the respondent is asked to indicate how six additional criteria (i.e., sought advice from with respect to treatment, works for same agency as you, works for another agency or employer, is a subordinate or employees of yours, if your superior or supervisor, is your friend) apply to him/her. |
Step 2: mixed methods analysis
Conjoint analysis
Step 3: conjoint analysis
Step 4: implementation team formation
Position | Role responsibility |
---|---|
Chair | Set the agenda and run the meetings |
Secretary | Take minutes and ensure that meetings built from one another |
Program Evaluator | Help collect and interpret data to track whether team activities have the intended influence |
Incentives Officer | Ensure the team is being appropriately incentivized for this demanding, volunteer role (e.g., lunches during meetings, an additional PTO every 3 months) |
Communications Officer | Strategically communicate out about the team’s activities to the rest of the organization |
Step 5: implementation blueprint creation
Results
Importance | Goal | Responsible | Feasibility | Impact | Implementation category | Action step |
---|---|---|---|---|---|---|
H | 1, 2, 3 | IT | H | 3 | Develop stakeholder interrelationships | Implementation team––reserve biweekly meetings |
H | 1, 3 | IT | L | 1.5 | Support clinicians | Restructure clinical teams |
H | 3 | B | H | 2 | Train and educate stakeholders | Select training methods that fit preferences of staff |
H | 1, 2, 3 | IT | L | 3 | Develop stakeholder interrelationships | Recruit, designate, and train for leadership (pick chair/lead) |
H | 3 | B/IT | L | 3 | Adapt and tailor to context | Fit intervention to clinical practice (link points and levels with CBT and outcome monitoring) |
H | 1, 3 | B/IT | Use evaluative and iterative strategies | Develop and implement tools for quality monitoring (identify program level measures) | ||
M | 3 | B | H | 1 | Develop stakeholder interrelationships | Develop implementation glossary |
M | 3 | B | H | 1 | Develop stakeholder interrelationships | Develop structured referral sheets |
L | 3 | B | L | 2 | Train and educate stakeholders | Prepare client-facing psychoeducational materials regarding mental health problems |
M | 1 | IT | L | 3 | Utilize financial strategies | Shift resources for incentives, support and to reduce turnover |
M | 1, 2 | B/IT | H | 2 | Develop stakeholder interrelationships | Conduct local consensus discussions––mix with educational meetings |
L | 1, 2 | IT | H | 1 | Train and educate stakeholders | Conduct educational meetings |
L | 3 | IT | L | 2 | Change infrastructure | Modify context to prompt new behaviors––change note template |
M | 3 | B/IT | L | 3 | Utilize financial strategies | Access new funding |
Importance | Goal | Responsible | Feasibility | Impact | Implementation category | Action step |
---|---|---|---|---|---|---|
H | 1, 2, 3 | B | H | 3 | Train and educate stakeholders/provide interactive assistance | Beck/IU training/supervision |
H | 1, 2, 3 | IT | L | 2 | Develop stakeholder interrelationships | Hold cross-staff clinical meetings |
H | 1, 3 | B/IT | H | 2 | Adapt and tailor to context | Facilitate, structure, and promote adaptability (Beck to work with IT to modify CBT to fit the sites) |
H | 2 | B | L | 3 | Train and educate stakeholders | Conduct educational outreach visits |
H | 3 | IT | L | 3 | Utilize financial strategies | Shift resources (ensure strategy for monitoring outcomes) |
H | 2 | IT | H | 1 | Develop stakeholder interrelationships | Identify early adopters (have person shadowed, talk in clinical meetings about overcoming barriers) |
H | 2 | B | L | 3 | Provide interactive assistance | Provide clinical supervision––include IT on calls |
H | 1, 2 | B/IT | L | 3 | Train and educate stakeholders | Use train-the-trainers strategies |
H | 2, 3 | IT | L | 3 | Change infrastructure | Increase demand––present data to courts and state level |
H | 2 | IT | H | 2 | Support clinicians | Change performance evaluations, change professional roles |
M | 2 | B/IT | H | 1 | Use evaluative and iterative strategies | Develop and institute self-assessment of competency |
M | 2, 3 | IT | H | 2 | Develop stakeholder interrelationships | Capture and share local knowledge |
M | 2 | IT | H | 1 | Support clinicians | Remind clinicians |
L | 3 | B/IT | L | 2 | Train and educate stakeholders | Prep CBT client handouts (Beck to provide examples) |
L | 1, 2 | B/IT | L | 2 | Utilize financial strategies | Alter incentives (certification, vacation, salary) |
L | 1, 3 | B/IT | L | 2 | Support clinicians | Facilitate relay of clinical data to providers (data parties) |
L | 1, 2 | IT | L | 2 | Support clinicians | Modify context to prompt new behaviors |
L | 1, 2, 3 | IT | L | 2 | Train and educate stakeholders | Shadow other experts |
L | 1, 2, 3 | IT | L | 2 | Use evaluative and iterative strategies | Obtain and use consumer and family feedback (exit interviews and surveys) |
Importance | Goal | Responsible | Feasibility | Impact | Implementation category | Action step |
---|---|---|---|---|---|---|
H | 1, 2, 3 | IT | H | 3 | Develop stakeholder interrelationships | Engage implementation team |
H | 1, 3 | IT | L | 2 | Develop stakeholder interrelationships | Hold cross-staff clinical meetings |
H | 3 | IT | L | 3 | Use evaluative and iterative strategies | Develop and implement for quality monitoring––must monitor fidelity through observation regularly and randomly |
H | 1, 3 | IT | H | 1 | Train and educate stakeholders | Conduct educational meetings––hold regularly for new staff and as refreshers |
H | 1, 3 | IT | L | 3 | Train and educate stakeholders | Use train-the-trainer strategies––only those certified in CBT |
H | 1, 2, 3 | IT | L | 2 | Provide interactive assistance | Centralize technical assistance––create standard operating procedure for training and use of CBT at each staff level |
L | 1, 2 | IT | L | 2 | Utilize financial strategies | Alter incentives––provide raise earlier based on competency |
L | 1, 3 | IT | L | 2 | Use evaluative and iterative strategies | Obtain and use consumer feedback w/PQI data collection |
L | 1, 3 | IT | L | 2 | Train and educate stakeholders | Shadow other experts––elongate period for new staff |
L | 1, 2, 3 | IT | L | 2 | Train and educate stakeholders | Develop learning collaborative |
L | 3 | B/IT | L | 2 | Use evaluative and iterative strategies | Stage implementation scale-up to generate plan across site |
L | 3 | B/IT | L | 2 | Engage consumers | Use mass media––get press release out with data from implementation |