A substantial body of implementation research has underscored the importance of organizational contexts in the successful adoption and sustainment of evidence-based practices (EBP) across various healthcare settings [
1‐
5]. Findings indicate that even when high-quality implementation strategies––such as interactive training and post-training supports (e.g., observation and performance feedback)––are in place to facilitate professional behavior change, implementation outcomes are highly variable [
6‐
8]. Additional research suggests that characteristics of the inner organizational setting, or the immediate context in which implementation occurs, have a substantial influence on the use of evidence-based practices in routine service delivery [
9‐
11]. As a result, most leading implementation frameworks provide comprehensive coverage of “inner context” organizational factors [
1,
3]. Conversely, inadequate attention to system influences is likely to cripple even the most well-resourced and thoughtful implementation efforts, leading some to observe that “bad systems trump good programs” [
12]. However, there is a need for measures that capture key organizational context factors likely to set the stage for effective implementation. The current study examined the factor structure and psychometric properties of an adapted set of measures oriented to the inner organizational context for use in the education sector, an important service delivery context for delivery of behavioral health interventions and supports with considerable promise to positively impact public health of children and adolescents.
Organizational implementation context
A wide variety of organizational characteristics have been identified as relevant to implementation, ranging from local policy to leadership and infrastructure [
13]. These constructs vary, however, in the extent to which they are proximal and specific to the successful adoption and sustainment of EBPs. The
organizational implementation context (OIC) reflects a subset of characteristics of the inner setting that are particularly relevant to the objective of EBP implementation. The OIC captures the factors within the immediate environment likely to influence front-line professionals’ EBP use. Conceptualized using the Exploration, Preparation, Implementation, and Sustainment (EPIS) [
1] framework for implementation in public service systems, key OIC constructs include
strategic implementation leadership,
strategic implementation climate, and
implementation citizenship behavior. These organizational constructs are considered focused or “strategic” in that they refer to specific organizational goals. This is in contrast to more general or “molar” versions of the construct (e.g., global organizational climate and culture, school climate) that, while important, provide a more comprehensive picture of the way an organization is functioning (e.g., general behavioral expectations at work, overall work stress) and are less directly linked to the strategic objective of EBP implementation [
14].
Strategic implementation leadership
Strategic implementation leadership is a subcomponent of general leadership that involves specific behaviors that support or inhibit implementation in service organizations [
15]. These include leaders being knowledgeable and able to articulate the importance of implementation and being supportive of staff, proactive in problem solving, and perseverant in the implementation process [
15]. Importantly, strategic leadership exerts its strongest impact at an interactional level. Leaders who accomplish their strategic goals communicate regularly with staff, protect time during meetings to discuss strategic content, hold staff accountable, and provide ongoing feedback based on performance [
16,
17]. In this way, strategic implementation leadership enhances the use of a number of “embedding mechanisms,” such as role modeling or setting clear criteria for rewards, [
18] that communicate the importance of a strategic initiative and directly support staff use of new programs. Meta-analyses find that strategic leadership helps promote organizational change [
19]. This finding is consistent with recent research on implementation strategies, which supports a link between enhanced implementation leadership and an organizational climate that is conducive to EBP implementation [
20].
Strategic implementation climate
Defined as staff’s shared perception of the importance of EBP implementation [
21], strategic implementation climate encompasses employee perceptions of the organizational supports and practices that help to define norms and expectations with regard to the implementation of new EBPs. A positive implementation climate signals what is expected, supported, and rewarded in relation to use of programs or practices [
22]. Strategic implementation climate is supported by specific leadership behaviors that communicate those norms and expectations [
15]. Similar to implementation leadership, strategic implementation climate reflects a subset of more general or molar organization climate, which is intended to reflect the entirety of the organizational setting. Existing research suggests that focused or strategic climates (e.g., safety climate) are most related to specific outcomes [
23].
Implementation citizenship behavior
Citizenship behaviors are exhibited when employees go “above and beyond” their core job aspects or standard “call of duty” to further the mission of the organization [
24]. Applying the concept to the goal of EBP adoption and sustainment, implementation citizenship behaviors are those that demonstrate a commitment to EBP by keeping informed about the EBP being implemented and supporting colleagues to meet EBP standards [
25]. Because they represent actual changes in the behaviors of front line service providers, implementation citizenship behaviors mediate the influence of implementation leadership and implementation climate on implementation success [
20].
OIC assessment instruments
Instruments assessing the OIC constructs detailed above have been developed as part of a larger program of research focused on understanding and enhancing organizational factors that influence the implementation of mental and behavioral health programs and practices in public sector settings (e.g., community mental health, child welfare). Development of these measures has been largely consistent with the basic tenets of pragmatic measurement [
26] in that they are low-burden (i.e., brief), sensitive to change, actionable (i.e., flowing into selection of implementation strategies [
27]), and consistent with a larger framework or model (i.e., EPIS), among other criteria.
The implementation leadership scale (ILS) [
15,
28] was developed to capture strategic leadership behaviors that likely drive successful EBP implementation
. The Implementation Climate Scale (ICS) [
21] captures specific aspects of the inner organizational climate that are likely supportive of EBP implementation. Research has shown that the ICS correlates moderately with, but is distinct from, a conceptually similar strategic climate measure and correlates weakly with molar climate in mental health and child welfare settings. Last, the Implementation Citizenship Behavior Scale (ICBS) [
25] assesses the degree to which providers within an organization go above and beyond their typical job roles and expectations to support EBP implementation. Together, these measures capture three critical factors associated with the OIC hypothesized to impact successful EBP implementation. Each instrument is described in more details in the “
Method” section.
Although the strategic constructs assessed by the ILS, ICS, and ICBS are expected to be generalizable across settings, these measures are likely to require adaptation if they are to fit novel contexts of use (e.g., schools). Adaptation is often critical to improve the contextual appropriateness of instruments or practices [
29]. Adaptation of measurement tools can include changes to existing items, terminology, and definitions to ensure that they are relevant and comprehensible to end users [
30,
31]. Subsequent to those adaptations, studies should evaluate the extent to which the original factor structure is maintained to evaluate the validity of the tools in a new setting and provide information about their cross-setting utility. The current project was designed to conduct such an evaluation, following the adaptation of the ILS, ICS, and ICBS to support the implementation of mental and behavioral health programming in the education sector.
OIC assessment in the education sector
Schools are the most common site for the delivery of behavioral health services to children and adolescents in the USA, a setting where upwards of 70–80% of service-connected youth receive care [
32‐
36]. Consistent with the literature in the USA, we use “behavioral health” as an overarching term encompassing mental health and substance abuse services [
37]. In schools, behavioral health includes a spectrum of services ranging from universal prevention to selected and indicated interventions [
38]. A diverse school-based behavioral health workforce including educators (e.g., teachers) and dedicated healthcare personnel support this continuum of care [
39].
School-based behavioral health consultants, who support systems and personnel to deliver evidence-based interventions across multiple levels of care, are frequently present in the education sector [
38]. While they sometimes deliver direct services, these consultants often act as EBP champions (within school buildings) or intermediaries (across school buildings) to support implementation of behavioral health programs. Individuals functioning in this role are critical, given consistent evidence that school-based behavioral health services, while accessible, are unlikely to be evidence-based [
39‐
41]. For instance, research suggests that, even when adopted, only 25–50% of school-based programs are implemented with acceptable fidelity, thus limiting their effects on student and school functioning [
42].
In the education sector, the OIC reflects characteristics of inner organizational settings that impact implementation efforts, such as administrator and teacher norms and behaviors. Although prior research has focused on individual-level factors for reasons of convenience and feasibility, multilevel assessment is needed to successfully address implementation issues and install new programs [
43]. Existing research in education has tended to focus narrowly on measuring implementation outcomes such as fidelity [
44], with minimal attention to capturing the organizational factors that specifically impact delivery of EBPs. Careful assessment of the OIC in schools should consider multiple system levels and include perspectives of individual teachers and administrators, as well as organizational processes at the school and district levels [
38].
Educational researchers have previously proposed strong principal leadership as a requirement for adoption and use of SEL programs [
45] and examined leadership qualities as important predictors of school climate and school improvement [
46,
47], but no studies have investigated strategic implementation leadership, climate, and citizenship. Although there are a number of general principal leadership measures with good psychometric properties [
48], such as the Vanderbilt Assessment of Leadership in Education [
49] and the Principal Instructional Management Rating Scale (PIMRS; [
50,
51], existing measures in the educational sector remain too broad to identify specific leadership behaviors that are directed at EBP adoption, delivery, and sustainment in schools because they assess a diffuse range of general leadership qualities (e.g., transformational leadership). Similarly, educational researchers have long examined the role of school climate––defined as people’s perceptions of social norms, goals, values, interpersonal relationships, teaching and learning practices, and organizational structures [
52]––and its connection to wellbeing and positive outcomes among educators and students. This has led to the development and use of a range of broad school climate measures (e.g., [
53]) that, while useful in assessing organizational health, do not capture specific barriers impeding implementation or strategic implementation climate. As a result, these instruments lack utility regarding the selection of tailored implementation strategies to facilitate improved EBP use. Further, there is some educational research that has investigated broad (i.e., non-strategic) citizenship behavior among educators and found it to be related to their participation in decision-making, sense of self-efficacy, and perceived status in the organization [
54,
55]. However, the concept of citizenship has not yet been specifically applied to the extent to which educators go beyond the “call of duty” to keep informed about EBPs and support their fellow colleagues to deliver EBP with fidelity [
56]).
Despite their strong theoretical and empirical links to EBP implementation, none of the EPIS constructs have been studied systematically in schools, where successful implementation of universal EBPs can facilitate improvements in teacher behavior (e.g., instructional practices, interactions with students, use of reinforcement, etc.) that result in positive student outcomes [
57]. As touched on above, in the education sector, existing measures of organizational processes have one or more of the following limitations: (1) they are most often either molar in nature (rather than specific to implementation) or intended for use with specific EBPs and not generalizable; (2) they lack an underlying theoretical framework; or (3) they do not translate to specific, practical actions that strategically improve implementation. Nevertheless, it is possible that the broad educational literature has conceptualized implementation in ways that are not readily identifiable using contemporary implementation science constructs. Finally, although the EPIS framework suggests that leadership, climate, and citizenship are inter-related aspects of the organizational context, education sector behavioral health research typically has explored them independently. This reflects a missed opportunity to understand the associations among these factors and the extent to which they may be complementary in promoting EBP use.