Evaluating school capacity to implement new programs

https://doi.org/10.1016/j.evalprogplan.2007.04.002Get rights and content

Abstract

An eight-factor survey-based Bayesian model (Bridge-It) for assessing school capacity to implement health and education programs was tested in secondary analyses of data from 47 schools in the Texas Tobacco Prevention Initiative (TTPI). Bridge-It was used during the pre-implementation phase and again at mid-course of the TTPI 2 years later. Achieved implementation status was evaluated in follow-up almost 4 years after the start of the TTPI. The Bridge-It score aggregated across all eight of the capacity factors predicted both quality of adherence to the Guidelines for School Programs to Prevent Tobacco Use and Addiction and quantity of implementing activity. The school-based leadership factor was an independent predictor of quality of adherence whereas the facilitation processes factor predicted quantity of implementing activity. Integration of Bridge-It, or comparable multi-attribute tools, into the planning and evaluation of school-centered programs can increase understanding of factors that influence implementation and provide guidance for capacity building.

Introduction

Implementation is the “bridge” between a school program and its impact on students and their families (Berman & McLaughlin, 1976; also cited in Dusenbury, Brannigan, Falco, & Hansen, 2003). No matter how effective it proved in the research lab or field trial, a school-centered health or education program cannot produce its intended benefits until it is effectively integrated into actual procedures and practices at the campus level.

Effective implementation, however, is elusive. Too often, promising programs fall into an implementation gap, adopted but not used or only partially integrated into actual practice. Inspection of the research literature suggests as many as half of schools into which innovative and evidence-based programs are introduced fail to implement with sufficient scope or fidelity to expect the promised benefits will accrue to students, families, and communities. This challenge becomes more acute when evidence-based practices and guidelines include program components that span multiple aspects of school functioning, such as policy change and enforcement, classroom instruction, counseling support, and family involvement. For examples, see Berman & McLaughlin, 1978; Buston, Wight, Hart, & Scott, 2002; Elliott & Mihalic, 2004; Eslea & Smith, 1998; Gingiss, Roberts-Gray, & Boerm, 2006; Gottfredson & Gottfredson, 2002; Hahn, Noland, Rayens, & Christie, 2002; Hallfors & Godette, 2002; Kumpfer, 2002; Pentz, 2003; Ringwalt et al., 2003; Taggart, Bush, Zuckerman, & Theiss, 1990).

Part of the explanation for failed or low levels of implementation is lack of capacity at the campus level to implement the new program or technology (Fullan, 2005; Johnson, Hays, Center, & Daley, 2004; Kaftarian, Robinson, Compton, Davis, & Volkow, 2004). Elements of school capacity to implement new programs and policies include support by the principal; the teachers’ sense of efficacy; implementers’ ability to communicate with program participants; staff morale; the general school culture; quality of leadership; availability of time, money, or other resources; presence or amount of turbulence in the implementing environment; and other organizational characteristics (Bosworth, Gingiss, Pothoff, & Roberts-Gray, 1999; Dusenbury et al., 2003; Kallestad & Olweus, 2003). Program characteristics such as complexity and relative advantage; facilitation processes such as adaptive planning and implementer training; and factors external to the school such as state- and district-level policies and mandates are other elements to be considered in understanding and building school capacity to implement a new health or education program (Berman & McLaughlin, 1976; Blake et al., 2005; Boerm, Gingiss, & Roberts-Gray, 2007; Bosworth et al., 1999; Greenberg & Walls, 2003; Han & Weiss, 2005; Payne, Gottfredson, & Gottfredson, 2006; Ringwalt et al., 2003).

To enable and encourage successful implementation of school programs for prevention of tobacco use and addiction, evaluation and planning for the Texas Tobacco Prevention Initiative (TTPI) included a focus on schools’ capacity to implement best practices identified in the Guidelines for School Health Programs to Prevent Tobacco Use and Addiction (school guidelines, Centers for Disease Control and Prevention (1994), Centers for Disease Control and Prevention (2004)). Tobacco use is one of the six health behaviors that contribute most to the leading causes of mortality in the United States (Kann, Brener, & Allensworth, 2001). Frequently these behaviors are established during youth. Nationwide, 22.3% of high school students and 8.1% of middle school students are current cigarette smokers (CDC, 2005). School health education programs play an important role in reducing adolescent tobacco use by increasing student knowledge, positive attitudes and peer resistance skills, and therefore lowering the levels of youth smoking (Engquist et al., 1994). When properly implemented, school programs can lower smoking prevalence by 25–60% (Meshack et al., 2004; National Cancer Policy Board, 2000; Rhode et al., 2001). The school guidelines were designed as a set of recommendations for ensuring a quality school program to prevent tobacco use. These recommendations constitute a “bundle” of program components which necessitate school adaptation and accommodation on multiple dimensions and at multiple levels.

To build schools’ capacity to implement programs consistent with the school guidelines, the TTPI, which is administered through the Texas Department of State Health Services (TDSHS) using funds from the Texas Tobacco Settlement, awarded small ($2000) competitive grants to reimburse program expenses and also provided initial training, guidance, and materials to selected schools in East Texas (TDSHS, 2001). Evaluation of the TTPI included studies to monitor schools’ capacity for achieving and sustaining successful implementation of tobacco prevention and control programs and studies to track implementation status (Gingiss, Boerm et al., 2006; Gingiss, Roberts-Gray et al., 2006; Boerm et al., 2007).

The Bridge-It system (Bosworth et al., 1999) was adopted to assess capacity to implement school programs in the TTPI. Bridge-It originally was developed using an expert panel and an empirical group process technique to integrate the wide range of influential variables into a model to help schools plan for and monitor key elements of the implementation process for school programs. The system includes an 8-factor, 36-item survey to analyze capacity for program implementation and a companion Bayesian model that uses the survey data to estimate the likelihood of implementation success.

The current study uses results from TTPI evaluation studies conducted at three points in time. The baseline or pre-implementation study was conducted to document the then current amount of implementing activity for components of the school guidelines and assess schools’ capacity to achieve and maintain proper implementation of the recommended best practices. Two years later, mid-course assessments were conducted to assess capacity for continued implementation success. Almost 4 years after the pre-implementation assessment, a more comprehensive follow-up assessment of implementation status was conducted using the survey tools adapted from the School Health Education Profile—Tobacco Module (SHEP-TM, Centers for Disease Control and Prevention, 2001). Data from the sequence of three evaluation studies were submitted to secondary analyses to examine the utility and predictive validity of the Bridge-It system. Our hypotheses were that (1) the Bridge-It model for assessing school capacity to implement new health and education programs provides valid predictors of implementation status measured nearly 4 years after program start-up; and (2) Bridge-It's multi-factor approach for measuring capacity has utility because different factors are predictive of qualitative versus quantitative aspects of achieved implementation status.

Section snippets

Participants

Schools were selected for the current study if they were (a) among the initial 111 schools that began their participation in the TTPI during fall 2000 or spring 2001; (b) participated in the baseline capacity assessment conducted August through October 2000 and/or mid-course capacity assessment in August 2002; and (c) participated in the survey of implementation status in April 2004. Forty-seven schools met these eligibility criteria. The survey forms for capacity assessments were distributed

Capacity to implement the school guidelines

Forty-six percent of schools were identified as having at least medium likelihood of achieving the quality and quantity of implementation needed for success. The average overall forecast on Bridge-It's 100-point scale representing the Bayesian probability statement about likelihood of achieving implementation success was low (mean=23.07, standard deviation 28.15). Expressed as the average across the factor scores (1=low, 2= medium, 3=high), the overall score was 1.64 with a standard deviation

Discussion

Secondary analyses of evaluations conducted for school programs in the TTPI supported hypotheses about the validity and utility of the Bridge-It system for assessing school capacity to implement new health and education programs. In a prospective design, the Bridge-It survey tool and Bayesian model produced forecasts that reliably predicted scores obtained at the 4-year follow-up of overall amount of implementing activity and quality of adherence to the Guidelines for School Health Programs to

Limitations

Conclusions drawn from the current study are limited by its relatively small sample size and uneven participation by the schools in the series of evaluation studies from which the datasets were drawn. Although we found no significant differences in characteristics (e.g., school size and type), pre-implementation capacity, or follow-up implementation status between schools included and those that did not meet eligibility criteria for the current analyses, the small sample size relative to the

Lessons learned

Implementation is a bridge between a school program and its impact on students and their families (e.g., Kalafat, Illback, & Sanders, 2007). But implementation success often is elusive. The current study, like many others (e.g., Hahn et al., 2002; Elliott & Mihalic, 2004; Payne et al., 2006), showed only about half of schools with high levels of quality and quantity of implementation. The lesson reinforced here is that ample time and money for local capacity building should be included in any

Acknowledgment

This study was conducted as part of research sponsored by the Texas Department of State Health Services (TDSHS) under contract 74660013992.

References (51)

  • M. Boerm et al.

    Association of the presence of state and district health education policies with school tobacco prevention program practices

    Journal of School Health

    (2007)
  • N.D. Brener et al.

    Reliability and validity of the school health policies and programs study 2000 questionnaire

    Journal of School Health

    (2003)
  • K. Buston et al.

    Implementation of a teacher-delivered sex education programme: Obstacles and facilitating factors

    Health Education Research, Theory, and Practice

    (2002)
  • Centers for Disease Control and Prevention (1994). Guidelines for school health programs to prevent tobacco use and...
  • Centers for Disease Control and Prevention (2001). State-level school health policies and practices: A state-by-state...
  • Centers for Disease Control and Prevention (2004). Guidelines for School Health Programs to Prevent Tobacco Use:...
  • Centers for Disease Control and Prevention (2005). Tobacco use, access, and exposure to tobacco in media among middle...
  • L. Dusenbury et al.

    A review of research on fidelity of implementation: Implications for drug abuse prevention in school settings

    Health Education Research

    (2003)
  • L. Dusenbury et al.

    Quality of implementation: Developing measures crucial to understanding the diffusion of preventive interventions

    Health Education Research

    (2005)
  • M. Elias et al.

    Implementation, sustainability, and scaling up of social-emotional and academic innovations in public schools

    School Psychology

    (2003)
  • D. Elliott et al.

    Issues in disseminating and replicating effective prevention programs

    Prevention Science

    (2004)
  • K. Engquist et al.

    The effect of two types of teacher training on implementation of Smart Choices: A tobacco prevention curriculum

    Journal of School Health

    (1994)
  • M. Eslea et al.

    The long-term effectiveness of anti-bullying work in primary schools

    Educational Research

    (1998)
  • M. Fullan

    Leadership and sustainabilitiy: System thinkers in action

    (2005)
  • P. Gingiss et al.

    Follow-up comparisons of intervention and comparison schools in a state tobacco prevention and control initiative

    Journal of School Health

    (2006)
  • Cited by (45)

    • Structural, cultural and instructional predictors essential to sustained implementation fidelity in schools: The School-Wide Positive Behavior Support Model (SWPBS)

      2021, International Journal of Educational Research Open
      Citation Excerpt :

      It may, however, be that other factors are important to short-term fidelity than to longer-term fidelity. Higher initial fidelity in schools is according to the research literature, positively associated with a) clear end explicit guidelines and materials (e.g., Mihalic, Fagan & Argamaso, 2008), b) greater organizational capacity, indicated by characteristics such as good initial planning, amenability and communication (e.g., Jaycox et al., 2006; Mihalic et al., 2008), c) greater organizational support, such as program provider training and strong principal support (e.g., Payne, 2009; Roberts-Gray, Gingiss & Boerm, 2007), d) school size and urbanity (Payne, Gottfredson & Gottfredson, 2006), and e) stability in terms of resources and personnel (Payne et al., 2006). Payne and Eckert (2009) summarized that program structure characteristics seem to be of greater importance in the prediction of high initial implementation fidelity than characteristics of the program providers, school climate, and school and community structure.

    • An evaluation of factors which can affect the implementation of a health promotion programme under the Schools for Health in Europe framework

      2016, Evaluation and Program Planning
      Citation Excerpt :

      Capacity building amongst staff has been identified as important for the success of the SHE concept (Hoyle, Samek, & Valois, 2008; Inchley et al., 2006; Jourdan, Samdal, Diagne, & Carvalho, 2008). The SHE concept is more sustainable if it is supported by programmes which actively enhance staff capacity to: coordinate health-related activities; judiciously allocate resources for health promotion; and implement health policies for the school community (Inchley et al., 2006; Roberts-Gray et al., 2007; St. Leger, 2005). Although a health professional can provide guidance and reassuring assistance on such issues, school staff should be clearly recognised as the experts on their school environment.

    • Dissemination and implementation of evidence-based practices for child and adolescent mental health: A systematic review

      2013, Journal of the American Academy of Child and Adolescent Psychiatry
      Citation Excerpt :

      Additional inner contextual factors that were related to more successful implementation in these studies included those related to provider characteristics,76 including sociodemographics,39 experience,54,77 disciplinary background,39 exposure to47,49 and attitudes toward EBPs,39,70,72,78-81 sense of self-efficacy,47,49,69,78 opportunities for reward/achievement,68,81 concerns about competing demands (which reduced likelihood of implementation),72,80 and ethnic matching of therapist and youth.79,82 Supervisor factors associated with implementation included experience with EBPs83 and relationships with supervisees,71 whereas organizational factors included leadership,66,67,80,81,84 organizational culture and climate,48,74,85 perceived fit of an EBP with the organization's mission,21,25,38,64,81 organizational support for implementation,54,66,68-70,80,84 and effectively addressing institutional barriers69,80,86 such as financing,66,76,81 resources,48 readiness to change,39 and governance (i.e., for-profit organizations were more likely to implement EBPs than nonprofits).77 Within the context of fidelity to EBP protocols, some studies suggested allowing therapists flexibility in their implementation of the EBP (e.g., adapting cognitive behavioral therapy for anxiety to better meet specific child needs,87 language translation66) improved engagement, treatment retention, and child outcomes.87

    • Exposure to school and community based prevention programs and reductions in cigarette smoking among adolescents in the United States, 2000-08

      2012, Evaluation and Program Planning
      Citation Excerpt :

      School-based programs were developed early in prevention efforts and continue to be among the most commonly used (Dobbins, DeCorby, Manske, & Goldblatt, 2008; National Institute on Drug Abuse, 2003a; Sherman & Primack, 2009; United Nations Office on Drugs and Crime, 2004). Schools are a place where students spend most of their time interacting with each other, representing an ideal venue for diffusion of information and for intervention programs targeting the influence of peers (Roberts-Gray, Gingiss, & Boerm, 2007; United Nations Office on Drugs and Crime, 2004). The broader community may also serve as a forum for prevention efforts.

    • A Beginning Music Teacher’s Micropolitical Literacy Development

      2023, Journal of Research in Music Education
    View all citing articles on Scopus
    View full text