Skip to main content
Erschienen in: Implementation Science 1/2017

Open Access 01.12.2017 | Study protocol

Organizational supports used by private child and family serving agencies to facilitate evidence use: a mixed methods study protocol

verfasst von: Emmeline Chuang, Crystal Collins-Camargo, Bowen McBeath

Erschienen in: Implementation Science | Ausgabe 1/2017

Abstract

Background

Challenges to evidence use are well documented. Less well understood are the formal supports—e.g., technical infrastructure, inter-organizational relationships—organizations may put in place to help overcome these challenges. This study will identify supports for evidence use currently used by private child and family serving agencies delivering publicly funded behavioral health and/or human services; examine contextual, organizational, and managerial factors associated with use of such supports; and determine how identified supports affect evidence use by staff at multiple levels of the organization.

Methods

We will use a sequential explanatory mixed methods design, with study activities occurring in two sequential phases: In phase 1, quantitative survey data collected from managers of private child and family serving agencies in six states (CA, IN, KY, MO, PA, and WI) and analyzed using both regression and qualitative comparative analysis (QCA) will identify organizational supports currently being used to facilitate evidence use and examine the contextual, organizational, and managerial factors associated with the use of such supports. In phase 2, data from phase 1 will be used to select a purposive sample of 12 agencies for in-depth case studies. In those 12 agencies, semi-structured interviews with key informants and managers, focus groups with frontline staff, and document analysis will provide further insight into agencies’ motivation for investing in organizational supports for evidence use and the facilitators and barriers encountered in doing so. Semi-structured interviews with managers and focus groups with frontline staff will also assess whether and how identified supports affect evidence use at different levels of the organization (senior executives, middle managers, frontline supervisors, and frontline staff). Within- and between-case analyses supplemented by QCA will identify combinations of factors associated with the highest and lowest levels of staff evidence use.

Discussion

This study will inform efforts to improve sustainment, scale-up, and spread of evidence by providing insight into organizational and managerial strategies that facilitate evidence use, the contexts in which these strategies are most effective, and their effect on evidence use by staff at different levels of the organization.
Abkürzungen
Alliance
Alliance for Strong Families and Communities
EBTs
Evidence-based treatments
QCA
Qualitative comparative analysis

Background

Effective use of research evidence, or knowledge that has been subjected to testing and found to be credible [1, 2], by health and human service practitioners can reduce disparities in the costs and quality of care, enhance service effectiveness and worker satisfaction, and improve outcomes experienced by vulnerable children and families [36]. Recognizing the importance of research evidence for improving organizational performance and service outcomes, policymakers in many countries have made evidence-informed health and human services a priority [7, 8]. In the USA, an increasing number of state and local governments now link the use of research evidence regarding “effective” programs and practices, including evidence-based treatments (EBTs), to funding decisions or service reimbursement [9].
Despite the importance of research evidence use for achieving positive outcomes for vulnerable children and families, research-to-practice gaps persist across settings, conditions, and population groups [10]. In health care, for example, only about half of recommended care practices are implemented [11, 12]. Uptake of research recommendations is even lower for prevention and behavior change programs and for managerial practices such as goal setting and performance feedback [13, 14]. In the human services sector, national survey evidence suggests that only 25% of child welfare agency programs and practices are evidence based [15]. In all states, an increasingly large proportion of publicly funded services for child welfare-involved children and families are delivered via purchase of service contracts with private agencies [16, 17]. Although these private agencies are heterogeneous as a population [18], the majority provide both behavioral health and human services to vulnerable children and families [19]. However, only a small proportion of agency administrators and practitioners report regularly using research to inform their daily work [20, 21].
Challenges to the use of research evidence, and particularly to the adoption, implementation, and sustainment of EBTs, are well documented [8, 22, 23]. Less studied are the formal supports—i.e., staff positions, infrastructural supports, and special initiatives—organizations may put in place to help overcome these challenges. For example, in some clinical settings, knowledge broker positions (i.e., intermediaries accountable for encouraging knowledge use) have successfully been used to support evidence-informed decision-making [24]. Ties to opinion leaders and researchers outside of the organization [25, 26], a supportive technical infrastructure [27, 28] and an organizational climate that rewards evidence use [24, 29] have also been identified as affecting evidence use.
When present, these supports may signify an organizational commitment to improving service quality and program outcomes. However, these supports typically require significant managerial and workforce investment and can be costly to develop, implement, and sustain. Agencies may vary in their capacity and/or willingness to utilize such supports and in their approach to systematizing the use of research evidence by managers and other staff. Currently, we know little about the extent to which health and human service agencies use such supports to facilitate evidence use by staff. We also know little about the contextual and organizational factors associated with agency use of such supports or the extent to which these supports affect evidence use by staff at different levels of the organization.
The current study will address this gap by collecting qualitative and quantitative data from private child and family serving agency managers and staff to address the following aims:
  • Aim 1: Identify formal organizational supports used by private child and family serving agencies to facilitate evidence use
  • Aim 2: Examine contextual, organizational, and managerial factors affecting agency use of formal supports
  • Aim 3: Determine how formal organizational supports affect use of research evidence at multiple levels of the organization

Conceptual framework

Study activities will be guided by the conceptual framework in Fig. 1. This model was developed following a review of the literature on evidence use in the human services, public health, and associated scholarly domains and is informed by concepts from resource dependence theory [30], institutional theory [31, 32], theories of leadership [33, 34], and the strategic management literature [35, 36].

Organizational supports that facilitate evidence use

We propose to examine four types of formal supports that agencies may use to facilitate evidence use. These supports include linkage and exchange efforts, technical infrastructure, other knowledge management infrastructure, and strategic alignment.
Linkage and exchange efforts refer to formal ties to knowledge brokers [37] outside of the agency who can assist in acquiring, assessing, adapting, or applying evidence in decision-making or practice [3841]. These knowledge brokers can include researchers, professional associations, consultants, or research use networks. The use of knowledge brokers to assist with distilling and disseminating research to practitioners is increasingly common in Canada and the UK [4244] and has also been seen increased uptake in the US health care sector [45] but has not been systematically examined in the US behavioral health or human service sectors.
Technical infrastructure includes internal data systems and/or other tools designed to facilitate access to and use of evidence by agency staff [4648]. Research suggests that investment in technical infrastructure may be necessary to facilitate evidence use [49, 50]. At a basic level, agencies may provide staff with computers that permit use of free online resources or develop virtual libraries to promote staff access to research evidence. Agencies may also invest in performance measurement systems designed to collect and report data on program- or agency-level performance indicators or in client management information systems that provide real-time data on client service utilization and outcomes.
Other knowledge management infrastructure includes other agency resources allocated for the purpose of building capacity to use or promote evidence use. These supports include having formal positions (full or part-time) responsible for supporting evidence use, e.g., internal knowledge brokers or other staff formally assigned to retrieve, translate, and disseminate research knowledge within the agency or research agency programs and initiatives. Agencies may also promote staff training and continuing education on specific research topics.
Finally, strategic alignment [51] refers to other formal efforts intended to establish an organizational culture and climate, i.e., workers’ perceptions of norms and expectations in their work environment [52, 53], that prioritizes evidence use. Examples include emphasis on the importance of evidence use in the agency’s mission or strategic plan or establishing policies and practices that promote accountability for evidence use (e.g., incorporating a requirement for research evidence use into staff performance reviews or compensation plans).

Factors hypothesized to affect agency use of formal organizational supports

As shown in Fig. 1, we conceptualize agency use of formal organizational supports to facilitate evidence use as affected by factors at the environmental, agency, and managerial levels.
Contextual and organizational factors hypothesized to affect agency use of such supports are informed by two complementary macro-theoretical perspectives on organizational behavior, resource dependence theory and institutional theory [30, 31, 54]. Consistent with resource dependence theory and institutional theory, we posit that agencies will invest in formal organizational supports for evidence use if leaders perceive this investment as enhancing agencies’ ability to secure resources vital to organizational maintenance and survival (e.g., providing a competitive edge with clients or funders) [55, 56] or if they face institutional pressure to do so from funders, policy-makers, competitors, and accrediting and educational bodies whose support agencies need in order to be perceived as “legitimate” [5759]. Specific factors hypothesized to affect agency investment in organizational supports for evidence use include inter-organizational competition [60], accreditation requirements promoting evidence-informed practice, state or local government policies requiring contractor use of evidence-based practice, and agency size [61, 62].
Managerial characteristics hypothesized to affect agency use of such supports are identified from the strategic management literature [35, 36], theories of leadership [33, 34], and prior research on evidence use in health care [38, 63, 64]. Specifically, we propose to examine three types of managerial characteristics hypothesized to affect agency use of formal organizational supports to facilitate evidence use: boundary spanning activities [65, 66] (i.e., management of relations with external entities), leadership behaviors [67] (e.g., the actions leaders take to motivate staff and implement plans), and managers’ education, training, and attitudes towards evidence use [38, 63, 64]. These characteristics, particularly leadership behaviors and attitudes towards evidence use, may also directly affect evidence use by agency staff.

Methods/design

We will use a sequential explanatory mixed methods design [68], with study activities occurring in two sequential phases: In phase 1, quantitative survey data collected from managers of private child and family serving agencies in six states will examine organizational supports currently being used by agencies to facilitate evidence use (aim 1) and identify contextual, organizational, and managerial factors associated with agency use of such supports (aim 2). In phase 2, data from phase 1 will be used to select a purposive sample of 12 agencies for in-depth case studies. In those 12 agencies, key informant interviews, focus groups, and document analysis will provide further insight into agencies’ motivation for investing in organizational supports for evidence use and the facilitators and barriers encountered in doing so (aim 2). Semi-structured interviews with managers (executives, middle managers, and frontline supervisors) and focus groups with frontline staff will examine whether and how identified supports affect evidence use at different levels of the organization (aim 3). An overview of our proposed study design is provided in Fig. 2.

Phase 1: quantitative survey of private agency executives

In phase 1, we will administer a web-based survey to private child and family serving agency executives in six states that vary in size and sociopolitical context (CA, IN, KY, MO, PA, and WI). Included agencies in six states will enhance generalizability of study findings and allow us to better control for differences in local institutional context. It will also ensure that we have a sufficient sample size to conduct proposed analyses.

Sampling strategy and recruitment procedures

Eligible agencies in each state include all private agencies serving the child welfare population that are members of a state or national association of private providers. Prior evidence suggests that public agencies contract with such agencies for a broad spectrum of services, ranging from core child welfare services such as foster care and case management to behavioral health and health care services [18, 69]. In the absence of a comprehensive database identifying our study population, we will partner with the major membership associations representing private child and family serving agencies in state and federal policymaking to distribute survey invitations to their members, as well as private non-member agencies on their association listservs. Our partners include six state membership associations of private child and family serving agencies as well as the Alliance for Strong Families and Communities (the Alliance). Based on 2015 membership, we anticipate a sample size of approximately 448 agencies.
A single executive in each agency will receive an electronic letter of invitation that provides additional information on the study’s purpose and a hyperlink to the survey. Typically, this executive will be the individual identified by our study partners as their primary contact for that agency. Respondents will be asked to complete the survey only once per agency. To facilitate tracking, each agency will receive a unique survey link. The survey portal will remain open for 12 weeks. Upon completion of the survey, responding agencies will be entered in a raffle to win one of 45 $100 electronic gift cards (or equivalent donation to their agencies). Respondents will also be asked if their agency is interested in participating in the phase 2 of the study.

Survey instrument development

An overview of survey domains and content is provided in Table 1 reflecting measures previously validated either in child welfare or other health and human service contexts as well as prior research conducted by the study investigators [18, 7075]. To ensure questions are of high salience to agency executives, the survey instrument will be piloted with an expert panel comprised of 5–6 private agency administrators and other key stakeholders from states not targeted for survey participation, and refined to ensure it takes no more than 20 min to complete.
Table 1
Overview of phase 1 director survey measures
Construct
Measure
Evidence use
 Use of research evidence
Four types of research use: Instrumental use refers to the direct use of research evidence in decision-making or in identifying a solution to a specific problem, e.g., in deciding to adopt or implement a specific program or practice or choosing a specific course of action with a client [63]. Tactical use, also known as persuasive or symbolic use, refers to the use of evidence to legitimize, justify, or otherwise mobilize support for actions or decisions [100]. Process use encompasses direct involvement in the design or conduct of research. Finally, overall use refers to the use of any kind of research in any way and encompasses all of the previous types of evidence use [101].
Formal organizational supports used to facilitate evidence use
 Linkage and exchange efforts
Ties to knowledge brokers outside the organization who can assist in acquiring, assessing, adapting, or applying evidence in decision-making or practice. Specific types of knowledge brokers to be examined include researchers, professional associations, advocacy groups, and consultants or other technical assistance providers.
 Technical infrastructure
Data systems and/or other technical infrastructure designed to facilitate access to and use of evidence by agency staff, e.g., information systems that disseminate research evidence and/or provide timely data and feedback to staff on client utilization, experiences, or outcomes, etc.
 Other knowledge management infrastructure
Other infrastructure designed to promote evidence use within the agency. Specifically, we will examine whether there are (1) formal positions accountable for supporting evidence use and/or (2) formal organizational policies or practices designed to develop agency capacity for evidence use; (3) staff directly involved in research and/or quality improvement activities
 Strategic alignment
Emphasizing the importance of evidence use in the agency’s mission, vision, values and/or strategic plan and/or any other formal efforts to establish an organizational climate that prioritizes evidence use
Factors hypothesized to affect agency use of formal organizational supports
 Competition
Local competition for funding, staff, and clientele
 Funding sources
Major sources of revenue, the percentage of revenues received directly from each source during the most recent fiscal year (e.g., Medicaid, state or county contracts with public child welfare agencies), and whether payment is linked to a performance-based accountability mechanism
 Accreditation
Whether the agency is accredited by COA, CARF, JCAHO, or another accrediting body
 Government policies
State and/or local requirements for the use of EBTs in services and/or for research evidence use
 Agency size
Number of full-time staff or full-time staff equivalents
 Organizational auspices
Whether the agency is not-for-profit or for-profit and whether the agency is part of a larger network or system
 Boundary spanning activities
Number of hours per week spent engaging in each of the following activities with external stakeholders: (1) liaison activities with other monitoring or licensing organizations; (2) activities with professional associations; (3) consulting with and/or participating in task groups with other service providers; (4) activities with researchers or technical assistance providers (5) public presentations and appearance in the community; and (6) contributing to federal, state, and/or local policy making
 Leadership behaviors
Respondents’ leadership style
 Director education and training
Directors’ highest educational degree, length of time in current role, whether he/she ever completed a research class (e.g., on research design or statistics) and/or class on quality improvement techniques, attitudes towards evidence use, and personal use of research evidence

Analyses

The phase 1 unit of analysis is the private child and family serving agency. Factor analysis will examine the underlying factor structure of any quantitative measures not previously used in this context [76]. Univariate and bivariate analyses will descriptively examine the prevalence of different formal organizational supports to facilitate evidence use across the sample of private child welfare agencies (aim 1). Subsequent analyses will explore the extent to which contextual, agency, and managerial factors identified in Fig. 1 are associated with the presence and use of different organizational supports for evidence use (aim 2). These analyses will occur in three stages: First, multiple regression will identify contextual, organizational, and managerial factors significantly associated with agency use of organizational supports to facilitate evidence use.
Next, qualitative comparative analysis (QCA) will be used to identify specific combinations of contextual, agency, and managerial factors associated with high vs. low levels of organizational support for evidence use [7779]. QCA is a set-theoretic method based on Boolean algebra that is increasingly used in sociological, management, and health services research to explore complex social phenomena [80, 81]. The primary benefit of QCA for the proposed research is the technique’s ability to identify conditions or combinations of conditions that are necessary vs. sufficient for agency investment in organizational supports for evidence use. A condition is considered necessary if it must be present in order for an outcome to occur; however, the presence of a necessary condition does not ensure the outcome will occur. For example, agencies may need to be a certain size in order to invest in management information systems or other technical infrastructure. However, even if agency size is necessary, it may not be sufficient for agency investment in technical infrastructure in the absence of other conditions such as institutional pressures or managerial support for evidence use. Conditions or combinations of conditions are considered sufficient if they consistently produce an outcome of interest when present. In our study, there may be multiple combinations of contextual, organizational, and managerial factors that are sufficient but not necessary for agency investment in organizational supports for evidence use. For example, having either an executive with a positive attitude towards research or funders that link evidence use to payments could each be sufficient to ensure agency investment in supports for evidence use. This assumption of causal heterogeneity, i.e., different combinations of conditions can lead to the same outcome [77], is a strength of QCA and will allow for development of models that better reflect the reality of complex organizational phenomena than if we relied solely on regression-based techniques.
Configurational solutions identified through QCA will be inputted into regression models to identify configurations of factors that are significantly associated with agency use of formal organizational supports even after controlling for other agency characteristics. Regression analyses will be conducted using Stata 13.0 [82]. QCA will be conducted using either the fsQCA software [78] or the FUZZY module in Stata [83], depending on the specific constructs being tested.

Phase 2: mixed methods case studies of 12 agencies

In phase 2, our primary aim is to examine the ways in which identified organizational supports affect the use of evidence at multiple levels of the agency (aim 3). We will also explore in more detail the contextual, organizational, and managerial factors that may affect agency use of such supports and use of research at different levels of the agency (aim 2). To achieve this objective, we will employ a multiple case study design, with the agency as the unit of analysis.

Sampling strategy

In phase 2, eligible agencies include all private child and family serving agencies from phase 1 that expressed an interest in participating in phase 2. Eligible agencies will be stratified by prevalence of formal organizational supports for evidence use, and a maximum variation sampling procedure [84] will be used to identify a diverse sample of 12 private agencies for in-depth case study analysis. Within each of these 12 agencies, qualitative data will be collected from respondents at multiple levels of the organization (e.g., agency executives, middle managers, frontline supervisors, and frontline staff).

Semi-structured interviews with key informants and managers

We propose to conduct semi-structured qualitative interviews with key informants and managers at multiple levels (e.g., executive team, middle management, frontline supervisors) within each agency (~8–10 interviews per agency). Qualitative interviews with 2–3 key informants in each agency will provide insight into agency strategic priorities, motivation for investing in identified organizational supports, and facilitators and barriers to putting these supports in place. Qualitative interviews with managers (additional ~6–8 total) will provide insight into how respondents at different levels of the agency use research evidence and the ways in which available organizational supports affect evidence use. Interviews will also further explore contextual, organizational, and managerial factors—particularly leadership behaviors and staff training and resources—that may affect perceived utility of these organizational supports and overall evidence use (see Table 2).
Table 2
Overview of phase 2 qualitative domains
Domain
Type
Examples
Respondent role
Open-ended
• Both: Can you tell me a little bit about your role in [agency]? What does a typical day or week look like?
Organizational context
Open-ended
• KS: Can you tell me a little bit about the [agency]’s strategic priorities?
• FLW: How much discretion do you have in your work with clients? What factors external or internal to your agency have the greatest influence on your daily work?
Evidence use
Closed-ended
• Measures of research evidence use will be drawn from one of three validated instruments: the Output scale of the Structured Interview of Evidence Use (SIEU) [102], the Evidence-Informed Practice Survey (EIPS) [103], or the Research Utilization Questionnaire (RUQ) [100, 104]. Specific measures administered will be based in part on expert panel feedback regarding topic salience and respondent burden.
Open-ended
• KS: What types of information do you use when making decisions about programs or policies? Do you view certain types of evidence as more useful to this process than others?
• FLW: What types of information or ‘evidence’ do you use when making decisions about clients? Do you view certain types of evidence as more useful to this process than others? Why or why not? What about in other aspects of your daily work?
Organizational supports for evidence use
Open-ended
• Both: To what extent does your agency value the use of evidence (particularly research evidence) in your day-to-day work? What gives you this impression?
• Both: What types of supports does your agency have in place to facilitate evidence-informed decision making? How useful do you find these supports? Why or why not? How often do you take advantage of these supports? Are there other supports or resources you wish you had access to? Why or why not?
• KS: How long have these supports been in place? What motivated your agency to invest in these supports? How effective have you found these supports at helping your agency fulfill its mission and/or accomplish its strategic priorities?
Leader facilitation of evidence use
Closed-ended
• Leadership style and effectiveness using the Multifactor Leadership Questionnaire Short Form (MLQ-5X) [33]
• Boundary spanning activities adapted from the National Drug Abuse Treatment System Survey [105, 106]
Open-ended
• KS: Can you provide an example of any actions you have taken to promote evidence use by staff?
• Both: To what extent do leaders at your agency value the use of evidence in your daily work? What gives you this impression?
Respondent characteristics
Closed-ended
• Respondents’ training, knowledge, and skills in using research; specific measures will be drawn from previous studies of research use in the health care sector [64, 107]
• Basic demographics (age, gender, race/ethnicity)
KS key stakeholder, FLW frontline worker
All interviews will last no longer than 45 min and will be conducted using a semi-structured interview guide tailored to the respondent’s role within the agency. With respondents’ permission, all interviews will be recorded and transcribed verbatim.

Focus groups with frontline staff

Focus groups rely on group interaction to generate insights and can be effective at encouraging participation from individuals reluctant to be interviewed on their own [85]. In phase 2, focus groups with frontline staff will examine the extent to which staff currently use research evidence and the ways in which identified organizational supports and leader behaviors do or do not affect evidence use (see Table 2). Depending on agency size, we anticipate conducting 2–3 focus groups per agency, with 5–8 participants per focus group. An experienced moderator will introduce the topics to be addressed, monitor dynamics of group discussion to ensure all views are represented, and ensure the discussion stays on track. Focus groups will be conducted in absence of other agency personnel, and the script will be tailored to create an atmosphere in which staff feel safe and comfortable sharing their opinions (see Table 2). A second researcher will be present to take detailed notes on the discussion. Focus groups will last approximately 90–100 min and, with respondents’ permission, will be recorded and transcribed verbatim.

Document analysis

As appropriate during the semi-structured interview process, key informants will be asked to provide access to documents that clarify organizational policies, priorities, or supports for evidence use. These documents will provide a low-cost method of augmenting respondents’ descriptions of their organizational contexts and available supports for evidence use [84, 86]. These documents may include but are not limited to program manuals, quality improvement plans and practice protocols, the agency’s response to a request for proposals that seeks funding for training or other infrastructure development activities, or publicly available information on the agency’s structure and services. With respondents’ and agencies’ permission, these documents will be uploaded into NVivo for analysis.

Analyses

Phase 2 analyses will occur in multiple stages. First, all qualitative data will be uploaded into the qualitative software QSR NVivo 10.0 for analysis [87]. Case study analyses of these qualitative data will be guided by pattern-matching logic [88]. Procedurally, this analysis will involve three steps: coding, within-case analysis, and cross-case analysis. In the first step, all qualitative data will be coded in NVivo. Our initial codebook will be informed by constructs identified in Fig. 1 and the results of phase 1 data collection but may subsequently be refined to include emergent constructs identified from the data.
In the next step, we will conduct within- and cross-case analyses. Specifically, coded data will be analyzed to identify themes in the coded data for each construct, the degree to which these themes emerge in the data (“strength”) and the degree to which each construct positively or negatively affects evidence use by staff (“valence”). We will also assess the degree to which observed relationships within and across cases are consistent with the conceptual model outlined in Fig. 1 [89].
Finally, phase 2 qualitative data will be calibrated and analyzed using QCA. Our objective in applying QCA will be to identify combinations of factors associated with high levels of staff research use and low levels of staff research use. Given the limited number of cases in our sample, limited diversity may prevent us from identifying meaningful combinations of factors associated with the desired outcomes. If that is the case, we will forego QCA and rely purely on the within- and cross-case analysis results. Results will be shared with participating agencies and used to refine the conceptual model in Fig. 1.

Discussion

Prior research has identified numerous barriers to evidence use by practitioners. Research evidence can be costly for practitioners to access. The quality of available evidence and its perceived relevance to local organizational and practitioner needs can also vary considerably [40, 43, 90]. In particular, many EBTs are developed and tested with specific client populations in relatively resource rich settings; however, the contexts in which practitioners are expected to translate these EBTs are often significantly more heterogeneous in terms of available resources, client characteristics, and supportive infrastructure [9193]. Strategies for beginning to address such concerns include the use of hybrid research designs that simultaneously assess program effects and implementation and the development of processes that allow for structured (rather than ad hoc) adaptation of EBTs to local contexts [94, 95]. However, equally critical for sustainment, scale-up, and spread of evidence to lower resource settings are a better understanding of the contexts in which evidence is being used and the extent to which effective strategies for facilitating evidence use may vary across these contexts.
This study identifies organizational supports (e.g., technical infrastructure, knowledge management infrastructure, linkage and exchange efforts) being used by private child and family serving agencies to promote evidence use. The study also examines the contexts in which these supports are more prevalent, identifies multilevel factors driving agency investment in these supports, and explores the conditions under which these supports may affect evidence use at different levels of the organization. Key strengths include the study’s focus on private child and family serving agencies, who play a critical role in delivery of publicly funded behavioral health and/or human services but are under-examined in the literature; the use of a multi-state sample that allows for systematic examination of how different institutional and market factors influence agencies’ support for evidence use; and the use of a configurational comparative approach to inform analyses, which will allow for identification of solutions that better reflect the reality of complex organizational phenomena.
In all states, public and private agencies are challenged to use evidence to improve organizational performance, including child and family outcomes [9699]. Results will provide insight into strategies that may be effective for scaling evidence use at multiple levels of the organization and across different institutional environments and, ultimately, improving outcomes for vulnerable children and families.

Acknowledgements

We would like to thank the directors and staff of the associations for private child and family serving agencies for their partnership and support on this project: Bernadette Bianchi, Mary Chant, Cathleen Graham, Linda Hall, Polina Makievsky, Phyllis Richards, Michelle Sanborn, Carroll Schroeder, and Lynn Thull. We would also like to thank our expert panel members for input provided on survey instruments and interview protocols: Valerie Hicks, Mary Hollie, Gregory Kurth, Mike Shaver, and Kara Teeple. Finally, we appreciate our partnership with Kristen Lwin and her team at Practice and Research Together related to dissemination efforts.

Funding

This work is supported by the William T Grant Foundation (ID #186134). The views presented here are solely the responsibility of the authors and do not necessarily represent the official views of the William T Grant Foundation.

Availability of data and materials

Not applicable.

Authors’ contributions

EC, CC, and BM are responsible for study conception and design and for drafting and critical review of the manuscript. All authors read and approved the final manuscript.

Competing interests

The authors declare that they have no competing interests.
Not applicable.
This study, including procedures for obtaining informed consent, was approved by the Institutional Review Board at the University of California Los Angeles (IRB #16-000579). Informed consent will be obtained from study participants prior to data collection.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://​creativecommons.​org/​licenses/​by/​4.​0/​), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://​creativecommons.​org/​publicdomain/​zero/​1.​0/​) applies to the data made available in this article, unless otherwise stated.
Literatur
1.
Zurück zum Zitat Higgs J, Jones M. Will evidence-based practice take the reasoning out of practice? In: Higgs J, Jones M, editors. Clinical reasoning in the health professionals. 2nd ed. Oxford: Butterworth Heineman; 2000. p. 307–15. Higgs J, Jones M. Will evidence-based practice take the reasoning out of practice? In: Higgs J, Jones M, editors. Clinical reasoning in the health professionals. 2nd ed. Oxford: Butterworth Heineman; 2000. p. 307–15.
2.
Zurück zum Zitat Rycroft-Malone J, Seers K, Titchen A, Harvey G, Kitson A, McCormack B. What counts as evidence in evidence-based practice? J Adv Nurs. 2004;47:81–90.PubMedCrossRef Rycroft-Malone J, Seers K, Titchen A, Harvey G, Kitson A, McCormack B. What counts as evidence in evidence-based practice? J Adv Nurs. 2004;47:81–90.PubMedCrossRef
3.
Zurück zum Zitat Cabassa LJ, Baumann A. A two-way street: bridging implementation science and cultural adaptations of mental health treatments. Implement Sci. 2013;8:90.PubMedPubMedCentralCrossRef Cabassa LJ, Baumann A. A two-way street: bridging implementation science and cultural adaptations of mental health treatments. Implement Sci. 2013;8:90.PubMedPubMedCentralCrossRef
4.
Zurück zum Zitat Napoles AM, Santoyo-Olsson J, Stewart AL. Methods for translating evidence-based behavioral interventions for health-disparity communities. Prev Chronic Dis. 2013;10:E193. doi:10.5888/pcd10.130133. Napoles AM, Santoyo-Olsson J, Stewart AL. Methods for translating evidence-based behavioral interventions for health-disparity communities. Prev Chronic Dis. 2013;10:E193. doi:10.​5888/​pcd10.​130133.
5.
Zurück zum Zitat Grol R. Successes and failures in the implementation of evidence-based guidelines for clinical practice. Med Care. 2001;39:1146–54.CrossRef Grol R. Successes and failures in the implementation of evidence-based guidelines for clinical practice. Med Care. 2001;39:1146–54.CrossRef
6.
Zurück zum Zitat Aarons GA, Sommerfeld D, Hecht D, Silvosky J, Chaffin M. The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: evidence for a protective effect. J Consult Clin Psychol. 2009;77:270–80.PubMedPubMedCentralCrossRef Aarons GA, Sommerfeld D, Hecht D, Silvosky J, Chaffin M. The impact of evidence-based practice implementation and fidelity monitoring on staff turnover: evidence for a protective effect. J Consult Clin Psychol. 2009;77:270–80.PubMedPubMedCentralCrossRef
7.
Zurück zum Zitat Dill K, Shera W. Implementing evidence-informed practice: international perspectives. Toronto: Canadian Scholars’ Press Inc; 2012. Dill K, Shera W. Implementing evidence-informed practice: international perspectives. Toronto: Canadian Scholars’ Press Inc; 2012.
8.
Zurück zum Zitat Gray M, Joy E, Plath D, Webb S. Implementing evidence-based practice: a review of the empirical research literature. Res Soc Work Pract. 2013;23:157–66.CrossRef Gray M, Joy E, Plath D, Webb S. Implementing evidence-based practice: a review of the empirical research literature. Res Soc Work Pract. 2013;23:157–66.CrossRef
9.
Zurück zum Zitat Littell J, Shlonsky A. Toward evidence-informed policy and practice in child welfare. Res Soc Work Pract. 2010;20:723–5.CrossRef Littell J, Shlonsky A. Toward evidence-informed policy and practice in child welfare. Res Soc Work Pract. 2010;20:723–5.CrossRef
10.
Zurück zum Zitat Carrilio T, Packard T, Clapp J. Nothing in—nothing out: barriers to the use of performance data in social service programs. Adm Soc Work. 2003;27:61–76. Carrilio T, Packard T, Clapp J. Nothing in—nothing out: barriers to the use of performance data in social service programs. Adm Soc Work. 2003;27:61–76.
11.
Zurück zum Zitat McGlynn E, Asch S, Adams J, Keesey J, Hicks J, DeCristofaro A, Kerr E. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348:2635–45.PubMedCrossRef McGlynn E, Asch S, Adams J, Keesey J, Hicks J, DeCristofaro A, Kerr E. The quality of health care delivered to adults in the United States. N Engl J Med. 2003;348:2635–45.PubMedCrossRef
12.
Zurück zum Zitat Mangione-Smith R, DeCristofaro A, Setodji C, Keesey J, Klein D, Adams J, Schuster MA, McGlynn E. The quality of ambulatory care delivered to children in the United States. N Engl J Med. 2007;357:1515–23.PubMedCrossRef Mangione-Smith R, DeCristofaro A, Setodji C, Keesey J, Klein D, Adams J, Schuster MA, McGlynn E. The quality of ambulatory care delivered to children in the United States. N Engl J Med. 2007;357:1515–23.PubMedCrossRef
13.
Zurück zum Zitat Locke E, Latham G. Goal setting: a motivational technique that works. Englewood Cliffs: Prentice-Hall; 1984. Locke E, Latham G. Goal setting: a motivational technique that works. Englewood Cliffs: Prentice-Hall; 1984.
14.
Zurück zum Zitat Rousseau D. Is there such a thing as evidence-based management? Acad Manage Rev. 2006;31:256–68.CrossRef Rousseau D. Is there such a thing as evidence-based management? Acad Manage Rev. 2006;31:256–68.CrossRef
15.
Zurück zum Zitat Horwitz SM, Hurlburt MS, Goldhaber-Fiebert J, Palinkas L, Rolls-Reutz J, Zhang J, Fisher E, Landsverk J. Exploration and adoption of evidence-based practice by U.S. child welfare agencies. Child Youth Serv Rev. 2014;39:147–52.PubMedCrossRef Horwitz SM, Hurlburt MS, Goldhaber-Fiebert J, Palinkas L, Rolls-Reutz J, Zhang J, Fisher E, Landsverk J. Exploration and adoption of evidence-based practice by U.S. child welfare agencies. Child Youth Serv Rev. 2014;39:147–52.PubMedCrossRef
16.
Zurück zum Zitat Gronbjerg KA, Salamon LM. Devolution, marketization, and the changing shape of government-nonprofit relations. In: Salamon LM, editor. The state of nonprofit America. 2nd ed. Washington: Brookings Institution; 2012. Gronbjerg KA, Salamon LM. Devolution, marketization, and the changing shape of government-nonprofit relations. In: Salamon LM, editor. The state of nonprofit America. 2nd ed. Washington: Brookings Institution; 2012.
17.
Zurück zum Zitat Smith S. The political economy of contracting and competition. In: Hasenfeld Y, editor. Human services as complex organizations. Los Angeles: Sage Publications; 2009. p. 139–60. Smith S. The political economy of contracting and competition. In: Hasenfeld Y, editor. Human services as complex organizations. Los Angeles: Sage Publications; 2009. p. 139–60.
18.
Zurück zum Zitat Chuang E, Collins-Camargo C, McBeath B, Wells R, Bunger A. An empirical typology of private child and family serving agencies. Child Youth Serv Rev. 2014;38:101–12.PubMedPubMedCentralCrossRef Chuang E, Collins-Camargo C, McBeath B, Wells R, Bunger A. An empirical typology of private child and family serving agencies. Child Youth Serv Rev. 2014;38:101–12.PubMedPubMedCentralCrossRef
19.
Zurück zum Zitat Smith S. Cross-sector nonprofit-government financing. In: Steuerle EBCE, editor. Nonprofits and government: collaboration and conflict. 3rd ed. Washington: Urban Institute Press; 2016. Smith S. Cross-sector nonprofit-government financing. In: Steuerle EBCE, editor. Nonprofits and government: collaboration and conflict. 3rd ed. Washington: Urban Institute Press; 2016.
20.
Zurück zum Zitat Collins-Camargo C, Sullivan D, Murphy A. Use of data to assess performance and promote outcome achievement by public and private child welfare agency staff. Child Youth Serv Rev. 2011;33:330–9.CrossRef Collins-Camargo C, Sullivan D, Murphy A. Use of data to assess performance and promote outcome achievement by public and private child welfare agency staff. Child Youth Serv Rev. 2011;33:330–9.CrossRef
21.
Zurück zum Zitat Chagnon F, Pouliot L, Malo C, Gervais M, Pigeon M. Comparison of determinants of research knowledge utilization by practitioners and administrators in the field of child and family social services. Implement Sci. 2010;5:41.PubMedPubMedCentralCrossRef Chagnon F, Pouliot L, Malo C, Gervais M, Pigeon M. Comparison of determinants of research knowledge utilization by practitioners and administrators in the field of child and family social services. Implement Sci. 2010;5:41.PubMedPubMedCentralCrossRef
22.
Zurück zum Zitat Barwick M, Boydell K, Stasiulis E, Ferguson H, Blase K, Fixsen D. Research utilization among children’s mental health providers. Implement Sci. 2008;3:19.PubMedPubMedCentralCrossRef Barwick M, Boydell K, Stasiulis E, Ferguson H, Blase K, Fixsen D. Research utilization among children’s mental health providers. Implement Sci. 2008;3:19.PubMedPubMedCentralCrossRef
23.
Zurück zum Zitat Aarons GA, Hurlburt MS, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38:4–23.PubMedCrossRef Aarons GA, Hurlburt MS, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38:4–23.PubMedCrossRef
24.
Zurück zum Zitat Ellen M, Leon G, Bouchard G, Lavis J, Ouimet M, Grimshaw J. What supports do health systems organizations have in place to facilitate evidence-informed decision-making? A qualitative study. Implement Sci. 2013;8:84.PubMedPubMedCentralCrossRef Ellen M, Leon G, Bouchard G, Lavis J, Ouimet M, Grimshaw J. What supports do health systems organizations have in place to facilitate evidence-informed decision-making? A qualitative study. Implement Sci. 2013;8:84.PubMedPubMedCentralCrossRef
25.
Zurück zum Zitat Flodgren G, Parmelli E, Doumit G, Gattellari M, O’Brien M, Grimshaw J, Eccles M. Local opinion leaders: Effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2011;(8):CD000125. doi:10.1002/14651858.CD000125.pub4. Flodgren G, Parmelli E, Doumit G, Gattellari M, O’Brien M, Grimshaw J, Eccles M. Local opinion leaders: Effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2011;(8):CD000125. doi:10.​1002/​14651858.​CD000125.​pub4.
26.
Zurück zum Zitat Oliver K, Innvar S, Lorenc T, Woodman J, Thomas J. A systematic review of barriers to and facilitators of the use of evidence by policymakers. BMC Health Serv Res. 2014;14:2.PubMedPubMedCentralCrossRef Oliver K, Innvar S, Lorenc T, Woodman J, Thomas J. A systematic review of barriers to and facilitators of the use of evidence by policymakers. BMC Health Serv Res. 2014;14:2.PubMedPubMedCentralCrossRef
27.
Zurück zum Zitat Ivers N, Jamtvedt G, Flottorp S, Young J, Odgaard-Jensen J, French S, O’Brien M, Johansen M, Grimshaw J, Oxman A. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;6:CD000259. doi:10.1002/14651858.CD000259.pub3. Ivers N, Jamtvedt G, Flottorp S, Young J, Odgaard-Jensen J, French S, O’Brien M, Johansen M, Grimshaw J, Oxman A. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;6:CD000259. doi:10.​1002/​14651858.​CD000259.​pub3.
28.
Zurück zum Zitat Quinn E, Huckel-Schneider C, Campbell D, Seale H, Milat A. How can knowledge exchange portals assist in knowledge management for evidence-informed decision making in public health? BMC Public Health. 2014;14:443.PubMedPubMedCentralCrossRef Quinn E, Huckel-Schneider C, Campbell D, Seale H, Milat A. How can knowledge exchange portals assist in knowledge management for evidence-informed decision making in public health? BMC Public Health. 2014;14:443.PubMedPubMedCentralCrossRef
30.
Zurück zum Zitat Pfeffer J, Salancik G. The external control of organizations: a resource dependence perspective. Stanford: Stanford University Press; 2003. Pfeffer J, Salancik G. The external control of organizations: a resource dependence perspective. Stanford: Stanford University Press; 2003.
31.
Zurück zum Zitat DiMaggio P, Powell W. The iron cage revisited: institutional isomorphism and collective rationality in organizational fields. Am Sociol Rev. 1983;48:147–60.CrossRef DiMaggio P, Powell W. The iron cage revisited: institutional isomorphism and collective rationality in organizational fields. Am Sociol Rev. 1983;48:147–60.CrossRef
32.
Zurück zum Zitat Meyer J, Rowan B. Institutionalized organizations: formal structure as myth and ceremony. Am J Sociol. 1977;83:340–63.CrossRef Meyer J, Rowan B. Institutionalized organizations: formal structure as myth and ceremony. Am J Sociol. 1977;83:340–63.CrossRef
33.
Zurück zum Zitat Bass B, Avolio B. Full range leadership development: manual for the multifactor leadership questionnaire. Redwood City: Mind Garden Inc; 1997. Bass B, Avolio B. Full range leadership development: manual for the multifactor leadership questionnaire. Redwood City: Mind Garden Inc; 1997.
34.
Zurück zum Zitat Fiedler F. A contingency model of leadership effectiveness. Adv Exp Soc Psychol. 1964;1:149–90.CrossRef Fiedler F. A contingency model of leadership effectiveness. Adv Exp Soc Psychol. 1964;1:149–90.CrossRef
35.
Zurück zum Zitat Bourgeois L. Strategy and environment: a conceptual integration. Acad Manage Rev. 1980;5:25–39. Bourgeois L. Strategy and environment: a conceptual integration. Acad Manage Rev. 1980;5:25–39.
36.
Zurück zum Zitat Jemison D. The importance of boundary spanning roles in strategic decision making. J Manag Stud. 1984;21:131–52.CrossRef Jemison D. The importance of boundary spanning roles in strategic decision making. J Manag Stud. 1984;21:131–52.CrossRef
37.
Zurück zum Zitat Hargadon A. Brokering knowledge: linking learning and innovation. Res Organ Behav. 2002;24:41–85.CrossRef Hargadon A. Brokering knowledge: linking learning and innovation. Res Organ Behav. 2002;24:41–85.CrossRef
38.
Zurück zum Zitat Innvaer S, Vist G, Trommald M, Oxman A. Health policy-makers’ perceptions of their use of evidence: a systematic review. J Health Serv Res Policy. 2002;7:239–44.PubMedCrossRef Innvaer S, Vist G, Trommald M, Oxman A. Health policy-makers’ perceptions of their use of evidence: a systematic review. J Health Serv Res Policy. 2002;7:239–44.PubMedCrossRef
39.
Zurück zum Zitat Ward V, House A, Hamer S. Knowledge brokering: the missing link in the evidence to action chain. Evid Policy. 2011;5:267–79.CrossRef Ward V, House A, Hamer S. Knowledge brokering: the missing link in the evidence to action chain. Evid Policy. 2011;5:267–79.CrossRef
40.
Zurück zum Zitat Buckley H, Tonmyr L, Lewig K, Jack S. Factors influencing the uptake of research evidence in child welfare: a synthesis of findings from Australia, Canada, and Ireland. Child Abuse Rev. 2014;23:5–16.CrossRef Buckley H, Tonmyr L, Lewig K, Jack S. Factors influencing the uptake of research evidence in child welfare: a synthesis of findings from Australia, Canada, and Ireland. Child Abuse Rev. 2014;23:5–16.CrossRef
41.
Zurück zum Zitat Provan KG, Beagles J, Mercken L, Leischow S. Awareness of evidence-based practices by organizations in a publicly funded smoking cessation network. J Public Adm Res Theory. 2012;23:133–53.CrossRef Provan KG, Beagles J, Mercken L, Leischow S. Awareness of evidence-based practices by organizations in a publicly funded smoking cessation network. J Public Adm Res Theory. 2012;23:133–53.CrossRef
42.
Zurück zum Zitat Shera W, Dill K. Promoting evidence-informed practice in child welfare in Ontario: progress, challenges, and future directions. Res Soc Work Pract. 2012;22:204–13.CrossRef Shera W, Dill K. Promoting evidence-informed practice in child welfare in Ontario: progress, challenges, and future directions. Res Soc Work Pract. 2012;22:204–13.CrossRef
43.
Zurück zum Zitat Stevens M, Liabo K, Frost S, Roberts H. Using research in practice: a research information service for social care practitioners. Child Fam Soc Work. 2005;10:67–75.CrossRef Stevens M, Liabo K, Frost S, Roberts H. Using research in practice: a research information service for social care practitioners. Child Fam Soc Work. 2005;10:67–75.CrossRef
44.
Zurück zum Zitat Lavis J, Robertson D, Woodside J, McLeod C, Abelson J. How can research organizations more effectively transfer research knowledge to decision makers? Milbank Q. 2003;81:221–48.PubMedPubMedCentralCrossRef Lavis J, Robertson D, Woodside J, McLeod C, Abelson J. How can research organizations more effectively transfer research knowledge to decision makers? Milbank Q. 2003;81:221–48.PubMedPubMedCentralCrossRef
45.
Zurück zum Zitat Mays GP, Hogg R, Castellanos-Cruz D, Hoover A, Fowler L. Public health research implementation and translation: evidence from practice-based research networks. Am J Prev Med. 2013;45:752–62.PubMedPubMedCentralCrossRef Mays GP, Hogg R, Castellanos-Cruz D, Hoover A, Fowler L. Public health research implementation and translation: evidence from practice-based research networks. Am J Prev Med. 2013;45:752–62.PubMedPubMedCentralCrossRef
46.
Zurück zum Zitat LeRoux K, Wright N. Does performance measurement improve strategic decision making? Findings from a national survey of nonprofit social service agencies. Nonprofit Volunt Sect Q. 2010;39:571–87.CrossRef LeRoux K, Wright N. Does performance measurement improve strategic decision making? Findings from a national survey of nonprofit social service agencies. Nonprofit Volunt Sect Q. 2010;39:571–87.CrossRef
48.
Zurück zum Zitat Bakken S, Currie L, John R. Information and communication technology infrastructure for evidence-based practice. In: Holzemer W, editor. Improving health through nursing research. West Sussex: Wiley-Blackwell; 2010. Bakken S, Currie L, John R. Information and communication technology infrastructure for evidence-based practice. In: Holzemer W, editor. Improving health through nursing research. West Sussex: Wiley-Blackwell; 2010.
49.
Zurück zum Zitat Syed-Ikhsan S, Rowland F. Knowledge management in a public organization: a study on the relationship between organizational elements and the performance of knowledge transfer. J Knowl Manag. 2004;8:95–111.CrossRef Syed-Ikhsan S, Rowland F. Knowledge management in a public organization: a study on the relationship between organizational elements and the performance of knowledge transfer. J Knowl Manag. 2004;8:95–111.CrossRef
50.
Zurück zum Zitat Mithas S, Ramasubbu N, Sambamurthy V. How information management capability influences firm performance. MIS Q. 2011;35:237–56. Mithas S, Ramasubbu N, Sambamurthy V. How information management capability influences firm performance. MIS Q. 2011;35:237–56.
51.
Zurück zum Zitat White K, Griffith JH. The well-managed health care organization. 7th ed. Chicago: Health Administration Press; 2010. White K, Griffith JH. The well-managed health care organization. 7th ed. Chicago: Health Administration Press; 2010.
52.
Zurück zum Zitat Glisson C, James LR. The cross-level effects of culture and climate in human service teams. J Organ Behav. 2002;23:767-94. Glisson C, James LR. The cross-level effects of culture and climate in human service teams. J Organ Behav. 2002;23:767-94.
53.
Zurück zum Zitat Schneider B, Ehrhart M, Macey W. Organizational culture and climate. Annu Rev Psychol. 2013;64:361–88.PubMedCrossRef Schneider B, Ehrhart M, Macey W. Organizational culture and climate. Annu Rev Psychol. 2013;64:361–88.PubMedCrossRef
54.
Zurück zum Zitat Tolbert P. Institutional environments and resource dependence: sources of administrative structure in institutions of higher education. Adm Sci Q. 1985;30:1–13.CrossRef Tolbert P. Institutional environments and resource dependence: sources of administrative structure in institutions of higher education. Adm Sci Q. 1985;30:1–13.CrossRef
55.
Zurück zum Zitat Hillman A, Withers M, Collins B. Resource dependence theory: a review. J Manag. 2009;35:1404–27. Hillman A, Withers M, Collins B. Resource dependence theory: a review. J Manag. 2009;35:1404–27.
56.
Zurück zum Zitat Oliver C. Determinants of interorganizational relationships: integration and future directions. Acad Manage Rev. 1990;15:241–65. Oliver C. Determinants of interorganizational relationships: integration and future directions. Acad Manage Rev. 1990;15:241–65.
57.
Zurück zum Zitat Lee M. Children and family service agencies’ experience with the initial Council on Accreditation process: an exploratory study. Families in Society: The Journal of Contemporary Social Services. 2013;94:23–30.CrossRef Lee M. Children and family service agencies’ experience with the initial Council on Accreditation process: an exploratory study. Families in Society: The Journal of Contemporary Social Services. 2013;94:23–30.CrossRef
58.
Zurück zum Zitat Thyer B. What is evidence-based practice? In: Roberts A, Yeager K, editors. Foundations of evidence-based social work practice. New York: Oxford University Press; 2006. Thyer B. What is evidence-based practice? In: Roberts A, Yeager K, editors. Foundations of evidence-based social work practice. New York: Oxford University Press; 2006.
59.
Zurück zum Zitat Saufl N, Fieldus M. Accreditation: a ‘voluntary’ regulatory requirement. J Perianesth Nurs. 2003;18:152–9.PubMedCrossRef Saufl N, Fieldus M. Accreditation: a ‘voluntary’ regulatory requirement. J Perianesth Nurs. 2003;18:152–9.PubMedCrossRef
60.
Zurück zum Zitat Ulrich D, Barney J. Perspectives in organizations: resource dependence, efficiency, and population. Acad Manage Rev. 1984;9:471–81. Ulrich D, Barney J. Perspectives in organizations: resource dependence, efficiency, and population. Acad Manage Rev. 1984;9:471–81.
61.
Zurück zum Zitat Sharfman M, Wolf G, Chase R, Tansik D. Antecedents of organizational slack. Acad Manage Rev. 1988;13:601–14. Sharfman M, Wolf G, Chase R, Tansik D. Antecedents of organizational slack. Acad Manage Rev. 1988;13:601–14.
62.
Zurück zum Zitat Kimberly JR. Organizational size and the structuralist perspective: a review, critique, and proposal. Adm Sci Q. 1976;21:571–97.CrossRef Kimberly JR. Organizational size and the structuralist perspective: a review, critique, and proposal. Adm Sci Q. 1976;21:571–97.CrossRef
63.
Zurück zum Zitat Nutley S, Walter I, Davies H. Using evidence: how research can inform public services. Bristol: Policy Press; 2007. Nutley S, Walter I, Davies H. Using evidence: how research can inform public services. Bristol: Policy Press; 2007.
64.
Zurück zum Zitat Squires J, Estabrooks C, Gustavsson P, Wallin L. Individual determinants of research utilization by nurses: a systematic review update. Implement Sci. 2011;6:1.PubMedPubMedCentralCrossRef Squires J, Estabrooks C, Gustavsson P, Wallin L. Individual determinants of research utilization by nurses: a systematic review update. Implement Sci. 2011;6:1.PubMedPubMedCentralCrossRef
65.
Zurück zum Zitat Thompson J. Organizations in action. New York: McGraw-Hill; 1967. Thompson J. Organizations in action. New York: McGraw-Hill; 1967.
66.
Zurück zum Zitat Aldrich H, Herker D. Boundary spanning roles and organization structure. Acad Manage Rev. 1977;2:217–30. Aldrich H, Herker D. Boundary spanning roles and organization structure. Acad Manage Rev. 1977;2:217–30.
67.
Zurück zum Zitat Banaszak-Holl J, Nembhard I, Taylor L, Bradley EH. Leadership and management: a framework for action. In: Burns L, Bradley E, Weiner B, editors. Shortell & Kaluzny’s health care management: organization design and behavior. 6th ed. Clifton Park: Delmar; 2012. Banaszak-Holl J, Nembhard I, Taylor L, Bradley EH. Leadership and management: a framework for action. In: Burns L, Bradley E, Weiner B, editors. Shortell & Kaluzny’s health care management: organization design and behavior. 6th ed. Clifton Park: Delmar; 2012.
68.
Zurück zum Zitat Creswell J, Plano Clark V. Designing and conducting mixed methods research. 2nd ed. Thousand Oaks: Sage; 2011. Creswell J, Plano Clark V. Designing and conducting mixed methods research. 2nd ed. Thousand Oaks: Sage; 2011.
69.
Zurück zum Zitat McBeath B, Collins-Camargo C, Chuang E. Portrait of private agencies in the child welfare system: principal results from the National Survey of Private Child and Family Serving Agencies, Report by the National Quality Improvement Center on the Privatization of Child Welfare Services. Funded by the U.S. Health and Human Services Administration for Children and Families, Children’s Bureau. 2011. McBeath B, Collins-Camargo C, Chuang E. Portrait of private agencies in the child welfare system: principal results from the National Survey of Private Child and Family Serving Agencies, Report by the National Quality Improvement Center on the Privatization of Child Welfare Services. Funded by the U.S. Health and Human Services Administration for Children and Families, Children’s Bureau. 2011.
70.
Zurück zum Zitat Collins-Camargo C, Armstrong M, McBeath B, Chuang E. Promoting cross-sector partnerships in child welfare: qualitative results from a five-state strategic planning process. Child Welfare. 2013;92:33–63.PubMed Collins-Camargo C, Armstrong M, McBeath B, Chuang E. Promoting cross-sector partnerships in child welfare: qualitative results from a five-state strategic planning process. Child Welfare. 2013;92:33–63.PubMed
71.
Zurück zum Zitat Collins-Camargo C, Chuang E, McBeath B, Bunger A. Private child welfare agency managers’ perceptions of the effectiveness of different performance management strategies. Child Youth Serv Rev. 2014;38:133–41.CrossRef Collins-Camargo C, Chuang E, McBeath B, Bunger A. Private child welfare agency managers’ perceptions of the effectiveness of different performance management strategies. Child Youth Serv Rev. 2014;38:133–41.CrossRef
72.
Zurück zum Zitat Chuang E, Dill J, Morgan J, Konrad TR. A configurational approach to the relationship between high performance work practices and frontline health care worker outcomes. Health Serv Res. 2012;47:1460–81.PubMedPubMedCentralCrossRef Chuang E, Dill J, Morgan J, Konrad TR. A configurational approach to the relationship between high performance work practices and frontline health care worker outcomes. Health Serv Res. 2012;47:1460–81.PubMedPubMedCentralCrossRef
73.
Zurück zum Zitat Chuang E, Wells R. The role of inter-agency collaboration in facilitating receipt of behavioral health services for youth involved with child welfare and juvenile justice. Child Youth Serv Rev. 2010;32:1814–22.PubMedPubMedCentralCrossRef Chuang E, Wells R. The role of inter-agency collaboration in facilitating receipt of behavioral health services for youth involved with child welfare and juvenile justice. Child Youth Serv Rev. 2010;32:1814–22.PubMedPubMedCentralCrossRef
74.
Zurück zum Zitat McBeath B, Briggs HE, Aisenberg E. The role of child welfare managers in promoting agency performance through experimentation. Child Youth Serv Rev. 2009;31:112–8.CrossRef McBeath B, Briggs HE, Aisenberg E. The role of child welfare managers in promoting agency performance through experimentation. Child Youth Serv Rev. 2009;31:112–8.CrossRef
75.
Zurück zum Zitat Chuang E, Wells R, Alexander J, Green S. How outpatient substance abuse treatment unit director activities may affect provision of community outreach services. Drugs: Education, Prevention, and Policy. 2013;20:149–59. Chuang E, Wells R, Alexander J, Green S. How outpatient substance abuse treatment unit director activities may affect provision of community outreach services. Drugs: Education, Prevention, and Policy. 2013;20:149–59.
76.
Zurück zum Zitat Litwin M. How to assess and interpret survey psychometrics. 2nd ed. Thousand Oaks: Sage Publications; 2002. Litwin M. How to assess and interpret survey psychometrics. 2nd ed. Thousand Oaks: Sage Publications; 2002.
77.
Zurück zum Zitat Rihoux B, Ragin C, editors. Configurational comparative methods: qualitative comparative analysis (QCA) and related techniques. Thousand Oaks: Sage Publications; 2009. Rihoux B, Ragin C, editors. Configurational comparative methods: qualitative comparative analysis (QCA) and related techniques. Thousand Oaks: Sage Publications; 2009.
78.
Zurück zum Zitat Ragin C. Redesigning social inquiry: fuzzy sets and beyond. Chicago: University of Chicago Press; 2008.CrossRef Ragin C. Redesigning social inquiry: fuzzy sets and beyond. Chicago: University of Chicago Press; 2008.CrossRef
79.
Zurück zum Zitat Schneider C, Wagemann C. Set-theoretic methods for the social sciences: a guide to qualitative comparative analysis. Cambridge: Cambridge University Press; 2012.CrossRef Schneider C, Wagemann C. Set-theoretic methods for the social sciences: a guide to qualitative comparative analysis. Cambridge: Cambridge University Press; 2012.CrossRef
80.
Zurück zum Zitat Thygeson N, Solberg L, Asche S, Fontaine P, Pawlson L, Scholle S. Using fuzzy set qualitative comparative analysis (fs/QCA) to explore the relationship between medical ‘homeness’ and quality. Health Serv Res. 2012;47:22–45.CrossRef Thygeson N, Solberg L, Asche S, Fontaine P, Pawlson L, Scholle S. Using fuzzy set qualitative comparative analysis (fs/QCA) to explore the relationship between medical ‘homeness’ and quality. Health Serv Res. 2012;47:22–45.CrossRef
81.
Zurück zum Zitat Kane H, Lewis MA, Williams P, Kahwati L. Using qualitative comparative analysis to understand and quantify translation and implementation. Transl Behav Med. 2014;4:201–8.PubMedPubMedCentralCrossRef Kane H, Lewis MA, Williams P, Kahwati L. Using qualitative comparative analysis to understand and quantify translation and implementation. Transl Behav Med. 2014;4:201–8.PubMedPubMedCentralCrossRef
82.
Zurück zum Zitat StataCorp. Stata statistical software: release 13. College Station: StataCorp LP; 2013. StataCorp. Stata statistical software: release 13. College Station: StataCorp LP; 2013.
83.
Zurück zum Zitat Longest KC, Vaisey S. Fuzzy: a program for performing qualitative comparative analysis (QCA) in Stata. Stata J. 2008;8:79–104. Longest KC, Vaisey S. Fuzzy: a program for performing qualitative comparative analysis (QCA) in Stata. Stata J. 2008;8:79–104.
84.
Zurück zum Zitat Patton M. Qualitative research and evaluation methods. Thousand Oaks: Sage Publications; 2002. Patton M. Qualitative research and evaluation methods. Thousand Oaks: Sage Publications; 2002.
85.
Zurück zum Zitat Krueger R, Casey M. Focus groups: a practical guide for applied research. 4th ed. Thousand Oaks: Sage Publications; 2009. Krueger R, Casey M. Focus groups: a practical guide for applied research. 4th ed. Thousand Oaks: Sage Publications; 2009.
86.
Zurück zum Zitat Bowen G. Document analysis as a qualitative research method. Qual Res J. 2009;9:27–40.CrossRef Bowen G. Document analysis as a qualitative research method. Qual Res J. 2009;9:27–40.CrossRef
87.
Zurück zum Zitat Bazeley P, Jackson K. Qualitative data analysis with NVivo. 2nd ed. London: Sage; 2013. Bazeley P, Jackson K. Qualitative data analysis with NVivo. 2nd ed. London: Sage; 2013.
88.
Zurück zum Zitat Yin R. Case study research. 3rd ed. Thousand Oaks: Sage Publications; 2003. Yin R. Case study research. 3rd ed. Thousand Oaks: Sage Publications; 2003.
89.
Zurück zum Zitat Miles MB, Huberman AM. Qualitative data analysis: an expanded sourcebook. Thousand Oaks: Sage Pubns; 1994. Miles MB, Huberman AM. Qualitative data analysis: an expanded sourcebook. Thousand Oaks: Sage Pubns; 1994.
90.
Zurück zum Zitat Gough D. Research for practice in child neglect. In: Taylor J, Daniel B, editors. Neglect: practice issues for health and social care. London: Jessica Kingsley; 2004. p. 43–56. Gough D. Research for practice in child neglect. In: Taylor J, Daniel B, editors. Neglect: practice issues for health and social care. London: Jessica Kingsley; 2004. p. 43–56.
91.
Zurück zum Zitat Kerner J. Knowledge translation versus knowledge integration: a funder’s perspective. J Contin Educ Heal Prof. 2006;26:72–80.CrossRef Kerner J. Knowledge translation versus knowledge integration: a funder’s perspective. J Contin Educ Heal Prof. 2006;26:72–80.CrossRef
92.
Zurück zum Zitat Kitson A, Rycroft-Malone J, Harvey G, mcCormack B, Seers K, Titchen A. Evaluating the successful implementation of evidence into practice using the PARIHS framework: theoretical and practical challenges. Implement Sci. 2008;3:1.PubMedPubMedCentralCrossRef Kitson A, Rycroft-Malone J, Harvey G, mcCormack B, Seers K, Titchen A. Evaluating the successful implementation of evidence into practice using the PARIHS framework: theoretical and practical challenges. Implement Sci. 2008;3:1.PubMedPubMedCentralCrossRef
93.
Zurück zum Zitat Glasgow R, Emmons K. How can we increase translation of research into practice? Types of evidence needed. Annu Rev Public Health. 2007;28:413–33.PubMedCrossRef Glasgow R, Emmons K. How can we increase translation of research into practice? Types of evidence needed. Annu Rev Public Health. 2007;28:413–33.PubMedCrossRef
94.
Zurück zum Zitat Curran G, Bauer M, Mittman B, Pyne J, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50:217–26.PubMedPubMedCentralCrossRef Curran G, Bauer M, Mittman B, Pyne J, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50:217–26.PubMedPubMedCentralCrossRef
95.
Zurück zum Zitat Aarons GA, Green A, Palinkas L, Self-Brown S, Whitaker D, Lutzker J, Silovsky J, Hecht D, Chaffin M. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implement Sci. 2012;7:32.PubMedPubMedCentralCrossRef Aarons GA, Green A, Palinkas L, Self-Brown S, Whitaker D, Lutzker J, Silovsky J, Hecht D, Chaffin M. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implement Sci. 2012;7:32.PubMedPubMedCentralCrossRef
96.
Zurück zum Zitat Collins-Camargo C, McBeath B, Ensign K. Privatization and performance-based contracting in child welfare: recent trends and implications for social service administrators. Adm Soc Work. 2011;35:494–516. Collins-Camargo C, McBeath B, Ensign K. Privatization and performance-based contracting in child welfare: recent trends and implications for social service administrators. Adm Soc Work. 2011;35:494–516.
97.
Zurück zum Zitat McBeath B, Collins-Camargo C, Chuang E. The role of the private sector in child welfare: historical reflections and a contemporary snapshot based on the National Survey of Private Child and Family Serving Agencies. J Publ Child Welfare. 2012;6:459–81.CrossRef McBeath B, Collins-Camargo C, Chuang E. The role of the private sector in child welfare: historical reflections and a contemporary snapshot based on the National Survey of Private Child and Family Serving Agencies. J Publ Child Welfare. 2012;6:459–81.CrossRef
98.
Zurück zum Zitat Milward HB, Provan KG. Managing the hollow state: collaboration and contracting. Public Manag Rev. 2003;5:1–18.CrossRef Milward HB, Provan KG. Managing the hollow state: collaboration and contracting. Public Manag Rev. 2003;5:1–18.CrossRef
99.
Zurück zum Zitat Smith S, Lipsky M. Nonprofits for hire: the welfare state in the age of contracting. Cambridge: Harvard University Press; 1993. Smith S, Lipsky M. Nonprofits for hire: the welfare state in the age of contracting. Cambridge: Harvard University Press; 1993.
100.
Zurück zum Zitat Estabrooks C, Squires J, Strandberg E, Nilsson-Kajermo K, Scott S, Profetto-McGrath J, Harley D, Wallin L. Towards better measures of research utilization: a collaborative study in Canada and Sweden. J Adv Nurs. 2011;67:1705–18.PubMedCrossRef Estabrooks C, Squires J, Strandberg E, Nilsson-Kajermo K, Scott S, Profetto-McGrath J, Harley D, Wallin L. Towards better measures of research utilization: a collaborative study in Canada and Sweden. J Adv Nurs. 2011;67:1705–18.PubMedCrossRef
101.
Zurück zum Zitat Weiss C, Murphy-Graham E, Birkeland S. An alternate route to policy influence: how evaluations affect D.A.R.E. Am J Eval. 2005;26:12–30.CrossRef Weiss C, Murphy-Graham E, Birkeland S. An alternate route to policy influence: how evaluations affect D.A.R.E. Am J Eval. 2005;26:12–30.CrossRef
102.
Zurück zum Zitat Palinkas L, Garcia A, Aarons GA, Finno-Velasquez M, Holloway I, Mackie T, Leslie LK, Chamberlain P. Measuring use of research evidence: the Structured Interview for Evidence Use (SIEU). Res Soc Work Pract. 2014;26:550-64. Palinkas L, Garcia A, Aarons GA, Finno-Velasquez M, Holloway I, Mackie T, Leslie LK, Chamberlain P. Measuring use of research evidence: the Structured Interview for Evidence Use (SIEU). Res Soc Work Pract. 2014;26:550-64.
103.
Zurück zum Zitat McBeath B, Austin M. The organizational context of research-minded practitioners: challenges and opportunities. Res Soc Work Pract. 2015;25:446–59.CrossRef McBeath B, Austin M. The organizational context of research-minded practitioners: challenges and opportunities. Res Soc Work Pract. 2015;25:446–59.CrossRef
104.
Zurück zum Zitat Squires J, Estabrooks C, Newburn-Cook C, Gierl M. Validation of the conceptual research utilization scale: an application of the standards for educational and psychological testing in healthcare. BMC Health Serv Res. 2011;11:107.PubMedPubMedCentralCrossRef Squires J, Estabrooks C, Newburn-Cook C, Gierl M. Validation of the conceptual research utilization scale: an application of the standards for educational and psychological testing in healthcare. BMC Health Serv Res. 2011;11:107.PubMedPubMedCentralCrossRef
105.
Zurück zum Zitat Adams T, Heeringa SG. Outpatient substance abuse treatment survey: technical documentation for OSATSS-5, 1999–2000. Ann Arbor: Institute for Social Research; 2001. Adams T, Heeringa SG. Outpatient substance abuse treatment survey: technical documentation for OSATSS-5, 1999–2000. Ann Arbor: Institute for Social Research; 2001.
106.
Zurück zum Zitat Heeringa SG. Outpatient drug abuse treatment studies: technical documentation. Ann Arbor: Institute for Social Research, University of Michigan; 1996. Heeringa SG. Outpatient drug abuse treatment studies: technical documentation. Ann Arbor: Institute for Social Research, University of Michigan; 1996.
107.
Zurück zum Zitat Lizarondo L, Grimmer-Somers K, Kumar S. A systematic review of the individual determinants of research evidence use in allied health. J Multidiscip Healthc. 2011;4:261–72.PubMedPubMedCentralCrossRef Lizarondo L, Grimmer-Somers K, Kumar S. A systematic review of the individual determinants of research evidence use in allied health. J Multidiscip Healthc. 2011;4:261–72.PubMedPubMedCentralCrossRef
Metadaten
Titel
Organizational supports used by private child and family serving agencies to facilitate evidence use: a mixed methods study protocol
verfasst von
Emmeline Chuang
Crystal Collins-Camargo
Bowen McBeath
Publikationsdatum
01.12.2017
Verlag
BioMed Central
Erschienen in
Implementation Science / Ausgabe 1/2017
Elektronische ISSN: 1748-5908
DOI
https://doi.org/10.1186/s13012-017-0580-1

Weitere Artikel der Ausgabe 1/2017

Implementation Science 1/2017 Zur Ausgabe