Advertisement

Advancing Implementation Research and Practice in Behavioral Health Systems

  • Byron J. Powell
  • Rinad S. Beidas
INTRODUCTION

Introduction

Concerns about the quality of health and mental health services have led to the development and prioritization of implementation science within the portfolio of health services research (Institute of Medicine 2009a; National Institutes of Health 2013). The science of implementation has advanced rapidly over the past decade (Chambers 2012). However, much of the empirical literature has focused on clinician and organizational-level factors and strategies that influence the adoption, implementation, and sustainment of evidence-based practices (EBPs; Aarons et al. 2014; Aarons et al. 2012; Glisson and Williams 2015). Despite conceptual literature that points to the importance of system-level influences on adoption, implementation, and sustainment (e.g., Aarons et al. 2011; Damschroder et al. 2009; Flottorp et al. 2013), there is a void in empirical work that attends to this matter. This may be due to the inherent difficulty of studying contexts and strategies within large systems. However, as service systems face increasing pressure to promote the adoption, implementation and sustainment of EBPs, particularly through the passing of the Patient Protection and Affordable Care Act, there is a critical need for empirical research that can inform system-level implementation efforts. The purpose of this special issue is to present a set of articles that can inform system-level implementation research and practice by suggesting how contexts and strategies can be leveraged to promote implementation and quality improvement in behavioral health, and to encourage dialogue and further empirical research in this area.

Exploration, Preparation, Implementation and Sustainment (EPIS) Framework

This special issue was informed by conceptual models of implementation that emphasize contextual factors of implementation that surround the implementation of innovations (e.g., Aarons et al. 2011; Damschroder et al. 2009; Raghavan et al. 2008). Given its focus on implementation in public service sectors, we used the Exploration, Preparation, Implementation, and Sustainment (EPIS) framework (see Fig. 1) developed by Aarons et al. (2011) to organize the contributions made by each manuscript. The EPIS framework highlights both processes and determinants of implementation (Nilsen 2015) by outlining four phases of the implementation process (EPIS) and identifying domains that are important to implementation including the inner context (e.g., organizational and therapist characteristics), outer context (e.g., the service environment, inter-organizational environment, and consumer support), and fit between the EBP being implemented and the inner and outer contexts. Within each phase, contextual variables relevant to the inner or outer context are posited. For example, during implementation, inner context variables hypothesized to be important to the implementation process include organizational culture (i.e., “the behavioral expectations and norms that characterize the way work is done in an organization;” Glisson et al. 2006, p. 858), organizational climate (i.e., the shared employee perception of the psychological impact of the work environment on their well-being; Glisson et al. 2008), and clinician attitudes toward EBPs. Outer context variables include funding, engagement with treatment developers, and leadership. A growing body of empirical work supports the importance of the contextual factors specified in the EPIS framework (e.g., Aarons et al. 2015; Beidas et al. 2015, 2016; Cook et al. 2015; Glisson and Williams 2015; Isett et al. 2007). We will organize our discussion of the manuscripts in this special issue according to the phase(s) of the EPIS they best map onto, though some manuscripts span multiple phases.
Fig. 1

The Exploration, Preparation, Implementation, and Sustainment (EPIS) framework.

Note. Reproduced from (Aarons et al. 2011)

Brief Summary of Articles

This special issue includes 12 articles, many of which draw upon system-level implementation efforts that serve as natural laboratories for the study of implementation processes and outcomes. We discuss the contributions of each of these articles relative to broad methodological issues and each of the four phases of the EPIS framework. The special issue concludes with a commentary written by leaders of a large public behavioral health system that highlights their perspectives on system-level implementation and needed areas for research inquiry.

Methodological Contributions

The special issue includes a number of manuscripts that focus on emergent methodologies that are relevant to studying system-level implementation across the four EPIS phases. Zimmerman and colleagues (this issue) note that implementation is too often driven by a trial and error approach, and demonstrate that participatory system dynamics modeling can be a powerful tool to improve the implementation planning process. Participatory system dynamics (Hovmand 2014) is a way of triangulating stakeholder expertise, data, and model simulations. It offers the opportunity to compare different implementation plans before starting a change effort, potentially saving the time and money associated with ill-conceived implementation plans. While this method has been suggested as a potentially useful means of selecting and tailoring implementation strategies (Powell et al. 2015), Zimmerman and colleagues (this issue) break new ground as they apply participatory system dynamics to address the challenge of implementing better care for Posttraumatic Stress Disorder in the U.S. Department of Veterans Affairs mental health system. Walker and colleagues (this issue) take on the task of determining where systems should invest in EBP implementation by applying a Geographic Information Systems approach in Washington State. They identify need, service availability, and “service deserts,” in which the demand for services far outweighs delivery capacity. This compelling example illustrates how systems might allocate resources and implementation supports in a more targeted way.

Exploration Phase

During the exploration phase, organizations and individuals are aware of an area for growth in their organization and that there may be a more advantageous approach to service delivery (Aarons et al. 2011). Inner context factors such as an organization’s absorptive capacity (i.e., “organizational routines and processes by which firms acquire, assimilate, transform, and exploit knowledge to produce a dynamic organizational capability;” Zahra and George 2002), readiness to change (Weiner 2009), and the organizational context (e.g., organizational culture and climate; Glisson et al. 2008) are especially pertinent during the exploration phase (Aarons et al. 2011). Particularly relevant to this special issue, outer context factors driving awareness of alternative service delivery approaches include system-level legislation or mandates around EBPs (e.g., Trupin and Kerns 2015), funding supporting EBPs (e.g., Hoagwood et al. 2014), client advocacy groups calling for the use of EBPs (Birkel et al. 2003), and interorganizational networks (e.g., direct networking with other organizations implementing EBPs; Bunger et al. 2014; Hurlburt et al. 2014). These inner and outer context factors may propel an organization to consider alternative service delivery approaches.

A number of articles in this special issue can inform the exploration phase. The aforementioned article by Walker and colleagues (this issue) addresses this phase most explicitly by showing how Geographic Information Systems mapping can be useful in exploring high priority community targets for implementation investments. Kotte and colleagues (this issue) discuss the importance of engaging partners that span inner and outer contexts, and provide a useful example of how a longstanding partnership between the State of Hawaii and the University of Hawaii led to the exploration of more effective ways implementing a measurement feedback system (Bickman 2008) in child and adolescent mental health services. Saldana and colleagues (this issue) discuss a partnership in which the New York City child welfare system engaged treatment developers to develop and pilot an innovative approach to improving the interactions between supervisors, caseworkers, and caregivers. A careful consideration of existing EBPs led to the conclusion that a new approach would need to be developed that could more feasibly be taken to scale across an entire child welfare system. Thus, Saldana and colleagues (this issue) developed the R3 model, which combines elements of EBPs with a supervision approach so that the elements could be more seamlessly integrated into the child welfare system. The partnership essentially ensured that the characteristics of the EBP would be carefully considered and that the intervention would be “designed for dissemination” (Brownson et al. 2013). Beidas and colleagues (this issue) reported several concerns related to the exploration process from the perspective of system leaders, treatment developers, and organizational leaders in Philadelphia. One shared concern was that there was not always a thorough assessment of the fit between particular EBPs and the organizations in which they were implemented prior to initiating implementation. Powell et al. (this issue) detail how Philadelphia’s system leaders have responded to this concern and attempted to explore the fit between EBPs and organizations prior to implementation by piloting a new method of contracting with agencies based upon the Getting to Outcomes framework (Chinman et al. 2004; Community Behavioral Health 2015). This systematic approach has promise for improving communication between system and organizational stakeholders, clarifying expectations up-front, and increasing the chances that the EBP(s) implemented will be acceptable, appropriate, and feasible. Collectively, these articles speak to the importance of partnership in implementation research (Chambers and Azrin 2013), as well as deliberately considering the fit between the EBP(s) and the inner and outer contexts, particularly in the exploration phase.

Preparation Phase

The adoption and preparation phase is characterized by the initial decision to implement an EBP (Aarons et al. 2011; Rogers 2003). This phase can be influenced by inner context factors such as organizational characteristics, organizational structures, and leadership (Aarons et al. 2011). For example, larger organizations are more likely to adopt innovations (Damanpour 1991), and transformational leadership (i.e., leadership that is characterized by charisma, inspiration, intellectual stimulation, and consideration of individual staff members’ interests; Avolio et al. 1999) has been shown to be associated with more positive innovation climates (Aarons and Sommerfeld 2012, p. 2012). Outer context factors that may influence the adoption phase include legislation (e.g., definitions of evidence), funding (e.g., systems providing support for the adoption of a particular EBP), client advocacy (e.g., lawsuits such as Felix Consent Decree; Chorpita and Donkervoet 2005), and interorganizational networks (e.g., networking with other agencies within a system implementing EBPs).

Although not explicitly studied in any of the manuscripts, several of the studies shed light on the adoption decision and preparation phase. Beidas and colleagues (this issue) provide insights from system leaders, treatment developers, and organizational leaders about the factors that motivate and facilitate the adoption of EBPs; a number of stakeholders reported that they adopted EBPs because it was consistent with their mission to provide the best and most effective care. Kotte and colleagues (this issue) prepared for scaling up a measurement feedback system by piloting it with a smaller sample of care coordinators and conducting focus groups to assess barriers and facilitators to implementation. This helped them to optimize the measurement feedback system in preparation for adopting it statewide. Finally, Zimmerman et al. (this issue) show how participatory system dynamics can be useful in the preparation phase by simulating different implementation plans prior to actual implementation. Similar approaches could also be used to simulate the impact of other assessment or intervention approaches (e.g., Lyon et al. 2015).

Implementation Phase

During the implementation phase, organizations and individuals are actively engaging in the steps necessary to implement an EBP in their setting. Aarons et al. (2011) suggest several salient inner context factors at this phase, including organizational readiness for change (Weiner 2009), organizational culture and climate (Glisson et al. 2008), and innovation-values fit (Klein and Sorra 1996). There are also a host of outer context factors relevant including funding (e.g., contracts and payment for training in EBPs), interorganizational networks (e.g., information sharing across organizations within a system), the role of intervention developers (e.g., engaging with organizations implementing EBPs), and leadership (e.g., effective leadership practices for system leaders; Aarons et al. 2011).

A number of the manuscripts focus on describing implementation through naturalistic observations of ongoing system-level efforts to implement EBPs. Beidas and colleagues (this issue) present barriers and facilitators to the implementation of four EBPs in the City of Philadelphia. The three stakeholder groups converged on the importance of inner (e.g., agency competing demands) and outer context factors (e.g., funding) as barriers and facilitators to implementation. Kotte and colleagues (this issue) and Ross and colleagues (this issue) examine facilitators and barriers to the implementation of a measurement feedback system in the state of Hawaii (Kotte) and Veterans Affairs Canada (Ross). Both of these articles identify characteristics of the innovation as an important factor in successful implementation (Aarons et al. 2011; Rogers 2003). Olin and colleagues (this issue) investigate outer context and inner context factors that predicted clinician drop out from a statewide training on an EBP in the State of NY, finding that younger clinicians and those that practiced in upstate-rural areas were less likely to drop out from training in an EBP. Finally, Rosen and colleagues (this issue) present a review of research focusing on the implementation of two EBPs for PTSD in the U.S. Department of Veterans Affairs. The article serves as one model of how we can learn across multiple implementation efforts that focus on specific EBPs and settings.

Several articles go beyond naturalistic descriptions of the implementation process and describe the implementation strategies used in system-level implementation. Powell and colleagues (this issue) describe the approach taken by the City of Philadelphia to implement four EBPs, using the policy ecology framework (Raghavan et al. 2008) to emphasize the multi-level implementation strategies used. Saldana and colleagues (this issue) leverage a system-initiated effort to develop a supervisor-focused implementation approach and evaluate its feasibility in child welfare in the State of New York. Similarly, Nadeem and colleagues (this issue) present findings from an effort to develop and test a theory-based learning collaborative model as an implementation strategy in children’s behavioral health in the State of New York. Although preliminary, these pilot studies provide exemplars for how to partner with communities to develop interventions and implementation strategies that are acceptable, appropriate, and feasible in large behavioral health systems.

Sustainment Phase

The sustainment phase involves maintaining the use of EBPs so that they come to represent “treatment as usual” (Aarons et al. 2011). Little is known about the sustainment of EBPs (Wiltsey Stirman et al. 2012), despite the fact that many implementation science models include sustainment as a key consideration in the implementation process (Tabak et al. 2012). Aarons et al. (2011) suggest several inner context factors hypothesized to be important in the sustainment phase, including leadership, organizational culture, a critical mass of therapists using the EBP (Powell et al. 2013), ongoing fidelity monitoring and support, and adequate staffing. Outer context factors potentially impacting sustainment include leadership at the service system level, legislation that supports sustainment of EBPs, continued funding following the initial implementation investment, and public-academic collaborations that can support the continued process of maintaining EBPs in public settings (Aarons et al. 2011). Aarons and colleagues (this issue) empirically investigate the impact of leadership on sustainment of EBPs using mixed-methods to provide empirical support to their conceptual model. Their study suggests that sustainment was associated with leadership as hypothesized by the EPIS model. Brookman-Frazee and colleagues (this issue) use administrative data to characterize the sustainment of EBPs implemented following a fiscal mandate in Los Angeles County over 6 years. This study makes an important contribution by focusing on sustainment of multiple EBPs, whereas most studies focus on the implementation and/or sustainment of a single EBP (Chambers 2012).

Emerging Themes

Several themes are highlighted in the articles in this special issue. First and foremost, we note the practical relevance of these articles for informing system leaders and policy makers. The included studies demonstrate a balance of rigor and relevance, and answer questions about where investments in EBPs should be made (Walker et al. this issue), which approaches most improve EBP timing and reach (Zimmerman et al. this issue), and how administrative data can be used to study the penetration and sustainment of multiple EBPs in a large behavioral health system (Brookman-Frazee et al. this issue). The studies also demonstrate laudable attempts to develop and test implementation strategies that are feasible, sustainable, and scaleable in large systems (Nadeem et al. this issue; Saldana et al. this issue). The rich and actionable research questions posed in these studies are made possible by highly partnered efforts between system and organizational leaders, direct care providers, consumers of behavioral health services, treatment developers, and academics. Each of these stakeholder groups have important perspectives that need to be brought to bear in system-level implementation (Chambers and Azrin 2013). These studies testify to both the power of partnerships and the challenges and opportunities associated with aligning the visions and efforts of diverse stakeholders (Beidas et al. this issue; Powell et al. this issue).

Many of the included studies leveraged ongoing system-level implementation efforts as natural laboratories to study implementation determinants, processes, and outcomes. This is a pragmatic necessity given the cost of large-scale efforts; implementation researchers often need to take advantage of opportunities to study the implementation of services delivered outside of the context of well-controlled studies. Observational studies are both expected and encouraged (Wensing et al. 2005). However, developing, refining, and testing implementation strategies is a federal priority (Institute of Medicine 2009a, b; National Institutes of Health 2013), and we call for more studies that build the evidence base for multifaceted, multilevel, and tailored implementation strategies in behavioral health systems (Powell et al. 2015a, b; Weiner et al. 2012). Two articles described the prospective development and testing of implementation strategies (Nadeem et al. this issue; Saldana et al. this issue), and appropriate for their developmental stage, both were pilot studies. Future studies should test innovative implementation strategies using the most rigorous and pragmatic designs possible (Brown et al. In Press).

A number of articles demonstrate the relevance of methods from other fields (e.g., participatory system dynamics modeling and geographic information systems) that can be leveraged to understand large systems and potentially make implementation more efficient (e.g., Walker et al. this issue; Zimmerman et al. this issue). Systems science methods appear to be particularly relevant to the study of implementation in large systems (Burke et al. 2015), though we encourage researchers to continue to draw from diverse theoretical and empirical traditions as appropriate.

The use of implementation frameworks such as the EPIS framework (Aarons et al. 2011) and the policy ecology framework (Raghavan et al. 2008) is tremendously helpful when considering system-level implementation. The EPIS framework provides a manageable scope for the special issue, grounds the work in the extant literature, and provides a common language to use across studies. Frameworks were also used effectively within the context of individual studies to provide insight into various implementation phases (e.g., Kotte et al. this issue) and provide conceptual framing for the use of multi-level implementation strategies (Powell et al. this issue). Implementation theories and frameworks serve multiple purposes (see Nilsen 2015); however, they have been underutilized in implementation research generally (Colquhoun et al. 2013; Davies et al. 2010) and in behavioral health (Powell et al. 2014). We urge researchers to use theories and frameworks to guide the planning, conduct, and reporting of their research in order to work toward more generalizable knowledge in the field of implementation. Utilizing theories and frameworks from other fields such as organizational behavior (Denis and Lehoux 2009; Weiner 2009) and policy implementation research (Nilsen et al. 2013) may generate new insights and supplement the growing number of theories (Grol et al. 2007) and frameworks (Tabak et al. 2012) in implementation science.

The power of mixed methods and multiple types of research participants was also evidenced in a number of articles. For example, Zimmerman and colleagues’ (this issue) used a participatory system dynamics method that is inherently mixed methods (Hovmand 2014), and exemplified how the approach can both generate buy-in and a nuanced understanding of implementation challenges. Aarons and colleagues (this issue) show how mixed methods can be used to determine how quantitative and qualitative results converge as well as how qualitative findings can expand upon quantitative findings (Palinkas et al. 2011). The use of multiple types of respondents is also critical, as perceptions of stakeholders often differ. Beidas and colleagues’ (this issue) study is innovative in its integration of system leader, treatment developer, and agency director perceptions of barriers and facilitators to implementing multiple EBPs across a large behavioral health system. Their perceptions converged and diverged in various ways, indicating that failing to account for different types of stakeholders may be a recipe for failure. This special issue concludes with an important commentary from Rubin and colleagues (this issue). The authors are system leaders affiliated with Philadelphia’s Department of Behavioral Health and Intellectual disAbility Services, and their perspective is important for implementation researchers to consider as they embark upon system level implementation research.

Finally, while the articles in this special issue spanned all phases of the EPIS framework, the majority most explicitly focused on the implementation phase with less emphasis on exploration, adoption, and sustainment. This is consistent with the state of the literature in 2011 when the EPIS framework was published, and indicates a need for more research on those phases. Specifically, there is a need to better understand the types of implementation determinants that are most salient at each stage, how these determinants can be measured and assessed in a pragmatic way, and whether some implementation strategies are more appropriate or have differential effects depending upon the phase of implementation. Answering these questions will require clear reporting in the published literature, as implementation determinants and strategies will need to be described in sufficient detail so that other systems and researchers can replicate implementation approaches (Albrecht et al. 2013; Neta et al. 2015; Proctor et al. 2013). It will also require more consistent tracking of implementation and clinical outcomes, which remains a significant stumbling block for the field, particularly when taken to scale in large behavioral health systems.

Conclusion

This special issue highlights some of the exciting work being done to improve the quality of behavioral health services in large systems, and suggests several areas for improvement. We hope that it sparks dialogue and ongoing conceptual and empirical work that will contribute to a better understanding of: (1) the determinants of implementation effectiveness in large systems, (2) the processes and strategies that need to be applied to ensure that stakeholders have the support they need to deliver effective services, and (3) the best ways of capturing the implementation, service system, and clinical outcomes that are most meaningful to individuals with behavioral health disorders.

Notes

Acknowledgments

This work was partially supported by the National Institutes of Mental Health (NIMH) through L30 MH108060 to BJP and K23 MH099179 to RSB. We also acknowledge support from the NIMH-funded Implementation Research Institute (R25 MH080916; BJP is a current fellow, and RSB is a former fellow) and the Dissemination and Implementation Methods Core at the North Carolina Translational & Clinical Sciences Institute (UL1 TR001111).

Compliance with Ethical Standards

Conflict of Interest

BJP declares that he has no conflict of interest. RSB declares that she has no conflict of interest.

Ethical Approval

This article does not contain any studies with human participants performed by any of the authors.

References

  1. Aarons, G. A., Ehrhart, M. G., & Farahnak, L. R. (2014). Aligning leadership across systems and organizations to develop a strategic climate for evidence-based practice implementation. Annual Review of Public Health, 35, 255–274. doi: 10.1146/annurev-publhealth-032013-182447.CrossRefPubMedPubMedCentralGoogle Scholar
  2. Aarons, G. A., Ehrhart, M. G., Farahnak, L. R., & Hurlburt, M. S. (2015). Leadership and organizational change for implementation (LOCI): A randomized mixed method pilot study of a leadership and organization development intervention for evidence-based practice implementation. Implementation Science, 10(11), 1–12. doi: 10.1186/s13012-014-0192-y.Google Scholar
  3. Aarons, G. A., Green, A. E., Trott, E., Willging, C. E., Ehrhart, M. G., & Roesch, S. C. (this issue). The role of leadership at multiple levels in sustaining system-wide implementation: A mixed-method study. Administration and Policy in Mental Health and Mental Health Services Research.  10.1007/s10488-016-0751-4
  4. Aarons, G. A., Horowitz, J. D., Dlugosz, L. R., & Ehrhart, M. G. (2012). The role of organizational processes in dissemination and implementation research. In R. C. Brownson, G. A. Colditz, & E. K. Proctor (Eds.), Dissemination and implementation research in health: Translating science to practice (pp. 128–153). New York: Oxford University Press.CrossRefGoogle Scholar
  5. Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research, 38, 4–23. doi: 10.1007/s10488-010-0327-7.CrossRefPubMedGoogle Scholar
  6. Aarons, G. A., & Sommerfeld, D. H. (2012). Leadership, innovation climate, and attitudes toward evidence-based practice during a statewide implementation. Journal of the American Academy of Child and Adolescent Psychiatry, 51(4), 423–431. doi: 10.1016/j.jaac.2012.01.018.CrossRefPubMedGoogle Scholar
  7. Albrecht, L., Archibald, M., Arseneau, D., & Scott, S. D. (2013). Development of a checklist to assess the quality of reporting of knowledge translation interventions using the workgroup for intervention development and evaluation research (WIDER) recommendations. Implementation Science, 8(52), 1–5. doi: 10.1186/1748-5908-8-52.Google Scholar
  8. Avolio, B. J., Bass, B. M., & Jung, D. I. (1999). Re-examining the components of transformational and transactional leadership using the multifactor leadership questionnaire. Journal of Occupational and Organizational Psychology, 72, 441–462.CrossRefGoogle Scholar
  9. Beidas, R. S., Marcus, S., Aarons, G. A., Hoagwood, K. E., Schoenwald, S., Evans, A. C., et al. (2015). Predictors of community therapists’ use of therapy techniques in a large public mental health system. JAMA Pediatrics, 169(4), 374–382. doi: 10.1001/jamapediatrics.2014.3736.CrossRefPubMedPubMedCentralGoogle Scholar
  10. Beidas, R. S., Marcus, S., Benjamin Wolk, C., Powell, B. J., Aarons, G. A., Evans, A. C., et al. (2016). A prospective examination of clinician and supervisor turnover within the context of implementation of evidence-based practices in a publicly-funded mental health system. Administration and Policy in Mental Health and Mental Health Services Research, 43, 640–649. doi: 10.1007/s10488-015-0673-6.CrossRefPubMedGoogle Scholar
  11. Beidas, R. S., Stewart, R. E., Adams, D. R., Fernandez, T., Lustbader, S., Powell, B. J., Aarons, G. A., Hoagwood, K. E., Evans, A. C., Hurford, M. O., Rubin, R., Hadley, T., Mandell, D. S., & Barg, F. K. (this issue). A multi-level examination of stakeholder perspectives of implementation of evidence-based practices in a large urban publicly-funded mental health system. Administration and Policy in Mental Health and Mental Health Services Research.  10.1007/s10488-015-0705-2
  12. Bickman, L. (2008). A measurement feedback system (MFS) is necessary to improve mental health outcomes. Journal of the American Academy of Child and Adolescent Psychiatry, 47(10), 1114–1119. doi: 10.1097/CHI.0b013e3181825af8.CrossRefPubMedGoogle Scholar
  13. Birkel, R. C., Hall, L. L., Lane, T., Cohan, K., & Miller, J. (2003). Consumers and families as partners in implementing evidence-based practice. Psychiatric Clinics of North America, 26, 867–881.CrossRefPubMedGoogle Scholar
  14. Brookman-Frazee, L., Stadnick, N., Roesch, S., Regan, J., Barnett, M., Bando, L., & Lau, A. (this issue). Measuring sustainment of multiple practices fiscally mandated in children’s mental health services. Administration and Policy in Mental Health and Mental Health Services Research.  10.1007/s10488-016-0731-8
  15. Brown, C. H., Curran, G., Palinkas, L. A., Aarons, G. A., Wells, K. B., Jones, L., & Cruden, G. (In Press). An overview of research and evaluation designs for dissemination and implementation. Annual Review of Public Health.  10.18131/G35W23
  16. Brownson, R. C., Jacobs, J. A., Tabak, R. G., Hoehner, C. M., & Stamatakis, K. A. (2013). Designing for dissemination among public health researchers: Findings from a national survey in the United States. American Journal of Public Health, 103(9), 1693–1699. doi: 10.2105/AJPH.2012.301165.CrossRefPubMedPubMedCentralGoogle Scholar
  17. Bunger, A. C., Collins-Camargo, C., McBeath, B., Chuang, E., Perez-Jolles, M., & Wells, R. (2014). Collaboration, competition, and co-opetition: Interorganizational dynamics between private child welfare agencies and child serving sectors. Children and Youth Services Review, 38, 113–122. doi: 10.1016/j.childyouth.2014.01.017.CrossRefPubMedPubMedCentralGoogle Scholar
  18. Burke, J. G., Lich, K. H., Neal, J. W., Meissner, H. I., Yonas, M., & Mabry, P. L. (2015). Enhancing dissemination and implementation research using systems science methods. International Journal of Behavioral Medicine, 22(3), 283–291. doi: 10.1007/s12529-014-9417-3.CrossRefPubMedPubMedCentralGoogle Scholar
  19. Chambers, D. A. (2012). Forward. In R. C. Brownson, G. A. Colditz, & E. K. Proctor (Eds.), Dissemination and implementation research in health: Translating science to practice (pp. vii–x). New York: Oxford University Press.Google Scholar
  20. Chambers, D. A., & Azrin, S. T. (2013). Partnership: A fundamental component of dissemination and implementation research. Psychiatric Services, 64(16), 509–511. doi: 10.1176/appi.ps.201300032.CrossRefPubMedGoogle Scholar
  21. Chinman, M., Imm, P., & Wandersman, A. (2004). Getting to outcomes 2004: Promoting accountability through methods and tools for planning, implementation, and evaluation. Santa Monica: Rand Health.Google Scholar
  22. Chorpita, B. F., & Donkervoet, C. (2005). Implementation of the Felix consent decree in Hawaii: The impact of policy and practice development efforts on service delivery. In R. G. Steele & M. Roberts (Eds.), Handbook of mental health services for children, adolescents, and families (pp. 317–332). New York: Kluwer Academic/Plenum Publishers.CrossRefGoogle Scholar
  23. Colquhoun, H. L., Brehaut, J. C., Sales, A., Ivers, N., Grimshaw, J., Michie, S., et al. (2013). A systematic review of the use of theory in randomized controlled trials of audit and feedback. Implementation Science, 8(66), 1–8. doi: 10.1186/1748-5908-8-66.Google Scholar
  24. Community Behavioral Health. (2015). Request for proposals for substance use adult partial hospitalization services. Retrieved from http://www.dbhids.us/assets/Forms–Documents/CBH/RFP-Substance-Use-Adult-Partial-Hospitalization.pdf
  25. Cook, J. M., Dinnen, S., Thompson, R., Ruzek, J., Coyne, J. C., & Schnurr, P. P. (2015). A quantitative test of an implementation framework in 38 VA residential PTSD programs. Administration and Policy in Mental Health and Mental Health Services Research, 42(4), 462–473. doi: 10.1007/s10488-014-0590-0.CrossRefPubMedPubMedCentralGoogle Scholar
  26. Damanpour, F. (1991). Organizational innovation: A meta-analysis of effects of determinants and moderators. The Academy of Management Journal, 34(3), 555–590. doi: 10.2307/256406.CrossRefGoogle Scholar
  27. Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science, 4(50), 1–15.Google Scholar
  28. Davies, P., Walker, A. E., & Grimshaw, J. M. (2010). A systematic review of the use of theory in the design of guideline dissemination and implementation strategies and interpretation of the results of rigorous evaluations. Implementation Science, 5(14), 1–6. doi: 10.1186/1748-5908-5-14.Google Scholar
  29. Denis, J.-L.-. L., & Lehoux, P. (2009). Organizational theory. In S. Straus, J. Tetroe, & I. D. Graham (Eds.), Knowledge translation in health care: Moving from evidence to practice (pp. 215–225). Hoboken: Wiley-Blackwell.Google Scholar
  30. Flottorp, S. A., Oxman, A. D., Krause, J., Musila, N. R., Wensing, M., Godycki-Cwirko, M., et al. (2013). A checklist for identifying determinants of practice: A systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable improvements in healthcare professional practice. Implementation Science, 8(35), 1–11. doi: 10.1186/1748-5908-8-35.Google Scholar
  31. Glisson, C., Dukes, D., & Green, P. (2006). The effects of the ARC organizational intervention on caseworker turnover, climate, and culture in children’s service systems. Child Abuse & Neglect, 30(8), 855–880. doi: 10.1016/j.chiabu.2005.12.010.CrossRefGoogle Scholar
  32. Glisson, C., Landsverk, J., Schoenwald, S., Kelleher, K., Hoagwood, K. E., Mayberg, S., et al. (2008). Assessing the organizational social context (OSC) of mental health services: Implications for research and practice. Administration and Policy in Mental Health and Mental Health Services Research, 35(1-2), 98–113. doi: 10.1007/s10488-007-0148-5.CrossRefPubMedGoogle Scholar
  33. Glisson, C., & Williams, N. J. (2015). Assessing and changing organizational social contexts for effective mental health services. Annual Review of Public Health, 36, 507–523. doi: 10.1146/annurev-publhealth-031914-122435.CrossRefPubMedGoogle Scholar
  34. Grol, R., Bosch, M. C., Hulscher, M. E. J., Eccles, M. P., & Wensing, M. (2007). Planning and studying improvement in patient care: The use of theoretical perspectives. The Milbank Quarterly, 85(1), 93–138. doi: 10.1111/j.1468-0009.2007.00478.x.CrossRefPubMedPubMedCentralGoogle Scholar
  35. Hoagwood, K. E., Olin, S. S., Horwitz, S., McKay, M., Cleek, A., Gleacher, A., et al. (2014). Scaling up evidence-based practices for children and families in New York state: Toward evidence-based policies on implementation for state mental health systems. Journal of Clinical Child and Adolescent Psychology, 43(2), 145–157. doi: 10.1080/15374416.2013.869749.CrossRefPubMedPubMedCentralGoogle Scholar
  36. Hovmand, P. S. (2014). Community based system dynamics. New York: Springer.CrossRefGoogle Scholar
  37. Hurlburt, M., Aarons, G. A., Fettes, D., Willging, C., Gunderson, L., & Chaffin, M. J. (2014). Interagency collaborative team model for capacity building to scale-up evidence-based practice. Children and Youth Services Review, 39, 160–168. doi: 10.1016/j.childyouth.2013.10.005.CrossRefPubMedPubMedCentralGoogle Scholar
  38. Institute of Medicine. (2009a). Initial national priorities for comparative effectiveness research. Washington, DC: The National Academies Press.Google Scholar
  39. Institute of Medicine. (2009b). Preventing mental, emotional, and behavioral disorders among young people: Progress and possibilities. Washington, DC: National Academies Press.Google Scholar
  40. Isett, K. R., Burnam, M. A., Coleman-Beattie, B., Hyde, P. S., Morrissey, J. P., Magnabosco, J., et al. (2007). The state policy context of implementation issues for evidence-based practices in mental health. Psychiatric Services, 58(7), 914–921. doi: 10.1176/appi.ps.58.7.914.CrossRefPubMedGoogle Scholar
  41. Klein, K. J., & Sorra, J. S. (1996). The challenge of innovation implementation. Academy of Management Review, 21(4), 1055–1080. doi: 10.5465/AMR.1996.9704071863.Google Scholar
  42. Kotte, A., Hill, K. A., Mah, A. C., Korathu-Larson, P. A., Au, J. R., Izmirian, S., Keir, S. S., Nakamura, B. J., & Higa-McMillan, C. K. (this issue). Facilitators and barriers of implementing a measurement feedback system in public youth mental health. Administration and Policy in Mental Health and Mental Health Services Research.  10.1007/s10488-016-0729-2
  43. Lyon, A. R., Maras, M. A., Pate, C. M., Igusa, T., & Vander Stoep, A. (2015). Modeling the impact of school-based universal depression screening on additional service capacity needs: A system dynamics approach. Administration and Policy in Mental Health and Mental Health Services Research. doi: 10.1007/s10488-015-0628-y.Google Scholar
  44. Nadeem, E., Weiss, D., Olin, S. S., Hoagwood, K. E., & Horwitz, S. M. (this issue). Using a theory-guided learning collaborative model to improve implementation of EBPs in a state children’s mental health system: A pilot study. Administration and Policy in Mental Health and Mental Health Services Research.  10.1007/s10488-016-0735-4
  45. National Institutes of Health. (2013). Dissemination and implementation research in health (R01). Retrieved January 30, 2013, from http://grants.nih.gov/grants/guide/pa-files/PAR-13-055.html
  46. Neta, G., Glasgow, R. E., Carpenter, C. R., Grimshaw, J. M., Rabin, B. A., Fernandez, M. E., et al. (2015). A framework for enhancing the value of research for dissemination and implementation. American Journal of Public Health, 105(1), 49–57. doi: 10.2105/AJPH.2014.302206.CrossRefPubMedPubMedCentralGoogle Scholar
  47. Nilsen, P. (2015). Making sense of implementation theories, models and frameworks. Implementation Science, 10(53), 1–13. doi: 10.1186/s13012-015-0242-0.Google Scholar
  48. Nilsen, P., Stahl, C., Roback, K., & Cairney, P. (2013). Never the twain shall meet?: A comparison of implementation science and policy implementation research. Implementation Science, 8(63), 1–12. doi: 10.1186/1748-5908-8-63.Google Scholar
  49. Olin, S. S., Nadeem, E., Gleacher, A., Weaver, J., Weiss, D., Hoagwood, K. E., & Horwitz, S. M. (this issue). What predicts clinician dropout from state-sponsored managing and adapting practice training. Administration and Policy in Mental Health and Mental Health Services Research.  10.1007/s10488-015-0709-y
  50. Palinkas, L. A., Aarons, G. A., Horwitz, S., Chamberlain, P., Hurlburt, M., & Landsverk, J. (2011). Mixed methods designs in implementation research. Administration and Policy in Mental Health and Mental Health Services Research, 38, 44–53. doi: 10.1007/s10488-010-0314-z.CrossRefPubMedGoogle Scholar
  51. Powell, B. J., Beidas, R. S., Rubin, R. M., Stewart, R. E., Benjamin Wolk, C., Matlin, S. L., Weaver, S., Hurford, M. O., Evans, A. C., Hadley, T. R., & Mandell, D. S. (this issue). Applying the policy ecology framework to Philadelphia’s behavioral health transformation efforts. Administration and Policy in Mental Health and Mental Health Services Research.  10.1007/s10488-016-0733-6
  52. Powell, B. J., Beidas, R. S., Lewis, C. C., Aarons, G. A., McMillen, J. C., Proctor, E. K., et al. (2015a). Methods to improve the selection and tailoring of implementation strategies. Journal of Behavioral Health Services & Research. doi: 10.1007/s11414-015-9475-6.Google Scholar
  53. Powell, B. J., Hausmann-Stabile, C., & McMillen, J. C. (2013). Mental health clinicians’ experiences of implementing evidence-based treatment. Journal of Evidence-Based Social Work, 10(5), 396–409. doi: 10.1080/15433714.2012.664062.CrossRefPubMedGoogle Scholar
  54. Powell, B. J., Proctor, E. K., & Glass, J. E. (2014). A systematic review of strategies for implementing empirically supported mental health interventions. Research on Social Work Practice, 24(2), 192–212. doi: 10.1177/1049731513505778.CrossRefPubMedGoogle Scholar
  55. Powell, B. J., Waltz, T. J., Chinman, M. J., Damschroder, L. J., Smith, J. L., Matthieu, M. M., et al. (2015b). A refined compilation of implementation strategies: Results from the expert recommendations for implementing change (ERIC) project. Implementation Science, 10(21), 1–14. doi: 10.1186/s13012-015-0209-1.Google Scholar
  56. Proctor, E. K., Powell, B. J., & McMillen, J. C. (2013). Implementation strategies: Recommendations for specifying and reporting. Implementation Science, 8(139), 1–11.Google Scholar
  57. Raghavan, R., Bright, C. L., & Shadoin, A. L. (2008). Toward a policy ecology of implementation of evidence-based practices in public mental health settings. Implementation Science, 3(26), 1–9. doi: 10.1186/1748-5908-3-26.Google Scholar
  58. Rogers, E. M. (2003). Diffusion of Innovations (5th ed.). New York: Free Press.Google Scholar
  59. Rosen, C. S., Matthieu, M., Stirman, S. W., Cook, J. M., Landes, S. J., Bernardy, N. C., Chard, K. M., Crowley, J., Eftekhari, A., Finley, E. P., Hamblen, J. L., Harik, J. M., Kehle-Forbes, S. M., Meis, L. A., Osei-Bonsu, P. E., Rodriguez, A. L., Ruggiero, K. J., Ruzek, J. I., Smith, B. N., Trent, L., & Watts, B. V. (this issue). A review of studies on the system-wide implementation of evidence-based psychotherapies for posttraumatic stress disorder in the veterans health administration. Administration and Policy in Mental Health and Mental Health Services Research.  10.1007/s10488-016-0755-0
  60. Ross, D. F., Ionita, G., & Stirman, S. W. (this issue). System-wide implementation of routine outcome monitoring and measurement feedback system in a national network of operational stress injury clinics. Administration and Policy in Mental Health and Mental Health Services Research.  10.1007/s10488-016-0749-y
  61. Rubin, R. M., Hurford, M. O., Hadley, T., Matlin, S., Weaver, S., & Evans, A. C. (this issue). Synchronizing watches: The challenge of aligning implementation science and public systems. Administration and Policy in Mental Health and Mental Health Services Research.  10.1007/s10488-016-0759-9
  62. Saldana, L., Chamberlain, P., & Chapman, J. (this issue). A supervisor-targeted implementation approach to promote system change: The R3 Model. Administration and Policy in Mental Health and Mental Health Services Research.  10.1007/s10488-016-0730-9
  63. Tabak, R. G., Khoong, E. C., Chambers, D. A., & Brownson, R. C. (2012). Bridging research and practice: Models for dissemination and implementation research. American Journal of Preventive Medicine, 43(3), 337–350. doi: 10.1016/j.amepre.2012.05.024.CrossRefPubMedPubMedCentralGoogle Scholar
  64. Trupin, E., & Kerns, S. (2015). Introduction to the special issue: Legislation related to children’s evidence-based practice. Administration and Policy in Mental Health and Mental Health Services Research. doi: 10.1007/s10488-015-0666-5.Google Scholar
  65. Walker, S. C., Hurvitz, P., Leith, J., Rodriguez, F. I., & Endler, G. C. (this issue). Evidence-based program service deserts: A geographic information systems (GIS) approach to identifying service gaps for state-level implementaiton planning. Administration and Policy in Mental Health and Mental Health Services Research.  10.1007/s10488-016-0743-4
  66. Weiner, B. J. (2009). A theory of organizational readiness for change. Implementation Science, 4(67), 1–9. doi: 10.1186/1748-5908-4-67.Google Scholar
  67. Weiner, B. J., Lewis, M. A., Clauser, S. B., & Stitzenberg, K. B. (2012). In search of synergy: Strategies for combining interventions at multiple levels. JNCI Monographs, 44, 34–41. doi: 10.1093/jncimonographs/lgs001.CrossRefGoogle Scholar
  68. Weiner, B. J., Lewis, M. A., & Linnan, L. A. (2009). Using organization theory to understand the determinants of effective implementation of worksite health promotion programs. Health Education Research, 24(2), 292–305. doi: 10.1093/her/cyn019.CrossRefPubMedGoogle Scholar
  69. Wensing, M., Eccles, M., & Grol, R. (2005). Observational evaluations of implementation strategies. In R. Grol, M. Wensing, & M. Eccles (Eds.), Improving patient care: The implementation of change in clinical practice (pp. 248–255). Edinburgh: Elsevier.Google Scholar
  70. Wiltsey Stirman, S., Kimberly, J., Cook, N., Calloway, A., Castro, F., & Charns, M. (2012). The sustainability of new programs and innovations: A review of the empirical literature and recommendations for future research. Implementation Science, 7(17), 1–19. doi: 10.1186/1748-5908-7-17.Google Scholar
  71. Zahra, A. S., & George, G. (2002). Absorptive capacity: A review, reconceptualization and extension. Academy of Management Review, 27(2), 185–203.Google Scholar
  72. Zimmerman, L., Lounsbury, D., Rosen, C., Kimerling, R., Trafton, J., & Lindley, S. (this issue). Participatory system dynamics modeling: Increasing engagement and precision to improve implementation planning in systems. Administration and Policy in Mental Health and Mental Health Services Research. doi: 10.1007/s10488-016-0754-1

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.Department of Health Policy and Management, Gillings School of Global Public HealthUniversity of North Carolina at Chapel HillChapel HillUSA
  2. 2.Center for Mental Health Policy and Services Research, Department of PsychiatryUniversity of Pennsylvania Perelman School of MedicinePhiladelphiaUSA

Personalised recommendations