Advertisement

Prevention Science

, Volume 20, Issue 8, pp 1200–1210 | Cite as

Influence of an Implementation Support Intervention on Barriers and Facilitators to Delivery of a Substance Use Prevention Program

  • Jill S. CannonEmail author
  • Marylou Gilbert
  • Patricia Ebener
  • Patrick S. Malone
  • Caitlin M. Reardon
  • Joie Acosta
  • Matthew Chinman
Article
  • 153 Downloads

Abstract

Implementation support interventions have helped organizations implement programs with quality and obtain intended outcomes. For example, a recent randomized controlled trial called Preparing to Run Effective Programs (PREP) showed that an implementation support intervention called Getting To Outcomes (GTO) improved implementation of an evidence-based substance use prevention program (CHOICE) run in community-based settings. However, more information is needed on how these interventions affect organizational barriers and facilitators of implementation. This paper aims to identify differences in implementation facilitators and barriers in sites conducting a substance use prevention program with and without GTO. PREP is a cluster-randomized controlled trial testing GTO, a two-year implementation support intervention, in Boys & Girls Clubs. The trial compares 15 Boys & Girls Club sites implementing CHOICE (control group), a five-session evidence-based alcohol and drug prevention program, with 14 Boys & Girls Club sites implementing CHOICE supported by GTO (intervention group). All sites received CHOICE training. Intervention sites also received GTO manuals, training, and onsite technical assistance to help practitioners complete implementation best practices specified by GTO (i.e., GTO steps). During the first year, technical assistance providers helped the intervention group adopt, plan, and deliver CHOICE, and then evaluate and make quality improvements to CHOICE implementation using feedback reports summarizing their data. Following the second year of CHOICE and GTO implementation, all sites participated in semi-structured interviews to identify barriers and facilitators to CHOICE implementation using the Consolidated Framework for Implementation Research (CFIR). This paper assesses the extent to which these facilitators and barriers differed between intervention and control group. Intervention sites had significantly higher average ratings than control sites for two constructs from the CFIR process domain: planning and reflecting and evaluating. At the same time, intervention sites had significantly lower ratings on the culture and available resources constructs. Findings suggest that strong planning, evaluation, and reflection—likely improved with GTO support—can facilitate implementation even in the face of perceptions of a less desirable implementation climate. These findings highlight that implementation support, such as GTO, is likely to help low-resourced community-based organizations improve program delivery through a focus on implementation processes.

Trial Registration

This project is registered at ClinicalTrials.gov with number NCT02135991 (URL: https://clinicaltrials.gov/show/NCT02135991). The trial was first registered May 12, 2014.

Keywords

Implementation support Fidelity Evidence-based program Community-based, CFIR GTO 

Notes

Acknowledgments

We would like to acknowledge the contributions of Keisha McDonald, Elizabeth D’Amico, Laura Damschroder, and Christian Lopez for their assistance.

Funding

All the authors are funded by a grant from the National Institute on Alcohol Abuse and Alcoholism: Preparing to Run Effective Prevention (R01AA022353-01) to conduct the reported research. The funder had no role in the design of the study, data collection, analysis, interpretation of data, or writing of the manuscript.

Compliance with Ethical Standards

Conflict of Interest

The authors declare that they have no conflict of interest.

Ethical Approval

All procedures performed in studies involving human participants were in accordance with ethical standards of the institutional research committee (RAND Human Subjects Protection Committee FWA00003425) and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Informed Consent

Informed consent was obtained from all individual participants in the study.

Supplementary material

11121_2019_1037_MOESM1_ESM.docx (33 kb)
ESM 1 (DOCX 32 kb)
11121_2019_1037_MOESM2_ESM.pdf (424 kb)
ESM 2 (PDF 423 kb)
11121_2019_1037_MOESM3_ESM.doc (56 kb)
ESM 3 (DOC 56 kb)
11121_2019_1037_MOESM4_ESM.doc (216 kb)
ESM 4 (DOC 216 kb)

References

  1. Aarons, G. A., Hoagwood, K., Landsverk, J., Glisson, C., Kelleher, K., & Cafri, G. (2010). Psychometric properties and US national norms of the Evidence-Based Practice Attitude Scale (EBPAS). Psychological Assessment, 22(2), 356–365.  https://doi.org/10.1037/a0019188.CrossRefPubMedGoogle Scholar
  2. Acosta, J., Chinman, M., Ebener, P., Malone, P. S., Paddock, S., Phillips, A., et al. (2013). An intervention to improve program implementation: Findings from a two-year cluster randomized trial of Assets-Getting To Outcomes. Implementation Science, 8, 87.CrossRefGoogle Scholar
  3. Agresti, A. (2002). Inference for contingency tables. Categorical data analysis, 70–114.Google Scholar
  4. Benjamini, Y., & Hochberg, B. (1995). Controlling the false discovery rate: A practical and powerful approach to multiple testing. Journal of the Royal Statistical Society, 57(1), 289–300.Google Scholar
  5. CFIR Research Team (undated). Consolidated Framework for Implementation Research constructs: culture. Center for Clinical Management Research. https://cfirguide.org/constructs/culture/. Accessed 16 July 2019.
  6. Chinman, M., Imm, P., & Wandersman, A. (2004). Getting to Outcomes 2004: Promoting accountability through methods and tools for planning, implementation, and evaluation. Santa Monica, CA: RAND Corporation.Google Scholar
  7. Chinman, M., Hunter, S. B., Ebener, P., Paddock, S., Stillman, L., Imm, P., et al. (2008). The Getting To Outcomes demonstration and evaluation: An illustration of the Prevention Support System. American Journal of Community Psychology, 41, 206–224.CrossRefGoogle Scholar
  8. Chinman, M., Acosta, J., Ebener, P., Malone, P. S., & Slaughter, M. E. (2016). Can implementation support help community-based settings better deliver evidence-based sexual health promotion programs? A randomized trial of Getting To Outcomes(R). Implementation Science, 11(1), 78.CrossRefGoogle Scholar
  9. Chinman, M., Acosta, J., Ebener, P., Malone, P. S., & Slaughter, M. E. (2018a). A cluster-randomized trial of Getting To Outcomes’ impact on sexual health outcomes in community-based settings. Prevention Science, 19(4), 437–448.  https://doi.org/10.1007/s11121-017-0845-6.CrossRefPubMedPubMedCentralGoogle Scholar
  10. Chinman, M., Ebener, P., Malone, P. S., Cannon, J., D’Amico, E. J., & Acosta, J. (2018b). Testing implementation support for evidence-based programs in community settings: A replication cluster-randomized trial of Getting To Outcomes(R). Implementation Science, 13(1), 131.  https://doi.org/10.1186/s13012-018-0825-7.CrossRefPubMedGoogle Scholar
  11. Creswell, J. W., Klassen, A. C., Plano Clark, V. L., & Smith, K. C. (2011). Best practices for mixed methods research in the health sciences (pp. 541–545). Bethesda (Maryland): National Institutes of Health.Google Scholar
  12. D’Amico, E. J., & Edelen, M. (2007). Pilot test of Project CHOICE: A voluntary after school intervention for middle school youth. Psychology of Addictive Behaviors, 21(4), 592–598.CrossRefGoogle Scholar
  13. D’Amico, E. J., Ellickson, P. L., Wagner, E. F., Turrisi, R., Fromme, K., Ghosh-Dastidar, B., et al. (2005). Developmental considerations for substance use interventions from middle school through college. Alcoholism: Clinical and Experimental Research, 29, 474–483.CrossRefGoogle Scholar
  14. D’Amico, E. J., Tucker, J. S., Miles, J. N. V., Zhou, A. J., Shih, R. A., & Green, H. D. J. (2012). Preventing alcohol use with a voluntary after school program for middle school students: Results from a cluster randomized controlled trial of CHOICE. Prevention Science, 13, 415–425.CrossRefGoogle Scholar
  15. Damschroder, L. J., & Lowery, J. C. (2013). Evaluation of a large-scale weight management program using the Consolidated Framework for Implementation Research (CFIR). Implementation Science, 8, 51.  https://doi.org/10.1186/1748-5908-8-51.CrossRefPubMedGoogle Scholar
  16. Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science, 4, 50–65.  https://doi.org/10.1186/1748-5908-4-50.CrossRefGoogle Scholar
  17. Damschroder, L. J., Reardon, C. M., Sperber, N., Robinson, C. H., Fickel, J. J., & Oddone, E. Z. (2017). Implementation evaluation of the Telephone Lifestyle Coaching (TLC) program: Organizational factors associated with successful implementation. Translational Behavioral Medicine, 7(2), 233–241.  https://doi.org/10.1007/s13142-016-0424-6.CrossRefPubMedGoogle Scholar
  18. Dedoose Version 8.0.42 (2016). Web application for managing, analyzing, and presenting qualitative and mixed method research data www.dedoose.com.
  19. Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327–350.CrossRefGoogle Scholar
  20. Ennett, S. T., Ringwalt, C. L., Thorne, J., Rohrbach, L. A., Vincus, A., Simons-Rudolph, A., et al. (2003). A comparison of current practice in school-based substance use prevention programs with meta-analysis findings. Prevention Science, 4(1), 1–14.CrossRefGoogle Scholar
  21. Glasgow, R., Vinson, C., Chambers, D., Khoury, M., Kaplan, R., & Hunter, C. (2012). National Institutes of Health approaches to dissemination and implementation science: Current and future directions. American Journal of Public Health, 102(7), 1274–1281.CrossRefGoogle Scholar
  22. Hallfors, D., & Godette, D. (2002). Will the ‘principles of effectiveness’ improve prevention practice? Early findings from a diffusion study. Health Education Research, 17(4), 461–470.CrossRefGoogle Scholar
  23. Hawkins, J. D., Oesterle, S., Brown, E. C., Arthur, M. W., Abbott, R. D., Fagan, A. A., et al. (2009). Results of a type 2 translational research trial to prevent adolescent drug use and delinquency: A test of Communities That Care. Archives of Pediatrics and Adolescent Medicine, 163(9), 789–798.CrossRefGoogle Scholar
  24. Helfrich, C. D., Kohn, M. J., Stapleton, A., Allen, C. L., Hammerback, K. E., Chan, K. C. G., et al. (2018). Readiness to change over time: Change commitment and change efficacy in a workplace health-promotion trial. Frontiers in Public Health, 6.  https://doi.org/10.3389/fpubh.2018.00110.
  25. Johnston, L. D., O’Malley, P. M., Miech, R. A., Bachman, J. G., & Schulenberg, J. E. (2017). In The University of Michigan (Ed.), Monitoring the future national survey results on drug use, 1975-2016: Overview, key findings on adolescent drug use. Ann Arbor, Michigan: Institute for Social Research.Google Scholar
  26. Kirk, M. A., Kelley, C., Yankey, N., Birken, S. A., Abadie, B., & Damschroder, L. (2016). A systematic review of the use of the Consolidated Framework for Implementation Research. Implementation Science, 11(1), 72.  https://doi.org/10.1186/s13012-016-0437-z.CrossRefPubMedGoogle Scholar
  27. Kumar, R., O’Malley, P. M., Johnston, L. D., & Laetz, V. B. (2013). Alcohol, tobacco, and other drug use prevention programs in U.S. schools: A descriptive summary. Prevention Science, 14(6), 581–592.  https://doi.org/10.1007/s11121-012-0340-z.CrossRefPubMedPubMedCentralGoogle Scholar
  28. Livet, M., & Wandersman, A. (2005). Organizational functioning: Facilitating effective interventions and increasing the odds of programming success. In D. M. Fetterman & A. Wandersman (Eds.), Empowerment evaluation in practice (pp. 123–154). New York, NY: Guilford.Google Scholar
  29. Miller, W. R., & Rollnick, S. (2012). Motivational interviewing: Helping people change (3rd ed.). New York: Guilford Press.Google Scholar
  30. Ringwalt, C., Hanley, S., Vincus, A. A., Ennett, S. T., Rohrbach, L. A., & Bowling, J. M. (2008). The prevalence of effective substance use prevention curricula in the nation’s high schools. Journal of Primary Prevention, 29(6), 479–488.  https://doi.org/10.1007/s10935-008-0158-4.CrossRefPubMedGoogle Scholar
  31. Ringwalt, C., Vincus, A. A., Hanley, S., Ennett, S. T., Bowling, J. M., & Rohrbach, L. A. (2009). The prevalence of evidence-based drug use prevention curricula in U.S. middle schools in 2005. Prevention Science, 10(1), 33–40.  https://doi.org/10.1007/s11121-008-0112-y.CrossRefPubMedPubMedCentralGoogle Scholar
  32. Scaccia, J. P., Cook, B. S., Lamont, A., Wandersman, A., Castellow, J., Katz, J., et al. (2015). A practical implementation science heuristic for organizational readiness: R=MC2. Journal of Community Psychology, 43(4), 484–501.CrossRefGoogle Scholar
  33. Spoth, R., Redmond, C., Shin, C., Greenberg, M., Clair, S., & Feinberg, M. (2007). Substance-use outcomes at 18 months past baseline: The PROSPER Community-University Partnership Trial. American Journal of Preventive Medicine, 32(5), 395–402.CrossRefGoogle Scholar
  34. U.S. Department of Health and Human Services. (2016). Facing addiction in America: The Surgeon General’s report on alcohol, drugs, and health (Vol. 6). Washington, DC: HHS.Google Scholar
  35. Weiner, B. J. (2009). A theory of organizational readiness for change. Implementation Science, 4(1), 67.  https://doi.org/10.1186/1748-5908-4-67.CrossRefPubMedGoogle Scholar

Copyright information

© Society for Prevention Research 2019

Authors and Affiliations

  1. 1.RAND CorporationSanta MonicaUSA
  2. 2.Malone QuantitativeDurhamUSA
  3. 3.Center for Clinical Management Research, VA Ann Arbor Healthcare SystemAnn ArborUSA

Personalised recommendations