A Sustainable Alternative to the Gold Standard EBP: Validating Existing Programs
Increasingly, jurisdictions are requiring the adoption of certified evidence-based programs (EBPs) for behavioral health and human services for children, youth, and their families. Often, such adoption of proven, prepackaged programs is done without regard to existing, yet effective, locally developed program models. This study presents a replicable six-step process that identifies key researched elements from within existing programs and creates program-specific fidelity scoring and tracking tools for routine use during clinical supervision to assure that these elements are implemented well. A case study is used to demonstrate that a locally developed program model, when implemented with high fidelity, can serve clients with outcomes comparable to its EBP counterpart at a much lower cost. The results underscore the importance of one common element among EBPs and effective services in general: measuring key elements of the service and client outcomes and feeding these data back to clinicians for continuous improvement.
This research was supported by a grant from the Allegheny County Department of Human Services awarded to Wesley Spectrum Services for an evaluation of the WSIH program. The authors thank all of the staff of Wesley Spectrum Services, especially Doug Muetzel (CEO) and Pam Weaver (CPO) and the staff of the WSIH program who worked closely with the evaluation consultants in developing the Model Value Management six steps. We also thank Katy Collins, PhD, who was affiliated with the University of Pittsburgh Graduate School of Public and International Affairs at the time of this research, for her work on the cost–benefit analysis of these services. We thank Michele Garrity, also formerly of the Graduate School of Public and International Affairs, for her final editing. Finally, we thank our Pittsburgh research and community advisors who encouraged this work and support the exploration of credible alternatives to pre-packaged EBP models: Ed Ricci, PhD (University of Pittsburgh School of Public Health; Ed Mulvey, Ph.D., University of Pittsburgh School of Medicine; Marybeth Rauktis, Ph.D., University of Pittsburgh, School of Social Work; Nancy Kukovich, CEO Adelphoi Village; Rochelle Haimes, COA national consultant; Brandi Mauck, CEO Allegheny Health Choices; Robert Sheen, PhD, clinical consultant; and Laura Maines, CEO, Every Child, Inc.
Compliance with Ethical Standards
Conflict of Interest Statement
The authors served as paid outside evaluation consultants for Wesley Spectrum Services, the case study site for the research reported in this manuscript. They had no role in developing or maintaining the program, serving only as outside evaluators. Despite the active involvement of staff in data collection and discussions regarding the meaning of the information, the authors had full access to all the relevant data and were in no way required to report on findings in a biased manner. None of the authors have had employment or engagement with any of the companies related to the standardized measurement tools referenced in this case study nor with any of the comparison programs cited in this article (e.g., Multi-Systemic Therapy). The authors take responsibility for the integrity and accuracy of the data analysis.
- 2.Littell J. Evidence-based practice: evidence or orthodoxy? In: BL Duncan, SD Miller, BE Wampold, et al. (Eds). The Heart & Soul of Change, Second Edition. Washington: American Psychological Association, 2010, pp. 167–198.Google Scholar
- 3.Lipsey M, Howell J, Kelly R, et al. Improving the Effectiveness of Juvenile Justice Programs: A New Perspective On Evidence-Based Practice. Center for Juvenile Justice Reform, Georgetown University. Available online at http://njjn.org/uploads/digital-library/CJJR_Lipsey_Improving-Effectiveness-of-Juvenile-Justice_2010.pdf. Accessed on January 28, 2015.
- 4.Substance Abuse and Mental Health Services Administration. National Registry of Evidence-Based Programs and Practices. Programs & Campaigns. Available online at https://www.samhsa.gov/nrepp. Accessed on January 29, 2015.
- 5.Baron J, Haskins R. The Obama Administration’s Evidence-Based Social Policy Initiatives: An Overview. Brookings Institution. Available online at http://www.brookings.edu/research/articles/2011/04/obama-social-policy-haskins. Accessed on December 3, 2015.
- 6.Center for the Study and Prevention of Violence. Blueprints for Violence Prevention. University of Colorado. Available online at http://www.colorado.edu/cspv/blueprints/index.html. Accessed on February 9, 2015.
- 11.Greenwood PW, Welsh BC, Rocque M. Implementing Proven Programs for Juvenile Offenders: Assessing State Progress. Association for the Advancement of Evidence-Based Practice. Available online at http://youthjusticenc.org/download/juvenile-justice/prevention-interventions-and-alternatives/Implementing%20Proven%20Programs%20for%20JuvenIle%20Offenders.pdf. Accessed on January 27, 2015.
- 12.Henggeler SW, Schoenwald SK. Evidence-Based Interventions for Juvenile Offenders and Juvenile Justice Policies that Support Them. Social Policy Report. Volume 25, Issue 1. Society for Research in Child Development. Available online at https://eric.ed.gov/?id=ED519241. Accessed on January 27, 2015.
- 13.Hill I, Hogan S, Palmer L, et al. Medicaid Outreach & Enrollment for Pregnant Women: What is the State of the Art? Report for the March of Dimes Foundation. Urban Institute. Available online at http://www.urban.org/UploadedPDF/411898_pregnant_women.pdf. Accessed on January 29, 2015.
- 16.Wampold B. The research evidence for the common factors models: a historically situated perspective. In: BL Duncan, SD Miller, BE Wampold, et al. (Eds). Heart & Soul of Change: Delivering What Works in Therapy, Second Edition. Washington, D.C.: American Psychological Association, 2010, pp. 49–81.CrossRefGoogle Scholar
- 21.Fixsen D, Naoom S, Blase K, et al. Implementation Research: A Synthesis of the Literature. Tampa, FL: Louis de la Parte Florida Mental Health Institute, National Implementation Research Network, 2005.Google Scholar
- 23.Lipsey MW. The effects of treatment on juvenile delinquents: results from meta-analysis. In: F Lösel, D Bender, T Bliesener (Eds). Psychology and Law: International Perspectives. England: Walter De Gruyter, 1992, pp. 131–143.Google Scholar
- 28.Blase K, Fixsen D. Core Intervention Components: Identifying and Operationalizing What Makes Programs Work. Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services. Available online at https://aspe.hhs.gov/report/core-intervention-components-identifying-and-operationalizing-what-makes-programs-work. Accessed on January 26, 2015.
- 29.Hubble MA, Duncan BL, Miller SD, et al. Introduction. In: BL Duncan, SD Miller, BE Wampold, et al. (Eds). The Heart & Soul of Change: Delivering What Works in Therapy, Second Edition. Washington, D.C.: American Psychological Association, 2010, pp. 28.Google Scholar
- 30.Latimer J. Multisystemic Therapy as a Response to Serious Youth Delinquency. Department of Justice, Government of Canada. Available online at http://www.justice.gc.ca/eng/rp-pr/jr/jr12/p5d.html. Accessed on January 28, 2015.
- 31.Office of Mental Health and Substance Abuse Services (OMHSAS). White Paper Community Alternatives to Psychiatric Residential Treatment Facility Services. Commonwealth of Pennsylvania. Available online at http://www.paproviders.org/archives/Pages/Childrens_Archive/PRTF_White_Paper_042208.pdf. Accessed on February 11, 2015.
- 32.Hodges K. Child and adolescent functional assessment scale. In: ME Maruish (Ed). The Use of Psychological Testing for Treatment Planning and Outcomes Assessment, Third Edition, Volume 2. New Jersey: Taylor & Francis, 2011, pp. 405–442.Google Scholar
- 33.Kirk RS, Martens P. Development and Field Testing of the North Carolina Family Assessment Scale for General Services (NCFAS-G). National Family Preservation Network. Available online at http://nfpn.org/Portals/0/Documents/ncfasg_research_report.pdf. Accessed on January 28, 2015.
- 34.Wesley Spectrum Services. Final Report to Wesley Spectrum: Fidelity Management: A low-cost alternative to proprietary evidence-based programs. PHILLIPS. Available online at http://www.phillipsprograms.org/wp-content/uploads/2011/10/WSS-IHFT-FINAL-REPORT-7-5-11-v3.pdf. Accessed on January 29, 2015.
- 40.MST Institute. Model Fidelity and Positive Outcomes: Hallmarks of Evidence-Based Practice. MST Institute Homepage. Available online at http://www.mstinstitute.org. Accessed on February 10, 2015.
- 41.MST Institute. 2010 MST Data Report. QA Program Reports. Available online at http://www.mstinstitute.org/MST%202010%20Data%20Report%20Final.pdf. Accessed on February 10, 2015.
- 42.Allegheny County. Data Warehouse. Pennsylvania Department of Human Services. Available online at http://www.alleghenycounty.us/Human-Services/News-Events/Accomplishments/DHS-Data-Warehouse.aspx. Accessed on January 29, 2015.
- 43.Aos T, Lieb R, Mayfield J, et al. Benefits and Costs of Prevention and Early Intervention Programs for Youth. Washington State Institute for Public Policy. Available online at http://www.wsipp.wa.gov/ReportFile/881/Wsipp_Benefits-and-Costs-of-Prevention-and-Early-Intervention-Programs-for-Youth_Summary-Report.pdf. Accessed on January 28, 2015.
- 44.Substance Abuse and Mental Health Services Administration. Multisystemic Therapy (MST) for Juvenile Offenders. National Registry of Evidence-Based Programs and Practices. Available online at https://nrepp.samhsa.gov/Legacy/ViewIntervention.aspx?id=254. Accessed on January 29, 2015.