Advertisement

The Journal of Primary Prevention

, Volume 40, Issue 1, pp 111–127 | Cite as

Redesigning Implementation Measurement for Monitoring and Quality Improvement in Community Delivery Settings

  • Cady BerkelEmail author
  • Carlos G. Gallo
  • Irwin N. Sandler
  • Anne M. Mauricio
  • Justin D. Smith
  • C. Hendricks Brown
Original Paper

Abstract

The field of prevention has established the potential to promote child adjustment across a wide array of outcomes. However, when evidence-based prevention programs have been delivered at scale in community settings, declines in implementation and outcomes have resulted. Maintaining high quality implementation is a critical challenge for the field. We describe steps towards the development of a practical system to monitor and support the high-quality implementation of evidence-based prevention programs in community settings. Research on the implementation of an evidence-based parenting program for divorcing families called the “New Beginnings Program” serves as an illustration of the promise of such a system. As a first step, we describe a multidimensional theoretical model of implementation that links aspects of program delivery with improvements in participant outcomes. We then describe research on the measurement of each of these implementation dimensions and test their relations to intended program outcomes. As a third step, we develop approaches to the assessment of these implementation constructs that are feasible to use in community settings and to establish their reliability and validity. We focus on the application of machine learning algorithms and web-based data collection systems to assess implementation and provide support for high quality delivery and positive outcomes. Examples are presented to demonstrate that valid and reliable measures can be collected using these methods. Finally, we envision how these measures can be used to develop an unobtrusive system to monitor implementation and provide feedback and support in real time to maintain high quality implementation and program outcomes.

Keywords

Implementation Measurement Evidence-based programs Parenting Pragmatic measures Technology 

Notes

Funding

Support for the development of this manuscript was provided by the National Institute of Drug Abuse: R01DA026874 (Sandler), R01DA033991 (Berkel and Mauricio), and competitive funding from the Center for Prevention Implementation Methodology (Ce-PIM), P30-DA027828 (Brown/Berkel), and Diversity Supplement R01DA033991-03S1 (Berkel/Gallo).

Compliance With Ethical Standards

Conflict of Interest

Sandler is a developer of the NBP and has an LLC that trains facilitators to deliver the program. Remaining authors declare that they have no conflict of interest.

Research Involving Human Participants

All procedures performed in studies involving human participants were in accordance with the ethical standards of Arizona State University’s IRB and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. This article does not contain any studies with animals performed by any of the authors.

Informed Consent

Informed Consent was obtained from all individual participants included in the study.

References

  1. Aarons, G. A., & Sawitzky, A. C. (2006). Organizational climate partially mediates the effect of culture on work attitudes and staff turnover in mental health services. Administration and Policy in Mental Health and Mental Health Services Research, 33(3), 289–301.  https://doi.org/10.1007/s10488-006-0039-1.CrossRefGoogle Scholar
  2. Atkins, D. C., Steyvers, M., Imel, Z. E., & Smyth, P. (2014). Scaling up the evaluation of psychotherapy: Evaluating motivational interviewing fidelity via statistical text classification. Implementation Science, 9(49), 1–11.  https://doi.org/10.1186/1748-5908-9-49.Google Scholar
  3. Berkel, C., Mauricio, A. M., Sandler, I. N., Wolchik, S. A., Gallo, C. G., & Brown, C. H. (2018). The cascading effects of multiple dimensions of implementation on program outcomes: A test of a theoretical model. Prevention Science, 19(6), 782–794.  https://doi.org/10.1007/s11121-017-0855-4.CrossRefGoogle Scholar
  4. Berkel, C., Mauricio, A. M., Schoenfelder, E., & Sandler, I. N. (2011). Putting the pieces together: An integrated model of program implementation. Prevention Science, 12(1), 23–33.  https://doi.org/10.1007/s11121-010-0186-1.CrossRefGoogle Scholar
  5. Berkel, C., Sandler, I. N., Wolchik, S. A., Brown, C. H., Gallo, C. G., Chiapa, A., et al. (2018). “Home practice is the program:” Parents’ practice of program skills as predictors of outcomes in the New Beginnings Program effectiveness trial. Prevention Science, 19(5), 663–673.  https://doi.org/10.1007/s11121-016-0738-0.CrossRefGoogle Scholar
  6. Brown, C. H., Mohr, D. C., Gallo, C. G., Mader, C., Palinkas, L., Wingood, G., et al. (2013). A computational future for preventing HIV in minority communities: How advanced technology can improve implementation of effective programs. Journal of Acquired Immune Deficiency Syndromes, 63(Supp 1), S72–S84.  https://doi.org/10.1097/QAI.0b013e31829372bd.CrossRefGoogle Scholar
  7. Can, D., Marín, R. A., Georgiou, P. G., Imel, Z. E., Atkins, D. C., & Narayanan, S. S. (2016). “It sounds like…:” A natural language processing approach to detecting counselor reflections in motivational interviewing. Journal of Counseling Psychology, 63(3), 343–350.  https://doi.org/10.1037/cou0000111.CrossRefGoogle Scholar
  8. Chacko, A., Isham, A., Cleek, A. F., & McKay, M. M. (2016). Using mobile health technology to improve behavioral skill implementation through homework in evidence-based parenting intervention for disruptive behavior disorders in youth: Study protocol for intervention development and evaluation. Pilot and Feasibility Studies, 2, 57.  https://doi.org/10.1186/s40814-016-0097-4.CrossRefGoogle Scholar
  9. Chambers, D. A., Glasgow, R. E., & Stange, K. C. (2013). The dynamic sustainability framework: Addressing the paradox of sustainment amid ongoing change. Implementation Science, 8, 117.  https://doi.org/10.1186/1748-5908-8-117.CrossRefGoogle Scholar
  10. Durlak, J., & DuPre, E. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327–350.  https://doi.org/10.1007/s10464-008-9165-0.CrossRefGoogle Scholar
  11. Dusenbury, L. A., Brannigan, R., Hansen, W. B., Walsh, J., & Falco, M. (2005). Quality of implementation: Developing measures crucial to understanding the diffusion of preventive interventions. Health Education Research, 20, 308–313.  https://doi.org/10.1093/her/cyg134.CrossRefGoogle Scholar
  12. Forgatch, M. S., Patterson, G. R., & DeGarmo, D. S. (2005). Evaluating fidelity: Predictive validity for a measure of competent adherence to the Oregon model of parent management training. Behavior Therapy, 36, 3–13.  https://doi.org/10.1016/S0005-7894(05)80049-8.CrossRefGoogle Scholar
  13. Friedman, C. P. (2009). A “fundamental theorem” of biomedical informatics. Journal of the American Medical Informatics Association, 16, 169–170.  https://doi.org/10.1197/jamia.M3092.CrossRefGoogle Scholar
  14. Gallo, C. G., Berkel, C., Sandler, I. N., & Brown, C. H. (2015). Improving implementation of behavioral interventions by monitoring quality of delivery in speech. Paper presented at the annual conference on the Science of Dissemination and Implementation, Washington, DC.Google Scholar
  15. Gallo, C. G., Berkel, C., Sandler, I. N., & Brown, C. H. (2016). Developing computer-based methods for assessing quality of implementation in parent-training behavioral interventions. Paper presented at the annual meeting of the Society for Prevention Research, San Francisco, CA.Google Scholar
  16. Gallo, C. G., Li, Y., Berkel, C., Mehrotra, S., Liu, L., Benbow, N., et al. (under review). Recognizing emotion in speech for assessing the implementing behavioral interventions.Google Scholar
  17. Gallo, C. G., Pantin, H., Villamar, J., Prado, G., Tapia, M. I., Ogihara, M., et al. (2015). Blending qualitative and computational linguistics methods for fidelity assessment: Experience with the Familias Unidas preventive intervention. Administration and Policy in Mental Health and Mental Health Services Research, 42, 574–585.  https://doi.org/10.1007/s10488-014-0538-4.CrossRefGoogle Scholar
  18. Glasgow, R. E., & Riley, W. T. (2013). Pragmatic measures: What they are and why we need them. American Journal of Preventive Medicine, 45, 237–243.  https://doi.org/10.1016/j.amepre.2013.03.010.CrossRefGoogle Scholar
  19. Herman, P. M., Mahrer, N. E., Wolchik, S. A., Porter, M. M., Jones, S., & Sandler, I. N. (2015). Cost-benefit analysis of a preventive intervention for divorced families: Reduction in mental health and justice system service use costs 15 years later. Prevention Science, 16, 586–596.  https://doi.org/10.1007/s11121-014-0527-6.CrossRefGoogle Scholar
  20. Hogue, A., Liddle, H. A., & Rowe, C. (1996). Treatment adherence process research in family therapy: A rationale and some practical guidelines. Psychotherapy: Theory, Research, Practice, Training, 33, 332–345.  https://doi.org/10.1037/0022-006x.76.4.544.CrossRefGoogle Scholar
  21. Imel, Z. E., Barco, J. S., Brown, H. J., Baucom, B. R., Baer, J. S., Kircher, J. C., et al. (2014). The association of therapist empathy and synchrony in vocally encoded arousal. Journal of Counseling Psychology, 61, 146–153.  https://doi.org/10.1037/a0034943.CrossRefGoogle Scholar
  22. Martino, S., Ball, S., Nich, C., Frankforter, T. L., & Carroll, K. M. (2009). Correspondence of motivational enhancement treatment integrity ratings among therapists, supervisors, and observers. Psychotherapy Research, 19, 181–193.  https://doi.org/10.1080/10503300802688460.CrossRefGoogle Scholar
  23. Mohr, D. C., Cuijpers, P., & Lehman, K. (2011). Supportive accountability: A model for providing human support to enhance adherence to eHealth interventions. Journal of Medical Internet Research, 13, e30.  https://doi.org/10.2196/jmir.1602.CrossRefGoogle Scholar
  24. Moos, R. (1981). Group environment scale manual. Palo Alto, CA: Consulting Psychologists Press.Google Scholar
  25. Murry, V. M., Berkel, C., & Liu, N. (2018). The closing digital divide: Delivery modality and family attendance in the Pathways for African American Success (PAAS) program. Prevention Science, 19(5), 642–651.  https://doi.org/10.1007/s11121-018-0863-z.CrossRefGoogle Scholar
  26. NRC/IOM. (2009). Preventing mental, emotional, and behavioral disorders among young people: Progress and possibilities. Washington, DC: NRC/IOM.Google Scholar
  27. Perrino, T., Beardslee, W., Bernal, G., Brincks, A., Cruden, G., Howe, G., et al. (2015). Toward scientific equity for the prevention of depression and depressive symptoms in vulnerable youth. Prevention Science, 16, 642–651.  https://doi.org/10.1007/s11121-014-0518-7.CrossRefGoogle Scholar
  28. Rabin, B. A., Purcell, P., Naveed, S., Moser, R. P., Henton, M. D., Proctor, E. K., et al. (2012). Advancing the application, quality and harmonization of implementation science measures. Implementation Science, 7(119), 1–11.  https://doi.org/10.1186/1748-5908-7-119.Google Scholar
  29. Sandler, I. N., Gunn, H., Mazza, G., Tein, J.-Y., Wolchik, S. A., Berkel, C., et al. (2018). Effects of a program to promote high quality parenting by divorced and separated fathers. Prevention Science, 19(4), 538–548.  https://doi.org/10.1007/s11121-017-0841-x.CrossRefGoogle Scholar
  30. Sandler, I. N., Schoenfelder, E. N., Wolchik, S. A., & MacKinnon, D. P. (2011). Long-term impact of prevention programs to promote effective parenting: Lasting effects but uncertain processes. Annual Review of Psychology, 62, 299–329.  https://doi.org/10.1146/annurev.psych.121208.131619.CrossRefGoogle Scholar
  31. Sandler, I. N., Wolchik, S. A., Berkel, C., Jones, S., Mauricio, A. M., Tein, J.-Y., et al. (2016). Effectiveness trial of the New Beginnings Program (NBP) for divorcing and separating parents: Translation from an experimental prototype to an evidence-based community service. In M. Israelashvili & J. L. Romano (Eds.), Cambridge handbook of international prevention science (pp. 81–106). Cambridge: Cambridge University Press.Google Scholar
  32. Schoenfelder, E., Sandler, I. N., Millsap, R. E., Wolchik, S. A., Berkel, C., & Ayers, T. S. (2012). Responsiveness to the Family Bereavement Program: What predicts responsiveness? What does responsiveness predict? Prevention Science, 14, 545–556.  https://doi.org/10.1007/s11121-012-0337-7.CrossRefGoogle Scholar
  33. Schoenwald, S. K. (2011). It’s a bird, it’s a plane, it’s… fidelity measurement in the real world. Clinical Psychology: Science and Practice, 18, 142–147.  https://doi.org/10.1111/j.1468-2850.2011.01245.x.Google Scholar
  34. Schoenwald, S. K., Chapman, J. E., Kelleher, K., Hoagwood, K. E., Landsverk, J. A., Stevens, J., et al. (2008). A survey of the infrastructure for children’s mental health services: Implications for the implementation of empirically supported treatments (ESTs). Administration and Policy in Mental Health and Mental Health Services Research, 35, 84–97.  https://doi.org/10.1007/s10488-007-0147-6.CrossRefGoogle Scholar
  35. Schoenwald, S. K., Garland, A. F., Chapman, J. E., Frazier, S. L., Sheidow, A. J., & Southam-Gerow, M. A. (2011). Toward the effective and efficient measurement of implementation fidelity. Administration & Policy in Mental Health & Mental Health Services Research, 38, 32–43.  https://doi.org/10.1007/s10488-010-0321-0.CrossRefGoogle Scholar
  36. Shermis, M., Burstein, J., Elliot, N., Miel, S., & Foltz, P. (2015). Automated writing evaluation: A growing body of knowledge. In C. MacArthur, S. Graham, & J. Fitzgerald (Eds.), Handbook of writing research. New York: Guilford Press.Google Scholar
  37. Wolchik, S. A., Sandler, I. N., Tein, J.-Y., Gunn, H., Mazza, G. L., Kim, H.-J., et al. (2016). Main and moderated effects of the New Beginnings Program versus a low dose comparison. Paper presented at the annual meeting of the society for prevention research, San Francisco, CA.Google Scholar
  38. Wolchik, S. A., Sandler, I. N., Tein, J.-Y., Mahrer, N. E., Millsap, R., Winslow, E., et al. (2013). Fifteen-year follow-up of a randomized trial of a preventive intervention for divorced families: Effects on mental health and substance use outcomes in young adulthood. Journal of Consulting and Clinical Psychology, 81, 660–673.  https://doi.org/10.1037/a0033235.CrossRefGoogle Scholar
  39. Wolchik, S., Sandler, I., Weiss, L., & Winslow, E. (2007). New Beginnings: An empirically-based program to help divorced mothers promote resilience in their children. In J. M. Briesmeister & C. E. Schaefer (Eds.), Handbook of parent training: Helping parents prevent and solve problem behaviors (3rd ed., pp. 25–62). Hoboken, NJ: Wiley.Google Scholar
  40. Wolchik, S. A., West, S. G., Sandler, I. N., Tein, J.-Y., Coatsworth, D., Lengua, L., et al. (2000). An experimental evaluation of theory-based mother and mother–child programs for children of divorce. Journal of Consulting and Clinical Psychology, 68, 843–856.  https://doi.org/10.1037/0022-006x.68.5.843.CrossRefGoogle Scholar
  41. Xiao, B., Imel, Z. E., Georgiou, P. G., Atkins, D. C., & Narayanan, S. S. (2015). “Rate my therapist”: Automated detection of empathy in drug and alcohol counseling via speech and language processing. PLoS ONE, 10, e0143055.  https://doi.org/10.1371/journal.pone.0143055.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.TempeUSA
  2. 2.REACH Institute, Department of PsychologyArizona State UniversityTempeUSA
  3. 3.Center for Prevention Implementation Methodology (Ce-PIM)Northwestern UniversityChicagoUSA

Personalised recommendations