Prevention Science

, Volume 9, Issue 3, pp 215–229

Fidelity at a Distance: Assessing Implementation Fidelity of the Early Risers Prevention Program in a Going-to-Scale Intervention Trial

  • Chih-Yuan S. Lee
  • Gerald J. August
  • George M. Realmuto
  • Jason L. Horowitz
  • Michael L. Bloomquist
  • Bonnie Klimes-Dougan
Article

Abstract

The present study examined the feasibility of an innovative technology designed to assess implementation fidelity of the Early Risers conduct problems prevention program across 27 geographically dispersed school sites. A multidimensional construct of fidelity was used to assess the quantity of services provided (exposure), the degree to which program strategies conformed to the manual (adherence), and how well implementers delivered the program (quality of delivery). The measurement technology featured a fidelity monitoring system that required (a) weekly reporting on a web-based documentation system to assess program exposure and adherence, and (b) five annually administered telephone interviews with a technical assistant to assess quality of program implementation. The results showed that the fidelity monitoring system was feasible, with all sites achieving 100% compliance in completion of their required on-line reporting and on average over 80% of the required teleconference interviews. User feedback indicated satisfaction with the web-based program. The system was successful in measuring multiple indices of fidelity. The strengths and limitations of measuring fidelity at a distance with web-based and teleconferencing technologies are discussed.

Keyword

Prevention Implementation fidelity Exposure Adherence Quality of delivery 

References

  1. August, G. J., Bloomquist, M. L., Realmuto, G. M., & Hektner, J. M. (2007). The Early Risers “Skills for Success” Program: A targeted intervention for preventing conduct problems and substance abuse in aggressive elementary school children. In P. Tolan, J. Szapocznik, & S. Sambrano (Eds.), Preventing youth substance abuse (pp. 137–158). Washington, DC: American Psychological Association.CrossRefGoogle Scholar
  2. Blakeley, C. H., Mayer, J. P., Gottschalk, R. G., Schmitt, N., Davidson, W. S., Roitman, D. B., et al. (1987). The fidelity-adaptation debate: Implications for the implementation for public sector social programs. American Journal of Community Psychology, 15, 253–268, doi:10.1007/BF00922697.CrossRefGoogle Scholar
  3. Botvin, G. J., Baker, E., Dusenberry, L., Tortu, S., & Botvin, E. M. (1990). Preventing adolescent drug abuse through a multimodal cognitive-behavioral approach: Results of a 3-year study. Journal of Consulting and Clinical Psychology, 58, 437–446, doi:10.1037/0022-006X.58.4.437.PubMedCrossRefGoogle Scholar
  4. Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: Are implementation efforts out of control? Clinical Psychology Review, 18, 23–45, doi:10.1016/S0272-7358(97)00043-3.PubMedCrossRefGoogle Scholar
  5. Dumas, J. E., Lynch, A. M., Laughlin, J. E., Smith, E. P., & Prinz, R. J. (2001). Promoting intervention fidelity: Conceptual issues, methods, and preliminary results from the Early Alliance Prevention Trial. American Journal of Preventive Medicine, 20, 38–47, doi:10.1016/S0749-3797(00)00272-5.PubMedCrossRefGoogle Scholar
  6. Durlak, J. A. (1997). Successful prevention programs for children and adolescents. New York: Plenum.Google Scholar
  7. Dusenbury, L., Branningan, R., Falco, M., & Hansen, W. B. (2003). A review of research on fidelity of implementation: Implications for drug abuse prevention in school settings. Health Education Research, 18, 237–256, doi:10.1093/her/18.2.237.PubMedCrossRefGoogle Scholar
  8. Greenberg, M. T., & Kusche, C. A. (1998). Promoting Alternative Thinking Strategies, Book 10: Blueprint for violence prevention. University of Colorado: Institute of Behavioral Sciences.Google Scholar
  9. Gresham, F. M., MacMillian, D. L., Beebe-Frankenberger, M. E., & Bocian, K. M. (2000). Treatment integrity in learning disabilities intervention research: Do we really know how treatments are implemented? Learning Disabilities Research and Practice, 15, 198–205, doi:10.1207/SLDRP1504_4.CrossRefGoogle Scholar
  10. Hansen, W. B., Graham, J. W., Wolkenstein, B. H., & Rohrbach, L. A. (1991). Journal of Studies on Alcohol, 52, 568–579.PubMedGoogle Scholar
  11. Hill, L. G., Maucione, K., & Hood, B. K. (2007). A focused approach to assessing program fidelity. Prevention Science, 8, 25–34, doi:10.1007/s11121-006-0051-4.PubMedCrossRefGoogle Scholar
  12. Kam, C. M., Greenberg, M. T., & Walls, C. T. (2003). Examining the role of implementation quality in school-based prevention using the PATHS curriculum. Prevention Science, 4, 55–63, doi:10.1023/A:1021786811186.PubMedCrossRefGoogle Scholar
  13. Kendall, P. C., & Beidas, R. S. (2007). Smoothing the trail for dissemination of evidence-based practices for youth: Flexibility within fidelity. Professional Psychology, Research and Practice, 38, 13–20, doi:10.1037/0735-7028.38.1.13.CrossRefGoogle Scholar
  14. Lillehoj, C. J., Griffin, K. W., & Spoth, R. (2004). Program provider and observer ratings of school-based preventive intervention implementation: Agreement and relation to youth outcomes. Health Education and Behavior, 31, 242–257, doi:10.1177/1090198103260514.CrossRefGoogle Scholar
  15. Moncher, F., & Prinz, R. (1991). Treatment fidelity in outcome studies. Clinical Psychology Review, 11, 247–266, doi:10.1016/0272-7358(91)90103-2.CrossRefGoogle Scholar
  16. Ogden, T., Forgatch, M. S., Askeland, E., Patterson, G. R., & Bullock, B. M. (2005). Implementation of parent management training at the national level: The Case of Norway. Journal of Social Work Practice, 19, 317–329, doi:10.1080/02650530500291518.CrossRefGoogle Scholar
  17. Pentz, M. A., Trebow, E. A., Hansen, W. B., MacKinnon, D. P., Dqyer, J. H., Johnson, C. A., et al. (1990). Effects of program implementation on adolescent drug use behavior: The Midwestern Prevention Project (MPP). Evaluation Review, 14, 264–289, doi:10.1177/0193841X9001400303.CrossRefGoogle Scholar
  18. Rohrbach, L. A., Graham, J. W., & Hansen, W. B. (1993). Diffusion of a school-based substance abuse prevention program: Predictors of program implementation. Preventive Medicine, 22, 237–260, doi:10.1006/pmed.1993.1020.PubMedCrossRefGoogle Scholar
  19. Small, M., Domitrovich, C., Camplese, K., & Prabhu, V. (2006). Interactive system for program implementation and real-time evaluation: Using technology to improve implementation quality of EBIs. Paper presented at the annual convention of the Society for Prevention Research, San Antonio, TX, May.Google Scholar
  20. Sobol, D., Rorbach, L., Dent, C., Gleason, L., Brannon, B., Johnson, C. A., et al. (1989). The integrity of smoking prevention curriculum delivery. Health Education Research, 4, 59–67, doi:10.1093/her/4.1.59.CrossRefGoogle Scholar
  21. Yeaton, W. H., & Sechrest, L. (1981). Critical dimensions in the choice and maintenance of successful treatments: Strength, integrity, and effectiveness. Journal of Consulting and Clinical Psychology, 49, 156–167, doi:10.1037/0022-006X.49.2.156.PubMedCrossRefGoogle Scholar

Copyright information

© Society for Prevention Research 2008

Authors and Affiliations

  • Chih-Yuan S. Lee
    • 1
  • Gerald J. August
    • 1
  • George M. Realmuto
    • 1
  • Jason L. Horowitz
    • 1
  • Michael L. Bloomquist
    • 1
  • Bonnie Klimes-Dougan
    • 1
  1. 1.Division of Child and Adolescent PsychiatryUniversity of Minnesota Medical SchoolMinneapolisUSA

Personalised recommendations