The Journal of Primary Prevention

, Volume 34, Issue 3, pp 147–161 | Cite as

Examining Adaptations of Evidence-Based Programs in Natural Contexts

  • Julia E. Moore
  • Brian K. Bumbarger
  • Brittany Rhoades Cooper
Original Paper

Abstract

When evidence-based programs (EBPs) are scaled up in natural, or non-research, settings, adaptations are commonly made. Given the fidelity-versus-adaptation debate, theoretical rationales have been provided for the pros and cons of adaptations. Yet the basis of this debate is theoretical; thus, empirical evidence is needed to understand the types of adaptations made in natural settings. In the present study, we introduce a taxonomy for understanding adaptations. This taxonomy addresses several aspects of adaptations made to programs including the fit (philosophical or logistical), timing (proactive or reactive), and valence, or the degree to which the adaptations align with the program’s goals and theory, (positive, negative, or neutral). Self-reported qualitative data from communities delivering one of ten state-funded EBPs were coded based on the taxonomy constructs; additionally, quantitative data were used to examine the types and reasons for making adaptations under natural conditions. Forty-four percent of respondents reported making adaptations. Adaptations to the procedures, dosage, and content were cited most often. Lack of time, limited resources, and difficulty retaining participants were listed as the most common reasons for making adaptations. Most adaptations were made reactively, as a result of issues of logistical fit, and were not aligned with, or deviated from, the program’s goals and theory.

Keywords

Adaptation Fidelity Implementation quality 

References

  1. Berkel, C., Mauricio, A. M., Schoenfelder, E., & Sandler, I. N. (2011). Putting the pieces together: An integrated model of program implementation. Prevention Science, 12, 23–33.PubMedCrossRefGoogle Scholar
  2. Bernal, G., & Sáez-Santiago, E. (2006). Culturally centered psychosocial interventions. Journal of Community Psychology, 34, 121–132.CrossRefGoogle Scholar
  3. Blakely, C. H., Mayer, J. P., Gottschalk, R. G., Schmitt, N., Davidson, W. S., Roitman, D. B., et al. (1987). The fidelity-adaptation debate: Implications for the implementation of public sector social programs. American Journal of Community Psychology, 15, 253–268.CrossRefGoogle Scholar
  4. Castro, F. G., Barrera, M., Jr., & Martinez, C. R., Jr. (2004). The cultural adaptation of prevention interventions: Resolving tensions between fidelity and fit. Prevention Science, 5, 41–45.PubMedCrossRefGoogle Scholar
  5. Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clinical Psychology Review, 18, 23–45.PubMedCrossRefGoogle Scholar
  6. Dariotis, J. K., Bumbarger, B. K., Duncan, L. G., & Greenberg, M. T. (2008). How do implementation efforts relate to program adherence? Examining the role of organizational, implementer, and program factors. Journal of Community Psychology, 36, 744–760.CrossRefGoogle Scholar
  7. Dillman Carpentier, F. R., Mauricio, A. M., Gonzales, N. A., Millsap, R. E., Meza, C. M., Dumka, L. E., et al. (2007). Engaging Mexican origin families in a school-based preventive intervention. The Journal of Primary Prevention, 28, 521–546.PubMedCrossRefGoogle Scholar
  8. Dixon, A. L., Yabiku, S. T., Okamoto, S. K., Tann, S. S., Marsiglia, F. F., Kulis, S., et al. (2007). The efficacy of a multicultural prevention intervention among urban American Indian youth in the southwest US. The Journal of Primary Prevention, 28, 547–568.PubMedCrossRefGoogle Scholar
  9. Domitrovich, C. E., & Greenberg, M. T. (2000). The study of implementation: Current findings from effective programs that prevent mental disorders in school-aged children. Journal of Educational and Psychological Consultation, 11, 193–221.CrossRefGoogle Scholar
  10. Durlak, J. A. (1998). Why program implementation is important. Journal of Prevention & Intervention in the Community, 17, 5–18.CrossRefGoogle Scholar
  11. Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327–350.PubMedCrossRefGoogle Scholar
  12. Dusenbury, L., Brannigan, R., Hansen, W. B., Walsh, J., & Falco, M. (2005). Quality of implementation: Developing measures crucial to understanding the diffusion of preventive interventions. Health Education Research, 20, 308–313.PubMedCrossRefGoogle Scholar
  13. Dusenbury, L., Branningan, R., Falco, M., & Hansen, W. B. (2003). A review of research on fidelity of implementation: Implications for drug abuse prevention in school settings. Health Education Research, 18, 237–256.PubMedCrossRefGoogle Scholar
  14. Elliott, D. S., & Mihalic, S. (2004). Issues in disseminating and replicating effective prevention programs. Prevention Science, 5, 47–53.PubMedCrossRefGoogle Scholar
  15. Ennett, S. T., Haws, S., Ringwalt, C. L., Vincus, A. A., Hanley, S., Bowling, J. M., et al. (2011). Evidence-based practice in school substance use prevention: Fidelity of implementation under real-world conditions. Health Education Research, 26, 361–371.PubMedCrossRefGoogle Scholar
  16. Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231).Google Scholar
  17. Gottfredson, D. C., & Gottfredson, G. D. (2002). Quality of school-based prevention programs: Results from a national survey. Journal of Research in Crime & Delinquency, 39, 3–35.CrossRefGoogle Scholar
  18. Hallfors, D., & Godette, D. (2002). Will the “Principles of Effectiveness” improve prevention practice? Early findings from a diffusion study. Health Education Research, 17, 461–470.PubMedCrossRefGoogle Scholar
  19. Hansen, W. B., Bishop, D. C., & Bryant, K. S. (2009). Using online components to facilitate program implementation: Impact of technological enhancements to All Stars on ease and quality of program delivery. Prevention Science, 10, 66–75.PubMedCrossRefGoogle Scholar
  20. Hill, L. G., Maucione, K., & Hood, B. K. (2007). A focused approach to assessing program fidelity. Prevention Science, 8, 25–34.PubMedCrossRefGoogle Scholar
  21. Kalichman, S. C., Cherry, C., White, D., Pope, H., Cain, D., & Kalichman, M. (2007). Altering key characteristics of a disseminated effective behavioral intervention for HIV positive adults: The “Healthy Relationships” experience. The Journal of Primary Prevention, 28, 145–153.PubMedCrossRefGoogle Scholar
  22. Knoche, L. L., Sheridan, S. M., Edwards, C. P., & Osborn, A. Q. (2010). Implementation of a relationship-based school readiness intervention: A multidimensional approach to fidelity measurement for early childhood. Early Childhood Research Quarterly, 25, 299–313.PubMedCrossRefGoogle Scholar
  23. Kumpfer, K. L., Alvarado, R., Smith, P., & Bellamy, N. (2002). Cultural sensitivity and adaptation in family-based prevention interventions. Prevention Science, 3, 241–246.PubMedCrossRefGoogle Scholar
  24. Kumpfer, K. L., Pinyuchon, M., Teixeira de Melo, A., & Whiteside, H. O. (2008). Cultural adaptation process for international dissemination of the Strengthening Families Program. Evaluation and the Health Professions, 31, 226–239.CrossRefGoogle Scholar
  25. Lee, S. J., Altschul, I., & Mowbray, C. T. (2008). Using planned adaptation to implement evidence-based programs with new populations. American Journal of Community Psychology, 41, 290–303.PubMedCrossRefGoogle Scholar
  26. Lightfoot, M. A., Kasirye, R., Comulada, W. S., & Rotheram-Borus, M. J. (2007). Efficacy of a culturally adapted intervention for youth living with HIV in Uganda. Prevention Science, 8, 271–273.PubMedCrossRefGoogle Scholar
  27. Marek, L. I., Brock, D. P., & Sullivan, R. (2006). Cultural adaptations to a family life skills program: Implementation in rural Appalachia. The Journal of Primary Prevention, 27, 113–133.PubMedCrossRefGoogle Scholar
  28. Odom, S. L., Fleming, K., Diamond, K., Lieber, J., Hanson, M., Butera, G., et al. (2010). Examining different forms of implementation and in early childhood curriculum research. Early Childhood Research Quarterly, 25, 314–328.PubMedCrossRefGoogle Scholar
  29. Ozer, E. J., Wanis, M. G., & Bazell, N. (2010). Diffusion of school-based prevention programs in two urban districts: Adaptations, rationales, and suggestions for change. Prevention Science, 11, 42–55.PubMedCrossRefGoogle Scholar
  30. Perez, D., Lefèvre, P., Castro, M., Sánchez, L., Toledo, M. E., Vanlerberghe, V., et al. (2011). Process-oriented fidelity research assists in evaluation, adjustment and scaling-up of community-based interventions. Health Policy and Planning, 26(5), 413–422.Google Scholar
  31. Resnicow, K., Soler, R., Braithwaite, R. L., Ahluwalia, J. S., & Butler, J. (2000). Cultural sensitivity in substance use prevention. Journal of Community Psychology, 28, 271–290.CrossRefGoogle Scholar
  32. Rhoades, B. L., Bumbarger, B. K., & Moore, J. E. (2012). The role of a state-level prevention support system in promoting high-quality implementation and sustainability of evidence-based programs. American Journal of Community Psychology, 50, 386–401.PubMedCrossRefGoogle Scholar
  33. Ringwalt, C. L., Pankratz, M. M., Hansen, W. B., Dusenbury, L., Jackson-Newsom, J., Giles, S. M., et al. (2009). The potential of coaching as a strategy to improve the effectiveness of school-based substance use prevention curricula. Health Education & Behavior, 36, 696–710.CrossRefGoogle Scholar
  34. Ringwalt, C. L., Vincus, A., Ennett, S., Johnson, R., & Rohrbach, L. A. (2004). Reasons for teachers’ adaptation of substance use prevention curricula in schools with non-white student populations. Prevention Science, 5, 61–67.PubMedCrossRefGoogle Scholar
  35. Rohrbach, L. A., Grana, R., Sussman, S., & Valente, T. W. (2006). Type II translation: Transporting prevention interventions from research to real-world settings. Evaluation and the Health Professions, 29, 302–333.CrossRefGoogle Scholar
  36. Spoth, R., Guyll, M., Lillehoj, C. J., Redmond, C., & Greenberg, M. T. (2007). PROSPER study of evidence-based intervention implementation quality by community–university partnerships. Journal of Community Psychology, 35, 981–999.PubMedCrossRefGoogle Scholar
  37. Wandersman, A., Imm, P., Chinman, M., & Kaftarian, S. (2000). Getting to outcomes: A results-based approach to accountability. Evaluation and Program Planning, 23, 389–395.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  • Julia E. Moore
    • 1
  • Brian K. Bumbarger
    • 2
  • Brittany Rhoades Cooper
    • 3
  1. 1.Li Ka Shing Knowledge Institute, St. Michael’s HospitalTorontoCanada
  2. 2.Prevention Research CenterPennsylvania State UniversityState CollegeUSA
  3. 3.Department of Human DevelopmentWashington State UniversityPullmanUSA

Personalised recommendations