The Journal of Primary Prevention

, Volume 34, Issue 3, pp 193–207 | Cite as

The Tug-of-War: Fidelity Versus Adaptation Throughout the Health Promotion Program Life Cycle

  • Melissa BoppEmail author
  • Ruth P. Saunders
  • Diana Lattimore
Research Methods and Practice


Researchers across multiple fields have described the iterative and nonlinear phases of the translational research process from program development to dissemination. This process can be conceptualized within a “program life cycle” framework that includes overlapping and nonlinear phases: development, adoption, implementation, maintenance, sustainability or termination, and dissemination or diffusion, characterized by tensions between fidelity to the original plan and adaptation for the setting and population. In this article, we describe the life cycle (phases) for research-based health promotion programs, the key influences at each phase, and the issues related to the tug-of-war between fidelity and adaptation throughout the process using a fictionalized case study based on our previous research. This article suggests the importance of reconceptualizing intervention design, involving stakeholders, and monitoring fidelity and adaptation throughout all phases to maintain implementation fidelity and completeness. Intervention fidelity should be based on causal mechanisms to ensure effectiveness, while allowing for appropriate adaption to ensure maximum implementation and sustainability. Recommendations for future interventions include considering the determinants of implementation including contextual factors at each phase, the roles of stakeholders, and the importance of developing a rigorous, adaptive, and flexible definition of implementation fidelity and completeness.


Intervention Implementation Fidelity Efficacy Effectiveness Adaptations 


  1. August, G. J., Bloomquist, M. L., Lee, S. S., Realmuto, G. M., & Hektner, J. M. (2006). Can evidence-based prevention programs be sustained in community practice settings? The Early Risers’ advanced-stage effectiveness trial. Prevention Science, 7, 151–165.PubMedCrossRefGoogle Scholar
  2. August, G. J., Gewirtz, A., & Realmuto, G. M. (2010). Moving the field of prevention from science to service: Integrating evidence-based preventive interventions into community practice through adapted and adaptive models. Applied and Preventive Psychology, 14, 72–85.CrossRefGoogle Scholar
  3. Backer, T. E. (2002). Finding the balance: Program fidelity and adaptation in substance abuse prevention: A state-of-the-art review. Rockville: Substance Aubse and Mental Health Services Administration, Center for Substance Abuse Prevention.Google Scholar
  4. Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Englewood Cliffs: Prentice-Hall.Google Scholar
  5. Bartholomew, L. K., Parcel, G. S., Kok, G., & Gottlieb, N. H. (2006). Planning health promotion programs: An intervention mapping approach (2nd ed.). San Francisco: Jossey-Bass.Google Scholar
  6. Bond, L., Glover, S., Godfrey, C., Butler, H., & Patton, G. C. (2001). Building capacity for system-level change in schools: Lessons from the Gatehouse Project. Health Education and Behavior, 28, 368–383.PubMedCrossRefGoogle Scholar
  7. Castro, F. G., Barrera, M., Jr, & Martinez, C. R., Jr. (2004). The cultural adaptation of prevention interventions: Resolving tenisions between fidelity and fit. Prevention Science, 5, 41–45.PubMedCrossRefGoogle Scholar
  8. Chen, H. T. (2004). Practical program evaluation: Assessing and improving planning, implementation, and effectiveness. Thousand Oaks: SAGE.Google Scholar
  9. Collins, L. M., Murphy, S. A., & Strecher, V. (2007). The multiphase optimization strategy (MOST) and the sequential multiple assignment randomized trial (SMART): New methods for more potent eHealth interventions. American Journal of Preventive Medicine, 32, S112–S118.PubMedCrossRefGoogle Scholar
  10. Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science, 4, 50.PubMedCrossRefGoogle Scholar
  11. Dearing, J. W. (2008). Evolution of diffusion and dissemination theory. Journal of Public Health Management and Practice, 14, 99–108.PubMedGoogle Scholar
  12. Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal Community Psychology, 41, 327–350.CrossRefGoogle Scholar
  13. Dusenbury, L., Brannigan, R., Falco, M., & Hansen, W. B. (2003). A review of research on fidelity of implementation: Implications for drug abuse prevention in school settings. Health Education Research, 18, 237–256.PubMedCrossRefGoogle Scholar
  14. Emshoff, J., Blakely, C., Gottschalk, R., Mayer, J., Davidson, W., & Erickson, S. (1987). Innovation in education and criminal justice: Measuring fidelity of implementation and program effectiveness. Educational Evaluation and Policy Analysis, 9, 300–311.Google Scholar
  15. Glasgow, R. E., Lichtenstein, E., & Marcus, A. C. (2003). Why don’t we see more translation of health promotion research to practice? Rethinking the efficacy-to-effectiveness transition. American Journal of Public Health, 93, 1261–1267.PubMedCrossRefGoogle Scholar
  16. Glasgow, R. E., Vogt, T. M., & Boles, S. M. (1999). Evaluating the public health impact of health promotion interventions: The RE-AIM framework. American Journal of Public Health, 89, 1322–1327.PubMedCrossRefGoogle Scholar
  17. Green, L. W., & Kreuter, M. W. (2005). Health program planning: An educational and ecological approach (4th ed.). New York: McGraw-Hill.Google Scholar
  18. Greenhalgh, T., Robert, G., Bate, P., Kyriakidou, O., Macfarlane, F., & Peacock, R. (2004). How to spread good ideas: A systematic review of the literature on diffusion, dissemination and sustainability of innovations in health service delivery and organisation: Report for the National Co-ordinating Centre for NHS Service Delivery and Organisation R & D (NCCSDO). London, UK: National Co-ordinating Centre for NHS Service Delivery and Organisation R & D (NCCSDO).Google Scholar
  19. Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., & Kyriakidou, O. (2004b). Diffusion of innovations in service organizations: Systematic review and recommendations. Milbank Quarterly, 82, 581–629.PubMedCrossRefGoogle Scholar
  20. Hallfors, D., Cho, H., Sanchez, V., Khatapoush, S., Kim, H. M., & Bauer, D. (2006). Efficacy vs effectiveness trial results of an indicated “model” substance abuse program: Implications for public health. American Journal of Public Health, 96, 2254–2259.PubMedCrossRefGoogle Scholar
  21. Hawe, P., Shiell, A., & Riley, T. (2004). Complex interventions: How “out of control” can a randomised controlled trial be? British Medical Journal, 328, 1561–1563.PubMedCrossRefGoogle Scholar
  22. Hawe, P., Shiell, A., & Riley, T. (2009). Theorising interventions as events in systems. American Journal of Community Psychology, 43, 267–276.PubMedCrossRefGoogle Scholar
  23. Johnson, K., Hays, C., Center, H., & Daley, C. (2004). Building capacity and sustainable prevention innovations: A sustainability planning model. Evaluation and Program Planning, 27, 135–149.CrossRefGoogle Scholar
  24. Kilbourne, A. M., Neumann, M. S., Pincus, H. A., Bauer, M. S., & Stall, R. (2007). Implementing evidence-based interventions in health care: Application of the replicating effective programs framework. Implementation Science, 2, 42.PubMedCrossRefGoogle Scholar
  25. Lee, S. J., Altschul, I., & Mowbray, C. T. (2008). Using planned adaptation to implement evidence-based programs with new populations. American Journal of Clinical Psychology, 41, 290–303.Google Scholar
  26. Linnan, L., & Steckler, A. (Eds.). (2002). Process evaluation for public health interventions and research. San Francisco: Jossey-Bass.Google Scholar
  27. McGraw, S. A., Sellers, D. E., Stone, E. J., Bebchuk, J., Edmundson, E. W., Johnson, C. C., et al. (1996). Using process data to explain outcomes. An illustration from the Child and Adolescent Trial for Cardiovascular Health (CATCH). Evaluation Review, 20, 291–312.PubMedCrossRefGoogle Scholar
  28. McLeroy, K. R., Bibeau, D., Steckler, A., & Glanz, K. (1988). An ecological perspective on health promotion programs. Health Education Quarterly, 15, 351–377.PubMedCrossRefGoogle Scholar
  29. Medves, J., Godfrey, C., Turner, C., Paterson, M., Harrison, M., MacKenzie, L., et al. (2010). Systematic review of practice guideline dissemination and implementation strategies for healthcare teams and team-based practice. International Journal of Evidence-Based Healthcare, 8, 79–89.PubMedGoogle Scholar
  30. Mercer, S. L., DeVinney, B. J., Fine, L. J., Green, L. W., & Dougherty, D. (2007). Study designs for effectiveness and translational research: Identifying trade-offs. American Journal of Preventive Medicine, 33, 139–154.PubMedCrossRefGoogle Scholar
  31. Mowbray, C. T., Holter, M. C., Teague, G. B., & Bybee, D. (2003). Fidelity criteria: Development, measurement, and validation. American Journal of Evaluation, 24, 315–340.Google Scholar
  32. National Cancer Institute. (2002). Designing for dissemination: Conference summary report. Washington, DC: National Cancer Institute, Center for the Advancement of Health, Robert Wood Johnson Foundation.Google Scholar
  33. Oakley, A., Strange, V., Bonell, C., Allen, E., & Stephenson, J. (2006). Process evaluation in randomised controlled trials of complex interventions. British Medical Journal, 332, 413–416.PubMedCrossRefGoogle Scholar
  34. Pate, R. R., Saunders, R. P., Ward, D. S., Felton, G., Trost, S. G., & Dowda, M. (2003). Evaluation of a community-based intervention to promote physical activity in youth: Lessons from Active Winners. American Journal of Health Promotion, 17, 171–182.PubMedCrossRefGoogle Scholar
  35. Pate, R. R., Ward, D. S., Saunders, R. P., Felton, G., Dishman, R. K., & Dowda, M. (2005). Promotion of physical activity among high-school girls: A randomized controlled trial. American Journal of Public Health, 95, 1582–1587.PubMedCrossRefGoogle Scholar
  36. Patton, G., Bond, L., Butler, H., & Glover, S. (2003). Changing schools, changing health? Design and implementation of the Gatehouse Project. Journal of Adolescent Health, 33, 231–239.PubMedCrossRefGoogle Scholar
  37. Pluye, P., Potvin, L., & Denis, J. L. (2004). Making public health programs last: Conceptualizing sustainability. Evaluation and Program Planning, 27, 121–133.CrossRefGoogle Scholar
  38. Pluye, P., Potvin, L., Denis, J. L., Pelletier, J., & Mannoni, C. (2005). Program sustainability begins with the first events. Evaluation and Program Planning, 28, 123–137.CrossRefGoogle Scholar
  39. Ramiller, N., & Pentland, B. (2009). Management implications in information systems research: The untold story. Journal of the Association for Information Systems, 10, 474–494.Google Scholar
  40. Resnicow, K., Baranowski, T., Ahluwalia, J. S., & Braithwaite, R. L. (1999). Cultural sensitivity in public health: Defined and demystified. Ethncity & Disease, 9, 10–21.Google Scholar
  41. Rogers, E. M. (1995). Diffusion of innovation. New York: Free Press.Google Scholar
  42. Sanchez, V., Steckler, A., Nitirat, P., Hallfors, D., Cho, H., & Brodish, P. (2007). Fidelity of implementation in a treatment effectiveness trial of reconnecting youth. Health Education Research, 22, 95–107.PubMedCrossRefGoogle Scholar
  43. Saunders, R. P., Evans, M. H., & Joshi, P. (2005). Developing a process-evaluation plan for assessing health promotion program implementation: A how-to guide. Health Promotion Practice, 6, 134–147.PubMedCrossRefGoogle Scholar
  44. Saunders, R. P., & Moody, J. (2006). Community agency survey formative research results from the TAAG study. Health Education and Behavior, 33, 12–24.PubMedCrossRefGoogle Scholar
  45. Scheirer, M. A. (2005). Is sustainability possible? A review and commentary on empirical studies of sustainability. American Journal of Evaluation, 26, 320–347.CrossRefGoogle Scholar
  46. Scheirer, M. A. (2008). Defining sustainability outcomes of health programs: Illustrations from an online survey. Evaluation and Program Planning, 31, 335–346.PubMedCrossRefGoogle Scholar
  47. Shediac-Rizkallah, M. C., & Bone, L. R. (1998). Planning for the sustainability of community-based health programs: Conceptual frameworks and future directions for research, practice and policy. Health Education Research, 13, 87–108.PubMedCrossRefGoogle Scholar
  48. Tageja, N. (2011). Bridging the translation gap—new hopes, new challenges. Fundamental & Clinical Pharmacology, 25, 163–171.CrossRefGoogle Scholar
  49. Ward, D. S., Saunders, R., Felton, G. M., Williams, E., Epping, J. N., & Pate, R. R. (2006). Implementation of a school environment intervention to increase physical activity in high school girls. Health Education Research, 21, 896–910.PubMedCrossRefGoogle Scholar
  50. Wilson, D. K., Griffin, S., Saunders, R. P., Evans, A., Mixon, G., Wright, M., et al. (2006). Formative evaluation of a motivational intervention for increasing physical activity in underserved youth. Evaluation and Program Planning, 29, 260–268.PubMedCrossRefGoogle Scholar
  51. Wilson, D. K., Griffin, S., Saunders, R. P., Kitzman-Ulrich, H., Meyers, D. C., & Mansard, L. (2009). Using process evaluation for program improvement in dose, fidelity and reach: The ACT trial experience. International Journal of Behavioral Nutrition & Physical Activity, 6, 79.CrossRefGoogle Scholar
  52. Zapka, J., Goins, K. V., Pbert, L., & Ockene, J. K. (2004). Translating efficacy research to effectiveness studies in practice: Lessons from research to promote smoking cessation in community health centers. Health Promotion Practice, 5, 245–255.PubMedCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  • Melissa Bopp
    • 1
    Email author
  • Ruth P. Saunders
    • 2
  • Diana Lattimore
    • 3
  1. 1.Department of Kinesiology, College of Health and Human DevelopmentThe Pennsylvania State UniversityUniversity ParkUSA
  2. 2.Department of Health Promotion, Education, and Behavior, Arnold School of Public HealthUniversity of South CarolinaColumbiaUSA
  3. 3.Department of Exercise and Sport Science, College of Arts and SciencesUniversity of San FranciscoSan FranciscoUSA

Personalised recommendations