Prevention Science

, Volume 17, Issue 4, pp 429–438 | Cite as

Adaptation and Fidelity: a Recipe Analogy for Achieving Both in Population Scale Implementation

  • Lynn Kemp


Balancing adherence to fidelity of evidence-based programs and adaptation to local context is one of the key debates in the adoption and implementation of effective programs. Concern about maintaining fidelity to achieve outcomes can result in replication of research-based models that can be a poor fit with the real world. Equally, unplanned adaptation can result in program drift away from the core elements needed to achieve outcomes. To support implementation of the Maternal Early Childhood Sustained Home-visiting (MECSH) program in multiple sites in three countries, an analogy was developed to identify how both fidelity and adaptation can be managed and successfully achieved. This article presents the Commonsense Cookery Book Basic Plain Cake with Variations recipe analogy to articulate the dual requirements of both fidelity and adaptation to achieve quality implementation of the MECSH program. Components classified by the analogy include identification of core ingredients, methods, and equipment that contribute to fundamental outcomes and fidelity to the evidence-based program, and a planned, collaborative approach to identification of needed variations to suit locally sourced capacity, needs, and tastes. Quality is achieved by identifying and measuring the core ingredients and the variations. Sourcing local ingredients and honoring of context support sustainability of quality practice. Using this analogy has assisted adopters of the MECSH program to understand that effective implementation requires uncompromised commitment to expectations of fidelity to the core components and methods; planned, proactive adaptation; systematic monitoring of both core program and agreed variations; and local ownership and sustainability.


Adaptation Fidelity Evidence-based practice Implementation Sustainability Analogy 



I would like to thank the MECSH program investigators and research teams, program funders, and all the members of the home visiting teams implementing MECSH and delivering the intervention to families. I also thank Fiona Byrne for the assistance in the preparation of this paper for publication.

Author’s Contribution

LK conceived the idea and prepared the manuscript.

Compliance with Ethical Standards

This article does not contain any studies with human participants performed by any of the authors. For this type of study formal consent is not required.

Conflict of Interest

The author declares that there is no conflict of interest. The author has no personal financial interest in the MECSH program. Lynn Kemp was the MECSH Trial Coordinator. The MECSH® program is a registered trademark of UNSW Australia and from 2016 for the duration of 5 years is being sublicensed to Western Sydney University.


There was no funding for the preparation of this manuscript. The original MECSH trial was funded by the Australian Research Council (LP0560285), Sydney South West Area Health Service (now known as South Western Sydney Local Health District), the NSW Department of Community Services (now known as the NSW Department of Family and Community Services) and the NSW Department of Health.


  1. Aarons, G.A., Green, A.E., Palinkas, L.A., Self-Brown, S., Whitaker, D.J., Lutzker, J.R., . . . Chaffin, M.J. (2012). Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implementation Science 7. doi:10.1186/1748-5908-7-32.Google Scholar
  2. Aslam, H., & Kemp, L. (2005). Home visiting in South Western Sydney: an integrative literature review, description and development of a generic model. Liverpool, NSW: Centre for Health Equity Training Research and Evaluation.Google Scholar
  3. Blakely, C. H., Mayer, J. P., Gottschalk, R. G., Schmitt, N., Davidson, W. S., Roitman, D. B., & Emshoff, J. G. (1987). The fidelity-adaptation debate: implications for the implementation of public sector social programs. American Journal of Community Psychology, 15(3), 253–268. doi: 10.1007/BF00922697.CrossRefGoogle Scholar
  4. Bopp, M., Saunders, R. P., & Lattimore, D. (2013). The tug-of-war: fidelity versus adaptation throughout the health promotion program life cycle. Journal of Primary Prevention, 34(3), 193–207. doi: 10.1007/s10935-013-0299-y.CrossRefPubMedGoogle Scholar
  5. Brewer, J. (2006). Foundations of multimethod research: Synthesizing styles. Thousand Oaks, CA: Sage Publications.Google Scholar
  6. Castro, F. G., Barrera, M., Jr., & Martinez, C. R., Jr. (2004). The cultural adaptation of prevention interventions: resolving tensions between fidelity and fit. Prevention Science, 5(1), 41–45. doi: 10.1023/ Scholar
  7. Chorpita, B. F., Becker, K. D., & Daleiden, E. L. (2007). Understanding the common elements of evidence-based practice: misconceptions and clinical examples. Journal of the American Academy of Child and Adolescent Psychiatry, 46(5), 647–652. doi: 10.1097/chi.0b013e318033ff71.CrossRefPubMedGoogle Scholar
  8. Common sense. (2015). Oxford: Oxford University Press. Retrieved 24 Nov 2015, from:
  9. Daro, D. (2009). Embedding home visitation programs within a system of early childhood services. Chapin Hall issue brief. Chicago, IL: Chapin Hall at the University of Chicago.Google Scholar
  10. Davis, H., & Spurr, P. (1998). Parent counselling: an evaluation of a community child mental health service. Journal of Child Psychology and Psychiatry, and Allied Disciplines, 39(3), 365–376.CrossRefPubMedGoogle Scholar
  11. De Shazer, S., Berg, I. K., Lipchik, E., Nunnally, E., Molnar, A., Gingerich, W., & Weiner-Davis, M. (1986). Brief therapy: focused solution development. Family Process, 25(2), 207–221.CrossRefPubMedGoogle Scholar
  12. Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3-4), 327–350. doi: 10.1007/s10464-008-9165-0.CrossRefPubMedGoogle Scholar
  13. Egan, G. (1982). The skilled helper: model, skills, and methods for effective helping (2nd ed.). Monterey, Calif.: Brooks/Cole.Google Scholar
  14. Elliott, D. S., & Mihalic, S. (2004). Issues in disseminating and replicating effective prevention programs. Prevention Science, 5(1), 47–53. doi: 10.1023/B:PREV.0000013981.28071.52.CrossRefPubMedGoogle Scholar
  15. Embry, D. D., & Biglan, A. (2008). Evidence-based kernels: fundamental units of behavioral influence. Clinical Child and Family Psychology Review, 11(3), 75–113. doi: 10.1007/s10567-008-0036-x.CrossRefPubMedPubMedCentralGoogle Scholar
  16. Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network.Google Scholar
  17. Green, L. W., Ottoson, J. M., García, C., & Hiatt, R. A. (2009). Diffusion theory and knowledge dissemination, utilization, and integration in public health. Annual Review of Public Health, 30, 151–174. doi: 10.1146/annurev.publhealth.031308.100049.CrossRefPubMedGoogle Scholar
  18. Hill, L. G., & Owens, R. W. (2013). Component analysis of adherence in a family intervention. Health Education, 113(4), 264–280. doi: 10.1108/09654281311329222.CrossRefPubMedPubMedCentralGoogle Scholar
  19. Hill, L. G., Maucione, K., & Hood, B. K. (2007). A focused approach to assessing program fidelity. Prevention Science, 8(1), 25–34. doi: 10.1007/s11121-006-0051-4.CrossRefPubMedGoogle Scholar
  20. Hohmann, A. A., & Shear, M. K. (2002). Community-based intervention research: coping with the “noise” of real life in study design. American Journal of Psychiatry, 159(2), 201–207. doi: 10.1176/appi.ajp.159.2.201.CrossRefPubMedGoogle Scholar
  21. Kant, I. (1914). Kant’s Critique of Judgement, translated with introduction and notes by J.H. Bernard. London: Macmillan.Google Scholar
  22. Kardamanidis, K., Kemp, L., & Schmied, V. (2009). Uncovering psychosocial needs: perspectives of Australian child and family health nurses in a sustained home visiting trial. Contemporary Nurse, 33(1), 50–58.CrossRefPubMedGoogle Scholar
  23. Kelly, G.A. (1955). The Psychology of Personal Constructs. New York: Norton.Google Scholar
  24. Kemp, L., & Harris, E. (2012). The challenges of establishing and researching a sustained nurse home visiting programme within the universal child and family health service system. Journal of Research in Nursing, 17(2), 127–138. doi: 10.1177/1744987111432228.CrossRefGoogle Scholar
  25. Kemp, L., Anderson, T., Travaglia, J., & Harris, E. (2005). Sustained nurse home visiting in early childhood: exploring Australian nursing competencies. Public Health Nursing, 22(3), 254–259.CrossRefPubMedGoogle Scholar
  26. Kemp, L., Eisbacher, L., McIntyre, L., O’Sullivan, K., Taylor, J., Clark, T., & Harris, E. (2006). Working in partnership in the antenatal period: what do child and family health nurses do? Contemporary Nurse, 23(2), 312–320.CrossRefPubMedGoogle Scholar
  27. Kemp, L., Harris, E., McMahon, C., Matthey, S., Vimpani, G., Anderson, T., . . . Zapart, S. (2011). Child and family outcomes of a long-term nurse home visitation program: a randomised controlled trial. Archives of Disease in Childhood, 96, 533-540. doi:10.1136/adc.2010.196279.Google Scholar
  28. Kemp, L., Harris, E., McMahon, C., Matthey, S., Vimpani, G., Anderson, T., . . . Aslam, H. (2013). Benefits of psychosocial intervention and continuity of care by child and family health nurses in the pre- and postnatal period: Process evaluation. Journal of Advanced Nursing, 69, 1850-1861. doi:10.1111/jan.12052.Google Scholar
  29. Lee, S. J., Altschul, I., & Mowbray, C. T. (2008). Using planned adaptation to implement evidence-based programs with new populations. American Journal of Community Psychology, 41(3-4), 290–303. doi: 10.1007/s10464-008-9160-5.CrossRefPubMedGoogle Scholar
  30. Maxwell, J. A. (2012). The importance of qualitative research for causal explanation in education. Qualitative Inquiry, 18(8), 655–661. doi: 10.1177/1077800412452856.CrossRefGoogle Scholar
  31. Meskin, J., & Shapiro, H. (2014). ‘To give an example is a complex Act’: Agamben’s pedagogy of the paradigm. Educational Philosophy and Theory, 46(4), 421–440. doi: 10.1080/00131857.2013.775058.CrossRefGoogle Scholar
  32. Meyers, D. C., Durlak, J. A., & Wandersman, A. (2012). The quality implementation framework: a synthesis of critical steps in the implementation process. American Journal of Community Psychology, 50(3-4), 462–480. doi: 10.1007/s10464-012-9522-x.CrossRefPubMedGoogle Scholar
  33. Moore, J. E., Bumbarger, B. K., & Cooper, B. R. (2013). Examining adaptations of evidence-based programs in natural contexts. Journal of Primary Prevention, 34(3), 147–161. doi: 10.1007/s10935-013-0303-6.CrossRefPubMedGoogle Scholar
  34. Mowbray, C. T., Holter, M. C., Teague, G. B., & Bybee, D. (2003). Fidelity criteria: development, measurement, and validation. American Journal of Evaluation, 24(3), 315–340. doi: 10.1016/S1098-2140(03)00057-2.CrossRefGoogle Scholar
  35. NSW Public School Cookery Teachers' Association. (1974). The commonsense cookery book. Sydney: Angus & Robertson Publishers.Google Scholar
  36. Proctor, E. K., Landsverk, J., Aarons, G., Chambers, D., Glisson, C., & Mittman, B. (2009). Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Administration and Policy in Mental Health and Mental Health Services Research, 36(1), 24–34. doi: 10.1007/s10488-008-0197-4.CrossRefPubMedGoogle Scholar
  37. Shen, J., Yang, H., Cao, H., & Warfield, C. (2008). The fidelity-adaptation relationship in non-evidence-based programs and its implication for program evaluation. Evaluation, 14(4), 467–481. doi: 10.1177/1356389008095488.CrossRefGoogle Scholar
  38. Stirman, S.W., Miller, C.J., Toder, K., & Calloway, A. (2013). Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implementation Science, 8(1). doi: 10.1186/1748-5908-8-65.Google Scholar
  39. US Department of Health and Human Services. (2002). Finding the balance: program fidelity and adaptation in substance abuse prevention: a state-of-the-Art review. Rockville, MD.: Substance Abuse and Mental Health Services Administration, Center for Substance Abuse Prevention.Google Scholar
  40. US Department of Health and Human Services. (2014). Home Visiting Evidence of Effectiveness. Retrieved 20 May, 2015, from
  41. Zapart, S., Knight, J., & Kemp, L. (2016). ‘It was easier because I had help’: Mothers’ reflections on the long-term impact of sustained nurse home visiting. Maternal and Child Health Journal, 20(1), 196–204. doi: 10.1007/s10995-015-1819-6.CrossRefGoogle Scholar

Copyright information

© Society for Prevention Research 2016

Authors and Affiliations

  1. 1.Centre for Health Equity Training Research and Evaluation (CHETRE), part of the Centre for Primary Health Care and Equity, Faculty of MedicineUNSW AustraliaLiverpool BCAustralia
  2. 2.Translational Research and Social Innovation (TReSI), School of Nursing and MidwiferyWestern Sydney UniversityPenrithAustralia
  3. 3.Ingham Institute for Applied Medical ResearchLiverpoolAustralia

Personalised recommendations