Measuring Enactment of Innovations and the Factors that Affect Implementation and Sustainability: Moving Toward Common Language and Shared Conceptual Understanding

  • Jeanne Century
  • Amy Cassata
  • Mollie Rudnick
  • Cassie Freeman
Article

Abstract

This article describes research that focuses on the concern that researchers are unable to fully realize the potential value of their collective efforts because they do not have shared conceptual or operational tools for communicating assumptions, ideas, research strategies, or findings with others outside, or even within their disciplines. This research, through the lens of measuring implementation of educational programs, has taken steps toward bringing researchers’ varied pictures of understanding into a coherent landscape. This article describes a conceptual framework for describing aspects of implementation, a conceptual framework for describing the factors that affect implementation, and tools for measuring each. It describes the challenges addressed in the development of these approaches, and the application of these approaches to current studies in education and other fields in the social sciences. In doing so, it demonstrates that meaningful communication between researchers and accumulation of knowledge across fields is possible, and necessary.

References

  1. 1.
    Aarons, G., Hurlburt, M., Horwitz, SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration Policy Mental Health. 2011; 38:4-23.PubMedCrossRefGoogle Scholar
  2. 2.
    Durlak, J., Dupre, E. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal Community Psychology. 2008; 41:327-350.PubMedCrossRefGoogle Scholar
  3. 3.
    Proctor, E., Silmere, H., Raghavan, R., et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Administration Policy Mental Health. 2011; 38:65-76.PubMedCrossRefGoogle Scholar
  4. 4.
    Hulleman, C.S. & Cordray, D.S. (2009). Moving from the lab to the field: The role of fidelity and achieved intervention strength. Journal of Research on Educational Effectiveness, 2, 88-110.CrossRefGoogle Scholar
  5. 5.
    Mowbray, C. T., Holter, M. C., Teague, G. B., et al. Fidelity criteria: Development, measurement, and validation. The American Journal of Evaluation. 2003;24(3):315-340.Google Scholar
  6. 6.
    O’Donnell, C.L. Defining, conceptualizing, and measuring fidelity of implementation and its relationship to outcomes in K-12 curriculum intervention research. Review of Educational Research. 2008; 78(1): 33-84.CrossRefGoogle Scholar
  7. 7.
    Domitrovich, C., Bradshaw, C., Poduska, J., et al. Advances in school mental health promotion. 2008; 1(3): 6- 28.CrossRefGoogle Scholar
  8. 8.
    Fixsen D., Blase K., Naoom S., et al. Core implementation components. Research on Social Work Practice. 2009;19(5):531.CrossRefGoogle Scholar
  9. 9.
    Sanetti, L.M.H., Kratochwill, T.R. Toward developing a science of treatment integrity: Introduction to the special series. School Psychology Review. 2009;38(4):445–459.Google Scholar
  10. 10.
    Damschroder, L., Aron, D., Keith, R., et al. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. 2009; 4:50.Google Scholar
  11. 11.
    The road ahead: research partnerships to transform services. A report by the National Advisory Mental Health Council’s Workgroup on Services and Clinical Epidemiology Research. Bethesda, MD: Department of Health and Human Services, National Institute of Mental Health; 2001.Google Scholar
  12. 12.
    Backer, T., David, S.L. Synthesis of behavioral science learning about technology transfer. National Institute on Drug Abuse. NIH Publication. 1995; 95-4035: 262-289.Google Scholar
  13. 13.
    Greenhalgh, T., Robert, G., MacFarlane, F., et al. Diffusion of innovations in service organizations: systematic review and recommendations. 2004; 82(4): 581-629.Google Scholar
  14. 14.
    Wandersman, A., Duffy, J., Flaspohler, P., et al. Bridging the gap between prevention research and practice: the interactive systems framework for dissemination and implementation. 2008; 41: 171-181. 8Google Scholar
  15. 15.
    Century, J., Rudnick, M. & Freeman, C. Measuring Fidelity of Implementation: A Foundation for Shared Language and Accumulation of Knowledge. American Journal of Evaluation. 2010;31(2):199-218.CrossRefGoogle Scholar
  16. 16.
    Hall, G. E., Hord, S. M. Change in Schools: Facilitating the Process. New York, State University of New York Press. 1987:117Google Scholar
  17. 17.
    Bond, G., Williams, J., Evans, L., et al. Psychiatric Rehabilitation Fidelity Toolkit. Cambridge, MA: Human Services Research Institute. 2000.Google Scholar
  18. 18.
    Huntley, M. A. Operationalizing the concept of "fidelity of implementation" for NSF-funded mathematics curricula. Unpublished manuscript. 2005.Google Scholar
  19. 19.
    Wang, M. C., Nojan, M., Strom, C. D., et al. The utility of degree of implementation measures in program implementation and evaluation research. Curriculum Inquiry. 1984;14:249–286.CrossRefGoogle Scholar
  20. 20.
    Gresham F.M, Gansle K, Noell G.H. Treatment integrity in applied behavior analysis with children. Journal of Applied Behavior Analysis. 1993; 26:257–263PubMedCrossRefGoogle Scholar
  21. 21.
    Rezmovic, E. L. Program implementation and evaluation results. Evaluation and Program Planning, 1982;5:111-118.CrossRefGoogle Scholar
  22. 22.
    Hall, G. E., & Loucks, S. F. A developmental model for determining whether the treatment is actually implemented. American Educational Research Journal, 1977;14(3):263-276.Google Scholar
  23. 23.
    Fullan, M., & Pomfret, A. Research on curriculum and instruction implementation. Review of Educational Research. 1977;47(2):335-397.Google Scholar
  24. 24.
    Fullan, M. Evaluating program implementation: What can be learned from follow through. Curriculum Inquiry. 1983;13(2):215-227.CrossRefGoogle Scholar
  25. 25.
    Gersten, R. M. & Carnine, D. Measuring implementation of the direct instruction model in an urban school district: An observational approach. Paper presented at the Annual Meeting of the American Educational Research Association, Boston, MA. 1980.Google Scholar
  26. 26.
    Lastica, J & O’Donnell, C. Considering the role of fidelity of implementation in science education research: Fidelity as teacher and student adherence to structure. Paper presented at the Annual Meeting of the American Educational Research Association, Chicago, IL. 2007.Google Scholar
  27. 27.
    Dane, A. V., Schneider, B. H. Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clinical Psychology Review. 1998; 18:23-45.PubMedCrossRefGoogle Scholar
  28. 28.
    Herie, M. & Martin, G. Knowledge diffusion in social work: A new approach to bridging the gap. Social Work. 2002;47(1):85—95PubMedCrossRefGoogle Scholar
  29. 29.
    Cranney, M., Warren, E., Barton, S., et al. Why do GPs not implement evidence–based guidelines? A descriptive study. Family Practice. 2001;18(4):353–355.CrossRefGoogle Scholar
  30. 30.
    Freeman, A. C., & Sweeney, K. Why general practitioners do not implement evidence: qualitative study. British Medical Journal. 2001:323(7321):1100–1102.PubMedCrossRefGoogle Scholar
  31. 31.
    Louis, K. S. & Jones, L. M. Dissemination with impact: What research suggests for practice in career and technical education. St. Paul, MN: University of Minnesota, National Research Center for Career and Technical Education. 2001Google Scholar
  32. 32.
    Robertson, T. (The process of innovation and the diffusion of innovation. Journal of Marketing. 1967;31(1):14-19.CrossRefGoogle Scholar

Copyright information

© National Council for Community Behavioral Healthcare 2012

Authors and Affiliations

  • Jeanne Century
    • 1
  • Amy Cassata
    • 1
  • Mollie Rudnick
    • 2
  • Cassie Freeman
    • 3
  1. 1.Center for Elementary Mathematics and Science Education (CEMSE)University of ChicagoChicagoUSA
  2. 2.Rand CorporationSanta MonicaUSA
  3. 3.Department of Comparative Human DevelopmentUniversity of ChicagoChicagoUSA

Personalised recommendations