Skip to main content

CUTOS: A Framework for Contextualizing Evidence

  • Chapter
  • First Online:
Implementation Science 3.0

Abstract

Researchers often emphasize the need for evidence from previous research to understand what is already known about a specific program or intervention. Even in the presence of high-quality evidence, consumers of research often find it difficult to interpret and apply the findings to their specific context. Many fields are incorporating evidence-based practices and the use of systematic reviews to understand what the evidence supports. However, there is little in the literature that tells us how to interpret the findings based on the intended application being considered, despite stressing the importance of doing so. This chapter introduces a framework that gives us a set of tools for disentangling what is known and a space to contemplate whether and which findings are applicable to a specific context. This implementation framework starts by identifying known evidence about what is going to be implemented (i.e., what the evidence supports) and how the components of such evidence apply to a specific context (population, organization, resources, etc.). In addition, this active implementation framework emphasizes the importance of collaborative decision-making between researchers and stakeholders.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 89.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    As mentioned in Ehri et al. (2001), some of the 96 effect sizes came from cases within studies that used the same treatment or control group more than once, thus creating multiple group dependency.

  2. 2.

    All Iowa state data were publicly available from http://reports.educateiowa.gov/

References

  • Becker, B. J. (1996). The generalizability of empirical research results. In C. P. Benbow & D. Lubinski (Eds.), Intellectual talent: Psychometric and social issues (pp. 362–383). Baltimore, MD: Johns Hopkins Press.

    Google Scholar 

  • Burchett, H., Umoquit, M., & Dobrow, M. (2011). How do we know when research from one setting can be useful in another? A review of external validity, applicability and transferability frameworks. Journal of Health Services Research and Policy, 16, 238–244. https://doi.org/10.1258/jhsrp.2011.010124

    Article  Google Scholar 

  • Campbell, D., & Stanley, J. (1963). Experimental and quasi-experimental designs for research. Chicago, IL: Rand-McNally.

    Google Scholar 

  • Campbell, D. T. (1986). Relabeling internal and external validity for applied social scientists. New Directions for Program Evaluation, 31, 67–77. https://doi.org/10.1002/ev.1434

    Article  Google Scholar 

  • Chandler, J., Churchill, R., Higgins, J. P. T., Lasserson, T., & Tovey, D. (2013). Methodological Expectations of Cochrane Intervention Reviews (MECIR). Methodological standards for the conduct of new Cochrane Intervention Reviews.

    Google Scholar 

  • Cook, T. D. (1991). Meta-analysis: Its potential for causal description and causal explanation within program evaluation. In G. Albrecht, H. U. Otto, S. Karstedt-Henke, & K. Bollert (Eds.), Social prevention and the social sciences: Theoretical controversies, research problems and evaluation strategies.

    Google Scholar 

  • Cook, T. D., & Campbell, D. T. (1979). Quasi-experimentation: Design and analysis issues for field settings. Boston, MA: Houghton Mifflin Company.

    Google Scholar 

  • Cooper, H. M., Hedges, L. V., & Valentine, J. (2009). The handbook of research synthesis and meta-analysis (2nd ed.). New York, NY: The Russell Sage Foundation.

    Google Scholar 

  • Cronbach, L. J. (1982). Designing evaluations of educational and social programs. San Francisco, CA: Jossey-Bass.

    Google Scholar 

  • Des Jarlais, D. C., Lyles, C., Crepaz, N., & the TREND Group. (2004). Improving the reporting quality of nonrandomized evaluations of behavioral and public health interventions: The TREND statement. American Journal of Public Health, 94, 361–366. https://doi.org/10.2105/ajph.94.3.361

    Article  Google Scholar 

  • Ehri, L. C., Nunes, S. R., Willows, D. M., Schuster, B. V., Yaghoub-Zadeh, Z., & Shanahan, T. (2001). Phonemic awareness instruction helps children learn to read: Evidence from the National Reading Panel’s meta-analysis. Reading Research Quarterly, 36(3), 250–287. https://doi.org/10.1598/RRQ.36.3.2

    Article  Google Scholar 

  • Ercikan, K., & Roth, W.-M. (2014). Limits of generalizing in education research: Why criteria for research generalization should include population heterogeneity and uses of knowledge claims. Teachers College Record, 116, 1–28.

    Article  Google Scholar 

  • Glass, G. V. (1976). Primary, secondary, and meta-analysis of research. Educational Researcher, 5(10), 3–8. https://doi.org/10.3102/0013189X005010003.

  • GRADE working group. (n.d.). Organizations that have endorsed or that are using GRADE. Available from: www.gradeworkinggroup.org

  • Guyatt, G., Oxman, A. D., Akl, E. A., Kunz, R., Vist, G., Brozek, J., … Schunemann, H. J. (2011). GRADE guidelines: 1. Introduction-GRADE evidence profiles and summary of findings tables. Journal of Clinical Epidemiology, 64, 383–394. https://doi.org/10.1016/j.jclinepi.2010.04.026

    Article  Google Scholar 

  • Higgins, J. P. T., & Green, S. (2011). Cochrane handbook for systematic reviews of interventions. Retrieved from www.cochrane-handbook.org

  • Innvaer, S. S., Vist, G. G., Trommald, M. M., & Oxman, A. A. (2002). Health policy-makers’ perceptions of their use of evidence: A systematic review. Journal of Health Services Research and Policy, 7, 239–244. https://doi.org/10.1258/135581902320432778

    Article  Google Scholar 

  • Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., & the PRISMA Group. (2009). Preferred reporting items for systematic reviews and meta-analysis: The PRISMA statement. Physical Therapy, 89, 873–880. https://doi.org/10.1093/ptj/89.9.873

    Article  Google Scholar 

  • National Institute of Child Health and Human Development (NICHD). (2000). Report of the National Reading Panel. Teaching children to read: An evidence-based assessment of the scientific research literature on reading and its implications for reading instruction: Reports of the subgroups (NIH Publication No. 00-4754). Washington, DC: U.S. Government Printing Office.

    Google Scholar 

  • Oliver, K., Innvar, S. S., Lorenc, T., Woodman, J., & Thomas, J. (2014). A systematic review of barriers to and facilitators of the use of evidence by policymakers. BMC Health Services Research, 14(2), 1–12. https://doi.org/10.1186/1472-6963-14-2

    Article  Google Scholar 

  • Richardson, W. S., Wilson, M. C., Nishikawa, J., & Hayward, R. S. A. (1995). The well-built clinical question: A key to evidence-based decisions. ACP Journal Club, 123, A-12. https://doi.org/10.7326/ACPJC-1005-123-3-A12.

  • Schulz, K. F., Altman, D. G., Moher, D., & the CONSORT Group. (2010). CONSORT 2010 statement: Updated guidelines for reporting parallel group randomized trials. Annals of Internal Medicine, 152, 726–732.

    Article  Google Scholar 

  • Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental design for generalized causal inference. Boston, MA: Houghton-Mifflin.

    Google Scholar 

  • Slavin, R. (2008). Perspectives on evidence-based research in education—What works? Issues in synthesizing educational program evaluation. Educational Researcher, 37(1), 5–14. https://doi.org/10.3102/0013189x08314117

    Article  Google Scholar 

  • Stroup, D. F., Berlin, J. A., Morton, S. C., Olkin, I., Williamson, G. D., Rennie, D., … Thacker, S. B. (2000). Meta-analysis of observational studies in epidemiology: A proposal for reporting. Journal of the American Medical Association, 283, 2008–2012. https://doi.org/10.1001/jama.283.15.2008

    Article  Google Scholar 

  • Wang, S., Moss, J. R., & Hiller, J. E. (2005). Applicability and transferability of interventions in evidence-based public health. Health Promotion International, 12, 76–83. https://doi.org/10.1186/1472-6963-14-61

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ariel M. Aloe .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Aloe, A.M., Thompson, C.G., Reed, D.K. (2020). CUTOS: A Framework for Contextualizing Evidence. In: Albers, B., Shlonsky, A., Mildon, R. (eds) Implementation Science 3.0. Springer, Cham. https://doi.org/10.1007/978-3-030-03874-8_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-03874-8_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-03873-1

  • Online ISBN: 978-3-030-03874-8

  • eBook Packages: Social SciencesSocial Sciences (R0)

Publish with us

Policies and ethics