Abstract
Major advances in population health will not occur unless we translate existing knowledge into effective multicomponent interventions, implement and maintain these in communities, and develop rigorous translational research and evaluation methods to ensure continual improvement and sustainability. We discuss challenges and offer approaches to evaluation that are key for translational research stages 3 to 5 to advance optimized adoption, implementation, and maintenance of effective and replicable multicomponent strategies. The major challenges we discuss concern (a) multiple contexts of evaluation/research, (b) complexity of packages of interventions, and (c) phases of evaluation/research questions. We suggest multiple alternative research designs that maintain rigor but accommodate these challenges and highlight the need for measurement systems. Longitudinal data collection and a standardized continuous measurement system are fundamental to the evaluation and refinement of complex multicomponent interventions. To be useful to T3–T5 translational research efforts in neighborhoods and communities, such a system would include assessments of the reach, implementation, effects on immediate outcomes, and effects of the comprehensive intervention package on more distal health outcomes.
This is a preview of subscription content, access via your institution.
References
National Research Council, Institute of Medicine. Preventing mental, emotional, and behavioral disorders among young people: progress and possibilities. Washington, DC: The National Academies Press, 2009.
Yoshikawa H, Aber JL, Beardslee WR. The effects of poverty on the mental, emotional, and behavioral health of children and youth: implications for prevention. Am Psychol. 2012; 67(4): 272-84. doi:10.1037/a0028015.
Fishbein DH, Stahl M, Ridenour TA, Sussman S. Translating emerging research into prevention of high-risk behaviors: state-of-the-science, barriers, and innovations. Translational behavioral medicine 2015
Holmes BJ, Finegood DT, Riley BL, Best A. Systems thinking in dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK, eds. Dissemination and implementation research in health. New York: Oxford University Press; 2012: 175-91.
Komro KA, Flay BR, Biglan A, the Promise Neighborhoods Research Consortium. Creating nurturing environments: a science-based framework for promoting child health and development within high-poverty neighborhoods. Clin Child Fam Psychol Rev. 2011; 14(2): 111.
Komro KA, Tobler AL, Delisle AL, O’Mara RJ, Wagenaar AC. Beyond the clinic: improving child health through evidence-based community development. BMC Pediatr. 2013; 13(172): 1471-2431.
Glasgow RE, Emmons KM. How can we increase translation of research into practice? Types of evidence needed. Annu Rev Public Health. 2007; 28: 413-33.
Patton MQ. Utilization-focused evaluation. 4th ed: Guilford Press, 2008
Flay BR, Biglan A, Boruch RF, et al. Standards of evidence: criteria for efficacy, effectiveness and dissemination. Prevention Science 2005;6(3):151–75 doi: DOI 10.1007/s11121-005-5553-y
Chen HT. The bottom-up approach to integrative validity: a new perspective for program evaluation. Eval Program Plann. 2010; 33(3): 205-14.
Thorpe KE, Zwarenstein M, Oxman AD, et al. A pragmatic-explanatory continuum indicator summary (PRECIS): a tool to help trial designers. Can Med Assoc J. 2009; 180(10), E47.
Brownson RC, Colditz GA, Proctor EK. Design and analysis in dissemination and implementation research: Oxford University Press, 2012
MacKinnon DP, Fairchild AJ. Current directions in mediation analysis. Current Directions in Psychological Science. 2009; 18(1): 16-20.
Haskins R, Baron J. Building the connection between policy and evidence: The Obama evidence-based initiatives. Nesta: London, England; 2011.
Dobbie W, Fryer RG Jr. Are high quality schools enough to close the achievement gap? NBER: Evidence from a social experiment in Harlem; 2009.
Cook TD, Shadish WR, Wong VC. Three conditions under which experiments and observational studies produce comparable causal estimates: new findings from within-study comparisons. J Policy Anal Manage. 2008; 27(4): 724-50.
Shadish WR, Cook TD, Campbell DT. Experimental and quasi-experimental designs for generalized causal inference. New York: Houghton Mifflin Company; 2002.
Cook TD, Steiner PM. Case matching and the reduction of selection bias in quasi-experiments: the relative importance of pretest measures of outcome, of unreliable measurement, and of mode of data analysis. Psychol Methods. 2010; 15(1): 56-68.
Shadish WR, Clark MH, Steiner PM. Can nonrandomized experiments yield accurate answers? A randomized experiment comparing random and nonrandom assignments. J Am Stat Assoc. 2008; 103(484): 1334-44.
Rubin DB. Using propensity scores to help design observational studies: application to the tobacco litigation. Health Services and Outcomes Research Methodology. 2001; 2(3): 169-88.
Coffman DL. Estimating causal effects in mediation analysis using propensity scores. Structural Equation Modeling: A Multidisciplinary Journal. 2011; 18(3): 357-69.
Wagenaar AC, Komro K. Natural experiments: research design elements for optimal causal inference without randomization. In: Wagenaar AC, Burris S, eds. Public health law research: theory and methods. San Francisco: Jossey-Bass; 2013.
Landsverk J, Brown CH, Chamberlain P, et al. Design and analysis in dissemination and implementation research. In: Brownson RC, Colditz GA, Proctor EK, eds. Dissemination and implementation research in health. New York: Oxford University Press; 2012.
Acknowledgments
A grant from the National Institute on Drug Abuse (DA028946) supported the authors’ work on this manuscript. The funders had no role in the manuscript, and the views expressed are solely those of the authors. We thank Christine Cody and Bethany Livingston for their excellent editorial assistance.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no competing interests.
Adherence to ethical standards
No human subject research was involved.
Additional information
Implications
Practice: Develop strong partnerships with researchers to implement and rigorously evaluate innovations for the promotion of health and well-being.
Policy: Fund national longitudinal and standardized measurement systems implemented and usable at the local level to advance the evaluation and refinement of multicomponent interventions to promote health.
Research: Partner with local and state agencies to implement and evaluate complex multicomponent intervention trials using multiple design elements to optimize causal inference.
About this article
Cite this article
Komro, K.A., Flay, B.R., Biglan, A. et al. Research design issues for evaluating complex multicomponent interventions in neighborhoods and communities. Behav. Med. Pract. Policy Res. 6, 153–159 (2016). https://doi.org/10.1007/s13142-015-0358-4
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13142-015-0358-4
Keywords
- Health
- Well-being
- Evaluation design
- Complex intervention
- Translational research