Alternatives to Randomized Control Trial Designs for Community-Based Prevention Evaluation
- 821 Downloads
Multiple factors may complicate evaluation of preventive interventions, particularly in situations where the randomized controlled trial (RCT) is impractical, culturally unacceptable, or ethically questionable, as can occur with community-based efforts focused on inner-city neighborhoods or rural American Indian/Alaska Native communities. This paper is based in the premise that all research designs, including RCTs, are constrained by the extent to which they can refute the counterfactual and by which they can meet the challenge of proving the absence of effects due to the intervention—that is, showing what is prevented. Yet, these requirements also provide benchmarks for valuing alternatives to RCTs, those that have shown abilities to estimate preventive effects and refute the counterfactual with limited bias acting in congruence with community values about implementation. In this paper, we describe a number of research designs with attending examples, including regression discontinuity, interrupted time series designs, and roll-out randomization designs. We also set forth procedures and practices that can enhance their utility. Alternative designs, when combined with such design strengths, can provide valid evaluations of community-based interventions as viable alternatives to the RCT.
KeywordsResearch design Community based research
Grateful acknowledgement is given to the investigators and staff of the Families and Communities Research Group, the Center for Alaska Native Health Research, and the conference, “Advancing Science with Culturally Distinct Samples” held at the University of Alaska Fairbanks in August 2011.
The research reported in this article was funded by the Centers for Disease Control and Prevention, the National Institute of Nursing Research, and the Robert R. McCormick Foundation.
Compliance with Ethical Standards
Conflict of Interest
The authors have no potential conflicts of interest.
All of the research reported here was conducted with the approval and under the supervision of the Institutional Review Boards of the University of Illinois at Chicago, Rush University, the University of Chicago, and/or the Illinois Department of Children and Family Services.
All of the research reported in this article was conducted with the written informed consent of participants or, in the case of research involving state wards, consent of the Illinois Department of Children and Family Services.
- Arthur, M. W., Briney, J. S., Hawkins, J. D., Abbott, R. D., Brooke-Weiss, B. L., & Catalano, R. F. (2007). Measuring risk and protection in communities using the Communities That Care Youth Survey. Evaluation and Program Planning, 30, 197–211. doi: 10.1016/j.evalprogplan.2007.01.00.CrossRefPubMedGoogle Scholar
- Bock, R. D. (1989). Measurement of human variation: a two-stage model. In R. D. Bock (Ed.), Multilevel analysis of educational data (pp. 319–342). Orlando, FL: Academic.Google Scholar
- Brown, C. A., & Lilford, R. J. (2006). The stepped wedge design: A systematic review. BMC Medical Research Methodology, 6(54). doi: 10.1186/1471-2288-6-54
- Campbell, D. T., Stanley, J. C., & Gage, N. L. (1963). Experimental and quasi-experimental designs for research. In N. L. Gage (Ed.), Handbook of research on teaching (pp. 1–84). Boston, MA: Houghton, Mifflin Company.Google Scholar
- Fogg, L. (2011). Using stochastic matching to compensate for failures of randomization. Paper presented at the Advancing Science with Culturally Distinct Communities, Fairbanks, AK.Google Scholar
- Gorman-Smith, D. (2014). Helping communities use the evidence for youth violence prevention. Paper presented at the Public Health Grand Rounds, Center for Disease Control and Prevention, Atlanta, GA.Google Scholar
- Gottfredson, D. C., Cook, T. D., Gardner, F. E. M., Gorman-Smith, D., Howe, G. W., Sandler, I. N., & Zafft, K. M. (2015). Standards of evidence for efficacy, effectiveness, and scale-up research in prevention science: Next generation. Prevention Science, 16, 893–926. doi: 10.1007/s11121-015-0555-x.CrossRefPubMedPubMedCentralGoogle Scholar
- Guo, S., & Fraser, M. (2014). Propensity score analysis: Statistical methods and applications (2nd ed.). Thousand Oaks, CA: Sage Publications.Google Scholar
- Hayden, M. K., Lin, M. Y., Lolans, K., Weiner, S., Blom, D., Moore, N. M.,…Weinstein, R. A. (2014). Prevention of colonization and infection by klebsiella pneumoniae carbapenemase-producing enterobacteriaceae in long term acute care hospitals. Clinical Infectious Diseases, (1–30). doi: 10.1093/cid/ciu1173.
- Henry, D. B., Knoblauch, S., & Sigurvinsdottir, R. (2014). The effect of intensive CeaseFire intervention on crime in four Chicago police beats: Quantitative assessment. Chicago, IL: Robert R. McCormick Foundation.Google Scholar
- Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generilized causal inference. Boston, MA: Houghton Mifflin.Google Scholar
- Skloot, R. (2010). The immortal life of Henrietta Lacks. New York: Random House. Crown/Archetype.Google Scholar
- Skogan, W. G., Hartnett, S. M., Bump, N., & Dubois, J. (2009). Evaluation of CeaseFire-Chicago. Evanston, IL: Northwestern University.Google Scholar
- Thompson, A. R., Henry, D. B., Davidson, C. V., Gottschalk, A., & Naylor., M. W. (2013). The influence of a State Oversight Program for Illinois wards on SSRI prescription request trends. Unpublished manuscript available from M.W. Naylor, Institute of Juvenile Research, 1747 W. Roosevelt Rd., Chicago, IL 60612.Google Scholar
- Tolan, P. H. (2002). Family-focused prevention research: ‘Tough but tender’. In H. A. Liddle, D. A. Santisteban, R. F. Levant, & J. H. Bray (Eds.), Family psychology: Science-based interventions (pp. 197–213). Washington, DC: American Psychological Association. doi: 10.1037/10438-010.CrossRefGoogle Scholar
- Tolan, P. H. (2014). Making and using lists of empirically tested programs: Value for violence interventions for progress and impact. Institute of Medicine/National Academy of Sciences Report. The evidence for violence prevention across the lifespan and around the world. (pp. 94–106). Washington, DC: Institute of Medicine/National Academy of Sciences Press.Google Scholar
- Trochim, W. M. K., & Campbell, D. T. (1996). The regression point displacement design for evaluating community-based pilot programs and demonstration projects. Unpublished manuscript. Retrieved from http://www.socialresearchmethods.net/research/RPD/RPD.pdf.
- Wakschlag, L. S., Briggs-Gowan, M. J., Choi, S. W., Nichols, S. R., Kestler, J., Burns, J. L., & Henry, D. (2014). Advancing a multidimensional, developmental spectrum approach to preschool disruptive behavior. Journal of the American Academy of Child and Adolescent Psychiatry, 53, 82–96. doi: 10.1016/j.jaac.2013.10.011. e3.CrossRefPubMedGoogle Scholar
- Wyman, P. A., Brown, C. H., Inman, J., Cross, W., Schmeelk-Cone, K., Guo, J., & Pena, J. B. (2008). Randomized trial of a gatekeeper program for suicide prevention: 1-year impact on secondary school staff. Journal of Consulting and Clinical Psychology, 76, 104–115. doi: 10.1037/0022-006X.76.1.104.CrossRefPubMedPubMedCentralGoogle Scholar
- Wyman, P. A., Henry, D., Knoblauch, S., & Brown, C. H. (2015). Designs for testing group-based interventions with limited numbers of social units: The dynamic wait-listed and regression point displacement designs. Prevention Science, 16, 956–966. doi: 10.1007/s11121-014-0535-6.CrossRefPubMedPubMedCentralGoogle Scholar