Replication research is essential for the advancement of any scientific field. In this paper, we argue that prevention science will be better positioned to help improve public health if (a) more replications are conducted; (b) those replications are systematic, thoughtful, and conducted with full knowledge of the trials that have preceded them; and (c) state-of-the art techniques are used to summarize the body of evidence on the effects of the interventions. Under real-world demands it is often not feasible to wait for multiple replications to accumulate before making decisions about intervention adoption. To help individuals and agencies make better decisions about intervention utility, we outline strategies that can be used to help understand the likely direction, size, and range of intervention effects as suggested by the current knowledge base. We also suggest structural changes that could increase the amount and quality of replication research, such as the provision of incentives and a more vigorous pursuit of prospective research registers. Finally, we discuss methods for integrating replications into the roll-out of a program and suggest that strong partnerships with local decision makers are a key component of success in replication research. Our hope is that this paper can highlight the importance of replication and stimulate more discussion of the important elements of the replication process. We are confident that, armed with more and better replications and state-of-the-art review methods, prevention science will be in a better position to positively impact public health.
KeywordsReplication Reproducibility Systematic Review Meta-Analysis Effectiveness
This paper is the result of the deliberations of the Standards of Evidence Taskforce, convened and funded by the Society for Prevention Research (Brian R. Flay, Chair). The Board of Directors of the Society for Prevention Research is pleased to have supported the preparation of this paper in hopes that it will stimulate further discussion about the importance of replication.
The views expressed in this paper are the authors’, and do not necessarily reflect the views of the authors’ institutions or the Society for Prevention Research. With the exception of the first author, order of authorship is alphabetical.
We thank Richard Catalano, Harris Cooper, Adam Haldahl, Mark Lipsey, and Patrick Tolan for their valuable feedback on earlier versions of this paper, and Kirsten Sundell for editing the final manuscript.
- Bushman, B. J., & Wang, M. C. (2009). Vote-counting procedures in meta-analysis. In H. Cooper, L. V. Hedges, & J. C. Valentine (Eds.), The handbook of research synthesis and meta-analysis (2nd ed., pp. 207–220). New York: Sage.Google Scholar
- Campbell, D. T. (1986). Relabeling internal and external validity for applied social scientists. In W. M. K. Trochim (Ed.), Advances in quasi-experimental design and analysis (pp. 67–77). San Francisco, CA: Jossey-Bass.Google Scholar
- Campbell, D. T., & Stanley, J. C. (1966). Experimental and quasi-experimental designs for research. Chicago, IL: McNally.Google Scholar
- Cochran, W. G., & Cox, G. M. (1957). Experimental designs (2nd ed.). New York: Wiley.Google Scholar
- Cooper, H., & Dorr, N. (1995). Race comparisons on need for achievement: A meta-analytic alternative to Graham’s narrative review. Review of Educational Research, 65, 483–508.Google Scholar
- Flay, B. R. (1986) Efficacy and effectiveness trials (and other phases of research) in the development of health promotion programs. Preventive Medicine, 15, 451–474.Google Scholar
- Flay, B. R., Biglan, A., Boruch, R. F., González Castro, F., Gottfredson, D., Kellam, S., et al. (2005). Standards of evidence: Criteria for efficacy, effectiveness and dissemination. Prevention Science, 6, 151–175.Google Scholar
- Gigerenzer, G. (1993). The superego, the ego, and the id in statistical reasoning. In G. Keren & C. Lewis (Eds.), A handbook for data analysis in the behavioral sciences: Methodological issues (pp. 311–339). Hillsdale, NJ: Erlbaum.Google Scholar
- Glass, G. V. (2000). Meta-analysis at 25. retrieved from http://glass.ed.asu.edu/gene/papers/meta25.html
- Hedges, L. V., & Olkin, I. (1985). Statistical methods for meta-analysis. Orlando, FL: Academic.Google Scholar
- Hume, D. (1739/1740). A treatise of human nature: Being an attempt to introduce the experimental method of reasoning into moral subjects. Available online at http://www.gutenberg.org/etext/4705.
- Institute of Medicine. (2008). Knowing what works in healthcare: A roadmap for the nation. Washington, DC: The National Academies Press.Google Scholar
- Kirby, D. (2001). Emerging answers: Research findings on programs to reduce teen pregnancy. Washington, DC: National campaign to prevent teen pregnancy.Google Scholar
- National Institutes of Health. (2003). Training in the responsible conduct of research. Retrieved March 19, 2008, from http://grants.nih.gov/training/responsibleconduct.htm
- Oakes, M. (1986). Statistical inference: A commentary for the social and behavioral sciences. New York: Wiley.Google Scholar
- Petrosino, A., & Soydan, H. (2005). The impact of program developers as evaluators on criminal recidivism: Results from meta-analyses of experimental and quasi-experimental research. Journal of experimental criminology, 1, 435–450.Google Scholar
- Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston, MA: Houghton.Google Scholar
- Shadish, W. R., Fuller, S., Gorman, M. E., Amabile, T. M., Kruglanski, A. W., Rosenthal, R., et al. (1994). Social psychology of science: A conceptual and research program. In W. R. Shadish, S. Fuller, & M. E. Gorman (Eds.), The social psychology of science (pp. 3–123). New York: Guilford.Google Scholar
- Shadish, W. R., & Haddock, C. K. (1994). Combining estimates of effect size. In H. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (pp. 261–281). New York: Sage.Google Scholar
- Schulz, K. F., (1995). Subverting randomization in controlled trials. Journal of the American Medical Association, 274, 1456–1458.Google Scholar
- Society for Prevention Research. (2002). Conflict of interest and disclosure statement policy. Retrieved March 18, 2008, from http://www.preventionresearch.org/history_conflict.php
- Stice, E., Shaw, H., Becker, C., & Rohde, P. (2008). Dissonance-based interventions for the prevention of eating disorders: Using persuasion principles to promote health. Prevention Science, 9, 114–128.Google Scholar
- Summerville, G. (2009). Laying a solid foundation: Strategies for effective program replication. New York: Public/Private Ventures.Google Scholar
- Tobler, N. S., Roona, M. R., Ochshorn, P., Marshall, D. G., Streke, A. V., & Stackpole, K. M. (2000). School-based adolescent drug prevention programs: 1998 meta-analysis. The Journal of Primary Prevention, 20, 275–336.Google Scholar