Designs for Testing Group-Based Interventions with Limited Numbers of Social Units: The Dynamic Wait-Listed and Regression Point Displacement Designs
- 561 Downloads
The dynamic wait-listed design (DWLD) and regression point displacement design (RPDD) address several challenges in evaluating group-based interventions when there is a limited number of groups. Both DWLD and RPDD utilize efficiencies that increase statistical power and can enhance balance between community needs and research priorities. The DWLD blocks on more time units than traditional wait-listed designs, thereby increasing the proportion of a study period during which intervention and control conditions can be compared, and can also improve logistics of implementing intervention across multiple sites and strengthen fidelity. We discuss DWLDs in the larger context of roll-out randomized designs and compare it with its cousin the Stepped Wedge design. The RPDD uses archival data on the population of settings from which intervention unit(s) are selected to create expected posttest scores for units receiving intervention, to which actual posttest scores are compared. High pretest–posttest correlations give the RPDD statistical power for assessing intervention impact even when one or a few settings receive intervention. RPDD works best when archival data are available over a number of years prior to and following intervention. If intervention units were not randomly selected, propensity scores can be used to control for non-random selection factors. Examples are provided of the DWLD and RPDD used to evaluate, respectively, suicide prevention training (QPR) in 32 schools and a violence prevention program (CeaseFire) in two Chicago police districts over a 10-year period. How DWLD and RPDD address common threats to internal and external validity, as well as their limitations, are discussed.
KeywordsGroup-based designs Roll-out designs Small sample designs Dynamic wait-listed design Regression point displacement design
We thank the National Institute of Mental Health for support under grants R34MH071189 (P. Wyman, PI) and RO1MH091452 (P. Wyman, PI) and the National Institute on Drug Abuse (NIDA) under grants P30 DA027828 (C. H. Brown, PI) and R13040610 (C. T. Fok, PI).
Conflict of Interest
The authors declare that they have no conflict of interest.
- Brown, C.H., Mason, W.A., Brown, E.C. (2014). Translating the Intervention Approach into an Appropriate Research Design—The Next Generation Designs for Effectiveness and Implementation Research. In: Z Sloboda and H Petras (Eds.), Advances in Prevention Science: Defining Prevention Science. Springer Publishing.Google Scholar
- Brown, C.H., Chamberlain, P., Saldana, L., Padgett, C., Wang W., Cruden G. (2014). Evaluation of two implementation strategies in fifty-one child county public service systems in two states: Results of a cluster randomized head-to-head implementation trial.Google Scholar
- Campbell, D. T., Stanley, J. C., & Gage, N. L. (1963). Experimental and quasi-experimental designs for research. Boston: Houghton, Mifflin and Company.Google Scholar
- Catalano, R. F., Arthur, M. W., Hawkins, J. D., Berglund, L., & Olson, J. J. (1998). Comprehensive community- and school-based interventions to prevent antisocial behavior. In R. Loeber & D. Farrington (Eds.), Serious and violent juvenile offenders: Risk factors and successful interventions. Thousand Oaks: Sage.Google Scholar
- Chamberlain, P., Saldana, L., Brown, C. H., & Leve, L. (2010). Implementation of multidimensional treatment foster care in California: A randomized control trial of an evidence-based practice. In M. Roberts-DeGennaro & S. Fogel (Eds.), Using evidence to inform practice for community and organizational change (pp. 218–234). Chicago: Lyceum Books.Google Scholar
- Dymnicki, A., Henry, D., Quintana, E., Wisnieski, E., & Kane, C. (2013). Outreach worker perceptions of positive and negative critical incidents: Characteristics associated with successful and unsuccessful violence interruption. Journal of Community Psychology, 41, 200–217. doi: 10.1002/jcop.21523.CrossRefGoogle Scholar
- Henry, D., Allen, J., Fok, C. C., Rasmus, S., Charles, B., & People Awakening Team. (2012). Patterns of protective factors in an intervention for the prevention of suicide and alcohol abuse with Yup’ik Alaska native youth (Early Online), 1–7. The American Journal of Drug and Alcohol Abuse. doi: 10.3109/00952990.2012.704460.Google Scholar
- Quinby, R. K., Hanson, K., Brooke-Weiss, B., Arthur, M. W., Hawkins, J. D., & Fagan, A. A. (2008). Installing the communities that care prevention system: implementation progress and fidelity in a randomized controlled trial. Journal of Community Psychology, 36, 313–332. doi: 10.1002/jcop.20194.CrossRefGoogle Scholar
- Quinnett, P. (1995). QPR: Ask a question, save a life. Spokane: QPR Institute and Suicide Awareness/Voices of Education.Google Scholar
- Skogan, W. G., Hartnett, S. M., Bump, N., & Dubois, J. (2009). Evaluation of CeaseFire-Chicago. Evanston: Northwestern University.Google Scholar
- Trochim, W. M. K., & Campbell, D. T. (1996). The regression point displacement design for evaluating community-based pilot programs and demonstration projects. Retrieved from http://www.socialresearchmethods.net/research/RPD/RPD.pdf
- Wyman, P. A., Brown, C. H., Inman, J., Cross, W., Schmeelk-Cone, K., Guo, J., & Pena, J. B. (2008). Randomized trial of a gatekeeper program for suicide prevention: 1-year impact on secondary school staff. Journal of Consulting and Clinical Psychology, 76, 104–115. doi: 10.1037/0022-006X.76.1.104.PubMedCentralCrossRefPubMedGoogle Scholar
- Wyman, P. A., Brown, C. H., LoMurray, M., Schmeelk-Cone, K., Petrova, M., Yu, Q., & Wang, W. (2010). An outcome evaluation of the Sources of Strength suicide prevention program delivered by adolescent peer leaders in high schools. American Journal of Public Health, 100, 1653–1661. doi: 10.2105/AJPH.2009.190025.PubMedCentralCrossRefPubMedGoogle Scholar