Abstract
This methods paper describes the fractional factorial design within the context of an evaluation of campus bystander programming. Prior evaluations include relative program comparisons, but do not reflect campus implementation of specific aspects of prevention programming. Campuses use combinations of programs, delivered across modalities, audiences, intensity, and degrees of requirement. Bystander program evaluation, in a natural experiment, must consider combinations of these components. This evaluation offers a novel application of the fractional factorial framework, considering combinations of program components, as implemented, within a multi-campus quasi-experimental design. Leveraging qualitative data, cluster analysis provides an initial identification of bystander program component combinations, mapped to experimental conditions. SAS v9.4 PROC FACTEX constructs possible fractional factorial designs to estimate main effects. From key informant interviews, primary program components are determined: delivery method, level of skill-building, degree of requirement, and intended audience. A total of seven combinations are identified. Using presence or absence of program components, a partial factorial structure is identified, where component clusters comprised seven of the 24 = 16 possible combinations. The smallest number of experimental conditions necessary for estimating these effects is determined with PROC FACTEX. Resulting designs indicate the necessary experimental conditions for determining bystander program component effectiveness available within this study. Our fractional factorial approach offers a novel strategy for planning an evaluation of several bystander program component combinations. This paper provides foundational elements needed to implement a bystander evaluation design, with the requisite emphasis on program components and relevant combinations, while optimizing number of participating campuses.
Similar content being viewed by others
References
Banyard, V. L., Moynihan, M. M., & Plante, E. G. (2007). Sexual violence prevention through bystander education: An experimental evaluation. Journal of Community Psychology, 35(4), 463–481. https://doi.org/10.1002/jcop.20159.
Campus Sexual Assault Elimination Act. (2013). The Violence against Women Reauthorization Act of 2013. Pub. L No. 113–4, §304, 127 Stat. 54, 89–92.
Cares, A. C., Banyard, V. L., Moynihan, M. M., Williams, L. M., Potter, S. J., & Stapleton, J. G. (2015). Changing attitudes about being a bystander to violence: Translating an in-person sexual violence prevention program to a new campus. Violence Against Women, 21(2), 165–187. https://doi.org/10.1177/1077801214564681.
Centers for Disease Control and Prevention, National Center for Injury Prevention and Control, Division of Violence Prevention. (2018). Sexual Violence: Prevention Strategies. Retrieved from https://www.cdc.gov/violenceprevention/sexualviolence/prevention.html.
Chakraborty, B., Collins, L. M., Strecher, V. J., & Murphy, S. A. (2009). Developing multicomponent interventions using fractional factorial designs. Statistics in Medicine, 28(21), 2687–2708. https://doi.org/10.1002/sim.3643.
Clear, E. R., Coker, A. L., Bush, H. M., & Davidov, D. M. (2019). Lessons learned in creating a college consortium. Journal of Family Violence, 1–10. https://doi.org/10.1007/s10896-019-00105-8.
Coker, A. L., Bush, H. M., Cook-Craig, P. G., DeGue, S. A., Clear, E. R., Brancato, C. J., et al. (2017). RCT testing bystander effectiveness to reduce violence. American Journal of Preventive Medicine, 52(5), 566–578. https://doi.org/10.1016/j.amepre.2017.01.020.
Coker, A. L., Bush, H. M., Brancato, C. J., Clear, E. R., & Recktenwald, E. A. (2018). Bystander program effectiveness to reduce violence acceptance: RCT in high schools. Journal of Family Violence, 34, 1–12. https://doi.org/10.1007/s10896-018-9961-8.
Collins, L. M., Dziak, J. J., & Li, R. (2009). Design of experiments with multiple independent variables: A resource management perspective on complete and reduced factorial designs. Psychological Methods, 14(3), 202–224. https://doi.org/10.1037/a0015826.
Collins, L. M., Dziak, J. J., Kugler, K. C., & Trail, J. B. (2014). Factorial experiments: Efficient tools for evaluation of intervention components. American Journal of Preventive Medicine, 47(4), 498–504. https://doi.org/10.1016/j.amepre.2014.06.021.
Davidov, D. M., Bush, H. M., Clear, E. R., & Coker, A. L. (2019). Using a multiphase mixed methods triangulation design to measure bystander intervention components and dose of violence prevention programs on college campuses. Journal of Family Violence, 1–12. https://doi.org/10.1007/s10896-019-00108-5.
Gidycz, C. A., Orchowski, L. M., & Berkowitz, A. D. (2011). Preventing sexual aggression among college men: An evaluation of a social norms and bystander intervention program. Violence Against Women, 17(6), 720–742. https://doi.org/10.1177/1077801211409727.
McMahon, S., Winter, S. C., Palmer, J. E., Postmus, J. L., Peterson, N. A., Zucker, S., & Koenick, R. (2015). A randomized controlled trial of a multi-dose bystander intervention program using peer education theater. Health Education Research, 30(4), 554–568. https://doi.org/10.1093/her/cyv022.
Miller, E., Tancredi, D. J., McCauley, H. L., Decker, M. R., Virata, M. C. D., Anderson, H. A., et al. (2012). “Coaching boys into men”: A cluster-randomized controlled trial of a dating violence prevention program. Journal of Adolescent Health, 51(5), 431–438. https://doi.org/10.1016/j.jadohealth.2012.01.018.
Moynihan, M. M., Banyard, V. L., Arnold, J. S., Eckstein, R. P., & Stapleton, J. G. (2010). Engaging intercollegiate athletes in preventing and intervening in sexual and intimate partner violence. Journal of American College Health, 59(3), 197–204. https://doi.org/10.1080/07448481.2010.502195.
Nair, V., Strecher, V., Fagerlin, A., Ubel, P., Resnicow, K., Murphy, S., et al. (2008). Screening experiments and the use of fractional factorial designs in behavioral intervention research. American Journal of Public Health, 98(8), 1354–1359. https://doi.org/10.2105/AJPH.2007.127563.
Riley, W. T., Glasgow, R. E., Etheredge, L., & Abernethy, A. P. (2013). Rapid, responsive, relevant (R3) research: A call for a rapid learning health research enterprise. Clinical and Translational Medicine, 2(1), 10. https://doi.org/10.1186/2001-1326-2-10.
Salazar, L. F., Vivolo-Kantor, A., Hardin, J., & Berkowitz, A. (2014). A web-based sexual violence bystander intervention for male college students: Randomized controlled trial. Journal of Medical Internet Research, 16(9), e203. https://doi.org/10.2196/jmir.3426.
Sargent, K. S., Jouriles, E. N., Rosenfield, D., & McDonald, R. (2017). A high school-based evaluation of TakeCARE, a video bystander program to prevent adolescent relationship violence. Journal of Youth and Adolescence, 46(3), 633–643. https://doi.org/10.1007/s10964-016-0622-z.
Violence Against Women Act, S.11, 103rd Congress. §§ 42–13701-14040. (1994).
West, S. G., & Aiken, L. S. (1997). Toward understanding individual effects in multicomponent prevention programs: Design and analysis strategies. In K. J. Bryant, M. Windle, & S. G. West (Eds.), The science of prevention: Methodological advances from alcohol and substance abuse research (pp. 167–209). American Psychological Association.
West, S. G., Aiken, L. S., & Todd, M. (1993). Probing the effects of individual components in multiple component prevention programs. American Journal of Community Psychology, 21(5), 571–605.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Bush, H.M., Davidov, D., Brancato, C.J. et al. The Opportunity – VAWA 2013 Reauthorization Provides a Natural Experiment for Bystander Efficacy Evaluation. J Fam Viol 35, 563–574 (2020). https://doi.org/10.1007/s10896-020-00152-6
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10896-020-00152-6