Skip to main content
Log in

Design of experiments with sequential randomizations on multiple timescales: the hybrid experimental design

  • Published:
Behavior Research Methods Aims and scope Submit manuscript

Abstract

Psychological interventions, especially those leveraging mobile and wireless technologies, often include multiple components that are delivered and adapted on multiple timescales (e.g., coaching sessions adapted monthly based on clinical progress, combined with motivational messages from a mobile device adapted daily based on the person’s daily emotional state). The hybrid experimental design (HED) is a new experimental approach that enables researchers to answer scientific questions about the construction of psychological interventions in which components are delivered and adapted on different timescales. These designs involve sequential randomizations of study participants to intervention components, each at an appropriate timescale (e.g., monthly randomization to different intensities of coaching sessions and daily randomization to different forms of motivational messages). The goal of the current manuscript is twofold. The first is to highlight the flexibility of the HED by conceptualizing this experimental approach as a special form of a factorial design in which different factors are introduced at multiple timescales. We also discuss how the structure of the HED can vary depending on the scientific question(s) motivating the study. The second goal is to explain how data from various types of HEDs can be analyzed to answer a variety of scientific questions about the development of multicomponent psychological interventions. For illustration, we use a completed HED to inform the development of a technology-based weight loss intervention that integrates components that are delivered and adapted on multiple timescales.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  • Bernacer, J., & Murillo, J. I. (2014). The Aristotelian conception of habit and its contribution to human neuroscience. Frontiers in Human Neuroscience, 8(November), 883–883. https://doi.org/10.3389/fnhum.2014.00883

    Article  PubMed  PubMed Central  Google Scholar 

  • Boruvka, A., Almirall, D., Witkiewitz, K., & Murphy, S. A. (2018). Assessing time-varying causal effect moderation in Mobile health. Journal of the American Statistical Association, 113(523), 1112–1121.

    Article  PubMed  Google Scholar 

  • Brumback, B. A. (2009). A note on using the estimated versus the known propensity score to estimate the average treatment effect. Statistics & Probability Letters, 79(4), 537–542.

    Article  Google Scholar 

  • Chakraborty, B., Collins, L. M., Strecher, V. J., & Murphy, S. A. (2009). Developing multicomponent interventions using fractional factorial designs. Statistics in Medicine, 28(21), 2687–2708.

    Article  PubMed  PubMed Central  Google Scholar 

  • Collins, L. M. (2018). Optimization of behavioral, biobehavioral, and biomedical interventions: The multiphase optimization strategy (MOST). Springer.

    Book  Google Scholar 

  • Collins, L. M., Dziak, J. J., & Li, R. (2009). Design of experiments with multiple independent variables: A resource management perspective on complete and reduced factorial designs. Psychological Methods, 14(3), 202–224. https://doi.org/10.1037/a0015826

    Article  PubMed  PubMed Central  Google Scholar 

  • Collins, L. M., Murphy, S. A., Nair, V. N., & Strecher, V. J. (2005). A strategy for optimizing and evaluating behavioral interventions. Annals of Behavioral Medicine, 30(1), 65–73.

    Article  PubMed  Google Scholar 

  • Dziak, J. J., Nahum-Shani, I., & Collins, L. M. (2012). Multilevel factorial experiments for developing behavioral interventions: Power, sample size, and resource considerations. Psychological Methods, 17(2), 153.

    Article  PubMed  PubMed Central  Google Scholar 

  • Dziak, J. J., Yap, J. R., Almirall, D., McKay, J. R., Lynch, K. G., & Nahum-Shani, I. (2019). A data analysis method for using longitudinal binary outcome data from a SMART to compare adaptive interventions. Multivariate Behavioral Research, 54(5), 613–636.

    Article  PubMed  PubMed Central  Google Scholar 

  • Fernandez, M. E., Schlechter, C. R., Del Fiol, G., Gibson, B., Kawamoto, K., Siaperas, T., Pruhs, A., Greene, T., Nahum-Shani, I., & Schulthies, S. (2020). QuitSMART Utah: An implementation study protocol for a cluster-randomized, multi-level sequential multiple assignment randomized trial to increase reach and impact of tobacco cessation treatment in community health centers. Implementation Science, 15(1), 1–13.

    Article  Google Scholar 

  • Ghosh, P., Nahum-Shani, I., Spring, B., & Chakraborty, B. (2020). Noninferiority and equivalence tests in sequential, multiple assignment, randomized trials (SMARTs). Psychological Methods, 25(2), 182.

    Article  PubMed  Google Scholar 

  • Hernan, M. A., Brumback, B. A., & Robins, J. M. (2002). Estimating the causal effect of zidovudine on CD4 count with a marginal structural model for repeated measures. Statistics in Medicine, 21(12), 1689–1709.

    Article  PubMed  Google Scholar 

  • Hirano, K., Imbens, G. W., & Ridder, G. (2003). Efficient estimation of average treatment effects using the estimated propensity score. Econometrica, 71(4), 1161–1189.

    Article  Google Scholar 

  • Koch, E. D., Moukhtarian, T. R., Skirrow, C., Bozhilova, N., Asherson, P., & Ebner-Priemer, U. W. (2021). Using e-diaries to investigate ADHD–state-of-the-art and the promising feature of just-in-time-adaptive interventions. Neuroscience & Biobehavioral Reviews, 127, 884–898.

    Article  Google Scholar 

  • Lavori, P. W., & Dawson, R. (2000). A design for testing clinical strategies: Biased adaptive within-subject randomization. Journal of the Royal Statistical Society: Series A (Statistics in Society), 163(1), 29–38.

    Article  Google Scholar 

  • Liang, K. Y., & Zeger, S. L. (1986). Longitudinal data analysis using generalized linear models. Biometrika, 73(1): 13-22.

  • Liao, P., Klasnja, P., Tewari, A., & Murphy, S. A. (2016). Sample size calculations for micro-randomized trials in mHealth. Statistics in Medicine, 35(12), 1944–1971.

    Article  PubMed  Google Scholar 

  • Lu, X., Nahum-Shani, I., Kasari, C., Lynch, K. G., Oslin, D. W., Pelham, W. E., Fabiano, G., & Almirall, D. (2016). Comparing dynamic treatment regimes using repeated-measures outcomes: Modeling considerations in SMART studies. Statistics in Medicine, 35(10), 1595–1615.

    Article  PubMed  Google Scholar 

  • Mohr, D., Cuijpers, P., & Lehman, K. (2011). Supportive accountability: A model for providing human support to enhance adherence to eHealth interventions. Journal of Medical Internet Research, 13(1), e30.

    Article  PubMed  PubMed Central  Google Scholar 

  • Murphy, S. A. (2005). An experimental design for the development of adaptive treatment strategies. Statistics in Medicine, 24(10), 1455–1481.

    Article  PubMed  Google Scholar 

  • Nahum-Shani, I., & Almirall, D. (2019). An introduction to adaptive interventions and SMART designs in education. NCSER 2020-001. National Center for Special Education Research.

  • Nahum-Shani, I., Almirall, D., Yap, J. R., McKay, J. R., Lynch, K. G., Freiheit, E. A., & Dziak, J. J. (2020). SMART longitudinal analysis: A tutorial for using repeated outcome measures from SMART studies to compare adaptive interventions. Psychological Methods, 25(1), 1–29.

    Article  PubMed  Google Scholar 

  • Nahum-Shani, I., & Dziak, J. J. (2018). Multilevel factorial designs in intervention development. In L. M. Collins & K. C. Kugler (Eds.), Optimization of behavioral, biobehavioral, and biomedical interventions: Advanced topics (pp. 47–87). Springer International Publishing. https://doi.org/10.1007/978-3-319-91776-4_3

    Chapter  Google Scholar 

  • Nahum-Shani, I., Dziak, J. J., & Collins, L. M. (2018). Multilevel factorial designs with experiment-induced clustering. Psychological Methods, 23(3), 458–479. https://doi.org/10.1037/met0000128

    Article  PubMed  Google Scholar 

  • Nahum-Shani, I., Dziak, J. J., Walton, M. A., & Dempsey, W. (2022a). Hybrid experimental designs for intervention development: What, why and how. Advances in Methods and Practices Psychological Science, 5(3). https://doi.org/10.1177/25152459221114279

  • Nahum-Shani, I., Dziak, J. J., & Wetter, D. W. (2022b). MCMTC: A pragmatic framework for selecting an experimental design to inform the development of digital interventions. Frontiers in Digital Health, 4.

  • Nahum-Shani, I., Hekler, E., & Spruijt-Metz, D. (2015). Building health behavior models to guide the development of just-in-time adaptive interventions: A pragmatic framework. Health Psychology, 34(Supp), 1209–1219.

    Article  PubMed Central  Google Scholar 

  • Nahum-Shani, I., Qian, M., Almirall, D., Pelham, W. E., Gnagy, B., Fabiano, G. A., Waxmonsky, J. G., Yu, J., & Murphy, S. A. (2012a). Experimental design and primary data analysis methods for comparing adaptive interventions. Psychological Methods, 17(4), 457–477. https://doi.org/10.1037/a0029372

    Article  PubMed  Google Scholar 

  • Nahum-Shani, I., Qian, M., Almirall, D., Pelham, W. E., Gnagy, B., Fabiano, G. A., Waxmonsky, J. G., Yu, J., & Murphy, S. A. (2012b). Q-learning: A data analysis method for constructing adaptive interventions. Psychological Methods, 17(4), 478–494.

    Article  PubMed  PubMed Central  Google Scholar 

  • Nahum-Shani, I., Shaw, S. D., Carpenter, S. M., Murphy, S. A., & Yoon, C. (2022c). Engagement in digital interventions. American Psychologist, 77(7), 836–852

  • Nahum-Shani, I., Ertefaie, A., Lu, X., Lynch, K. G., McKay, J. R., Oslin, D. W., & Almirall, D. (2017). A SMART data analysis method for constructing adaptive treatment strategies for substance use disorders. Addiction, 112(5), 901–909.

    Article  PubMed  PubMed Central  Google Scholar 

  • Nair, V., Strecher, V., Fagerlin, A., Ubel, P., Resnicow, K., Murphy, S., Little, R., Chakraborty, B., & Zhang, A. (2008). Screening experiments and the use of fractional factorial designs in behavioral intervention research. American Journal of Public Health, 98(8), 1354–1359.

    Article  PubMed  PubMed Central  Google Scholar 

  • Oetting, A. I., Levy, J. A., Weiss, R. D., & Murphy, S. A. (2007). Statistical methodology for a SMART design in the development of adaptive treatment strategies. In P. Shrout, K. Keyes, & K. Ornstein (Eds.), Causality and psychopathology : Finding the determinants of disorders and their cures (pp. 179–205). Oxford University Press.

    Google Scholar 

  • Orellana, L., Rotnitzky, A., & Robins, J. M. (2010). Dynamic regime marginal structural mean models for estimation of optimal dynamic treatment regimes, part I: Main content. International Journal of Biostatistics, 6(2), 8 https://www.ncbi.nlm.nih.gov/pubmed/21969994

    PubMed  Google Scholar 

  • Pfammatter, A. F., Nahum-Shani, I., DeZelar, M., Scanlan, L., McFadden, H. G., Siddique, J., Hedeker, D., & Spring, B. (2019). SMART: Study protocol for a sequential multiple assignment randomized controlled trial to optimize weight loss management. Contemporary Clinical Trials, 82, 36–45.

    Article  PubMed  PubMed Central  Google Scholar 

  • Qian, T., Walton, A. E., Collins, L. M., Klasnja, P., Lanza, S. T., Nahum-Shani, I., Rabbi, M., Russell, M. A., Walton, M. A., & Yoo, H. (2022). The microrandomized trial for developing digital interventions: Experimental design and data analysis considerations. Psychological Methods, 27(5), 874–894.

  • Qian, T., Yoo, H., Klasnja, P., Almirall, D., & Murphy, S. A. (2020). Estimating time-varying causal excursion effects in mobile health with binary outcomes. Biometrika. https://doi.org/10.1093/biomet/asaa070

  • R Core Team (2022). R: A language and environment for statistical computing. R Foundation for Statistical Computing. https://www.R-project.org/.

  • Ritterband, L. M., Thorndike, F. P., Cox, D. J., Kovatchev, B. P., & Gonder-Frederick, L. A. (2009). A behavior change model for internet interventions. Annals of Behavioral Medicine, 38(1), 18–27.

    Article  PubMed  Google Scholar 

  • Robins, J., Orellana, L., & Rotnitzky, A. (2008). Estimation and extrapolation of optimal treatment and testing strategies. Statistics in Medicine, 27(23), 4678–4721. https://doi.org/10.1002/sim.3301

    Article  PubMed  Google Scholar 

  • Schueller, S. M., Tomasino, K. N., & Mohr, D. C. (2017). Integrating human support into behavioral intervention technologies: The efficiency model of support. Clinical Psychology: Science and Practice, 24(1), 27–45.

    Google Scholar 

  • Spring, B., Pfammatter, A., Scanlan, L., McFadden HG, Marchese, S., Siddique, J., Hedeker, D., & Nahum-Shani, I. (2020). How low can we go? Optimal first line and augmentation treatment tactics for obesity stepped care. Obesity, 28(S2):(106), 216.

  • Stanger, C., Kowatsch, T., Xie, H., Nahum-Shani, I., Lim-Liberty, F., Anderson, M., Santhanam, P., Kaden, S., & Rosenberg, B. (2021). A digital health intervention (SweetGoals) for young adults with type 1 diabetes: Protocol for a factorial randomized trial. JMIR Research Protocols, 10(2), e27109.

    Article  PubMed  PubMed Central  Google Scholar 

  • Walton, A., Nahum-Shani, I., Crosby, L., Klasnja, P., & Murphy, S. (2018). Optimizing digital integrated care via micro-randomized trials. Clinical Pharmacology & Therapeutics, 104(1), 53–58. https://doi.org/10.1002/cpt.1079

    Article  Google Scholar 

  • Webb, C. A., & Cohen, Z. D. (2021). Progress towards clinically informative data-driven decision support tools in psychotherapy. The Lancet Digital Health, 3(4), e207–e208.

    Article  PubMed  Google Scholar 

  • Wentzel, J., van der Vaart, R., Bohlmeijer, E. T., & van Gemert-Pijnen, J. E. (2016). Mixing online and face-to-face therapy: How to benefit from blended care in mental health care. JMIR Mental Health, 3(1), e9.

    Article  PubMed  PubMed Central  Google Scholar 

Download references

Funding

This work was funded by the National Institutes of Health, Grants U01 CA229437, P50 DA054039, R01 DA039901, and R01 DK108678

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Inbal Nahum-Shani.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nahum-Shani, I., Dziak, J.J., Venera, H. et al. Design of experiments with sequential randomizations on multiple timescales: the hybrid experimental design. Behav Res 56, 1770–1792 (2024). https://doi.org/10.3758/s13428-023-02119-z

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.3758/s13428-023-02119-z

Keywords

Navigation