Skip to main content
Log in

How to Be RAD: Repeated Acquisition Design Features that Enhance Internal and External Validity

  • Original Research
  • Published:
Perspectives on Behavior Science Aims and scope Submit manuscript

A Correction to this article was published on 19 October 2021

This article has been updated

Abstract

The Repeated Acquisition Design (RAD) is a type of single-case research design (SCRD) that involves repeated and rapid measurement of irreversible discrete skills or behaviors through pre-and postintervention probes across different sets of stimuli. Researchers interested in the study of learning in animals and humans have used the RAD because of its sensitivity to detect immediate changes in rate or accuracy. Despite its strengths, critics of the RAD have cautioned against its use due to reasonable threats to internal validity like pretest effects, history, and maturation. Furthermore, many methodologists and researchers have neglected the RAD in their SCRD standards (e.g., What Works Clearinghouse [WWC], 2020; Horner et al., 2005). Unless given guidance to address threats to internal validity, researchers may avoid the design altogether or continue to use a weak version of the RAD. Therefore, we propose a set of 15 quality RAD indicators, comprising foundational elements that should be present in all RAD studies and additional features that enhance causal inference and external validity. We review contemporary RAD use and describe how the additional features strengthen the rigor of RAD studies. We end the article with suggested guidelines for interpreting effects and the strength of the evidence generated by RAD studies. We invite researchers to use these initial guidelines as a jumping off point for a more RAD future.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Change history

References

  • Boren, J. J. (1963). The repeated acquisition of new behavioral chains. American Psychologist, 18(7), 421–421.

    Google Scholar 

  • Boren, J. J. (1969). Some variables affecting the superstitious chaining of responses. Journal of the Experimental Analysis of Behavior, 12(6), 959–969.

    Article  Google Scholar 

  • Bouck, E. C., Flanagan, S., Joshi, G. S., Sheikh, W., & Schleppenback, D. (2011). Speaking math: A voice input, speech output calculator for students with visual impairments. Journal of Special Education Technology, 26(4), 1–14. https://doi.org/10.1177/016264341102600401.

  • Brown, C. H., Wyman, P. A., Guo, J., & Peña, J. (2006). Dynamic wait-listed designs for randomized trials: New designs for prevention of youth suicide. Clinical Trials, 3(3), 259–271. https://doi.org/10.1191/1740774506cn152oa.

  • Butler, C., Brown, J. A., & Woods, J. J. (2014). Teaching at-risk toddlers new vocabulary using interactive digital storybooks. Contemporary Issues in Communication Science & Disorders, 41, 155–168. https://doi.org/1092-5171/14/4102-0155.

  • Cohn, J., Cox, C., & Cory-Slechta, D. A. (1993). The effects of lead exposure on learning in a multiple repeated acquisition and performance schedule. Neurotoxicology, 14(2–3), 329–346.

    PubMed  Google Scholar 

  • Dennis, L. R., & Whalon, K. J. (2020). Effects of teacher- versus application-delivered instruction on the expressive vocabulary of at-risk preschool children. Remedial & Special Education, 42(4), 195–206. https://doi.org/10.1177/0741932519900991.

  • Gagliardi, A. R. (2011). Tailoring interventions: Examining the evidence and identifying gaps. Journal of Continuing Education in the Health Professions, 31(4), 276–282.

    Article  Google Scholar 

  • Gersten, R., Fuchs, L. S., Compton, D., Coyne, M., Greenwood, C., & Innocenti, M. S. (2005). Quality indicators for group experimental and quasi-experimental research in special education. Exceptional Children, 71(2), 149–164. https://doi.org/10.1177/001440290507100202.

  • Goldstein, H., & Kelley, E. S. (2016). Story friends: An early literacy intervention for improving oral language. Paul H. Brookes.

  • Greenwood, C. R., Carta, J. J., Kelley, E. S., Guerrero, G., Kong, N. Y., Atwater, J., & Goldstein, H. (2016). Systematic replication of the effects of a supplementary, technology-assisted, storybook intervention for preschool children with weak vocabulary and comprehension skills. Elementary School Journal, 116(4), 574–599. https://doi.org/10.1086/686223.

  • Horner, R. H., Carr, E. G., Halle, J., McGee, G., Odom, S., & Wolery, M. (2005). The use of single-subject research to identify evidence-based practice in special education. Exceptional Children, 71(2), 165–179. https://doi.org/10.1177/001440290507100203.

  • Hua, Y., Hinzman, M., & Yuan, C. (2020). Comparing the effects of two reading interventions using a randomized alternating treatment design. Exceptional Children, 86(4), 355–373. https://doi.org/10.1177/0014402919881357.

  • Johnson, A. H., & Cook, B. G. (2019). Preregistration in single-case design research. Exceptional Children, 86(1), 95–112. https://doi.org/10.1177/0014402919868529.

  • Kelley, E. S., Goldstein, H., Spencer, T. D., & Sherman, A. (2015). Effects of automated Tier 2 storybook intervention on vocabulary and comprehension learning in preschool children with limited oral language skills. Early Childhood Research Quarterly, 31, 47–61. https://doi.org/10.1016/j.ecresq.2014.12.004.

  • Kennedy, C. H. (2005). Single-case designs for educational research. Pearson.

  • Kratochwill, T. R., Hitchcock, J. H., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2013). Single-case intervention research design standards. Remedial & Special Education, 34(1), 26–38. https://doi.org/10.1177/0741932512452794.

  • Kratochwill, T. R., Levin, J. R., Horner, R. H., & Swoboda, C. M. (2014). Visual analysis of single-case intervention research: Conceptual and methodological issues. In T. R. Kratochwill & J. R. Levin (Eds.), Single-case intervention research: Methodological and statistical advances (pp. 91–125). School psychology series. American Psychological Association. https://doi.org/10.1037/14376-004

  • Kratochwill, T. R., & Levin, J. R. (2010). Enhancing the scientific credibility of single-case intervention research: Randomization to the rescue. Psychological Methods, 15(2), 124–144. https://doi.org/10.1037/a0017736.

    Article  PubMed  Google Scholar 

  • Ledford, J. R., Barton, E. E., Severini, K. E., Zimmerman, K. N., & Pokorski, E. A. (2019). Visual display of graphic data in single case design studies: Systematic review and expert preference analysis. Education & Training in Autism & Developmental Disabilities, 54(4), 315–327.

  • Ledford, J. R., & Gast, D. L. (2018). Combination and other designs. In J. R. Ledford & D. L. Gast (Eds.), Single case research methodology: Applications in special education and behavioral sciences (3rd ed., pp. 335–364). Routledge.

  • Levin, J. R., Ferron, J. M., & Gafurov, B. S. (2020). Investigation of single-case multiple-baseline randomization tests of trend and variability. Educational Psychology Review, 33, 713–737. https://doi.org/10.1007/s10648-020-09549-7.

  • Lin, F.-Y., & Kubina., R. M. (2015). Imitation fluency in a student with autism spectrum disorder: An experimental case study. European Journal of Behavior Analysis, 16(1), 2–20. https://doi.org/10.1080/15021149.2015.1065637.

  • Lobo, M. A., Moeyaert, M., Baraldi Cunha, A., & Babik, I. (2017). Single-case design, analysis, and quality assessment for intervention research. Journal of Neurologic Physical Therapy: JNPT, 41(3), 187–197. https://doi.org/10.1097/NPT.0000000000000187.

  • Odom, S. L., Hall, L. J., & Suhrheinrich, J. (2020). Implementation science, behavior analysis, and supporting evidence-based practices for individuals with autism. European Journal of Behavior Analysis, 21(1), 55–73. https://doi.org/10.1080/15021149.2019.1641952.

  • Onghena, P., & Edgington, E. S. (1994). Randomization tests for restricted alternating treatments designs. Behavior Research & Therapy, 32, 783–786. https://doi.org/10.1016/0005-7967(94)90036-1.

  • Parker, R. I., & Vannest, K. J. (Eds) (2014). Non-overlap analysis for single-case research. In T. R. Kratochwill & J. R. Levin (Eds.), Single-case intervention research: Methodological and statistical advances (pp. 127–151). American Psychological Association. https://doi.org/10.1037/14376-005.

  • Peters-Sanders, L. A., Kelley, E. S., Biel, C. H., Madsen, K., Soto, X., Seven, Y., Hull, K., & Goldstein, H. (2020). Moving forward four words at a time: Effects of a supplemental preschool vocabulary intervention. Language, Speech, & Hearing Services in Schools, 51, 165–175. https://doi.org/10.1044/2019_LSHSS-19-00029.

  • Porritt, M., Wagner, K. V., & Poling, A. (2009). Effects of response spacing on acquisition and retention of conditional discriminations. Journal of Applied Behavior Analysis, 42(2), 295–307. https://doi.org/10.1901/jaba.2009.42-295.

  • Powell, S. R., & Nelson, G. (2017). An investigation of the mathematics-vocabulary knowledge of first-grade students. Elementary School Journal, 117(4), 664–686. https://doi.org/10.1086/691604.

  • Pustejovsky, J. E., Swan, D. M., & English, K. W. (2019). An examination of measurement procedures and characteristics of baseline outcome data in single-case research. Behavior Modification. Advance online publication. https://doi.org/10.1177/0145445519864264.

  • Rubenstein, R. N., & Thompson, D. R. (2002). Understanding and supporting children’s mathematical vocabulary development. Teaching Children Mathematics, 9(2), 107–112. https://doi.org/10.5951/TCM.9.2.0107.

  • Shadish, W. R., & Sullivan, K. J. (2011). Characteristics of single-case designs used to assess intervention effects in 2008. Behavior Research Methods, 43(4), 971–980. https://doi.org/10.3758/s13428-011-0111-y.

  • Shepley, C., Zimmerman, K. N., & Ayres, K. M. (2020). Estimating the impact of design standards on the rigor of a subset of single-case research. Journal of Disability Policy Studies. https://doi.org/10.1177/1044207320934048.

  • Smith, J. D. (2012). Single-case experimental designs: A systematic review of published research and current standards. Psychological Methods, 17(4), 510–550. https://doi.org/10.1037/a0029312.

  • Spencer, E. J., Goldstein, H., Sherman, A., Noe, S., Tabbah, R., Ziolkowski, R., & Schneider, N. (2012). Effects of an automated vocabulary and comprehension intervention: An early efficacy study. Journal of Early Intervention, 45, 195–221. https://doi.org/10.1177/1053815112471990.

  • Sullivan, M., Konrad, M., Joseph, L. M., & Luu, K. C. T. (2013). A comparison of two sight word reading fluency drill formats. Preventing School Failure, 57(2), 102–110. https://doi.org/10.1080/1045988X.2012.674575.

  • Tanious, R., & Onghena, P. (2020). A systematic review of applied single-case research published between 2016 and 2018: Study designs, randomization, data aspects, and data analysis. Behavior Research Methods. https://doi.org/10.3758/s13428-020-01502-4.

  • Thompson, D. M., Mastropaulo, J., Winsauer, P. J., & Moerschbaecher, J. M. (1986). Repeated acquisition and delayed performance as a baseline to assess drug effects on retention in monkeys. Pharmacology, Biochemistry, & Behavior, 25, 201–207.

  • Van den Noortgate, W., & Onghena, P. (2007). The aggregation of single-case results using hierarchical linear models. Behavior Analyst Today, 8(2), 196–209. https://doi.org/10.1037/h0100613.

  • Weaver, E. S., & Lloyd, B. P. (2019). Randomization tests for single case designs with rapidly alternating conditions: An analysis of p-values from published experiments. Perspectives on Behavior Science, 42(3), 617–645. https://doi.org/10.1007/s40614-018-0165-6.

  • Whalon, K., Hanline, M. F., & Davis, J. (2016). Parent implementation of RECALL: A systematic case study. Education & Training in Autism & Developmental Disabilities, 51(2), 211–220. http://www.jstor.org/stable/24827548.

  • What Works Clearinghouse (WWC). (2020). What Works Clearinghouse standards handbook, Version 4.1. U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance. https://ies.ed.gov/ncee/wwc/handbooks.

  • Wolfe, K., Barton, E. E., & Meadan, H. (2019). Systematic protocols for the visual analysis of single-case research data. Behavior Analysis in Practice, 12, 491–502. https://doi.org/10.1007/s40617-019-00336-7.

  • Zimmerman, K. N., Ledford, J. R., Severini, K. E., Pustejovsky, J. E., Barton, E. E., & Lloyd, B. P. (2018). Single-case synthesis tools I: Comparing tools to evaluate SCD quality and rigor. Research in Developmental Disabilities, 79, 19–32. https://doi.org/10.1016/j.ridd.2018.02.003.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Megan S. Kirby.

Ethics declarations

Conflict of Interest

The authors report no conflicts of interest.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article was updated to correct the spelling of Micheal Sandbank in the quote attribution on the second page of the article.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kirby, M.S., Spencer, T.D. & Ferron, J. How to Be RAD: Repeated Acquisition Design Features that Enhance Internal and External Validity. Perspect Behav Sci 44, 389–416 (2021). https://doi.org/10.1007/s40614-021-00301-2

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s40614-021-00301-2

Keywords

Navigation