Abstract
Researchers in applied behavior analysis and related fields such as special education and school psychology use single-case designs to evaluate causal relations between variables and to evaluate the effectiveness of interventions. Visual analysis is the primary method by which single-case research data are analyzed; however, research suggests that visual analysis may be unreliable. In the absence of specific guidelines to operationalize the process of visual analysis, it is likely to be influenced by idiosyncratic factors and individual variability. To address this gap, we developed systematic, responsive protocols for the visual analysis of A-B-A-B and multiple-baseline designs. The protocols guide the analyst through the process of visual analysis and synthesize responses into a numeric score. In this paper, we describe the content of the protocols, illustrate their application to 2 graphs, and describe a small-scale evaluation study. We also describe considerations and future directions for the development and evaluation of the protocols.
Similar content being viewed by others
References
Barton, E. E., Ledford, J. R., Lane, J. D., Decker, J., Germansky, S. E., Hemmeter, M. L., & Kaiser, A. (2016). The iterative use of single case research designs to advance the science of EI/ECSE. Topics in Early Childhood Special Education, 36(1), 4–14. https://doi.org/10.1177/0271121416630011.
Barton, E. E., Lloyd, B. P., Spriggs, A. D., & Gast, D. L. (2018). Visual analysis of graphic data. In J. R. Ledford & D. L. Gast (Eds.), Single-case research methodology: Applications in special education and behavioral sciences (pp. 179–213). New York, NY: Routledge.
Barton, E. E., Meadan, H., & Fettig, A. (2019). Comparison of visual analysis, non-overlap methods, and effect sizes in the evaluation of parent implemented functional assessment based interventions. Research in Developmental Disabilities, 85, 31–41. https://doi.org/10.1016/j.ridd.2018.11.001.
Brossart, D. F., Parker, R. I., Olson, E. A., & Mahadevan, L. (2006). The relationship between visual analysis and five statistical analyses in a simple AB single-case research design. Behavior Modification, 30, 531–563. https://doi.org/10.1177/0145445503261167.
Cicchetti, D. V. (1994). Guidelines, criteria, and rules of thumb for evaluating normed and standardized assessment instruments in psychology. Psychological Assessment, 6(4), 284–290. https://doi.org/10.1037/1040-3590.6.4.284.
Cooper, C. O., Heron, T. E., & Heward, W. L. (2007). Applied behavior analysis. St. Louis: Pearson Education.
DeProspero, A., & Cohen, S. (1979). Inconsistent visual analyses of intrasubject data. Journal of Applied Behavior Analysis, 12(4), 573–579. https://doi.org/10.1901/jaba.1979.12-573.
Fisch, G. S. (1998). Visual inspection of data revisited: Do the eyes still have it? The Behavior Analyst, 21(1), 111–123. https://doi.org/10.1007/BF03392786.
Furlong, M. J., & Wampold, B. E. (1982). Intervention effects and relative variation as dimensions in experts’ use of visual inference. Journal of Applied Behavior Analysis, 15(3), 415–421. https://doi.org/10.1901/jaba.1982.15-415.
Hagopian, L. P., Fisher, W. W., Thompson, R. H., Owen-DeSchryver, J., Iwata, B. A., & Wacker, D. P. (1997). Toward the development of structured criteria for interpretation of functional analysis data. Journal of Applied Behavior Analysis, 30(2), 313–326. https://doi.org/10.1901/jaba.1997.30-313.
Hallgren, K. A. (2012). Computing inter-rater reliability for observational data: An overview and tutorial. Tutorial in Quantitative Methods for Psychology, 8(1), 23–34.
Hitchcock, J. H., Horner, R. H., Kratochwill, T. R., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. M. (2014). The what works Clearinghouse single-case design pilot standards: Who will guard the guards? Remedial and Special Education, 35(3), 145–152. https://doi.org/10.1177/0741932513518979.
Horner, R. H., Carr, E. G., Halle, J., McGee, G., Odom, S. L., & Wolery, M. (2005). The use of single-subject research to identify evidence-based practices in special education. Exceptional Children, 71, 165–179. https://doi.org/10.1177/001440290507100203.
Horner, R. H., & Spaulding, S. A. (2010). Single-subject designs. In N. E. Salkind (Ed.), The encyclopedia of research design (Vol. 3, pp. 1386–1394). Thousand Oaks: Sage Publications.
Horner, R. H., Swaminathan, H., Sugai, G., & Smolkowski, K. (2012). Considerations for the systematic analysis and use of single-case research. Education and Treatment of Children, 35(2), 269–290. https://doi.org/10.1353/etc.2012.0011.
Kahng, S. W., Chung, K. M., Gutshall, K., Pitts, S. C., Kao, J., & Girolami, K. (2010). Consistent visual analyses of intrasubject data. Journal of Applied Behavior Analysis, 43(1), 35–45. https://doi.org/10.1901/jaba.2010.43-35.
Kazdin, A. E. (2011). Single-case research designs: Methods for clinical and applied settings (2nd ed.). New York: Oxford University Press.
Kratochwill, T. R., Hitchcock, J. H., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2013). Single-case intervention research design standards. Remedial and Special Education, 34, 26–38. https://doi.org/10.1177/0741932512452794.
Ledford, J. R., & Gast, D. L. (2018). Single case research methodology: Applications in special education and behavioral sciences. New York: Routledge.
Lieberman, R. G., Yoder, P. J., Reichow, B., & Wolery, M. (2010). Visual analysis of multiple baseline across participants graphs when change is delayed. School Psychology Quarterly, 25(1), 28–44. https://doi.org/10.1037/a0018600.
Maggin, D. M., Briesch, A. M., & Chafouleas, S. M. (2013). An application of the what works Clearinghouse standards for evaluating single subject research: Synthesis of the self-management literature base. Remedial and Special Education, 34(1), 44–58. https://doi.org/10.1177/0741932511435176.
Penny, J., Johnson, R. L., & Gordon, B. (2000a). The effect of rating augmentation on inter-rater reliability: An empirical study of a holistic rubric. Assessing Writing, 7(2), 143–164. https://doi.org/10.1016/S1075-2935(00)00012-X.
Penny, J., Johnson, R. L., & Gordon, B. (2000b). Using rating augmentation to expand the scale of an analytic rubric. Journal of Experimental Education, 68(3), 269–287. https://doi.org/10.1080/00220970009600096.
Scruggs, T. E., & Mastropieri, M. A. (1998). Summarizing single-subject research: Issues and applications. Behavior Modification, 22(3), 221–242. https://doi.org/10.1177/01454455980223001.
Shadish, W. R. (2014). Statistical analyses of single-case designs: The shape of things to come. Current Directions in Psychological Science, 23(2), 139–146. https://doi.org/10.1177/0963721414524773.
Shadish, W. R., & Sullivan, K. J. (2011). Characteristics of single-case designs used to assess intervention effects in 2008. Behavior Research Methods, 43(4), 971–980. https://doi.org/10.3758/s13428-011-0111-y.
Shrout, P. E., & Fleiss, J. L. (1979). Intraclass correlations: Uses in assessing rater reliability. Psychological Bulletin, 86(2), 420–428. https://doi.org/10.1037/0033-2909.86.2.420.
Smith, J. D. (2012). Single-case experimental designs: A systematic review of published research and current standards. Psychological Methods, 17, 510–550. https://doi.org/10.1037/a0029312.
What Works Clearinghouse. (2017). Procedures and standards handbook (Version 4.0). Retrieved from https://ies.ed.gov/ncee/wwc/Docs/referenceresources/wwc_standards_handbook_v4.pdf. Accessed 9 Jan 2018.
Wolfe, K., Seaman, M. A., & Drasgow, E. (2016). Interrater agreement on the visual analysis of individual tiers and functional relations in multiple baseline designs. Behavior Modification, 40(6), 852–873. https://doi.org/10.1177/0145445516644699.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
Katie Wolfe declares that she has no conflict of interest. Erin E. Barton declares that she has no conflict of interest. Hedda Meadan declares that she has no conflict of interest.
Ethical Approval
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. The University of South Carolina Institutional Review Board approved the procedures in this study.
Informed Consent
Informed consent was obtained from all individual participants included in the study.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Wolfe, K., Barton, E.E. & Meadan, H. Systematic Protocols for the Visual Analysis of Single-Case Research Data. Behav Analysis Practice 12, 491–502 (2019). https://doi.org/10.1007/s40617-019-00336-7
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40617-019-00336-7