Journal of Behavioral Education

, Volume 24, Issue 4, pp 459–469 | Cite as

What Works Clearinghouse Standards and Generalization of Single-Case Design Evidence

  • John H. Hitchcock
  • Thomas R. Kratochwill
  • Laura C. Chezan
Commentary

Abstract

A recent review of existing rubrics designed to help researchers evaluate the internal and external validity of single-case design (SCD) studies found that the various options yield consistent results when examining causal arguments. The authors of the review, however, noted considerable differences across the rubrics when addressing the generalization of findings. One critical finding is that the What Works Clearinghouse (WWC) review process does not capture details needed for report readers to evaluate generalization. This conclusion is reasonable if considering only the WWC’s SCD design standards. It is important to note that these standards are not used in isolation, and thus generalization details cannot be fully understood without also considering the review protocols and a tool called the WWC SCD review guide. Our purpose in this commentary is to clarify how the WWC review procedures gather information on generalization criteria and to describe a threshold for judging how much evidence is available. It is important to clarify how the system works so that the SCD research community understands the standards, which in turn might facilitate use of future WWC reports and possibly influence both the conduct and the reporting of SCD studies.

Keywords

Single-case design Generalization Internal validity External validity 

References

  1. Barlow, D. H., Nock, M. K., & Hersen, M. (2009). Single case experimental designs: Strategies for studying behavior change (2nd ed.). Boston, MA: Pearson.Google Scholar
  2. Bowman-Perrott, L., Davis, H., Vannest, K. J., Williams, L., Greenwood, C. R., & Parker, R. (2013). Academic benefits of peer tutoring: A meta-analytic review of single-case research. School Psychology Review, 42(1), 39–59.Google Scholar
  3. Cohen, J. (1994). The Earth is round (p < .05). American Psychologist, 49(12), 997–1003. doi:10.1037//0003-066X.49.12.997.CrossRefGoogle Scholar
  4. Dart, E. H., Collins, T. A., Klingbeif, D. A., & McKinley, L. E. (2014). Peer management interventions: A meta-analytic review of single-case research. School Psychology Review, 43, 367–384.CrossRefGoogle Scholar
  5. Deegear, J., & Lawson, D. M. (2003). The utility of empirically supported treatments. Professional Psychology: Research and Practice, 34(3), 271–277. doi:10.1037/0735-7028.34.3.271.CrossRefGoogle Scholar
  6. Hedges, L. V. (2013). Recommendations for practice: Justifying claims of generalizability. Educational Psychology Review, 25(3), 331–337. doi:10.1007/s10648-013-9239-x.CrossRefGoogle Scholar
  7. Hitchcock, J. H., Horner, R. H., Kratochwill, T. R., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2014). The What Works Clearinghouse single-case design pilot standards: Who will guard the guards? Remedial and Special Education Advance online publication. doi:10.1177/0741932513518979. contributors are listed by alphabetical order.Google Scholar
  8. Horner, R. H., Carr, E. G., Halle, J., McGee, G., Odom, S., & Wolery, M. (2005). The use of single subject research to identify evidence-based practice in special education. Exceptional Children, 71(2), 165–179.CrossRefGoogle Scholar
  9. Kazdin, A. E. (2011). Single-case research designs: Methods for clinical and applied settings (2nd ed.). New York, NY: Oxford University Press.Google Scholar
  10. Kratochwill, T. R. (2002). Evidence-based interventions in school psychology: Thoughts on thoughtful commentary. School Psychology Quarterly, 17, 518–532. doi:10.1521/scpq.17.4.518.20861.CrossRefGoogle Scholar
  11. Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D., & Shadish, W. R. M. (2010). Single case designs technical documentation. Retrieved from http://ies.ed.gov/ncee/wwc/pdf/wwc_scd.pdf.
  12. Kratochwill, T. R., Hitchcock, J. H., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2013). Single-case intervention research design standards. Remedial and Special Education, 34, 26–38. doi:10.1177/0741932512452794.CrossRefGoogle Scholar
  13. Kratochwill, T. R., & Levin, J. R. (Eds.). (2014). Single-case intervention research: Methodological and statistical advances. Washington, DC: American Psychological Association.Google Scholar
  14. Kratochwill, T. R., & Stoiber, K. C. (2000). Diversifying theory and science: Expanding boundaries of empirically supported interventions in schools. Journal of School Psychology, 38, 349–358. doi:10.1016/S0022-4405(00)00039-X.CrossRefGoogle Scholar
  15. Kratochwill, T. R., & Stoiber, K. C. (2002). Evidence-based interventions in school psychology: Conceptual foundations of the Procedural and Coding Manual of Division 16 and the Society for the Study of School Psychology Task Force. School Psychology Quarterly, 17, 341–389.CrossRefGoogle Scholar
  16. Maggin, D. M., Briesch, A. M., Chafouleas, S. M., Ferguson, T. D., & Clark, C. (2013). A comparison of rubrics for identifying empirically supported practices with single-case research. Journal of Behavioral Education, 23, 287–311. doi:10.1007/s10864-013-9187-z.CrossRefGoogle Scholar
  17. Schneider, B., Carnoy, M., Kilpatrick, J., Schmidt, W.H., & Shavelson, R.J. (2007). Estimating casual effects using experimental and nonexperimental designs (report from the Governing Board of the American Educational Research Association Grants Program). Washington, DC: American Educational Research Association.Google Scholar
  18. Shadish, W. R. (1995). The logic of generalization: Five principles common to experiments and ethnographies. American Journal of Community Psychology, 23, 419–428. doi:10.1007/BF02506951.CrossRefGoogle Scholar
  19. Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston, MA: Houghton Mifflin.Google Scholar
  20. Smith, J. D. (2012). Single-case experimental designs: A systematic review of published research and current standards. Psychological Methods, 17(4), 510–550. doi:10.1037/a0029312.CrossRefPubMedGoogle Scholar
  21. Wendt, O., & Miller, B. (2012). Quality appraisal of single-subject experimental designs: An overview and comparison of different appraisal tools. Education and Treatment of Children, 35(3), 235–265.CrossRefGoogle Scholar
  22. What Works Clearinghouse. (2013). Procedures and standards handbook (Version 3.0). Retrieved from http://ies.ed.gov/ncee/wwc/documentsum.aspx?sid=19.
  23. What Works Clearinghouse (2014) WWC intervention report: Repeated Reading. Retrieved from http://ies.ed.gov/ncee/wwc/pdf/intervention_reports/wwc_repeatedreading_051314.pdf.

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  • John H. Hitchcock
    • 1
  • Thomas R. Kratochwill
    • 2
  • Laura C. Chezan
    • 3
  1. 1.Center for Evaluation and Education PolicyIndiana UniversityBloomingtonUSA
  2. 2.University of Wisconsin-MadisonMadisonUSA
  3. 3.Old Dominion UniversityNorfolkUSA

Personalised recommendations