Investigating the Diagnostic Consistency and Incremental Validity Evidence of Curriculum-based Measurements of Oral Reading Rate and Comprehension
- 4 Downloads
Rate and comprehension are two components related to broad reading abilities (i.e., phonemic awareness, decoding, vocabulary, comprehension). The purpose of this study was to examine the unique contribution of three curriculum-based measures (CBM)-comprehension assessments compared to a CBM-oral reading rate assessment through diagnostic consistency and incremental validity analyses. An extant sample of fall screening data in a national assessment system were used for this investigation. The CBM-comprehension assessments measured oral recall, synthesis of main ideas, and free-response question-answering. Scores from the CBM-comprehension measures were associated with weak criterion-related validity to a measure of broad reading. In addition, diagnostic consistency analyses revealed poor overlap between CBM-comprehension and CBM-oral reading rate assessment score classifications based on at-risk benchmarks. Incremental validity evidence replicated previous findings, demonstrating that the CBM-comprehension measures explain unique variation in broad reading scores even after controlling for rate and accuracy. Implications for the usage of CBM-R and CBM-comprehension for screening is addressed.
KeywordsReading Comprehension CBM Screening Assessment Assessment to intervention
Preparation of this manuscript was supported in part by grants from the Office of Special Education Programs, US Department of Education (H327S150004; R305A120086) and the National Institute of Mental Health, US Department of Health and Human Services (5T32MH010026).
Compliance with Ethical Standards
All procedures performed in this study involving human participants were in accordance with the ethical standards of the institutional research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Informed consent was obtained from all individual participants included in the study.
Conflict of Interest
Calvary Diggs has no conflict of interest. Dr. Theodore Christ has received royalties from Fastbridge Learning.
- Alexander, K. L., Entwisle, D. R., & Kabbani, N. S. (2001). The dropout process in life course perspective: Early risk factors at home and school. Teachers College Record, 103, 760–822.Google Scholar
- Ardoin, S. P., & Christ, T. J. (2008). Evaluating Curriculum-Based Measurement Slope Estimates Using Data From Triannual Universal Screenings. School Psychology Review, 37, 109–125.Google Scholar
- Ardoin, S. P., Eckert, T. L., Christ, T. J., White, M. J., Morena, L. S., January, S.-A. A., & Hine, J. F. (2013). Examining variance in reading comprehension among developing readers: words in context (curriculum-based measurement in reading) versus words out of context (word lists). School Psychology Review, 42(3), 243.Google Scholar
- Chall, J. S. (1983). Stages of reading development. New York: McGraw-Hill.Google Scholar
- Christ, T. J., & Aranas, Y. A. (2014). Best practices in problem analysis. Best practices in school psychology, V 2, (pp. 159–176).Google Scholar
- Christ, T. J., White, M. J., Ardoin, S. P., & Eckert, T. L. (2013). Curriculum based measurement of reading: consistency and validity across best, fastest, and question reading conditions. School Psychology Review, 42(4), 415.Google Scholar
- Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Earlbaum Associates.Google Scholar
- Compton, D. L., Fuchs, D., Fuchs, L. S., Bouton, B., Gilbert, J. K., Barquero, L. A., et al. (2010). Selecting at-risk first-grade readers for early intervention: Eliminating false positives and exploring the promise of a two-stage gated screening process. Journal of Educational Psychology, 102, 327–340.Google Scholar
- Daly, E., Martens, B. K., Barnett, D., Witt, J. C., & Olson, S. C. (2007). Varying intervention delivery in response to intervention: Confronting and resolving challenges with measurement, instruction, and intensity. School Psychology Review, 36, 562–581.Google Scholar
- Good, R. H., & Kaminski, R. A. (2002). Dynamic indicators of basic early literacy skills (6th ed.). Eugene: Institute for the Development of Educational Achievement.Google Scholar
- Graesser, A., & Hu, X. (2012). Conclusion: moving forward on reading assessment. In J. Sabatini, E. Albro, & T. O’Reilly (Eds.), Measuring up (pp. 153–158). Plymouth: Rowman & Littlefield Education.Google Scholar
- Hamilton, C., & Shinn, M. R. (2003). Characteristics of word callers: An investigation of the accuracy of teachers’ judgments of reading comprehension and oral reading skills. School Psychology Review, 32, 228–240.Google Scholar
- Hasbrouck, J., & Tindal, G. (2005). Oral reading fluency: 90 years of measurement (Technical Report No. 33). Eugene, OR: Behavioral Research and Teaching, University of Oregon.Google Scholar
- Hintze, J. M., & Silberglitt, B. (2005). A longitudinal examination of the diagnostic accuracy and predictive validity of R-CBM and high-stakes testing. School Psychology Review, 34(3), 372.Google Scholar
- Kendeou, P., Papadopoulos, T. C., & Spanoudis, G. (2012). Processing demands of reading comprehension tests in young readers. Learning and Instruction, 22(5), 354–367. https://doi.org/10.1016/j.learninstruc.2012.02.001.CrossRefGoogle Scholar
- Kilgus, S. P., Methe, S. A., Maggin, D. M., & Tomasula, J. L. (2014). Curriculum-based measurement of oral reading (R-CBM): a diagnostic test accuracy meta-analysis of evidence supporting use in universal screening. Journal of School Psychology, 52(4), 377–405. https://doi.org/10.1016/j.jsp.2014.06.002.CrossRefPubMedGoogle Scholar
- Meisinger, E. B., Bradley, B. A., Schwanenflugel, P. J., Kuhn, M. R., & Morris, R. D. (2009). Myth and reality of the word caller: the relation between teacher nominations and prevalence among elementary school children. School Psychology Quarterly, 24(3), 147–159.CrossRefPubMedPubMedCentralGoogle Scholar
- National Center for Response to Intervention (2015). Screening tools chart rating system. U.S. Office of Special Education Programs.Google Scholar
- National Governors Association Center for Best Practices, Council of Chief State School Officers. (2010). Common core state standards: English language arts. Washington D.C.: National Governors Association Center for Best Practices, Council of Chief State School Officers.Google Scholar
- National Reading Panel. (2000). Report of the national reading panel subgroups: teaching children to read. National Institute of Child Health and Human DevelopmentGoogle Scholar
- Perfetti, C., & Adlof, S. M. (2012). Reading comprehension: a conceptual framework from word meaning to text meaning. In J. Sabatini, E. Albro, & T. O’Reilly (Eds.), Measuring up (pp. 153–158). Plymouth: Rowman & Littlefield Education.Google Scholar
- Reschly, A. L., Busch, T. W., Betts, J., Deno, S. L., & Long, J. D. (2009). Curriculum-based measurement oral reading as an indicator of reading achievement: a meta-analysis of the correlational evidence. Journal of School Psychology, 47(6), 427–469. https://doi.org/10.1016/j.jsp.2009.07.001.CrossRefPubMedGoogle Scholar
- Schochet, P. Z. (2008). Technical methods report: guidelines for multiple testing in impact evaluations (No. NCEE 20084018). Washington, DC: NCEE.Google Scholar
- Shapiro, E. S. (2011). Academic Skills Problems: Direct Assessment and Intervention. New York: Guilford Press.Google Scholar
- Strickland, W. D., Boon, R. T., & Spencer, V. G. (2013). The effects of repeated reading on the fluency and comprehension skills of elementary-age students with learning disabilities (LD), 2001-2011: a review of research and practice. Learning Disabilities. A Contemporary Journal, 11(1), 1–33.Google Scholar
- Theodore J. Christ and Colleagues (2015). Formative Assessment System for Teachers: Technical Manual Version 2.0, Minneapolis, MN: Author and Fast Bridge Learning (fastbridge.org).Google Scholar
- U.S. Department of Education. Institute of Education Sciences, National Center for Education Statistics. (2015). Retrieved from https://www.nationsreportcard.gov/reading_math_2015/#reading?grade=4.
- VanDerHeyden, A. M., & Burns, M. K. (2017). Four dyslexia screening myths that cause more harm than good in preventing reading failure and what you can do instead. Communique , 45(7), 1, 26, 28.Google Scholar
- Yang, J. (2007). A meta-analysis of the effects of interventions to increase reading fluency among elementary school students. Retrieved from http://search.proquest.com.ezp2.lib.umn.edu/llba/docview/85656472/A6D9B471C5CC48FAPQ/2Google Scholar