Reading and Writing

, Volume 25, Issue 4, pp 887–915 | Cite as

Within-year oral reading fluency with CBM: a comparison of models

  • Joseph F. T. Nese
  • Gina Biancarosa
  • Daniel Anderson
  • Cheng-Fei Lai
  • Julie Alonzo
  • Gerald Tindal
Article

Abstract

This study examined the type of growth model that best fit within-year growth in oral reading fluency and between-student differences in growth. Participants were 2,465 students in grades 3–5. Hierarchical linear modeling (HLM) analyses modeled curriculum-based measurement (CBM) oral reading fluency benchmark measures in fall, winter, and spring with grade level and student characteristics (including special education and Limited English Proficiency status) as covariates. Results indicated that a discontinuous growth model fit the data better than a linear growth model, with greater growth in the fall than in the spring. Oral reading fluency growth rates also differed by grade and student characteristics. Implications for school practice and research are discussed.

Keywords

Oral reading fluency Curriculum-based measurement Growth modeling 

References

  1. Alonzo, J., Mariano, G. J., Nese, J. F., & Tindal, G. (2010). Reliability of the easyCBM® reading assessments. San Diego, CA: Paper presented at the Pacific Coast Research Conference.Google Scholar
  2. Alonzo, J., & Tindal, G. (2007a). The development of fifth-grade passage reading fluency measures for use in a progress monitoring assessment system (Technical Report # 43). Eugene, OR: Behavioral Research and Teaching.Google Scholar
  3. Alonzo, J., & Tindal, G. (2007b). The development of word and passage reading fluency measures in a progress monitoring assessment system (Technical Report No. 40). Eugene, OR: University of Oregon, Behavioral Research and Teaching.Google Scholar
  4. Alonzo, J., Tindal, G., Ulmer, K., & Glasgow, A. (2006). EasyCBM® online progress monitoring assessment system. Eugene, OR: University of Oregon, Behavioral Research and Teaching.Google Scholar
  5. Ardoin, S., & Christ, T. (2008). Evaluating curriculum-based measurement slope estimates using data from triannual universal screenings. School Psychology Review, 37, 109–125.Google Scholar
  6. Christ, T. J., Silberglitt, B., Yeo, S., & Cormier, D. (2010). Curriculum-based measurement of oral reading: An evaluation of growth rates and seasonal effects among students served in general and special education. School Psychology Review, 39, 447–462.Google Scholar
  7. Cummings, K. D., Good, R. H., Powell-Smith, K. A., Smolkowski, K., Baker, S., Gau, J., & Atkins, T. (2008). ROC done right: examining the decision utility of educational measures. San Diego, CA: Symposium presented at the 16th Annual Pacific Coast Research Conference.Google Scholar
  8. Deno, S. (1985). Curriculum-based measurement: The emerging alternative. Exceptional Children, 52, 219–232.Google Scholar
  9. Deno, S. (2003). Developments in curriculum-based measurement. The Journal of Special Education, 37, 184–192. doi:10.1177/00224669030370030801.CrossRefGoogle Scholar
  10. Deno, S., Fuchs, L., Marston, D., & Shin, J. (2001). Using curriculum-based measurement to establish growth standards for students with learning disabilities. School Psychology Review, 30, 507–524.Google Scholar
  11. Fuchs, L. (2004). The past, present, and future of curriculum-based measurement research. School Psychology Review, 33, 188–192.Google Scholar
  12. Fuchs, L., Fuchs, D., Hamlett, C., Walz, L., & Germann, G. (1993). Formative evaluation of academic progress: How much growth can we expect? School Psychology Reveiw, 22, 27–48.Google Scholar
  13. Fuchs, D., Mock, D., Morgan, P. L., & Young, C. L. (2003). Responsiveness-to-intervention: Definitions, evidence, and implications for the learning disabilities construct. Learning Disabilities Research & Practice, 18(3), 157–171.CrossRefGoogle Scholar
  14. Good, R., & Jefferson, G. (1998). Contemporary perspectives on curriculum-based measurement validity. In M. Shinn (Ed.), Advanced applications of curriculum-based measurement (pp. 61–88). New York, NY: The Guilford Press.Google Scholar
  15. Good, R. H., & Kaminski, R. A. (2003). Dynamic indicators of basic early literacy skills (6th Ed). Longmont, CO: Sopris West.Google Scholar
  16. Graney, S., Missall, K., Martinez, R., & Bergstrom, M. (2009). A preliminary investigation of within-year growth patterns in reading and mathematics curriculum-based measures. Journal of School Psychology, 47, 121–142. doi:10.1016/j.jsp.2008.12.001.CrossRefGoogle Scholar
  17. Hasbrouck, J., & Tindal, G. (2006). Oral reading fluency norms: A valuable assessment tool for reading teachers. The Reading Teacher, 59, 636–644. doi:10.1598/RT.59.7.3.CrossRefGoogle Scholar
  18. Kaminski, R., Cummings, K. D., Powell-Smith, K. A., & Good, R. H. (2007). Best practices in using Dynamic Indicators of Basic Early Literacy Skills (DIBELS) for formative assessment and evaluation. Unpublished manuscript, Dynamic Measurement Group, University of Oregon. Oregon: Eugene.Google Scholar
  19. Keller-Margulis, M., Shapiro, E., & Hintze, J. (2008). Long-term diagnostic accuracy of curriculum-based measures in reading and mathematics. School Psychology Review, 37, 374–390.Google Scholar
  20. Kranzler, J. H., Brownell, M. T., & Miller, M. D. (1998). The construct validity of curriculum-based measurement of reading: An empirical test of a plausible rival hypothesis. Journal of School Psychology, 36, 399–415. doi:10.1016/S0022-4405(98)00018-1.CrossRefGoogle Scholar
  21. Pearson (2008). AIMSweb progress monitoring and RTI system. Retrieved May 12, 2010 from https://www.aimsweb.com/
  22. Pearson (2011). AIMSweb growth aggregate, Reading(R-CBM). Retrieved March 9, 2010 from https://www.aimsweb.pearson.com/
  23. Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (2nd ed. ed.). Thousand Oaks, CA: Sage.Google Scholar
  24. Shapiro, E., Keller, M., Santoro, L., & Hintze, J. (2006). Curriculum-based measures and performance on state assessment and standardized tests. Journal of Psychoeducational Assessment, 24, 19–35.CrossRefGoogle Scholar
  25. Shinn, M. (2007). Identifying students at risk, monitoring performance, and determining eligibility within response to intervention: Research on educational need and benefit from academic intervention. School Psychology Review, 36, 601–617.Google Scholar
  26. Shinn, M., Good, R., Knutson, N., Tilly, W., & Colllins, V. (1992). Curriculum-based measurement of oral reading fluency: A confirmatory analysis of its relation to reading. School Psychology Review, 21, 459–479.Google Scholar
  27. Silberglitt, B., & Hintze, J. (2007). How much growth can we expect? A conditional analysis of R-CBM growth rates by level of performance. Exceptional Children, 74, 71–84.Google Scholar
  28. Singer, J. D., & Willett, J. B. (2003). Applied Longitudinal Data Analysis. New York: Oxford University.Google Scholar
  29. Stage, S. (2001). Program evaluation using hierarchical linear modeling with curriculum-based measurement reading probes. School Psychology Quarterly, 16, 91–112. doi:10.1521/scpq.16.1.91.19159.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media B.V. 2011

Authors and Affiliations

  • Joseph F. T. Nese
    • 1
  • Gina Biancarosa
    • 2
  • Daniel Anderson
    • 1
  • Cheng-Fei Lai
    • 1
  • Julie Alonzo
    • 1
  • Gerald Tindal
    • 1
  1. 1.Behavioral Research and TeachingUniversity of Oregon, 175 EducationEugeneUSA
  2. 2.University of Oregon, 102 EducationEugeneUSA

Personalised recommendations