Advertisement

Using NAEP to Analyze Eighth-Grade Students’ Ability to Reason Algebraically

  • Peter KloostermanEmail author
  • Crystal Walcott
  • Nathaniel J. S. Brown
  • Doris Mohr
  • Arnulfo Pérez
  • Shenghi Dai
  • Michael Roach
  • Linda Dager Hall
  • Hsueh-Chen Huang
Part of the Research in Mathematics Education book series (RME)

Abstract

After describing the content and scoring of NAEP mathematics assessments, this chapter focuses on performance on individual NAEP items as a means for documenting the algebraic reasoning skills of eighth-grade students. In general, performance on algebraic reasoning is similar to performance on the grade 8 mathematics NAEP in recent years—performance was stable on the majority of items and increased modestly in the remainder. Many eighth graders can understand and explain relationships between two variables in different settings and formats. Many are also able to plot points but few are able to use slope and intercept or determine which points to plot to connect an equation and a graph. When it comes to solving word problems involving linear relationships, it appears that many students employ intuitive strategies such as guess and check although some are able to use formal algebraic methods. Logistical and methodological issues that arise when using NAEP data are also addressed. These issues include adjusting analyses to account for the sampling method used by NAEP and limitations in statistical power when working with subsets of the NAEP data. The chapter closes with examples of the types of statistical techniques we are using to determine whether clusters of items developed by analysis of item content are valid from a measurement perspective.

Keywords

Algebra Common core state standards Item level analysis Item response theory NAEP Reasoning 

Notes

Acknowledgement

This chapter is based upon work supported by the National Science Foundation under the REESE Program, grant number 1008438. Opinions, findings, conclusions, and recommendations expressed in the chapter are those of the authors and do not necessarily reflect the views of the National Science Foundation.

References

  1. Adams, R., Wu, M., & Wilson, M. (2012). ACER ConQuest version 3.0.1: Generalised item response modeling software [Computer software and manual]. Camberwell: ACER Press.Google Scholar
  2. Arbaugh, F., Brown, C., Lynch, K., & McGraw, R. (2004). Students’ ability to construct responses (1992–2000): Findings from short and extended-constructed response items. In P. Kloosterman & F. K. Lester Jr. (Eds.), Results and interpretations of the 1990 through 2000 mathematics assessments of the National Assessment of Educational Progress (pp. 337–362). Reston, VA: National Council of Teachers of Mathematics.Google Scholar
  3. Ashlock, R. B. (2006). Error patterns in computation: Using error patterns to improve instruction (9th ed.). Upper Saddle River, NJ: Pearson.Google Scholar
  4. Blume, G. W., Galindo, E., & Walcott, C. (2007). Performance in measurement and geometry from the viewpoint of the principles and standards for school mathematics. In P. Kloosterman & F. K. Lester Jr. (Eds.), Results and interpretations of the 2003 mathematics assessment of the National Assessment of Educational Progress (pp. 95–138). Reston, VA: National Council of Teachers of Mathematics.Google Scholar
  5. Bransford, J., Stevens, S., Schwartz, D., Meltzoff, A., Pea, R., Roschelle, J., et al. (2006). Learning theories and education: Toward a decade of synergy. In P. A. Alexander & P. H. Winne (Eds.), Handbook of educational psychology (2nd ed., pp. 209–244). Mahwah, NJ: Lawrence Erlbaum.Google Scholar
  6. Braswell, J. S., Lutkus, A. D., Grigg, W. S., Santapau, S. L., Tay-Lim, B., & Johnson, M. (2001). The nation’s report card: Mathematics 2000 (Report No. NCES 2001-517). Washington, DC: National Center for Education Statistics.Google Scholar
  7. Brown, N. J. S., Dai, S., & Svetina, D. (2014, April). Predictors of omitted responses on the 2009 National Assessment of Educational Progress (NAEP) mathematics assessment. Paper presented at the annual meeting of the American Educational Research Association, Philadelphia.Google Scholar
  8. Brown, N. J. S., Svetina, D., & Dai, S. (2014, April). Impact of omitted responses scoring methods on achievement gaps. Paper presented at the annual meeting of the National Council on Measurement in Education, Philadelphia.Google Scholar
  9. Brown, N. J. S., & Wilson, M. (2011). A model of cognition: The missing cornerstone in assessment. Educational Psychology Review, 23, 221–234.CrossRefGoogle Scholar
  10. Chazan, D., Leavy, A. M., Birky, G., Clark, K., Lueke, M., McCoy, W., et al. (2007). What NAEP can (and cannot) tell us about performance in algebra. In P. Kloosterman & F. K. Lester Jr. (Eds.), Results and interpretations of the 2003 mathematics assessment of the National Assessment of Educational Progress (pp. 169–190). Reston, VA: National Council of Teachers of Mathematics.Google Scholar
  11. Cobb, P. (2007). Putting philosophy to work: Coping with multiple theoretical perspectives. In F. K. Lester Jr. (Ed.), Second handbook of research on mathematics teaching and learning (pp. 3–38). Charlotte, NC: Information Age.Google Scholar
  12. D’Ambrosio, B. S., Kastberg, S. E., & Lambdin, D. V. (2007). Designed to differentiate: What is NAEP measuring? In P. Kloosterman & F. K. Lester Jr. (Eds.), Results and interpretations of the 2003 mathematics assessment of the national assessment of educational progress (pp. 289–309). Reston, VA: National Council of Teachers of Mathematics.Google Scholar
  13. Ferrara, S., Svetina, D., Skucha, S., & Davidson, S. (2011). Test development with performance standards and achievement growth in mind. Educational Measurement: Issues and Practice, 30(4), 3–15.CrossRefGoogle Scholar
  14. Innes, R. G. (2012). Wise and proper use of national assessment of educational progress (NAEP) data. Journal of School Choice: Research, Theory, and Reform, 6, 259–289.CrossRefGoogle Scholar
  15. Kastberg, S. E., & Norton, A. N., III. (2007). Building a system of rational numbers. In P. Kloosterman & F. K. Lester Jr. (Eds.), Results and interpretations of the 2003 mathematics assessment of the national assessment of educational progress (pp. 67–93). Reston, VA: National Council of Teachers of Mathematics.Google Scholar
  16. Kieran, C. (2007). Learning and teaching algebra at the middle school through college levels: Building meaning for symbols and their manipulation. In F. K. Lester Jr. (Ed.), Second handbook of research on mathematics teaching and learning (pp. 707–762). Charlotte, NC: Information Age.Google Scholar
  17. Kloosterman, P. (2010). Mathematics skills of 17-year-olds in the United States: 1978 to 2004. Journal for Research in Mathematics Education, 41, 20–51.Google Scholar
  18. Kloosterman, P. (2011). Mathematics skills of 9-year-olds: 1978 to 2004. The Elementary School Journal, 112, 183–203.CrossRefGoogle Scholar
  19. Kloosterman, P. (2014). How much do mathematics skills improve with age? Evidence from LTT NAEP. School Science and Mathematics, 114, 19–29.CrossRefGoogle Scholar
  20. Kloosterman, P., & Lester, F. K., Jr. (Eds.). (2004). Results and interpretations of the 1990 through 2000 mathematics assessments of the national assessment of educational progress. Reston, VA: National Council of Teachers of Mathematics.Google Scholar
  21. Kloosterman, P., & Lester, F. K., Jr. (Eds.). (2007). Results and interpretations of the 2003 mathematics assessment of the national assessment of educational progress. Reston, VA: National Council of Teachers of Mathematics.Google Scholar
  22. Lobato, J., Ellis, A. B., & Munoz, R. (2003). How “focusing phenomena” in the instructional environment support individual student’s generalizations. Mathematical Thinking and Learning, 5, 1–36.Google Scholar
  23. MacGregor, M., & Stacey, K. (1997). Student’s understanding of algebraic notion: 11–15. Educational Studies in Mathematics, 33, 1–19.CrossRefGoogle Scholar
  24. Masters, G. (1982). A Rasch model for partial credit scoring. Psychometrika, 47, 149–174.CrossRefGoogle Scholar
  25. Misley, R. J., & Wu, P. K. (1996). Missing responses and IRT ability estimation: Omits, choice, time limits, and adaptive testing. Retrieved from http://oai.dtic.mil/oai/oai?verb=getRecord&metadataPrefix=html&identifier=ADA313823
  26. National Assessment Governing Board. (2010). Mathematics framework for the 2011 national assessment of educational progress. Washington, DC: U.S. Department of Education.Google Scholar
  27. National Assessment Governing Board. (2012). Mathematics framework for the 2013 national assessment of educational progress. Washington, DC: U.S. Department of Education.Google Scholar
  28. National Center for Education Statistics. (2012). The nation’s report card: Mathematics 2011 (Report No. NCES 2012-458). Washington, DC: Institute for Education Sciences, U.S. Department of Education.Google Scholar
  29. No Child Left Behind (NCLB) Act of 2001. (2001). Public Law 107–110 Title I Part A, Section 1111.Google Scholar
  30. Rasch, G. (1980). Probabilistic models for some intelligence and attainment tests. Chicago, IL: University of Chicago Press (Original work published 1960).Google Scholar
  31. Sowder, J. T., Wearne, D., Martin, W. G., & Strutchens, M. (2004). What do 8th-grade students know about mathematics? In P. Kloosterman & F. K. Lester Jr. (Eds.), Results and interpretations of the 1990 through 2000 mathematics assessments of the National Assessment of Educational Progress (pp. 105–144). Reston, VA: National Council of Teachers of Mathematics.Google Scholar
  32. Warfield, J., & Meier, S. L. (2007). Student performance in whole-number properties and operations. In P. Kloosterman & F. K. Lester Jr. (Eds.), Results and interpretations of the 2003 mathematics assessment of the National Assessment of Educational Progress (pp. 43–66). Reston, VA: National Council of Teachers of Mathematics.Google Scholar
  33. Wei, X. (2012). Are more stringent NCLB state accountability systems associated with better outcomes? An analysis of NAEP results across states. Educational Policy, 26, 268–308.CrossRefGoogle Scholar
  34. Wilson, M. (2005). Constructing measures: An item response modeling approach. Mahwah, NJ: Lawrence Erlbaum.Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Peter Kloosterman
    • 1
    Email author
  • Crystal Walcott
    • 2
  • Nathaniel J. S. Brown
    • 3
  • Doris Mohr
    • 4
  • Arnulfo Pérez
    • 5
  • Shenghi Dai
    • 1
  • Michael Roach
    • 1
  • Linda Dager Hall
    • 6
  • Hsueh-Chen Huang
    • 1
  1. 1.School of Education, Indiana UniversityBloomingtonUSA
  2. 2.Indiana University Purdue University ColumbusColumbusUSA
  3. 3.Boston CollegeChestnut HillUSA
  4. 4.University of Southern IndianaEvansvilleUSA
  5. 5.The Ohio State UniversityColumbusUSA
  6. 6.St. Patrick’s Episcopal Day SchoolWashington, DCUSA

Personalised recommendations