Advertisement

Reading and Writing

, Volume 31, Issue 4, pp 835–864 | Cite as

Content and alignment of state writing standards and assessments as predictors of student writing achievement: an analysis of 2007 National Assessment of Educational Progress data

  • Gary A. Troia
  • Natalie G. Olinghouse
  • Mingcai Zhang
  • Joshua Wilson
  • Kelly A. Stewart
  • Ya Mo
  • Lisa Hawkins
Article
  • 298 Downloads

Abstract

We examined the degree to which content of states’ writing standards and assessments (using measures of content range, frequency, balance, and cognitive complexity) and their alignment were related to student writing achievement on the 2007 National Assessment of Educational Progress (NAEP), while controlling for student, school, and state characteristics. We found student demographic characteristics had the largest effect on between-state differences in writing performance, followed by state policy-related variables, then state and school covariates. States with writing tests that exhibited greater alignment with the NAEP writing assessment demonstrated significantly higher writing scores. We discuss plausible implications of these findings.

Keywords

State standards State assessments Alignment NAEP 

Notes

Acknowledgements

This research was supported in part by Grant #R305A100040 from the U.S. Department of Education, Institute of Education Sciences, to Michigan State University. Statements do not necessarily reflect the positions or policies of this agency, and no official endorsement by it should be inferred.

References

  1. Abrams, L. M., Pedulla, J. T., & Madaus, G. F. (2003). Views from the classroom: Teachers’ opinions of statewide testing programs. Theory into Practice, 42, 18–29.CrossRefGoogle Scholar
  2. Albertson, B. R. (2007). Organization and development features of grade 8 and grade 10 writers: A descriptive study of Delaware student testing program (DSTP) essays. Research in the Teaching of English, 41, 435–465.Google Scholar
  3. Applebee, A., & Langer, J. (2006). The state of writing instruction in America’s schools: What existing data tell us. Albany, NY: Center on English Learning and Achievement.Google Scholar
  4. Beard, R., Myhill, D., Riley, J., & Nystrand, M. (Eds.). (2009). The SAGE handbook of writing development. Thousand Oaks, CA: SAGE.Google Scholar
  5. Beaton, A. E., Rogers, A. M., Gonzalez, E., Hanly, M. B., Kolstad, A., Rust, K. F., et al. (2011). The NAEP Primer (NCES 2011-463). Washington, DC, USA: Department of Education, National Center for Education Statistics.Google Scholar
  6. Bourdin, B., & Fayol, M. (2000). Is graphic activity cognitively costly? A developmental approach. Reading and Writing: An Interdisciplinary Journal, 13, 183–196.CrossRefGoogle Scholar
  7. Carmichael, S. B., Martino, G., Porter-McGee, K., & Wilson, W. S. (2010). The state of the state standards—and Common Core—in 2010. Washington, DC: Thomas B. Fordham Foundation & Institute. Retrieved from ERIC database (ED516607).Google Scholar
  8. Clarke, M., Shore, A., Rhoades, K., Abrams, L., Miao, J., & Li, J. (2003). Perceived effects of state-mandated testing programs on teaching and learning: Findings from interviews with educators in low-, medium-, and high-stakes states. Chestnut Hill, MA: National Board on Educational Testing and Public Policy. (ERIC Document Reproduction Service No. ED474867).Google Scholar
  9. Cohen, D. (1995). What standards for national standards? Phi Delta Kappan, 77, 751–757.Google Scholar
  10. Cutler, L., & Graham, S. (2008). Primary grade writing instruction: A national survey. Journal of Educational Psychology, 100, 907–919.CrossRefGoogle Scholar
  11. Dean, D. (2008). Genre theory: Teaching, writing, and being. Urbana, IL: National Council of Teachers of English.Google Scholar
  12. Dutro, E., & Valencia, S. W. (2004). The relation between state and district literacy standards: Issues of alignment, influence, and utility. A research report. Seattle: Center for the Study of Teaching and Policy, University of Washington.Google Scholar
  13. Faigley, L., & Witte, S. P. (1981). Coherence, cohesion, and writing quality. College Composition and Communication, 32(2), 2–11.Google Scholar
  14. Fraine, B. D., Damme, J., & Onghena, P. (2002). Accountability of schools and teachers: What should be taken into account? European Educational Research Journal, 1, 403–428.CrossRefGoogle Scholar
  15. Gershenson, S., & Langbein, L. (2015). The effect of primary school size on academic achievement. Educational Evaluation and Policy Analysis, 37, 135–155.CrossRefGoogle Scholar
  16. Gillespie, A., Graham, S., Kiuhara, S., & Hebert, M. (2014). High school teachers’ use of writing to support students’ learning: A national survey. Reading & Writing: An Interdisciplinary Journal, 27, 1043–1072.CrossRefGoogle Scholar
  17. Graham, S., Capizzi, A., Harris, K. R., Hebert, M., & Morphy, P. (2014). Teaching writing to middle school students: A national survey. Reading & Writing: An Interdisciplinary Journal, 27, 1015–1042.CrossRefGoogle Scholar
  18. Graham, S., McKeown, D., Kiuhara, S., & Harris, K. R. (2012). A meta-analysis of writing instruction for students in the elementary grades. Journal of Educational Psychology, 104, 879–896.CrossRefGoogle Scholar
  19. Graham, S., & Perin, D. (2007). A meta-analysis of writing instruction for adolescent students. Journal of Educational Psychology, 99, 445–476.CrossRefGoogle Scholar
  20. Graham, S., & Sandmel, K. (2011). The process approach: A meta-analysis. The Journal of Educational Research, 104, 396–407.CrossRefGoogle Scholar
  21. Grigorenko, E. L., Mambrino, E., & Preiss, D. D. (Eds.). (2012). Writing: A mosaic of new perspectives. New York, NY: Psychology Press.Google Scholar
  22. Grissmer, D. W., & Berends, M. (1994). Student achievement and the changing American family. Santa Monica, CA: RAND.Google Scholar
  23. Hamilton, L. S., & Berends, M. (2006). Instructional practices related to standards and assessments (RAND WR-37-EDU). Santa Monica, CA: RAND.Google Scholar
  24. Hamilton, L. S., Berends, M., & Stecher, B. M. (2005). Teachers’ responses to standards-based accountability. Santa Monica, CA: RAND.Google Scholar
  25. Hamilton, L. S., & Stecher, B. M. (2006, April). Measuring educators’ responses to high-stakes testing. In Paper presented at the annual meeting of the American Educational Research Association, San Francisco.Google Scholar
  26. Hayes, J. R. (1996). A new model of cognition and affect in writing. In M. Levy & S. Ransdell (Eds.), The science of writing (pp. 1–27). Hillsdale, NJ: Erlbaum.Google Scholar
  27. Hillocks, G. (2002). The testing trap: How state writing assessments control learning. New York, NY: Teachers College Press.Google Scholar
  28. Hoffman, K., & Liagas, C. (2003). Status and trends in the education of blacks (NCES Publication No. 2003-034). Washington, DC: U.S. Department of Education.Google Scholar
  29. Johnson, E. G. (1992). The design of the National Assessment of Educational Progress. Journal of Educational Measurement, 29, 95–110.CrossRefGoogle Scholar
  30. Johnson, E. G., & Rust, K. F. (1992). Population inferences and variance estimation for NAEP data. Journal of Educational Statistics, 17, 175–190.Google Scholar
  31. Juzwik, M. M., Curcic, S., Wolbers, K., Moxley, K. D., Dimling, L. M., & Shankland, R. K. (2006). Writing into the 21st century: An overview of research on writing, 1994 to 2004. Written Communication, 23, 451–476.CrossRefGoogle Scholar
  32. Kiuhara, S., Graham, S., & Hawken, L. (2009). Teaching writing to high school students: A national survey. Journal of Educational Psychology, 101, 136–160.CrossRefGoogle Scholar
  33. Klein, S. P., Hamilton, L. S., McCaffrey, D. F., & Stecher, B. M. (2000). What do test scores in Texas tell us? Education Policy Analysis Archives, 8(49), 1–20.Google Scholar
  34. Konstantopoulos, S. (2006). Trends of school effects on student achievement: Evidence from NLS:72, HSB:82, and NELS:92. Teacher College Record, 108, 2550–2581.CrossRefGoogle Scholar
  35. Koretz, D., Barron, S., Mitchell, K., & Stecher, B. (1996). The perceived effects of the Kentucky Instructional Results Information System (KIRIS) (MR-792-PCT/FF). Santa Monica, CA: RAND.Google Scholar
  36. Lee, J. (2013). Can writing attitudes and learning behavior overcome gender difference in writing? Evidence from NAEP. Written Communication, 30, 164–193.CrossRefGoogle Scholar
  37. Lee, V. E., & Bryk, A. S. (1989). A multilevel model of the social distribution of high school achievement. Sociology of Education, 62, 172–192.CrossRefGoogle Scholar
  38. Leithwood, K., & Jantzi, D. (2009). A review of empirical evidence about school size effects: A policy perspective. Review of Educational Research, 79, 464–490.CrossRefGoogle Scholar
  39. Linn, R. L., Baker, E. L., & Betebenner, D. W. (2002). Accountability systems: Implications of requirements of the No Child Left Behind Act of 2001. Educational Researcher, 31(6), 3–16.CrossRefGoogle Scholar
  40. Loveless, T. (2012). The 2012 Brown Center report on American education: How well are American students learning?. Washington, DC: Brown Center on Education Policy, Brookings Institution.Google Scholar
  41. Lubienski, S. T., & Lubienski, C. (2006). School sector and academic achievement: A multilevel analysis of NAEP mathematics data. American Educational Research Journal, 43, 651–698.CrossRefGoogle Scholar
  42. MacArthur, C. A., Graham, S., & Fitzgerald, J. (Eds.). (2006). Handbook of writing research. New York, NY: Guilford.Google Scholar
  43. Massell, D. (1994). Setting standards in mathematics and social studies. Education and Urban Society, 26, 118–140.CrossRefGoogle Scholar
  44. McCarthey, S. J. (2008). The impact of No Child Left Behind on teachers’ writing instruction. Written Communication, 25, 462–505.CrossRefGoogle Scholar
  45. McCarthey, S. J., & Ro, Y. S. (2011). Approaches to writing instruction. Pedagogies: An International Journal, 6, 273–295.CrossRefGoogle Scholar
  46. McLaughlin, M. W., & Shepard, L. A. (1995). Improving education through standards-based reform: A report by the National Academy of Education Panel on Standards-Based Reform. Stanford, CA: National Academy of Education.Google Scholar
  47. McMillan, J. H., Myran, S., & Workman, D. (1999). The impact of mandated statewide testing on teachers’ classroom assessment and instructional practices. In Paper presented at the Annual meeting of the American Educational Research Association., Montreal, Quebec, Canada. (ERIC Document Reproduction Service No. ED431041).Google Scholar
  48. Moats, L., Foorman, B. R., & Taylor, P. (2006). How quality of writing instruction impacts high-risk fourth graders’ writing. Reading and Writing: An Interdisciplinary Journal, 19, 363–391.CrossRefGoogle Scholar
  49. National Center for Education Statistics. (2012). The nation’s report card: Writing 2011 (NCES 2012–470). Washington, DC: Institute of Education Sciences, U.S. Department of Education.Google Scholar
  50. National Commission on Writing for America’s Families, Schools, and Colleges. (2003). The neglected R: The need for a writing revolution. New York, NY: College Entrance Examination Board. Retrieved February 1, 2008, from http://www.writingcommission.org/prod_downloads/writingcom/neglectedr.pdf.
  51. National Commission on Writing for America’s Families, Schools, and Colleges. (2004). Writing: A ticket to work…or a ticket out. A survey of business leaders. New York, NY: College Entrance Examination Board. Retrieved February 1, 2008, from http://www.writingcommission.org/prod_downloads/writingcom/writing-ticket-to-work.pdf.
  52. O’Neill, P., Murphy, S., Williamson, M., & Huot, B. (2006). What teachers say about different kinds of mandated state tests. Journal of Writing Assessment, 2, 81–108.Google Scholar
  53. Persky, H. R., Daane, M. C., & Jin, Y. (2003). The nation’s report card: Writing 2002. U. S. Department of Education, Institute of Education Sciences. Washington, DC: National Center for Education Statistics.Google Scholar
  54. Polikoff, M. S., & Fulmer, G. W. (2013). Refining methods for estimating critical values for an alignment index. Journal of Research on Educational Effectiveness, 6, 380–395.CrossRefGoogle Scholar
  55. Polikoff, M. S., Porter, A. C., & Smithson, J. (2011). How well aligned are state assessments of student achievement with state content standards? American Educational Research Journal, 48, 965–995.CrossRefGoogle Scholar
  56. Porter, A. C. (2002). Measuring the content of instruction: Uses in research and practice. Educational Researcher, 31(7), 3–14.CrossRefGoogle Scholar
  57. Prior, P. (2006). A sociocultural theory of writing. In C. A. MacArthur, S. Graham, & J. Fitzgerald (Eds.), Handbook of writing research (pp. 54–66). New York: Guilford.Google Scholar
  58. Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical linear models: Applications and data analysis methods (2nd ed.). Thousand Oaks, CA: Sage Publications.Google Scholar
  59. Roach, A., Elliott, S., & Webb, N. (2005). Alignment of an alternate assessment with state academic standards: Evidence for the content validity of the Wisconsin alternate assessment. Journal of Special Education, 38, 218–231.CrossRefGoogle Scholar
  60. Rothman, R., Slatterly, J. B., Vranek, J. L., & Resnick, L. B. (2002). Benchmarking and alignment of standards and testing. Los Angeles: Center for the Study of Evaluation, National Center for Research on Evaluation, Standards, and Student Testing.Google Scholar
  61. Rowan, B., Camburn, E., & Correnti, R. (2004). Using teacher logs to measure the enacted curriculum: A study of literacy teaching in third-grade classrooms. The Elementary School Journal, 105, 75–101.CrossRefGoogle Scholar
  62. Rubin, D. B. (1987). Multiple imputation for non-response in surveys. New York, NY: Wiley.CrossRefGoogle Scholar
  63. Salahu-Din, D., Persky, H., & Miller, J. (2008). The nation’s report card: Writing 2007. U. S. Department of Education, Institute of Education Sciences. Washington, DC: National Center for Education Statistics.Google Scholar
  64. Scherff, L., & Piazza, C. (2005). The more things change, the more they stay the same. A survey of high school students’ writing experiences. Research in the Teaching of English, 39, 271–304.Google Scholar
  65. Sharple, M., & Pemberton, L. (1992). Representing writing: External representations and the writing process. Oxford, UK: Kluwer Academic Publishers.Google Scholar
  66. Snijders, T. A. B., & Bosker, R. J. (2012). Multilevel analysis: An introduction to basic and advanced multilevel modeling (2nd ed.). London, UK: Sage Publications.Google Scholar
  67. Stecher, B. M., Barron, S. L., Chun, T., & Ross, K. (2000). The effects of the Washington state education reform on schools and classrooms (RAND report DRU-2263). Santa Monica, CA: RAND Corporation.Google Scholar
  68. Stecher, B. M., Barron, S. L., Kaganoff, T., & Goodwin, J. (1998). The effects of standards based assessment on classroom practices: Results of the 19961997 RAND survey of Kentucky teachers of mathematics and writing (CRESST Tech. Rep. No. 482). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).Google Scholar
  69. Stecher, B., & Borko, H. (2002). Combining surveys and case studies to examine standards-based educational reform (CSE Tech. Rep. No. 565). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing.Google Scholar
  70. Stecher, B., & Chun, T. (2002). School and classroom during two years of educational reform in Washington state (CSE Tech. Rep. No. 550). Los Angeles: University of California, National Center for Research on Evaluation, Standards, and Student Testing.Google Scholar
  71. Stecher, B. M., Hamilton, L. S., & Gonzalez, G. (2003). Working smarter to leave no child behind. Santa Monica, CA: RAND.Google Scholar
  72. Taylor, G., Shepard, L., Kinner, F., & Rosenthal, J. (2002). A survey of teachers’ perspectives on high-stakes testing in Colorado: What gets taught, what gets lost. CSE technical report 588. Los Angeles, CA: Center for Research on Evaluation, Standards Student Testing. (ERIC Document Reproduction Service No. ED475139).Google Scholar
  73. Troia, G. A. (2006). Writing instruction for students with learning disabilities. In C. A. MacArthur, S. Graham, & J. Fitzgerald (Eds.), Handbook of writing research (pp. 324–336). New York, NY: Guilford Press.Google Scholar
  74. Troia, G. A., & Graham, S. (2017). Use and acceptability of writing adaptations for students with disabilities: Survey of grade 3–8 teachers. Learning Disabilities Research & Practice, 32, 257–269.Google Scholar
  75. Troia, G. A., Harbaugh, A. G., Wolbers, K. A., & Lawrence, A. M. (2013). Relationships between writing motivation, writing activity, and writing achievement: Effects of grade, sex, and ability. Reading & Writing: An Interdisciplinary Journal, 26, 17–44.CrossRefGoogle Scholar
  76. Troia, G. A., Lin, S. C., Cohen, S., & Monroe, B. W. (2011). A year in the writing workshop: Linking writing instruction practices and teachers’ epistemologies and beliefs about writing instruction. The Elementary School Journal, 112, 155–182.CrossRefGoogle Scholar
  77. Troia, G. A., Lin, S. C., Monroe, B. W., & Cohen, S. (2009). The effects of writing workshop instruction on the performance and motivation of good and poor writers. In G. A. Troia (Ed.), Instruction and Assessment for Struggling Writers: Evidence-Based Practices (pp. 77–112). New York, NY: Guilford Press.Google Scholar
  78. Troia, G. A., Olinghouse, N. G., Wilson, J., Stewart, K. A., Mo, Y., Hawkins, L., et al. (2016). The Common Core writing standards: A descriptive study of content and alignment with a sample of former state standards. Reading Horizons Journal, 55(3), 98–141.Google Scholar
  79. Troia, G. A., Shankland, R. K., & Wolbers, K. A. (2012). Motivation research in writing: Theoretical and empirical considerations. Reading & Writing Quarterly: Overcoming Learning Difficulties, 28, 5–28.CrossRefGoogle Scholar
  80. U.S. Department of Education. (2015). Every Student Succeeds Act of 2015, Public Law No. 114-95. Retrieved from http://www.ed.gov/essa.
  81. Webb, N. L., Alt, M., Ely, R., & Vesperman, B. (2005). Web alignment tool (WAT): Training Manual version 1.1. Retrieved from http://wat.wceruw.org/index.aspx.
  82. Wixson, K. K., Frisk, M. C., Dutro, E., & McDaniel, J. (2002). The alignment of state standards and assessments in elementary reading (CIERA Report No. R-3-024). Ann Arbor, MI: Center for the Improvement of Early Reading Achievement.Google Scholar

Copyright information

© Springer Science+Business Media B.V., part of Springer Nature 2017

Authors and Affiliations

  1. 1.Department of Counseling, Educational Psychology and Special EducationMichigan State UniversityEast LansingUSA
  2. 2.Department of Educational PsychologyUniversity of ConnecticutStorrsUSA
  3. 3.College of Education and Human DevelopmentUniversity of DelawareNewarkUSA
  4. 4.Department of Research, Evaluation, Assessment and AccountabilityMinneapolis Public SchoolsMinneapolisUSA
  5. 5.National Institute of Statistical SciencesWashingtonUSA
  6. 6.Department of Elementary EducationBall State UniversityMuncieUSA

Personalised recommendations