Skip to main content
Log in

Associated effects of automated essay evaluation software on growth in writing quality for students with and without disabilities

  • Published:
Reading and Writing Aims and scope Submit manuscript

Abstract

The present study examined growth in writing quality associated with feedback provided by an automated essay evaluation system called PEG Writing. Equal numbers of students with disabilities (SWD) and typically-developing students (TD) matched on prior writing achievement were sampled (n = 1196 total). Data from a subsample of students (n = 655) was used to investigate evidence of transfer to improved first-draft performance on a follow up writing prompt. Three-level hierarchical linear modeling was used. Findings indicated that SWD produced first drafts of lesser quality than TD students, but grew at a faster rate and were able to close the gap in writing quality after five revisions. However, these effects were moderated by school quality and the availability of internet-connected devices in schools. There was no evidence of transfer for either group of students. Results document a positive association between the use of PEG Writing and growth in writing quality for SWD, and underscore the importance of having sufficient technology resources for maximizing this growth.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Abbott, R. D., Berninger, V. W., & Fayol, M. (2010). Longitudinal relationships of levels of language in writing and between writing and reading in grades 1 to 7. Journal of Educational Psychology, 102, 281–298.

    Article  Google Scholar 

  • Alexander, P. A. (2003). The development of expertise: The journey from acclimation to proficiency. Educational Researcher, 32(8), 10–14.

    Article  Google Scholar 

  • Applebee, A. N., & Langer, J. A. (2009). What is happening in the teaching of writing? English Journal, 98(5), 18–28.

    Google Scholar 

  • Attali, Y. (2004). Exploring the feedback and revision features of criterion. Paper presented at the National Council on Measurement in Education (NCME), San Diego, CA.

  • Bereiter, C., & Scardamalia, M. (1987). The psychology of written composition. Hillsdale, NJ: Lawrence Erlbaum Associates.

    Google Scholar 

  • Berninger, V. W. (2000). Development of language by hand and its connections with language by ear, mouth, and eye. Topics in Language Disorders, 20(4), 65–84.

    Article  Google Scholar 

  • Berninger, V. W., Nagy, W., Tanimoto, S., Thompson, R., & Abbott, R. D. (2015). Computer instruction in handwriting, spelling, and composing for students with specific learning disabilities in grades 4–9. Computers & Education, 81, 154–168.

    Article  Google Scholar 

  • Biber, D., Nekrasova, T., & Horn, B. (2011). The effectiveness of feedback for L1-English and L2-writing development: A meta-analysis. TOEFL iBT™ research report. Princeton, NJ: Educational Testing Service.

    Google Scholar 

  • Bunch, M. B., Vaughn, D., & Miel, S. (2016). Automated scoring in assessment systems. In Y. Rosen, S. Ferrara, & M. Mosharraf (Eds.), Technology tools for real-world skill development (pp. 611–626). Hershey, PA: IGI Global.

    Chapter  Google Scholar 

  • Clare, L., Valdés, R., & Patthey-Chavez, G. G. (2000). Learning to write in urban elementary and middle schools: An investigation of teachers’ written feedback on student compositions (Center for the Study of Evaluation Technical Report No. 526). Los Angeles: University of California, Center for Research on Evaluation, Standards, and Student Testing (CRESST).

  • Common Core State Standards Initiative. (2010). Common core state standards for English language arts & literacy in history/social studies, science, and technical subjects. Retrieved from http://www.corestandards.org/ELA-Literacy/.

  • Dockrell, J. E., Lindsay, G., & Connelly, V. (2009). The impact of specific language impairment on adolescents’ written text. Exceptional Children, 75, 427–446.

    Article  Google Scholar 

  • Englert, C. S., Mariage, T. V., Okolo, C. M., Shankland, R. K., Moxley, K. D., Courtad, C. A., et al. (2009). The learning-to-learn strategies of adolescent students with disabilities: Highlighting, note taking, planning, and writing expository texts. Assessment for Effective Intervention, 34, 147–161.

    Article  Google Scholar 

  • Foltz, P. W., Lochbaum, K. E. & Rosenstein, M. B. (2011). Analysis of student ELA writing performance for a large scale implementation of formative assessment. Paper presented at the annual meeting of the National Council for Measurement in Education, New Orleans, LA.

  • Franzke, M., Kintsch, E., Caccamise, D., Johnson, N., & Dooley, S. (2005). Summary street®: Computer support for comprehension and writing. Journal of Educational Computing Research, 33, 53–80.

    Article  Google Scholar 

  • Gersten, R., & Baker, S. (2001). Teaching expressive writing to students with learning disabilities: A meta-analysis. The Elementary School Journal, 101, 251–272. doi:10.1086/499668.

    Article  Google Scholar 

  • Gilbert, J., & Graham, S. (2010). Teaching writing to elementary students in grades 4–6: A national survey. Elementary School Journal, 110, 494–518.

    Article  Google Scholar 

  • Gillespie, A., & Graham, S. (2014). A meta-analysis of writing interventions for students with learning disabilities. Exceptional Children, 80, 454–473.

    Article  Google Scholar 

  • Gillespie, A., Olinghouse, N. G., & Graham, S. (2013). Fifth-grade students’ knowledge about writing process and writing genres. Elementary School Journal, 113, 565–588.

    Article  Google Scholar 

  • Graham, S., Bollinger, A., Booth Olson, C., D’Aoust, C., MacArthur, C., McCutchen, D., et al. (2012a). Teaching elementary school students to be effective writers: A practice guide (NCEE 2012-4058). Washington, DC: National Center for Education Evaluation and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

    Google Scholar 

  • Graham, S., Capizzi, A., Harris, K. R., Hebert, M., & Morphy, P. (2013). Teaching writing to middle school students: A national survey. Reading and Writing, 27, 1015–1042.

    Article  Google Scholar 

  • Graham, S., & Harris, K. R. (2003). Students with learning disabilities and the process of writing: A meta-analysis of SRSD studies. In H. L. Swanson, K. R. Harris, & S. Graham (Eds.), Handbook of learning disabilities (pp. 323–344). New York, NY: Guilford.

    Google Scholar 

  • Graham, S., & Harris, K. R. (2013). Common core state standards, writing, and students with LD: Recommendations. Learning Disabilities Research and Practice, 28, 28–37.

    Article  Google Scholar 

  • Graham, S., Harris, K. R., & Hebert, M. A. (2011). Informing writing: The benefits of formative assessment. A Carnegie Corporation Time to Act report. Washington, DC: Alliance for Excellent Education.

    Google Scholar 

  • Graham, S., Hebert, M., & Harris, K. R. (2015). Formative assessment and writing: A meta-analysis. Elementary School Journal, 115, 523–547.

    Article  Google Scholar 

  • Graham, S., Hebert, M., Sandbank, M. P., & Harris, K. R. (2016). Assessing the writing achievement of young struggling writers: Application of generalizability theory. Learning Disability Quarterly, 39, 72–82.

    Article  Google Scholar 

  • Graham, S., MacArthur, C., & Schwartz, S. (1995). Effects of goal setting and procedural facilitation on the revising behavior and writing performance of students with writing and learning problems. Journal of Educational Psychology, 87, 230–240.

    Article  Google Scholar 

  • Graham, S., McKeown, D., Kiuhara, S., & Harris, K. R. (2012b). A meta-analysis of writing instruction for students in the elementary grades. Journal of Educational Psychology, 104, 879–896.

    Article  Google Scholar 

  • Graham, S., & Perin, D. (2007). Writing next: Effective strategies to improve writing of adolescents in middle and high schools—A report to Carnegie Corporation of New York. Washington, DC: Alliance for Excellent Education.

    Google Scholar 

  • Grimes, D., & Warschauer, M. (2010). Utility in a fallible tool: A multi-site case study of automated writing evaluation. Journal of Technology, Learning, and Assessment, 8(6), 1–44.

    Google Scholar 

  • Harris, K. R., & Graham, S. (2013). “An adjective is a word hanging down from a noun”: Learning to write and students with learning disabilities. Annals of Dyslexia, 63, 65–79.

    Article  Google Scholar 

  • Hayes, J. R. (2004). What triggers revision? In L. Chanquoy, P. Largy, & L. Allal (Eds.), Revision: Cognitive and instructional processes (pp. 9–20). Boston, MA: Kluwer Academic.

    Chapter  Google Scholar 

  • Hayes, J. R. (2012). Modeling and remodeling writing. Written Communication, 29(3), 369–388.

    Article  Google Scholar 

  • Keith, T. Z. (2003). Validity and automated essay scoring systems. In M. D. Shermis & J. C. Burstein (Eds.), Automated essay scoring: A cross-disciplinary perspective (pp. 147–167). Mahwah, NJ: Lawrence Erlbaum Associates Inc.

    Google Scholar 

  • Kellogg, R. T., & Whiteford, A. P. (2009). Training advanced writing skills: The case for deliberative practice. Educational Psychologist, 44, 250–266.

    Article  Google Scholar 

  • Kellogg, R. T., Whiteford, A. P., & Quinlan, T. (2010). Does automated feedback help students learn to write? Journal of Educational Computing Research, 42(2), 173–196.

    Article  Google Scholar 

  • Lin, C. S., Monroe, B. W., & Troia, G. A. (2007). Development of writing knowledge in grades 2–8: A comparison of typically developing writers and their struggling peers. Reading and Writing Quarterly: Overcoming Learning Difficulties, 23, 207–230.

    Article  Google Scholar 

  • MacArthur, C. A. (2009). Reflections on research on writing and technology for struggling writers. Learning Disabilities Research and Practice, 24, 93–103.

    Article  Google Scholar 

  • MacArthur, C. A., Graham, S., & Harris, K. R. (2009). Insights from instructional research on revision with struggling writers. In L. Allal, L. Chanquoy, & P. Largy (Eds.), Revision cognitive and instructional processes (pp. 125–139). Boston, MA: Kluwer Academic.

    Google Scholar 

  • MacArthur, C. A., Graham, S., & Schwartz, S. (1991). Knowledge of revision and revising behavior among students with learning disabilities. Learning Disability Quarterly, 14, 61–73.

    Article  Google Scholar 

  • Matsumara, L. C., Patthey-Chavez, G. G., Valdés, R., & Garnier, G. (2002). Teacher feedback, writing assignment quality, and third-grade students’ revision in lower- and higher-achieving urban schools. Elementary School Journal, 103, 3–25.

    Article  Google Scholar 

  • McCutchen, D. (2011). From novice to expert: Implications of language skills and writing-relevant knowledge for memory during the development of writing skill. Journal of Writing Research, 3, 51–68.

    Article  Google Scholar 

  • Moore, N. S., & MacArthur, C. A. (2016). Student use of automated essay evaluation technology during revision. Journal of Writing Research, 8, 149–175. doi:10.17239/jowr-2016.08.01.05.

    Article  Google Scholar 

  • Morphy, P., & Graham, S. (2012). Word processing programs and weaker writers/readers: A meta-analysis of research findings. Reading and Writing, 25, 641–678.

    Article  Google Scholar 

  • National Center for Education Statistics. (2012). The Nation’s Report Card: Writing 2011 (NCES 2012-470). Washington, DC: Institute of Education Sciences, U.S. Department of Education.

    Google Scholar 

  • National Commission on Writing for America’s Families, Schools, and Colleges. (2003). The neglected “R”: The need for a writing revolution. Iowa City, IA: The College Board.

    Google Scholar 

  • National Research Council. (2000). How people learn: Brain, mind, experience, and school: Expanded edition. Washington, DC: The National Academies Press. doi:10.17226/9853.

    Google Scholar 

  • Nobles, S., & Paganucci, L. (2015). Do digital writing tools deliver? Student perceptions of writing quality using digital tools and online writing environments. Computers and Composition, 38, 16–31.

    Article  Google Scholar 

  • Olinghouse, N. G., & Wilson, J. (2013). The relationship between vocabulary and writing quality in three genres. Reading and Writing, 26, 45–65.

    Article  Google Scholar 

  • Page, E. B. (1966). The imminence of grading essays by computer. Phi Delta Kappan, 48, 238–243.

    Google Scholar 

  • Page, E. B. (2003). Project Essay Grade: PEG. In M. D. Shermis & J. C. Burstein (Eds.), Automated essay scoring: A cross-disciplinary perspective (pp. 43–54). Mahwah, NJ: Lawrence Erlbaum Associates Inc.

    Google Scholar 

  • Pajares, F. (2003). Self-efficacy beliefs, motivation, and achievement in writing: A review of the literature. Reading and Writing Quarterly, 19, 139–158. doi:10.1080/10573560308222.

    Article  Google Scholar 

  • Persky, H. R., Daane, M. C., & Jin, Y. (2002). The Nation’s Report Card: Writing 2002 (NCES 2003-529). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department for Education.

    Google Scholar 

  • Raudenbush, S. W., Bryk, A. S., & Congdon, R. T. (1988). HLM: Hierarchical linear modeling. Chicago: Scientific Software International Inc.

    Google Scholar 

  • Ruth, L., & Murphy, S. (1988). Designing writing tasks for the assessment of writing. Norwood, NJ: Ablex.

    Google Scholar 

  • Salahu-Din, D., Persky, H., & Miller, J. (2008). The Nation’s Report Card: Writing 2007 (NCES 2008-468). Washington, DC: National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education.

    Google Scholar 

  • Shermis, M. D. (2014). State-of-the-art automated essay scoring: Competition, results, and future directions from a United States demonstration. Assessing Writing, 20, 53–76.

    Article  Google Scholar 

  • Shermis, M. D., & Burstein, J. C. (2003). Automated essay scoring: A cross-disciplinary perspective. Mahwah, NJ: Lawrence Erlbaum Associates.

    Google Scholar 

  • Shermis, M. D., & Burstein, J. (2013). Handbook of automated essay evaluation: Current applications and new directions. New York, NY: Routledge.

    Google Scholar 

  • Shermis, M. D., Garvan, C. W., & Diao, Y. (2008). The impact of automated essay scoring on writing outcomes. Paper presented at the annual meeting of the National Council on Measurement in Education, New York, NY.

  • Shermis, M. D., & Hammer, B. (2013). Contrasting state-of-the-art automated scoring of essays. In M. D. Shermis & J. Burstein (Eds.), Handbook of automated essay evaluation: Current applications and new directions. New York, NY: Routledge.

    Google Scholar 

  • Shermis, M. D., Koch, C. M., Page, E. B., Keith, T. Z., & Harrington, S. (2002). Trait ratings for automated essay grading. Educational and Psychological Measurement, 62, 5–18.

    Article  Google Scholar 

  • Singer, J. D., & Willett, J. B. (2003). Applied longitudinal data analysis: Modeling change and event occurrence. New York, NY: Oxford University Press.

    Book  Google Scholar 

  • Stevenson, M., & Phakiti, A. (2014). The effects of computer-generated feedback on the quality of writing. Assessing Writing, 19, 51–65.

    Article  Google Scholar 

  • Troia, G. A. (2006). Instruction and assessment for struggling writers: Evidence-based practices. New York, NY: Guilford.

    Google Scholar 

  • Vaughn, S., Gersten, R., & Chard, D. J. (2000). The underlying message in LD intervention research: Findings from research syntheses. Exceptional Children, 67, 99–114.

    Article  Google Scholar 

  • Warschauer, M., & Grimes, D. (2008). Automated writing assessment in the classroom. Pedagogies: An International Journal, 3, 22–36.

    Article  Google Scholar 

  • Wilson, J., & Andrada, G. N. (2016). Using automated feedback to improve writing quality: Opportunities and challenges. In Y. Rosen, S. Ferrara & M. Mosharraf (Eds.), Handbook of research on technology tools for real-world skill development (pp. 678–703). Hershey, PA: IGI Global.

  • Wilson, J., & Czik, A. (2016). Automated essay evaluation software in English language arts classrooms: Effects on teacher feedback, student motivation, and writing quality. Computers & Education, 100, 94–109.

    Article  Google Scholar 

  • Wilson, J., Olinghouse N. G., & Andrada, G. N. (2014). Does automated feedback improve writing quality? Learning Disabilities: A Contemporary Journal, 12, 93–118.

    Google Scholar 

Download references

Acknowledgments

This research was supported in part by a Delegated Authority contract from Measurement Incorporated® to University of Delaware (EDUC432914150001). The opinions expressed in this paper are those of the author and do not necessarily reflect the positions or policies of this agency, and no official endorsement by it should be inferred.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Joshua Wilson.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wilson, J. Associated effects of automated essay evaluation software on growth in writing quality for students with and without disabilities. Read Writ 30, 691–718 (2017). https://doi.org/10.1007/s11145-016-9695-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11145-016-9695-z

Keywords

Navigation