Advertisement

Reading and Writing

, Volume 32, Issue 6, pp 1575–1606 | Cite as

The case for scenario-based assessment of written argumentation

  • Paul DeaneEmail author
  • Yi Song
  • Peter van Rijn
  • Tenaha O’Reilly
  • Mary Fowles
  • Randy Bennett
  • John Sabatini
  • Mo Zhang
Article

Abstract

This paper presents a theoretical and empirical case for the value of scenario-based assessment (SBA) in the measurement of students’ written argumentation skills. First, we frame the problem in terms of creating a reasonably efficient method of evaluating written argumentation skills, including for students at relatively low levels of competency. We next present a proposed solution in the form of an SBA and lay out the design for such an assessment. We then describe the results of prior research done within our group using this design. Fourth, we present the results of two new analyses of prior data that extend our previous results. These analyses concern whether the test items behave in ways consistent with the learning progressions underlying the design, how items measuring reading and writing component skills relate to essay performance, how measures of transcription fluency and proficiency in oral and academic language relate to writing skill, and whether the scenario-based design affects the fluency and vocabulary used in an essay. Results suggest that students can be differentiated by learning progression level, with variance in writing scores accounted for by a combination of performance on earlier tasks in the scenario and automated linguistic features measuring general literacy skills. The SBA structure, with preliminary tasks leading up to the final written performance, appears to result in more fluent (and also more efficient) writing behavior, compared to students’ performances when they write an essay in isolation.

Keywords

Assessment Argumentation Scenario-based assessment SBA Writing Summary Summarization Essay Critique Reading IRT Dimensionality Learning progression Keystroke Burst Process data Automated essay scoring AES 

References

  1. Agresti, A. (2007). An introduction to categorical data analysis. Hoboken, NJ: Wiley.CrossRefGoogle Scholar
  2. Almond, R., Deane, P., Quinlan, T., & Wagner, M. (2012). A preliminary analysis of keystroke log data from a timed writing task. ETS Research Report Series, 2012(2). Princeton, NJ: Educational Testing Service.  https://doi.org/10.1002/j.2333-8504.2012.tb02305.x.
  3. Alves, R. A., & Limpo, T. (2015). Progress in written language bursts, pauses, transcription, and written composition across schooling. Scientific Studies of Reading, 19(5), 374–391.  https://doi.org/10.1080/10888438.2015.1059838.CrossRefGoogle Scholar
  4. Attali, Y., & Burstein, J. (2005). Automated essay scoring with E-Rater v. 2.0. ETS Research Report Series, 2004(2). Princeton, NJ: Educational Testing Service.  https://doi.org/10.1002/j.2333-8504.2004.tb01972.x.
  5. Baaijen, H. R. (2001). Word frequency distributions. Dordrecht: Kluwer Academic Publishers.Google Scholar
  6. Bennett, R. E. (2011). CBAL: Results from piloting innovative K-12 assessments. ETS Research Report Series 2011(1). Princeton, NJ: Educational Testing Service.  https://doi.org/10.1002/j.2333-8504.2011.tb02259.x.
  7. Bennett, R. E., Deane, P., & van Rijn, P. W. (2016). From cognitive-domain theory to assessment practice. Educational Psychologist, 51(1), 1–26.  https://doi.org/10.1080/00461520.2016.1141683.CrossRefGoogle Scholar
  8. Breland, H. M., Camp, R., Jones, R. J., Morris, M. M., & Rock, D. R. (1987). Assessing writing skill (College Board Research Report No. 11). New York: College Entrance Examination Board.Google Scholar
  9. Brem, S. (2000). Explanation and evidence in informal argument. Cognitive Science, 24(4), 573–604.  https://doi.org/10.1207/s15516709cog2404_2.CrossRefGoogle Scholar
  10. CCSSO, & NGA. (2010). Common core state standards for English language arts and literacy in history/social studies, science, and technical subjects. Washington, DC.Google Scholar
  11. Chenoweth, N. A., & Hayes, J. R. (2001). Fluency in writing: Generating text in L1 and L2. Written Communication, 18(1), 99–118.  https://doi.org/10.1177/0741088301018001004.CrossRefGoogle Scholar
  12. Deane, P., Fowles, M., Baldwin, D., & Persky, H. (2011). The CBAL summative writing assessment: A draft eighth-grade design. ETS Research Memorandum Series (Report No. RM-11-01). Princeton, NJ: Educational Testing Service. https://sharepoint.etslan.org/rd/rrpts/RR/RM-11-01.pdf.
  13. Deane, P., Sabatini, J., Feng, G., Sparks, J., Song, Y., Fowles, M., et al. (2015). Key practices in the English language arts (ELA): Linking learning theory, assessment, and instruction. ETS Research Report Series.  https://doi.org/10.1002/ets2.12063.Google Scholar
  14. Deane, P., & Song, Y. (2014). A case study in principled assessment design: Designing assessments to measure and support the development of argumentative reading and writing skills. Educativa Psicologia, 20(2), 99–108.  https://doi.org/10.1016/j.pse.2014.10.001.CrossRefGoogle Scholar
  15. Deane, P., & Song, Y. (2015). The key practice, discuss and debate ideas: Conceptual framework, literature review, and provisional learning progressions for argumentation. ETS Research Report Series, 2015(2). Princeton, NJ: Educational Testing Service.  https://doi.org/10.1002/ets2.12079.
  16. Fayol, M. (1999). From on-line management problems to strategies in written composition. In M. Torrance & G. Jeffery (Eds.), The cognitive demands of writing: Processing capacity and working memory effects in text production (pp. 15–23). Amsterdam: Amsterdam University Press.Google Scholar
  17. Ferrari, M., Bouffard, T., & Rainville, L. (1998). What makes a good writer? Differences in good and poor writers’ self-regulation of writing. Instructional Science, 26(6), 473–488.  https://doi.org/10.1023/A:1003202412203.CrossRefGoogle Scholar
  18. Ferretti, R. P., Lewis, W. E., & Andrews-Weckerly, S. (2009). Do goals affect the structure of students’ argumentative writing strategies? Journal of Educational Psychology, 101(3), 577–589.  https://doi.org/10.1037/a0014702.CrossRefGoogle Scholar
  19. Fu, J., Chung, S., & Wise, M. (2013). Dimensionality analyses of CBAL Writing tests. ETS Research Report Series, 2013(1). Princeton, NJ: Educational Testing Service.  https://doi.org/10.1002/j.2333-8504.2013.tb02317.x.
  20. Fu, J., & Wise, M. (2012). Statistical report of 2011 CBAL multistate administration of reading and writing tests. ETS Research Report Series, 2012(2). Princeton, NJ: Educational Testing Service.  https://doi.org/10.1002/j.2333-8504.2012.tb02306.x.
  21. Gil, L., Braten, I., Vidal-Abarca, E., & Stromso, H. I. (2010). Summary versus argument tasks when working with multiple documents: Which is better for whom? Contemporary Educational Psychology, 35(3), 157–173.  https://doi.org/10.1016/j.cedpsych.2009.11.002.CrossRefGoogle Scholar
  22. Godshalk, F. I., Swineford, F., & Coffman, W. E. (1966). The measurement of writing ability. New York: College Entrance Examination Board.Google Scholar
  23. Goldman, S. R., Britt, M. A., Brown, W., Cribb, G., George, M. A., Greenleaf, C., et al. (2016). Disciplinary literacies and learning to read for understanding: A conceptual framework for disciplinary literacy. Educational Psychologist, 51(2), 219–246.  https://doi.org/10.1080/00461520.2016.1168741.CrossRefGoogle Scholar
  24. Hayes, J. R. (2012). Evidence from language bursts, revision, and transcription for translation and its relation to other writing processes. In M. Fayol, D. Alamargot, & V. W. Berninger (Eds.), Translation of thought to written text while composing. New York: Psychology Press.Google Scholar
  25. Hillocks, G. (2002). The testing trap: How state writing assessments control learning. New York: Teachers College Press.Google Scholar
  26. Kellogg, R. T. (1996). A model of working memory in writing. In C. M. Levy & S. Ransdell (Eds.), The science of writing: Theories, methods, individual differences, and applications (pp. 57–71). Hillsdale, NJ: Lawrence Earlbaum Associates.Google Scholar
  27. Klaczynski, P. (2000). Motivated scientific reasoning biases, epistemological beliefs, and theory polarization: A two-process approach to adolescent cognition. Child Development, 71(5), 1347–1366. http://www.jstor.org/stable/1131978.
  28. Kuhn, D. (1991). The skills of argument. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  29. Kuhn, D., & Crowell, A. (2011). Dialogic argumentation as a vehicle for developing young adolescents’ thinking. Psychological Science, 22(4), 545–552.  https://doi.org/10.1177/0956797611402512.CrossRefGoogle Scholar
  30. Kuhn, D., Shaw, V., & Felton, M. (1997). Effects of dyadic interaction on argumentive reasoning. Cognition and Instruction, 15(3), 287–315. http://www.jstor.org/stable/3233770.
  31. Kuhn, D., & Udell, W. (2007). Coordinating own and other perspectives in argument. Thinking & Reasoning, 13(2), 90–104.  https://doi.org/10.1080/13546780600625447.CrossRefGoogle Scholar
  32. Leitão, S. (2003). Evaluating and selecting counterarguments: Studies of children’s rhetorical awareness. Written Communication, 20(3), 269–306.  https://doi.org/10.1177/0741088303257507.CrossRefGoogle Scholar
  33. Levy, C. M. (2013). The science of writing: Theories, methods, individual differences and applications. New York: Routledge.CrossRefGoogle Scholar
  34. Mayweg-Paus, E., Macagno, F., & Kuhn, D. (2016). Developing argumentation strategies in electronic dialogs: Is modeling effective? Discourse Processes, 53(4), 280–297.  https://doi.org/10.1080/0163853X.2015.1040323.CrossRefGoogle Scholar
  35. McCann, T. M. (1989). Student argumentative writing: Knowledge and ability at three grade levels. Research in the Teaching of English, 23(1), 62–76. http://www.jstor.org/stable/40171288.
  36. McCutchen, D. (1996). A capacity theory of writing: Working memory in composition. Educational Pschology Review, 8(3), 299–325.  https://doi.org/10.1007/BF01464076.CrossRefGoogle Scholar
  37. NCES. (2012). The Nation’s report card: Writing 2011. National Center for Education Statistics. https://nces.ed.gov/nationsreportcard/pdf/main2011/2012470.pdf.
  38. Nussbaum, M. E., & Kardash, C. (2005). The effects of goal instructions and text on the generation of counterarguments during writing. Journal of Educational Psychology, 97(2), 157–169.  https://doi.org/10.1037/0022-0663.97.2.157.CrossRefGoogle Scholar
  39. O’Reilly, T., Deane, P., & Sabatini, J. (2015). Building and sharing knowledge key practice: What do you know, what don’t you know, what did you Learn? ETS Research Report Series, 2015(2). Princeton, NJ: Educational Testing Service.  https://doi.org/10.1002/ets2.12074.
  40. O’Reilly, T., Weeks, J., Sabatini, J., Halderman, L., & Steinberg, J. (2014). Designing reading comprehension assessments for reading interventions: How a theoretically motivated assessment can serve as an outcome measure. Educational Psychology Review, 26(3), 403–424.  https://doi.org/10.1007/s10648-014-9269-z.CrossRefGoogle Scholar
  41. Sabatini, J. P., Halderman, L. K., O’Reilly, T., & Weeks, J. P. (2016). Assessing comprehension in kindergarten through third grade. Topics in Language Disorders, 36(4), 334–355.  https://doi.org/10.1097/TLD.0000000000000104.CrossRefGoogle Scholar
  42. Sabatini, J. P., O’Reilly, T., Halderman, L. K., & Bruce, K. (2014). Integrating scenario-based and component reading skill measures to understand the reading behavior of struggling readers. Learning Disabilities Research & Practice, 29(1), 36–43.  https://doi.org/10.1111/ldrp.12028.CrossRefGoogle Scholar
  43. Schilperoord, J. (2002). On the cognitive status of pauses during text revision. In T. Olive & C. M. Levy (Eds.), Contemporary tools and techniques for studying writing (pp. 61–87). Dordrecht: Kluwer Academic Publishers.  https://doi.org/10.1007/978-94-010-0468-8_4.CrossRefGoogle Scholar
  44. Shemwell, J. T., & Furtak, E. M. (2010). Science classroom discussion as scientific argumentation: A study of conceptually rich (and poor) student talk. Educational Assessment, 15(3–4), 222–250.  https://doi.org/10.1080/10627197.2010.530563.CrossRefGoogle Scholar
  45. Van der Schoot, F. C. J. A. (2002). The application of an IRT-based method for standard setting in a three-stage procedure. In Paper presented at the the annual meeting of the National Council on Measurement in Education, New Orleans, April 2–4, 2002.Google Scholar
  46. van Rijn, P. W., Graf, E. A., & Deane, P. (2014). Empirical recovery of argumentation learning progressions in scenario-based assessments of English language arts. Educative Psicologia, 20(2), 109–115.  https://doi.org/10.1016/j.pse.2014.11.004.CrossRefGoogle Scholar
  47. van Rijn, P. W., & Yan-Koo, Y. (2016). Statistical results from the 2013 CBAL English Language arts multistate study: Parallel forms for argumentative writing. ETS Research Monograph Series (Report No. RM-16-15). Princeton, New Jersey: Educational Testing Service.Google Scholar
  48. Wang, Z., Sabatini, J. S., O’Reilly, T., & Feng, G. (2017). How individual differences interact with task demands in text processing. Scientific Studies of Reading, 21(2), 165–178.  https://doi.org/10.1080/10888438.2016.1276184.CrossRefGoogle Scholar
  49. White, E. M. (1995). An apologia for the timed impromptu essay test. College Composition and Communication, 46(1), 30–45. http://www.jstor.org/stable/358868?origin=JSTOR-pdf.
  50. Zeno, S. M., Ivens, S. H., Millard, R. T., & Duvvuri, R. (1995). The educator’s word frequency guide. Brewster, NY: Touchstone Applied Science Associates.Google Scholar
  51. Zhang, M., Van Rijn, P., Deane, P., & Bennett, R. E. (2017). Scenario-based assessments in writing: An experimental study. Manuscript submitted for publication.Google Scholar
  52. Zhang, M., Zou, D., Wu, A. D., Deane, P., & Li, C. (2017b). An investigation of writing processes employed in scenario-based assessment. In B. D. Zumbo & A. M. Hubley (Eds.), Understanding and investigating response processes in validation research. Social indicators research series (Vol. 69). Cham: Springer.  https://doi.org/10.1007/978-3-319-56129-5_17.Google Scholar

Copyright information

© Springer Science+Business Media B.V., part of Springer Nature 2018

Authors and Affiliations

  1. 1.Educational Testing ServicePrincetonUSA

Personalised recommendations