The case for scenario-based assessment of written argumentation
- 457 Downloads
This paper presents a theoretical and empirical case for the value of scenario-based assessment (SBA) in the measurement of students’ written argumentation skills. First, we frame the problem in terms of creating a reasonably efficient method of evaluating written argumentation skills, including for students at relatively low levels of competency. We next present a proposed solution in the form of an SBA and lay out the design for such an assessment. We then describe the results of prior research done within our group using this design. Fourth, we present the results of two new analyses of prior data that extend our previous results. These analyses concern whether the test items behave in ways consistent with the learning progressions underlying the design, how items measuring reading and writing component skills relate to essay performance, how measures of transcription fluency and proficiency in oral and academic language relate to writing skill, and whether the scenario-based design affects the fluency and vocabulary used in an essay. Results suggest that students can be differentiated by learning progression level, with variance in writing scores accounted for by a combination of performance on earlier tasks in the scenario and automated linguistic features measuring general literacy skills. The SBA structure, with preliminary tasks leading up to the final written performance, appears to result in more fluent (and also more efficient) writing behavior, compared to students’ performances when they write an essay in isolation.
KeywordsAssessment Argumentation Scenario-based assessment SBA Writing Summary Summarization Essay Critique Reading IRT Dimensionality Learning progression Keystroke Burst Process data Automated essay scoring AES
- Almond, R., Deane, P., Quinlan, T., & Wagner, M. (2012). A preliminary analysis of keystroke log data from a timed writing task. ETS Research Report Series, 2012(2). Princeton, NJ: Educational Testing Service. https://doi.org/10.1002/j.2333-8504.2012.tb02305.x.
- Attali, Y., & Burstein, J. (2005). Automated essay scoring with E-Rater v. 2.0. ETS Research Report Series, 2004(2). Princeton, NJ: Educational Testing Service. https://doi.org/10.1002/j.2333-8504.2004.tb01972.x.
- Baaijen, H. R. (2001). Word frequency distributions. Dordrecht: Kluwer Academic Publishers.Google Scholar
- Bennett, R. E. (2011). CBAL: Results from piloting innovative K-12 assessments. ETS Research Report Series 2011(1). Princeton, NJ: Educational Testing Service. https://doi.org/10.1002/j.2333-8504.2011.tb02259.x.
- Breland, H. M., Camp, R., Jones, R. J., Morris, M. M., & Rock, D. R. (1987). Assessing writing skill (College Board Research Report No. 11). New York: College Entrance Examination Board.Google Scholar
- CCSSO, & NGA. (2010). Common core state standards for English language arts and literacy in history/social studies, science, and technical subjects. Washington, DC.Google Scholar
- Deane, P., Fowles, M., Baldwin, D., & Persky, H. (2011). The CBAL summative writing assessment: A draft eighth-grade design. ETS Research Memorandum Series (Report No. RM-11-01). Princeton, NJ: Educational Testing Service. https://sharepoint.etslan.org/rd/rrpts/RR/RM-11-01.pdf.
- Deane, P., & Song, Y. (2015). The key practice, discuss and debate ideas: Conceptual framework, literature review, and provisional learning progressions for argumentation. ETS Research Report Series, 2015(2). Princeton, NJ: Educational Testing Service. https://doi.org/10.1002/ets2.12079.
- Fayol, M. (1999). From on-line management problems to strategies in written composition. In M. Torrance & G. Jeffery (Eds.), The cognitive demands of writing: Processing capacity and working memory effects in text production (pp. 15–23). Amsterdam: Amsterdam University Press.Google Scholar
- Fu, J., Chung, S., & Wise, M. (2013). Dimensionality analyses of CBAL Writing tests. ETS Research Report Series, 2013(1). Princeton, NJ: Educational Testing Service. https://doi.org/10.1002/j.2333-8504.2013.tb02317.x.
- Fu, J., & Wise, M. (2012). Statistical report of 2011 CBAL multistate administration of reading and writing tests. ETS Research Report Series, 2012(2). Princeton, NJ: Educational Testing Service. https://doi.org/10.1002/j.2333-8504.2012.tb02306.x.
- Godshalk, F. I., Swineford, F., & Coffman, W. E. (1966). The measurement of writing ability. New York: College Entrance Examination Board.Google Scholar
- Goldman, S. R., Britt, M. A., Brown, W., Cribb, G., George, M. A., Greenleaf, C., et al. (2016). Disciplinary literacies and learning to read for understanding: A conceptual framework for disciplinary literacy. Educational Psychologist, 51(2), 219–246. https://doi.org/10.1080/00461520.2016.1168741.CrossRefGoogle Scholar
- Hayes, J. R. (2012). Evidence from language bursts, revision, and transcription for translation and its relation to other writing processes. In M. Fayol, D. Alamargot, & V. W. Berninger (Eds.), Translation of thought to written text while composing. New York: Psychology Press.Google Scholar
- Hillocks, G. (2002). The testing trap: How state writing assessments control learning. New York: Teachers College Press.Google Scholar
- Kellogg, R. T. (1996). A model of working memory in writing. In C. M. Levy & S. Ransdell (Eds.), The science of writing: Theories, methods, individual differences, and applications (pp. 57–71). Hillsdale, NJ: Lawrence Earlbaum Associates.Google Scholar
- Klaczynski, P. (2000). Motivated scientific reasoning biases, epistemological beliefs, and theory polarization: A two-process approach to adolescent cognition. Child Development, 71(5), 1347–1366. http://www.jstor.org/stable/1131978.
- Kuhn, D., Shaw, V., & Felton, M. (1997). Effects of dyadic interaction on argumentive reasoning. Cognition and Instruction, 15(3), 287–315. http://www.jstor.org/stable/3233770.
- McCann, T. M. (1989). Student argumentative writing: Knowledge and ability at three grade levels. Research in the Teaching of English, 23(1), 62–76. http://www.jstor.org/stable/40171288.
- NCES. (2012). The Nation’s report card: Writing 2011. National Center for Education Statistics. https://nces.ed.gov/nationsreportcard/pdf/main2011/2012470.pdf.
- O’Reilly, T., Deane, P., & Sabatini, J. (2015). Building and sharing knowledge key practice: What do you know, what don’t you know, what did you Learn? ETS Research Report Series, 2015(2). Princeton, NJ: Educational Testing Service. https://doi.org/10.1002/ets2.12074.
- O’Reilly, T., Weeks, J., Sabatini, J., Halderman, L., & Steinberg, J. (2014). Designing reading comprehension assessments for reading interventions: How a theoretically motivated assessment can serve as an outcome measure. Educational Psychology Review, 26(3), 403–424. https://doi.org/10.1007/s10648-014-9269-z.CrossRefGoogle Scholar
- Sabatini, J. P., O’Reilly, T., Halderman, L. K., & Bruce, K. (2014). Integrating scenario-based and component reading skill measures to understand the reading behavior of struggling readers. Learning Disabilities Research & Practice, 29(1), 36–43. https://doi.org/10.1111/ldrp.12028.CrossRefGoogle Scholar
- Van der Schoot, F. C. J. A. (2002). The application of an IRT-based method for standard setting in a three-stage procedure. In Paper presented at the the annual meeting of the National Council on Measurement in Education, New Orleans, April 2–4, 2002.Google Scholar
- van Rijn, P. W., & Yan-Koo, Y. (2016). Statistical results from the 2013 CBAL English Language arts multistate study: Parallel forms for argumentative writing. ETS Research Monograph Series (Report No. RM-16-15). Princeton, New Jersey: Educational Testing Service.Google Scholar
- White, E. M. (1995). An apologia for the timed impromptu essay test. College Composition and Communication, 46(1), 30–45. http://www.jstor.org/stable/358868?origin=JSTOR-pdf.
- Zeno, S. M., Ivens, S. H., Millard, R. T., & Duvvuri, R. (1995). The educator’s word frequency guide. Brewster, NY: Touchstone Applied Science Associates.Google Scholar
- Zhang, M., Van Rijn, P., Deane, P., & Bennett, R. E. (2017). Scenario-based assessments in writing: An experimental study. Manuscript submitted for publication.Google Scholar
- Zhang, M., Zou, D., Wu, A. D., Deane, P., & Li, C. (2017b). An investigation of writing processes employed in scenario-based assessment. In B. D. Zumbo & A. M. Hubley (Eds.), Understanding and investigating response processes in validation research. Social indicators research series (Vol. 69). Cham: Springer. https://doi.org/10.1007/978-3-319-56129-5_17.Google Scholar