Building a game-enhanced formative assessment to gather evidence about middle school students’ argumentation skills
In this paper, we describe an effort to develop and evaluate an innovative formative assessment to gather evidence about middle school students’ argumentation skills. Specifically, this game-enhanced scenario-based assessment (Seaball—Semester at Sea) includes a series of argumentative reasoning activities in the context of an extended scenario wherein students debate the issue of whether junk food should be sold to students. These activities were designed around argumentation learning progressions (i.e., hypotheses about the qualitative shifts that occur as students achieve higher levels of sophistication in argumentation) which serve as a framework to determine the targeted skills, levels and activity sequences. Performance feedback is also provided in the assessment. We conducted a pilot study, aimed at examining student performance and the validity of the tasks as a measure of argumentation skills. More than 100 middle school students completed this assessment and additional external measures of argumentation in a pre/post design. Descriptive statistics of student performance in the activities, analyses of item difficulty, and correlations are reported. Results indicated that students’ total scores were significantly correlated with external measures of argumentation skills, and with students’ state reading and writing test scores. In addition, students achieved higher average scores in a post-test of argumentation skills after having completed the Seaball activities. Finally, explanatory feedback about students’ task performance was found to be beneficial to those who were “Below” or “Approaching” proficient on the state reading and writing test. We conclude with implications for assessment design and instruction in argumentation.
KeywordsScenario-based assessment Argumentation Game features Game-based assessment Feedback
We gratefully acknowledge funding from the ETS CBAL® research initiative. We thank Jennifer L. Bochenek and Lauren Phelps for research assistance, and Gerry A. Kokolis for help with data processing. We also want to thank EduWeb® for programming the Seaball SBA task. We thank Randy E. Bennett, G. Tanner Jackson, Donald E. Powers, Colleen Appel, and several anonymous reviewers who provided feedback on earlier versions of this article.
Compliance with ethical standards
Conflict of interest
The authors declare that they have no conflict of interest.
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
- Argubot Academy. (2014). Mars generation one: Argubot Academy. Retrieved 27 March 2017 from http://www.gamesforchange.org/play/mars-generation-one-argubot-academy/.
- Barab, S., Dodge, T., Tuzun, H., Job-Sluder, K., Jackson, C., Arici, A., et al. (2007). The quest atlantis project: A socially-responsive play space for learning. In B. E. Shelton & D. Wiley (Eds.), The design and use of simulation computer games in education (pp. 159–186). Rotterdam: Sense Publishers.Google Scholar
- Bennett, R. E. (2010). Cognitively based assessment of, for, and as learning: A preliminary theory of action for summative and formative assessment. Measurement: Interdisciplinary Research and Perspectives, 8, 70–91.Google Scholar
- Bennett, R. E. (2011). Formative assessment: A critical review. Assessment in Education: Principles, Policy & Practice, 18, 5–25.Google Scholar
- Bereiter, C., & Scardamalia, M. (1987). The psychology of written composition. Hillsdale, NJ: Lawrence Erlbaum.Google Scholar
- Coiro, J., & Coscarelli, C. (2013). Investigating criteria seventh-graders use to evaluate the quality of online information. Paper presented in Leu, D. J. (Chair), Thinking critically about the critical evaluation of online information. Symposium presented at the 63rd annual meeting of the Literacy Research Association, Dallas, TX.Google Scholar
- Council of Chief State School Officers. (2010). National Governors Association. Common Core State Standards for English language arts and literacy in history/social studies, science, and technical subjects. Washington, DC: Author. Retrieved 21 March 2017 fromhttp://www.corestandards.org/the-standards/ELA-Literacy.
- Deane, P. (2011). Writing assessment and cognition (Research Report 11-14). Princeton, NJ: ETS.Google Scholar
- Deane, P., Sabatini, J. P., Feng, G., Sparks, J. R., Song, Y., Fowles, M., et al. (2015). Key Practices in the English language arts: Linking learning theory, assessment, and instruction. (Research Report 15-17). Princeton, NJ: Educational Testing Service.Google Scholar
- Deane, P., & Song, Y. (2014). A case study in principled assessment design: Designing assessments to measure and support the development of argumentative reading and writing skills. Spanish Journal of Educational Psychology (Psicologia Educativa), 20, 99–108.Google Scholar
- Dede, C. (2007). Transforming education for the 21st century: New pedagogies that help all students attain sophisticated learning outcomes. Commissioned by the NCSU Friday Institute. Harvard University. Retrieved 26 March 2017 from http://tdhahwiki.wikispaces.com/file/view/Dede_21stC-skills_semi-final.pdf.
- Easterday, M. W., Aleven, V., Scheines, R., & Carver, S. M. (2011). Using tutors to improve educational games. In G. Biswas, S. Bull, J. Kay, & A. Mitrovic (Eds.), Artificial intelligence in education: Lecture notes in artificial intelligence 6738 (pp. 63–72). Berlin: Springer.CrossRefGoogle Scholar
- Gordon Commission. (2013). Report: Future K–12 education assessments must help improve teaching and learning, inform accountability. Retrieved 27 March 2017 from http://www.gordoncommission.org/rsc/pdfs/gordon_comm_news_release_final.pdf.
- Graham, S., & Perin, D. (2007). Writing next: Effective strategies to improve writing of adolescents in middle and high schools—A report to the Carnegie Corporation of New York. Washington, DC: Alliance for Educational Progress.Google Scholar
- Hayes, J. R., & Flower, L. S. (1980). Identifying the organization of writing processes. In L. Gregg & E. R. Steinberg (Eds.), Cognitive process in writing (pp. 3–30). Hillsdale, NJ: Lawrence Erlbaum Associates.Google Scholar
- iCivics Inc. (2015). iCivics. Retrieved 22 March 2017 from http://www.icivics.org/.
- Keehner, M., Gorin, J. S., Feng, G., & Katz, I. R. (2017). Developing and validating cognitive models in assessment. In A. A. Rupp & J. P. Leighton (Eds.), The handbook of cognition and assessment: Frameworks, methodologies, and applications (pp. 75–101). New Jersey: Wiley-Blackwell.Google Scholar
- Klopfer, E., Osterweil, S., & Salen, S. (2009). Moving learning games forward. Cambridge, MA: The Education Arcade.Google Scholar
- Mislevy, R. J., Oranje, A., Bauer, M., von Davier, A. A., Hao, J., Corrigan, S., et al. (2014). Psychometric considerations in game-based assessment. Redwood City, CA: GlassLab.Google Scholar
- National Center for Education Statistics. (2012). The Nation’s Report Card: Writing 2011 (NCES 2012-470). Washington, D.C: U. S. Department of Education, Institute for Education Sciences.Google Scholar
- Pellegrino, J. W., Chudowsky, N., & Glaser, R. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academy Press.Google Scholar
- Perkins, D. N., Farady, M., & Bushey, B. (1991). Everyday reasoning and the roots of intelligence. In J. F. Voss, D. N. Perkins, & J. W. Segal (Eds.), Informal reasoning and education (pp. 83–106). Hillsdale, NJ: Erlbaum.Google Scholar
- Sabatini, J., O’Reilly, T., & Deane, P. (2013). Preliminary reading literacy assessment framework: Foundation and rationale for assessment and system design (Research Report 13-30). Princeton, NJ: Educational Testing Service.Google Scholar
- Shute, V. J., & Kim, Y. J. (2011). Does playing the World of Goo facilitate learning? In D. Y. Dai (Ed.), Design research on learning and thinking in educational settings: Enhancing intellectual growth and functioning (pp. 359–387). New York: Routledge Books.Google Scholar
- Shute, V. J., Ventura, M., Bauer, M. I., & Zapata-Rivera, D. (2009). Melding the power of serious games and embedded assessment to monitor and foster learning: Flow and grow. In U. Ritterfeld, M. J. Cody, & P. Vorderer (Eds.), The social science of serious games: Theories and applications (pp. 295–321). Philadelphia, PA: Routledge/LEA.Google Scholar
- Song, Y., Deane, P., Graf, E. A., & van Rijn, P. (2013). Using argumentation learning progressions to support teaching and assessments of English language arts. R&D Connections 22. Princeton, NJ: Educational Testing Service.Google Scholar
- Spector, J. M., & Park, S. W. (2012). Argumentation, critical reasoning, and problem solving. In S. B. Fee & B. R. Belland (Eds.), The role of criticism in understanding problem solving (pp. 12–33). New York, NY: Springer.Google Scholar
- Squire, K. (2002). Cultural framing of computer/video games. Game Studies, 2(1). http://www.gamestudies.org/0102/squire/.
- Student Assessment of Growth and Excellence (2017). http://sageportal.org/.
- van Eemeren, F. H., Grootendorst, R., & Henkemans, F. S. (1996). Fundamentals of argumentation theory: A handbook of historical backgrounds and contemporary developments. Mahwah, NJ: Erlbaum.Google Scholar
- Van Eemeren, F. H., & Henkemans, A. F. S. (2017). Argumentation: Analysis and evaluation. New York: Taylor & Francis.Google Scholar
- van Rijn, P. W., Graf, E. A., & Deane, P. (2014). Empirical recovery of argumentation learning progressions in scenario-based assessments of English language arts. Spanish Journal of Educational Psychology (Psicologia Educativa), 20, 109–115.Google Scholar
- Walton, D. N. (1996). Argumentation schemes for presumptive reasoning. Mahwah, NJ: Erlbaum.Google Scholar
- Zieky, M. J. (2014). An introduction to the use of evidence-centered design in test development. Spanish Journal of Educational Psychology (Psicologia Educativa), 20, 79–87.Google Scholar