Skip to main content

Advertisement

Log in

Building a game-enhanced formative assessment to gather evidence about middle school students’ argumentation skills

  • Development Article
  • Published:
Educational Technology Research and Development Aims and scope Submit manuscript

Abstract

In this paper, we describe an effort to develop and evaluate an innovative formative assessment to gather evidence about middle school students’ argumentation skills. Specifically, this game-enhanced scenario-based assessment (Seaball—Semester at Sea) includes a series of argumentative reasoning activities in the context of an extended scenario wherein students debate the issue of whether junk food should be sold to students. These activities were designed around argumentation learning progressions (i.e., hypotheses about the qualitative shifts that occur as students achieve higher levels of sophistication in argumentation) which serve as a framework to determine the targeted skills, levels and activity sequences. Performance feedback is also provided in the assessment. We conducted a pilot study, aimed at examining student performance and the validity of the tasks as a measure of argumentation skills. More than 100 middle school students completed this assessment and additional external measures of argumentation in a pre/post design. Descriptive statistics of student performance in the activities, analyses of item difficulty, and correlations are reported. Results indicated that students’ total scores were significantly correlated with external measures of argumentation skills, and with students’ state reading and writing test scores. In addition, students achieved higher average scores in a post-test of argumentation skills after having completed the Seaball activities. Finally, explanatory feedback about students’ task performance was found to be beneficial to those who were “Below” or “Approaching” proficient on the state reading and writing test. We conclude with implications for assessment design and instruction in argumentation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. Seaball initial total scores were also significantly correlated with the external measures with similar values: pretest, r = .52; posttest, r = .61; state test, r = .58; ELA grade, r = .40, all p’s < .01.

References

  • Argubot Academy. (2014). Mars generation one: Argubot Academy. Retrieved 27 March 2017 from http://www.gamesforchange.org/play/mars-generation-one-argubot-academy/.

  • Barab, S., Dodge, T., Tuzun, H., Job-Sluder, K., Jackson, C., Arici, A., et al. (2007). The quest atlantis project: A socially-responsive play space for learning. In B. E. Shelton & D. Wiley (Eds.), The design and use of simulation computer games in education (pp. 159–186). Rotterdam: Sense Publishers.

    Google Scholar 

  • Bennett, R. E. (2010). Cognitively based assessment of, for, and as learning: A preliminary theory of action for summative and formative assessment. Measurement: Interdisciplinary Research and Perspectives, 8, 70–91.

    Google Scholar 

  • Bennett, R. E. (2011). Formative assessment: A critical review. Assessment in Education: Principles, Policy & Practice, 18, 5–25.

    Google Scholar 

  • Bereiter, C., & Scardamalia, M. (1987). The psychology of written composition. Hillsdale, NJ: Lawrence Erlbaum.

    Google Scholar 

  • Cho, K., & Jonassen, D. H. (2002). The effects of argumentation scaffolds on argumentation and problem solving. Educational Technology Research and Development, 50(3), 5–22.

    Article  Google Scholar 

  • Coiro, J., & Coscarelli, C. (2013). Investigating criteria seventh-graders use to evaluate the quality of online information. Paper presented in Leu, D. J. (Chair), Thinking critically about the critical evaluation of online information. Symposium presented at the 63rd annual meeting of the Literacy Research Association, Dallas, TX.

  • Council of Chief State School Officers. (2010). National Governors Association. Common Core State Standards for English language arts and literacy in history/social studies, science, and technical subjects. Washington, DC: Author. Retrieved 21 March 2017 fromhttp://www.corestandards.org/the-standards/ELA-Literacy.

  • Deane, P. (2011). Writing assessment and cognition (Research Report 11-14). Princeton, NJ: ETS.

    Google Scholar 

  • Deane, P., Sabatini, J. P., Feng, G., Sparks, J. R., Song, Y., Fowles, M., et al. (2015). Key Practices in the English language arts: Linking learning theory, assessment, and instruction. (Research Report 15-17). Princeton, NJ: Educational Testing Service.

    Google Scholar 

  • Deane, P., & Song, Y. (2014). A case study in principled assessment design: Designing assessments to measure and support the development of argumentative reading and writing skills. Spanish Journal of Educational Psychology (Psicologia Educativa), 20, 99–108.

    Google Scholar 

  • Dede, C. (2007). Transforming education for the 21st century: New pedagogies that help all students attain sophisticated learning outcomes. Commissioned by the NCSU Friday Institute. Harvard University. Retrieved 26 March 2017 from http://tdhahwiki.wikispaces.com/file/view/Dede_21stC-skills_semi-final.pdf.

  • Easterday, M. W. (2012). Policy world: A cognitive game for teaching deliberation. In N. Pinkwart & B. McLaren (Eds.), Educational technologies for teaching argumentation skills (pp. 225–276). Oak Park, IL: Bentham Science Publishers.

    Chapter  Google Scholar 

  • Easterday, M. W., Aleven, V., Scheines, R., & Carver, S. M. (2011). Using tutors to improve educational games. In G. Biswas, S. Bull, J. Kay, & A. Mitrovic (Eds.), Artificial intelligence in education: Lecture notes in artificial intelligence 6738 (pp. 63–72). Berlin: Springer.

    Chapter  Google Scholar 

  • Ferretti, R. P., Lewis, W. E., & Andrews-Weckerly, S. (2009). Do goals affect the structure of students’ argumentative writing strategies? Journal of Educational Psychology, 101, 577–589.

    Article  Google Scholar 

  • Furtak, E. M. (2012). Linking a learning progression for natural selection to teachers’ enactment of formative assessment. Journal of Research in Science Teaching, 49, 1181–1210.

    Article  Google Scholar 

  • Goldstein, M., Crowell, A., & Kuhn, D. (2009). What constitutes skilled argumentation and how does it develop? Informal Logic, 29, 379–395.

    Article  Google Scholar 

  • Gordon Commission. (2013). Report: Future K12 education assessments must help improve teaching and learning, inform accountability. Retrieved 27 March 2017 from http://www.gordoncommission.org/rsc/pdfs/gordon_comm_news_release_final.pdf.

  • Graham, S., & Perin, D. (2007). Writing next: Effective strategies to improve writing of adolescents in middle and high schools—A report to the Carnegie Corporation of New York. Washington, DC: Alliance for Educational Progress.

    Google Scholar 

  • Hattie, J., & Timperley, H. (2007). The power of feedback. Review of Educational Research, 77, 81–112.

    Article  Google Scholar 

  • Hayes, J. R., & Flower, L. S. (1980). Identifying the organization of writing processes. In L. Gregg & E. R. Steinberg (Eds.), Cognitive process in writing (pp. 3–30). Hillsdale, NJ: Lawrence Erlbaum Associates.

    Google Scholar 

  • iCivics Inc. (2015). iCivics. Retrieved 22 March 2017 from http://www.icivics.org/.

  • Keehner, M., Gorin, J. S., Feng, G., & Katz, I. R. (2017). Developing and validating cognitive models in assessment. In A. A. Rupp & J. P. Leighton (Eds.), The handbook of cognition and assessment: Frameworks, methodologies, and applications (pp. 75–101). New Jersey: Wiley-Blackwell.

    Google Scholar 

  • Klopfer, E., Osterweil, S., & Salen, S. (2009). Moving learning games forward. Cambridge, MA: The Education Arcade.

    Google Scholar 

  • Kuhn, D. (1991). The skills of argument. Cambridge, England: Cambridge University Press.

    Book  Google Scholar 

  • Kuhn, D. (2009). The importance of learning about knowing: Creating a foundation for development of intellectual values. Child Development Perspectives, 89, 112–117.

    Article  Google Scholar 

  • Kuhn, D., & Crowell, A. (2011). Dialogic argumentation as a vehicle for developing young adolescents’ thinking. Psychological Science, 22, 545–552.

    Article  Google Scholar 

  • Kuhn, D., & Udell, W. (2007). Coordinating own and other perspectives in argument. Thinking & Reasoning, 13, 90–104.

    Article  Google Scholar 

  • Mayer, R. E., & Moreno, R. (2003). Nine ways to reduce cognitive load in multimedia learning. Educational Psychologist, 38, 43–52.

    Article  Google Scholar 

  • McNamara, D. S., Jackson, G. T., & Graesser, A. C. (2010). Intelligent tutoring and games (ITaG). In Y. K. Baek (Ed.), Gaming for classroom-based learning: Digital role-playing as a motivator of study (pp. 44–65). Hershey, PA: IGI Global.

    Chapter  Google Scholar 

  • Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23(2), 13–23.

    Article  Google Scholar 

  • Mills, C. M. (2013). Knowing when to doubt: Developing a critical stance when learning from others. Developmental Psychology, 49, 404–418.

    Article  Google Scholar 

  • Mills, C. M., Al-Jabari, R. M., & Archacki, M. A. (2012). Why do people disagree? Explaining and endorsing the possibility of partiality in judgments. Journal of Cognition and Development, 13, 111–136.

    Article  Google Scholar 

  • Mills, C. M., & Grant, M. G. (2009). Biased decision-making: Developing an understanding of how positive and negative relationships may skew judgments. Developmental Science, 12, 784–797.

    Article  Google Scholar 

  • Mislevy, R. J., Oranje, A., Bauer, M., von Davier, A. A., Hao, J., Corrigan, S., et al. (2014). Psychometric considerations in game-based assessment. Redwood City, CA: GlassLab.

    Google Scholar 

  • Mislevy, R. J., Steinberg, L. S., & Almond, R. G. (2003). On the structure of educational assessment (with discussion). Measurement Interdisciplinary Research and Perspective, 1, 3–62.

    Article  Google Scholar 

  • National Center for Education Statistics. (2012). The Nation’s Report Card: Writing 2011 (NCES 2012-470). Washington, D.C: U. S. Department of Education, Institute for Education Sciences.

    Google Scholar 

  • Pellegrino, J. W., Chudowsky, N., & Glaser, R. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academy Press.

    Google Scholar 

  • Pellegrino, J. W., DiBello, L. V., & Goldman, S. R. (2016). From cognitive-domain theory to assessment practice. Educational Psychologist, 51, 59–81.

    Article  Google Scholar 

  • Perkins, D. N., Farady, M., & Bushey, B. (1991). Everyday reasoning and the roots of intelligence. In J. F. Voss, D. N. Perkins, & J. W. Segal (Eds.), Informal reasoning and education (pp. 83–106). Hillsdale, NJ: Erlbaum.

    Google Scholar 

  • Reznitskaya, A., Anderson, R. C., McNurlen, B., Nguyen-Jahiel, K., Archodiou, A., & Kim, S. (2001). Influence of oral discussion on written argument. Discourse Processes, 32, 155–175.

    Article  Google Scholar 

  • Sabatini, J., O’Reilly, T., & Deane, P. (2013). Preliminary reading literacy assessment framework: Foundation and rationale for assessment and system design (Research Report 13-30). Princeton, NJ: Educational Testing Service.

    Google Scholar 

  • Shaffer, D. W. (2006). How computers help children learn. New York, NY: Palgrave Macmillan.

    Book  Google Scholar 

  • Shute, V. J. (2008). Focus on formative feedback. Review of Educational Research, 78, 153–189.

    Article  Google Scholar 

  • Shute, V. J., & Kim, Y. J. (2011). Does playing the World of Goo facilitate learning? In D. Y. Dai (Ed.), Design research on learning and thinking in educational settings: Enhancing intellectual growth and functioning (pp. 359–387). New York: Routledge Books.

    Google Scholar 

  • Shute, V. J., Ventura, M., Bauer, M. I., & Zapata-Rivera, D. (2009). Melding the power of serious games and embedded assessment to monitor and foster learning: Flow and grow. In U. Ritterfeld, M. J. Cody, & P. Vorderer (Eds.), The social science of serious games: Theories and applications (pp. 295–321). Philadelphia, PA: Routledge/LEA.

    Google Scholar 

  • Song, Y., Deane, P., Graf, E. A., & van Rijn, P. (2013). Using argumentation learning progressions to support teaching and assessments of English language arts. R&D Connections 22. Princeton, NJ: Educational Testing Service.

    Google Scholar 

  • Spector, J. M., & Park, S. W. (2012). Argumentation, critical reasoning, and problem solving. In S. B. Fee & B. R. Belland (Eds.), The role of criticism in understanding problem solving (pp. 12–33). New York, NY: Springer.

    Google Scholar 

  • Squire, K. (2002). Cultural framing of computer/video games. Game Studies, 2(1). http://www.gamestudies.org/0102/squire/.

  • Student Assessment of Growth and Excellence (2017). http://sageportal.org/.

  • van Eemeren, F. H., Grootendorst, R., & Henkemans, F. S. (1996). Fundamentals of argumentation theory: A handbook of historical backgrounds and contemporary developments. Mahwah, NJ: Erlbaum.

    Google Scholar 

  • Van Eemeren, F. H., & Henkemans, A. F. S. (2017). Argumentation: Analysis and evaluation. New York: Taylor & Francis.

    Google Scholar 

  • van Merrienboer, J. J. G., & Sweller, J. (2005). Cognitive load theory and complex learning: Recent developments and future directions. Educational Psychology Review, 17, 147–178.

    Article  Google Scholar 

  • van Rijn, P. W., Graf, E. A., & Deane, P. (2014). Empirical recovery of argumentation learning progressions in scenario-based assessments of English language arts. Spanish Journal of Educational Psychology (Psicologia Educativa), 20, 109–115.

    Google Scholar 

  • Walton, D. N. (1996). Argumentation schemes for presumptive reasoning. Mahwah, NJ: Erlbaum.

    Google Scholar 

  • Wouters, P., van Nimwegen, C., van Oostendorp, H., & van der Spek, E. D. (2013). A meta-analysis of the cognitive and motivational effects of serious games. Journal of Educational Psychology, 105, 249–265.

    Article  Google Scholar 

  • Zieky, M. J. (2014). An introduction to the use of evidence-centered design in test development. Spanish Journal of Educational Psychology (Psicologia Educativa), 20, 79–87.

    Google Scholar 

Download references

Acknowledgements

We gratefully acknowledge funding from the ETS CBAL® research initiative. We thank Jennifer L. Bochenek and Lauren Phelps for research assistance, and Gerry A. Kokolis for help with data processing. We also want to thank EduWeb® for programming the Seaball SBA task. We thank Randy E. Bennett, G. Tanner Jackson, Donald E. Powers, Colleen Appel, and several anonymous reviewers who provided feedback on earlier versions of this article.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yi Song.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Song, Y., Sparks, J.R. Building a game-enhanced formative assessment to gather evidence about middle school students’ argumentation skills. Education Tech Research Dev 67, 1175–1196 (2019). https://doi.org/10.1007/s11423-018-9637-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11423-018-9637-3

Keywords

Navigation