Advertisement

Assessing Science Inquiry Skills in an Immersive, Conversation-Based Scenario

  • Diego Zapata-RiveraEmail author
  • Lei Liu
  • Lei Chen
  • Jiangang Hao
  • Alina A. von Davier
Chapter

Abstract

Innovative, interactive tasks that include conversations among humans and virtual (pedagogical) agents can be used to assess relevant cognitive skills (e.g., scientific inquiry skills). These new assessment systems aid the collection of additional information (e.g., timing data, information about conversation path sequences, and amount of help used) that provide the context for assessment and can inform assessment claims in these specific environments. In order to assess science skills, we have implemented and evaluated a game-like assessment with embedded conversations called the Volcano Scenario. This chapter describes the Volcano Scenario and highlights the techniques used to collect and analyze the data generated by the system. A hybrid approach to analyzing data from interactive, assessment environments that makes use of traditional psychometric analysis and several big data-related processes is described and illustrated through the analyses of data from 500 participants who have at least a year of college experience.

Keywords

Hybrid approach Conversation-based assessments Science inquiry skills 

References

  1. Adamson, D., Dyke, G., Jang, H. J., & Rosé, C. P. (2014). Towards an agile approach to adapting dynamic collaboration support to student needs. International Journal of Artificial Intelligence in Education, 24(1), 91–121.CrossRefGoogle Scholar
  2. Assunção, M., Calheiros, R., Bianchi, S., Netto, M., & Buyya, R. (2014). Big data computing and clouds: Trends and future directions. Journal of Parallel and Distributed Computing, 75(13), 156–175.Google Scholar
  3. Baker, R., & Yacef, K. (2009). The state of educational data mining in 2009: A review and future visions. Journal of Educational Data Mining, 1, 3–17.Google Scholar
  4. Bennett, R. E., Persky, H., Weiss, A. R., & Jenkins, F. (2007). Problem solving in technology-rich environments: A report from the NAEP technology based assessment project (NCES 2007–466). Washington, DC: National Center for Education Statistics, U.S. Department of EducationGoogle Scholar
  5. Clarke-Midura, J., Code, J., Dede, C., Mayrath, M., & Zap, N. (2011). Thinking outside the bubble: Virtual performance assessments for measuring complex learning. In M. C. Mayrath, J. Clarke-Midura, & D. Robinson (Eds.), Technology-based assessments for 21st century skills: Theoretical and practical implications from modern research (pp. 125–147). Charlotte, NC: Information Age.Google Scholar
  6. R Core Team (2013). R: A language and environment for statistical computing. R Foundation for statistical Computing, Vienna, Austria. Retrieved October 5, 2014, from http://www.R-project.org/
  7. DiCerbo, K., & Behrens, J. (2012). From technology-enhanced assessment to assessment-enhanced technology. Paper presented at the annual meeting of the National Council on Measurement in Education (NCME), Vancouver, BC. Canada, 12–16 April 2012.Google Scholar
  8. Gotwals, A. W., & Songer, N. B. (2006). Measuring students’ scientific content and inquiry reasoning. In S. Barab, K. Hay, & D. Hickey (Eds.), Proceedings of the 7th international conference of the learning sciences (pp. 196–202). Mahwah, NJ: Lawrence Erlbaum.Google Scholar
  9. Graesser, A. C., Lu, S. L., Jackson, G., Mitchell, H., Ventura, M., Olney, A., et al. (2004). AutoTutor: A tutor with dialogue in natural language. Behavioral Research Methods, Instruments & Computers, 36, 180–193.CrossRefGoogle Scholar
  10. Graesser, A. C., Person, N. K., & Harter, D. (2001). The tutoring research group: Teaching tactics and dialogue in AutoTutor. International Journal of Artificial Intelligence in Education, 12, 257–279.Google Scholar
  11. Hao, J., Liu, L., von Davier, A., & Kyllonen, P. (2015). Assessing collaborative problem solving with simulation based task. In Proceedings of the 11th international conference on computer supported collaborative learning, Gothenburg, Sweden, 7–11 June.Google Scholar
  12. Hao, J., Smith, L., Mislevy, R., von Davier, A., & Bauer, M. (2016). Taming log files from game and simulation based assessments: Data models and data analysis tools. doi: 10.1002/ets2.12096
  13. Kraut, R., Olson, J., Banaji, M., Bruckman, A., Cohen, J., & Couper, M. (2004). Psychological research online: Report of board of scientific affairs’ advisory group on the conduct of research on the internet. American Psychologist, 59, 105–117.CrossRefGoogle Scholar
  14. Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  15. Liu, L., Hao, J., von Davier, A., Kyllonen, P., & Zapata-Rivera, D. (2016). A tough nut to crack: Measuring collaborative problem solving. In Y. Rosen, S. Ferrara, & M. Mosharraf (Eds.), Handbook of research on computational tools for real-world skill development. Hershey, PA: IGI-Global.Google Scholar
  16. Liu, L., Rogat, A., & Bertling, M. (2013). A CBAL™ science model of cognition: Developing a competency model and learning progressions to support assessment development (ETS Research Report Series. 2:1–54). Princeton, NJ: Educational Testing Service.Google Scholar
  17. Millis, K., Forsyth, C., Butler, H., Wallace, P., Graesser, A. C., & Halpern, D. (2011). Operation ARIES! a serious game for teaching scientific inquiry. In J. Lakhmi & M. M. Oikonomou (Eds.), Serious games and edutainment applications (pp. 169–196). London: Springer.CrossRefGoogle Scholar
  18. Mislevy, R., Oranje, A., Bauer, M., von Davier, A., Hao, J., Corrigan, S., et al. (2014). Psychometric considerations in game-based assessment. Retrieved October 9, 2014, from http://www.instituteofplay.org/wp-content/uploads/2014/02/GlassLab_GBA1_WhitePaperFull.pdf.
  19. Mislevy, R. J., Steinberg, L. S., & Almond, R. G. (2003). On the structure of educational assessments. Measurement: Interdisciplinary Research and Perspectives , 1, 3–62.Google Scholar
  20. Pear, J. J., & Crone-Todd, D. E. (2002). A social constructivist approach to computer-mediated instruction. Computers & Education, 38, 221–231.CrossRefGoogle Scholar
  21. Quellmalz, E. S., Timms, M. J., Buckley, B. C., Davenport, J., Loveland, M., & Silberglitt, M. D. (2011). 21st century dynamic assessment. In M. C. Mayrath, J. Clarke-Midura, & D. Robinson (Eds.), Technology-based assessments for 21st century skills: Theoretical and practical implications from modern research (pp. 55–90). Charlotte, NC: Information Age.Google Scholar
  22. Rundgren, C. J., Rundgren, S. N. C., Tseng, Y. H., Lin, P. L., & Chang, C. Y. (2012). Are you SLiM? Developing an instrument for civic scientific literacy measurement (SLiM) based on media coverage. Public Understanding of Science, 21(6), 759–773.CrossRefGoogle Scholar
  23. Shute, V. J., Ventura, M., Bauer, M. I., & Zapata-Rivera, D. (2009). Melding the power of serious games and embedded assessment to monitor and foster learning: Flow and grow. In U. Ritterfeld, M. J. Cody, & P. Vorderer (Eds.), Serious games: Mechanisms and effects (pp. 295–321). Philadelphia, PA: Routledge.Google Scholar
  24. Vygotsky, L. (1978). Mind in society. London: Harvard University Press.Google Scholar
  25. White, T. (2012). Hadoop: The definitive guide (3rd ed.). Sebastopol, CA: O’Reilly Media.Google Scholar
  26. Zapata-Rivera, D. (2013). Exploring the use of trialogues in assessment. Paper presented at the Cognition and Assessment SIG Symposium. Annual meeting of the American Educational Research Association (AERA), San Francisco, CA, April 27–May 1.Google Scholar
  27. Zapata-Rivera, D., Jackson, T., Liu, L., Bertling, M., Vezzu, M., & Katz, I. R. (2014). Science inquiry skills using trialogues. In S. Trausan-Matu, K. Boyer, M. Crosby, & K. Panourgia (Eds.), Proceedings of the 12th International Conference on Intelligence Tutoring Systems. Honolulu, HI, June 2014: Vol. 8474: Lecture notes in computer science (pp. 625–626). Switzerland: Springer International.Google Scholar
  28. Zimmerman, C. (2000). The development of scientific reasoning skills. Developmental Review, 20, 99–149.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2017

Authors and Affiliations

  • Diego Zapata-Rivera
    • 1
    Email author
  • Lei Liu
    • 1
  • Lei Chen
    • 1
  • Jiangang Hao
    • 1
  • Alina A. von Davier
    • 1
  1. 1.Research and Development. Educational Testing ServicePrincetonUSA

Personalised recommendations