Advertisement

Measurement Challenges of Interactive Educational Assessment

Chapter

Abstract

This chapter discusses four measurement challenges of data science in educational assessments that are enabled by technology: (1) Dealing with change over time. (2) How a digital performance space’s relationships interact with learner actions, communications and products. (3) How layers of interpretation are formed from translations of atomistic data into meaningful larger units suitable for making inferences about what someone knows and can do. (4) How to represent the dynamics of interactions between and among learners who are being assessed by their interactions with each other as well as with digital resources and agents in digital performance spaces. Because of the movement from paper-based tests to online learning, and in order to make progress on these challenges, the authors advocate the restructuring of training of the next generation of researchers and psychometricians in technology-enabled assessments.

Keywords

Learning analytics Data science Educational assessment Educational measurement New psychometrics 

References

  1. Al-Diban, S., & Ifenthaler, D. (2011). Comparison of two analysis approaches for measuring externalized mental models. Educational Technology & Society, 14(2), 16–30.Google Scholar
  2. Aldrich, C. (2005). Learning by doing: The essential guide to simulations, computer games, and pedagogy in e-learning and other educational experiences. San Francisco, CA: Jossey-Bass.Google Scholar
  3. Black, P., Harrison, C., Hodgen, J., Marshall, M., & Serrett, N. (2010). Validity in teachers’ summative assessments. Assessment in Education: Principles, Policy & Practice, 17(2), 215–232.Google Scholar
  4. Bar-Yam, Y. (1997). Dynamics of complex systems. Reading, MA: Addison-Wesley.Google Scholar
  5. Bates, D. M., & Watts, D. G. (1988). Nonlinear regression analysis and its applications (Vol. 32). New York, NY: John Wiley & Sons.  https://doi.org/10.2307/2289810CrossRefGoogle Scholar
  6. Bennett, R. (2010). Cognitively based assessment of, for, and as learning (CBAL): A preliminary theory of action for summative and formative assessment. Measurement: Interdisciplinary Research & Perspective, 8(2–3), 70–91. Retrieved from http://www.tandfonline.com/doi/abs/10.1080/15366367.2010.508686Google Scholar
  7. Bollen, J., Van de Sompel, H., Hagberg, A., Bettencourt, L., Chute, R., Rodriguez, M. A., & Balakireva, L. (2009). Clickstream data yields high-resolution maps of science. PLoS One, 4, e4803.  https://doi.org/10.1371/journal.pone.0004803CrossRefGoogle Scholar
  8. Borgatti, S. P., & Halgin, D. S. (2011). On network theory. Organization Science, 22, 1168–1181.  https://doi.org/10.1287/orsc.1100.0641CrossRefGoogle Scholar
  9. Brandes, U., & Erlebach, T. (2005). Network analysis: Methodological foundations (Lecture notes in computer science) (Vol. 3418). Berlin, Germany: Springer.  https://doi.org/10.1007/b106453CrossRefGoogle Scholar
  10. Choi, Y., Rupp, A., Gushta, M., & Sweet, S. (2010). Modeling learning trajectories with epistemic network analysis: An investigation of a novel analytic method for learning progressions in epistemic games. In National Council on Measurement in Education (pp. 1–39).Google Scholar
  11. Clarke-Midura, J., Code, J., Dede, C., Mayrath, M., & Zap, N. (2012). Thinking outside the bubble: Virtual performance assessments for measuring complex learning. In Technology-based assessments for 21st Century skills: Theoretical and practical implications from modern research (pp. 125–148). Charlotte, NC: Information Age Publishers.Google Scholar
  12. Clarke-Midura, J., & Dede, C. (2010). Assessment, technology, and change. Journal of Research in Teacher Education, 42(3), 309–328.CrossRefGoogle Scholar
  13. Cronbach, L. J., & Meehl, P. E. (1955). Construct validity in psychological tests. Psychological Bulletin, 52(4), 281–302.  https://doi.org/10.1037/h0040957CrossRefGoogle Scholar
  14. de Freitas, S. (2014). Education in computer generated environments. London, UK: Routledge.Google Scholar
  15. Fischer, G. H. E., & Molenaar, I. W. E. (2012). Rasch models: Foundations, recent developments, and applications. New York, NY: Springer.  https://doi.org/10.1007/978-1-4612-4230-7CrossRefGoogle Scholar
  16. Gibson, D. (2012). Game changers for transforming learning environments. In F. Miller (Ed.), Transforming learning environments: Strategies to shape the next generation (Advances in educational administration) (Vol. 16, pp. 215–235). Bingley, UK: Emerald Group Publishing Ltd.  https://doi.org/10.1108/S1479-3660(2012)0000016014CrossRefGoogle Scholar
  17. Gibson, D., & Clarke-Midura, J. (2013). Some psychometric and design implications of game-based learning analytics. In D. Ifenthaler, J. Spector, P. Isaias, & D. Sampson (Eds.), E-Learning systems, environments and approaches: Theory and implementation. London, UK: Springer.Google Scholar
  18. Gibson, D., & Ifenthaler, D. (2017). Preparing the next generation of education researchers for big data in higher education. In B. Kei Daniel (Ed.), Big data and learning analytics: Current theory and practice in higher education (pp. 29–42). New York, NY: Springer.CrossRefGoogle Scholar
  19. Gibson, D., & Jakl, P. (2013). Data challenges of leveraging a simulation to assess learning. West Lake Village, CA: Pragmatic Solutions. Retrieved from http://www.curveshift.com/images/Gibson_Jakl_data_challenges.pdfGoogle Scholar
  20. Gibson, D., & Knezek, G. (2011). Game changers for teacher education. In P. Mishra & M. Koehler (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference 2011 (pp. 929–942). Chesapeake, VA: AACE.Google Scholar
  21. Gibson, D., & Webb, M. E. (2015). Data science in educational assessment. Education and Information Technologies.  https://doi.org/10.1007/s10639-015-9411-7CrossRefGoogle Scholar
  22. Gijbels, D., Dochy, F., Van den Bossche, P., & Segers, M. (2005). Effects of problem-based learning: A meta-analysis from the angle of assessment. Review of Educational Research, 75, 27–61.CrossRefGoogle Scholar
  23. Han, J., Cheng, H., Xin, D., & Yan, X. (2007). Frequent pattern mining: Current status and future directions. Data Mining and Knowledge Discovery, 15(1), 55–86.  https://doi.org/10.1007/s10618-006-0059-1CrossRefGoogle Scholar
  24. Hedeker, D., & Gibbons, R. D. (2006). Longitudinal data analysis. Hoboken, NJ: John Wiley & Sons.  https://doi.org/10.1002/0470036486CrossRefGoogle Scholar
  25. IBM. (2015). Big data. Retrieved from http://www-01.ibm.com/software/au/data/bigdata/
  26. Ifenthaler, D. (2010). Scope of graphical indices in educational diagnostics. In D. Ifenthaler, P. Pirnay-Dummer, & N. M. Seel (Eds.), Computer-based diagnostics and systematic analysis of knowledge (pp. 213–234). New York, NY: Springer.CrossRefGoogle Scholar
  27. Ifenthaler, D. (2014). AKOVIA: Automated knowledge visualization and assessment. Technology, Knowledge and Learning, 19(1–2), 241–248.  https://doi.org/10.1007/s10758-014-9224-6CrossRefGoogle Scholar
  28. Ifenthaler, D. (2015). Learning analytics. In J. M. Spector (Ed.), The SAGE encyclopedia of educational technology (Vol. 2, pp. 447–451). Thousand Oaks, CA: SAGE.Google Scholar
  29. Ifenthaler, D., Eseryel, D., & Ge, X. (2012). Assessment for game-based learning. In D. Ifenthaler, D. Eseryel, & X. Ge (Eds.), Assessment in game-based learning. Foundations, innovations, and perspectives (pp. 3–10). New York, NY: Springer.CrossRefGoogle Scholar
  30. Jonassen, D. H. (1997). Instructional design models for well-structured and ill-structured problem-solving learning outcomes. Educational Technology Research and Development, 45, 65–90.  https://doi.org/10.1007/BF02299613CrossRefGoogle Scholar
  31. Khan, S. (2011). Khan Academy. Retrieved from http://www.khanacademy.org/
  32. Kline, P. (1998). The new psychometrics: Science, psychology, and measurement. London, UK: Routledge.Google Scholar
  33. Kopainsky, B., Pirnay-Dummer, P., & Alessi, S. M. (2010). Automated assessment of learner’s understanding in complex dynamic systems. In 28th International Conference of the System Dynamics Society, Seoul (pp. 1–39).Google Scholar
  34. Kozleski, E., Gibson, D., & Hynds, A. (2012). Changing complex educational systems: Frameworks for collaborative social justice leadership. In C. Gersti-Pepin & J. Aiken (Eds.), Defining social justice leadership in a global context (pp. 263–286). Charlotte, NC: Information Age Publishing.Google Scholar
  35. Mansell, W., James, M., & The Assessment Reform Group. (2009). Assessment in schools. Fit for purpose? A Commentary by the Teaching and Learning Research Programme. London. Retrieved from http://www.tlrp.org/pub/documents/assessment.pdf
  36. Margetts, H., & Sutcliffe, D. (2013). Addressing the policy challenges and opportunities of “Big data”. Policy & Internet, 5(2), 139–146.  https://doi.org/10.1002/1944-2866.POI326CrossRefGoogle Scholar
  37. Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23(2), 13–23.CrossRefGoogle Scholar
  38. Mislevy, R., Steinberg, L., & Almond, R. (1999). Evidence-centered assessment design. Educational Testing Service. Retrieved from http://www.education.umd.edu/EDMS/mislevy/papers/ECD_overview.html
  39. Mislevy, R., Steinberg, L., & Almond, R. (2003). On the structure of educational assessments. Russell: The Journal of the Bertrand Russell Archives, 1(1), 3–62.Google Scholar
  40. Newell, A., & Simon, H. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice-Hall.Google Scholar
  41. Patton, M. (2011). Developmental evaluation: Applying complexity concepts to enhance innovation and use. New York, NY: The Guilford Press.Google Scholar
  42. Pirnay-Dummer, P., Ifenthaler, D., & Spector, M. (2010). Highly integrated model assessment technology and tools. Educational Technology Research and Development, 58(1), 3–18.CrossRefGoogle Scholar
  43. PISA. (2013). Draft collaborative problem solving framework. Paris, France: PISA.Google Scholar
  44. Prensky, M. (2001). Digital game-based learning. New York, NY: McGraw-Hill.Google Scholar
  45. Psychometrics. (2014). In American heritage dictionary. Boston, MA: Houghton-Mifflin Company.Google Scholar
  46. Quellmalz, E., Timms, M., Buckley, B., Davenport, J., Loveland, M., & Silberglitt, M. (2012). 21st century dynamic assessment. In M. Mayrath, J. Clarke-Midura, & D. Robinson (Eds.), Technology-based assessments for 21st Century skills: Theoretical and practical implications from modern research (pp. 55–90). Charlotte, NC: Information Age Publishers.Google Scholar
  47. Quellmalz, E., Timms, M., & Schneider, S. (2009). Assessment of student learning in science simulations and games. Washington, DC: National Research Council.Google Scholar
  48. Scheffel, M., Niemann, K., & Leony, D. (2012). Key action extraction for learning analytics. 21st century learning … (pp. 320–333). Retrieved from http://link.springer.com/chapter/10.1007/978-3-642-33263-0_25
  49. Schmidt, M., & Lipson, H. (2009). Symbolic regression of implicit equations. In Genetic programming theory and practice (Vol. 7, pp. 73–85). New York, NY: Springer.Google Scholar
  50. Shaffer, D., Hatfield, D., Svarovsky, G., Nash, P., Nulty, A., Bagley, E., … Mislevy, R. (2009). Epistemic network analysis: A prototype for 21st-century assessment of learning. International Journal of Learning and Media, 1(2), 33–53. Retrieved from http://www.mitpressjournals.org/doi/abs/10.1162/ijlm.2009.0013CrossRefGoogle Scholar
  51. Sporns, O. (2011). Networks of the brain. Cambridge, MA: MIT Press.Google Scholar
  52. Sterman, J. D. (1994). Learning in and about complex systems. System Dynamics Review, 10(2/3), 291.CrossRefGoogle Scholar
  53. Stevens, R. (2006). Machine learning assessment systems for modeling patterns of student learning. In D. Gibson, C. Aldrich, & M. Prensky (Eds.), Games and simulations in online learning: Research & development frameworks (pp. 349–365). Hershey, PA: Idea Group.Google Scholar
  54. Stevens, R., Johnson, D., & Soller, A. (2005). Probabilities and predictions: Modeling the development of scientific problem-solving skills. Cell Biology Education, 4(1), 42–57.  https://doi.org/10.1187/cbe.04-03-0036CrossRefGoogle Scholar
  55. Stevens, R., & Palacio-Cayetano, J. (2003). Design and performance frameworks for constructing problem-solving simulations. Cell Biology Education, 2(Fall), 162–179.CrossRefGoogle Scholar
  56. Stiggins, R. J. (1995). Assessment literacy for the 21st century. Phi Delta Kappan, 77(3), 238–245.Google Scholar
  57. Sugihara, G., May, R., Ye, H., Hsieh, C., Deyle, E., Fogarty, M., & Munch, S. (2012). Detecting causality in complex ecosystems. Science (New York, N.Y.), 338(6106), 496–500.  https://doi.org/10.1126/science.1227079CrossRefGoogle Scholar
  58. Tashakkori, A., & Teddlie, C. (2003). Handbook of mixed methods in social & behavioral research. Thousand Oaks, CA: SAGE. Retrieved from http://books.google.com/books?id=F8BFOM8DCKoC&pgis=1Google Scholar
  59. Thagard, P. (2010). How brains make mental models. Retrieved from cogsci.uwaterloo.ca/Articles/Thagard.brains-models.2010.pdfCrossRefGoogle Scholar
  60. Verbert, K., Duval, E., Klerkx, J., Govaerts, S., & Santos, J. L. (2013). Learning analytics dashboard applications. American Behavioral Scientist.  https://doi.org/10.1177/0002764213479363CrossRefGoogle Scholar
  61. Webb, M. E., Gibson, D. C., & Forkosh-Baruch, A. (2013). Challenges for information technology supporting educational assessment. Journal of Computer Assisted Learning, 29(5), 451–462.  https://doi.org/10.1111/jcal.12033CrossRefGoogle Scholar
  62. Webb, M., & Gibson, D. (2015). Technology enhanced assessment in complex collaborative settings. Education and Information Technologies.  https://doi.org/10.1007/s10639-015-9413-5CrossRefGoogle Scholar
  63. Webb, M., & Ifenthaler, D. (2018). Assessment as, for and of 21st Century learning using information technology: An overview. In J. Voogt, G. Knezek, R. Christensen, & K.-W. Lai (Eds.), International handbook of IT in primary and secondary education (2nd ed., pp. 1–20). Cham, Switzerland: Springer.Google Scholar
  64. Wiggins, G. (1989). Teaching to the (authentic) test. Educational Leadership, 46, 41–46.Google Scholar
  65. Zervas, P., & Sampson, D. G. (2018). Supporting reflective lesson planning based on inquiry learning analytics for facilitating students’ problem solving competence: The inspiring science education tools. In T.-W. Chang, R. Huang, & Kinshuk (Eds.), Authentic learning through advances in technologies (pp. 91–114). New York, NY: Springer.Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Curtin UniversityPerthAustralia
  2. 2.School of Education, Communication and SocietyKing’s College LondonLondonUK
  3. 3.Economic and Business Education, Learning, Design and TechnologyUniversity of MannheimMannheimGermany

Personalised recommendations