Advertisement

Verifying the Stability and Sensitivity of Learning Analytics Based Prediction Models: An Extended Case Study

Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 583)

Abstract

In this empirical contribution, a follow-up study of previous research [1], we focus on the issues of stability and sensitivity of Learning Analytics based prediction models. Do predictions models stay intact, when the instructional context is repeated in a new cohort of students, and do predictions models indeed change, when relevant aspects of the instructional context are adapted? Applying Buckingham Shum and Deakin Crick’s theoretical framework of dispositional learning analytics combined with formative assessments and learning management systems, we compare two cohorts of a large module introducing mathematics and statistics. Both modules were based on principles of blended learning, combining face-to-face Problem-Based Learning sessions with e-tutorials, and have similar instructional design, except for an intervention into the design of quizzes administered in the module. We analyse bivariate and multivariate relationships of module performance and track and disposition data to provide evidence of both stability and sensitivity of prediction models.

Keywords

Blended learning Dispositional learning analytics E-Tutorials Formative assessment Learning dispositions 

Notes

Acknowledgements

The project reported here has been supported and co-financed by SURF-foundation as part of the Learning Analytics Stimulus program.

References

  1. 1.
    Tempelaar, D.T., Rienties, B., Giesbers, B.: In Search for the most informative data for feedback generation: learning analytics in a data-rich context. Comput. Hum. Behav. 47, 157–167 (2015)CrossRefGoogle Scholar
  2. 2.
    Bienkowski, M., Feng, M., Means, B.: Enhancing teaching and learning through educational data mining and learning analytics: an issue brief. US Department of Education, Office of Educational Technology, pp. 1–57 (2012)Google Scholar
  3. 3.
    Siemens, G., Dawson, S., Lynch, G.: Improving the quality of productivity of the higher education sector: policy and strategy for systems-level deployment of learning analytics. Society for Learning Analytics Research (2013). http://solaresearch.org/Policy_Strategy_Analytics.pdf
  4. 4.
    Tobarra, L., Robles-Gómez, A., Ros, S., Hernández, R., Caminero, A.C.: Analyzing the students’ behavior and relevant topics in virtual learning communities. Comput. Hum. Behav. 31, 659–669 (2014)CrossRefGoogle Scholar
  5. 5.
    Greller, W., Drachsler, H.: Translating learning into numbers: a generic framework for learning analytics. J. Educ. Technol. Soc. 15(3), 42–57 (2012)Google Scholar
  6. 6.
    Stiles, R.J.: Understanding and managing the risks of analytics in higher education: a guide. In: Educause (2012)Google Scholar
  7. 7.
    Siemens, G.: Learning analytics: the emergence of a discipline. Am. Behav. Sci. 57(10), 1380–1400 (2013)CrossRefGoogle Scholar
  8. 8.
    Wolff, A., Zdrahal, Z., Nikolov, A., Pantucek, M.: Improving retention: predicting at-risk students by analysing clicking behaviour in a virtual learning environment. In: Suthers, D., Verbert, K. (eds.) Proceedings of the 3rd International Conference on Learning Analytics and Knowledge, pp. 145–149. ACM, New York (2013)Google Scholar
  9. 9.
    Agudo-Peregrina, Á.F., Iglesias-Pradas, S., Conde-González, M.Á., Hernández-García, Á.: Can we predict success from log data in VLEs? Classification of interactions for learning analytics and their relation with performance in VLE-supported F2F and online learning. Comput. Hum. Behav. 31, 542–550 (2014)CrossRefGoogle Scholar
  10. 10.
    Macfadyen, L.P., Dawson, S.: Mining LMS data to develop an “early warning system” for educators: a proof of concept. Comput. Educ. 54(2), 588–599 (2010)CrossRefGoogle Scholar
  11. 11.
    Tempelaar, D.T., Heck, A., Cuypers, H., van der Kooij, H., van de Vrie, E.: Formative assessment and learning analytics. In: Suthers, D., Verbert, K. (eds.) Proceedings of the 3rd International Conference on Learning Analytics and Knowledge, pp. 205–209. ACM, New York (2013)Google Scholar
  12. 12.
    Tempelaar, D.T., Kuperus, B., Cuypers, H., Van der Kooij, H., Van de Vrie, E., Heck, A.: The role of digital, formative testing in e-learning for mathematics: a case study in The Netherlands. Univ. Knowl. Soc. J. UoC 9(1), 92–114 (2012). In: “Mathematical e-learning” [online dossier]CrossRefGoogle Scholar
  13. 13.
    Arbaugh, J.B.: System, scholar, or students? Which most influences online MBA course effectiveness? J. Comput. Assist. Learn. 30, 349–362 (2014)CrossRefGoogle Scholar
  14. 14.
    Richardson, J.T.E.: The attainment of white and ethnic minority students in distance education. Assess. Eval. High. Educ. 37(4), 393–408 (2012)CrossRefGoogle Scholar
  15. 15.
    Tempelaar, D.T., Niculescu, A., Rienties, B., Giesbers, B., Gijselaers, W.H.: How achievement emotions impact students’ decisions for online learning, and what precedes those emotions. Internet High. Educ. 15(3), 161–169 (2012)CrossRefGoogle Scholar
  16. 16.
    Verbert, K., Manouselis, N., Drachsler, H., Duval, E.: Dataset-driven research to support learning and knowledge analytics. J. Educ. Technol. Soc. 15(3), 133–148 (2012)Google Scholar
  17. 17.
    Baker, R.: Data mining for education. Int. Encycl. Educ. 7, 112–118 (2010)CrossRefGoogle Scholar
  18. 18.
    Thakur, G., Olama, M.M., McNair, W., Sukumar, S.R., Studham, S.: Towards adaptive educational assessments: predicting student performance using temporal stability and data analytics in learning management systems. In: Proceedings 20th ACM SIGKDD Conference on Knowledge Discovery and Data Mining. ACCESS, New York City, NY (2014)Google Scholar
  19. 19.
    Tempelaar, D.T., Rienties, B., Giesbers, B.: Computer assisted, formative assessment and dispositional learning analytics in learning mathematics and statistics. In: Kalz, M., Ras, E. (eds.) CAA 2014. CCIS, vol. 439, pp. 67–78. Springer, Heidelberg (2014)Google Scholar
  20. 20.
    Buckingham Shum, S., Deakin Crick, R.: Learning dispositions and transferable competencies: pedagogy, modelling and learning analytics. In: Proceedings LAK2012: 2nd International Conference on Learning Analytics and Knowledge, pp. 92–101. ACM Press, New York (2012)Google Scholar
  21. 21.
    Rienties, B., Tempelaar, D.T., Giesbers, B., Segers, M., Gijselaers, W.H.: A dynamic analysis of social interaction in computer mediated communication; a preference for autonomous learning. Interact. Learn. Environ. 22(5), 631–648 (2012)CrossRefGoogle Scholar
  22. 22.
    Järvelä, S., Hurme, T., Järvenoja, H.: Self-regulation and motivation in computer-supported collaborative learning environments. In: Ludvigson, S., Lund, A., Rasmussen, I., Säljö, R. (eds.) Learning Across Sites: New Tools, Infrastructure and Practices, pp. 330–345. Routledge, New York (2011)Google Scholar
  23. 23.
    Nistor, N., Baltes, B., Dascălu, M., Mihăilă, D., Smeaton, G., Trăuşan-Matu, Ş.: Participation in virtual academic communities of practice under the influence of technology acceptance and community factors. A learning analytics application. Comput. Hum. Behav. 34, 339–344 (2014)CrossRefGoogle Scholar
  24. 24.
    Rienties, B., Alden Rivers, B.: Measuring and understanding learner emotions: evidence and prospects. learning analytics review 1, Learning Analytics Community Exchange (LACE) (2014). http://www.laceproject.eu/learning-analytics-review/measuring-and-understanding-learner-emotions/
  25. 25.
    Buckingham Shum, S., Ferguson, R.: Social learning analytics. J. Educ. Technol. Soc. 15(3), 3–26 (2012)Google Scholar
  26. 26.
    Vermunt, J.D.: Metacognitive, cognitive and affective aspects of learning styles and strategies: a phenomenographic analysis. High. Educ. 31, 25–50 (1996)CrossRefGoogle Scholar
  27. 27.
    Martin, A.J.: Examining a multidimensional model of student motivation and engagement using a construct validation approach. Br. J. Educ. Psychol. 77(2), 413–440 (2007)CrossRefGoogle Scholar
  28. 28.
    Pekrun, R., Goetz, T., Frenzel, A.C., Barchfeld, P., Perry, R.P.: Measuring emotions in students’ learning and performance: the achievement emotions questionnaire (AEQ). Contemp. Educ. Psychol. 36(1), 36–48 (2011)CrossRefGoogle Scholar
  29. 29.
    Narciss, S.: Feedback strategies for interactive learning tasks. In: Spector, J.M., Merrill, M.D., van Merrienboer, J.J.G., Driscoll, M.P. (eds.) Handbook of Research on Educational Communications and Technology, 3rd edn, pp. 125–144. Lawrence Erlbaum Associates, Mahaw (2008)Google Scholar
  30. 30.
    Narciss, S., Huth, K.: Fostering achievement and motivation with bug-related tutoring feedback in a computer-based training for written subtraction. Learn. Instr. 16(4), 310–322 (2006)CrossRefGoogle Scholar
  31. 31.
    Marks, R.B., Sibley, S.D., Arbaugh, J.B.: A structural equation model of predictors for effective online learning. J. Manag. Educ. 29(4), 531–563 (2005)CrossRefGoogle Scholar
  32. 32.
    Lajoie, S.P., Azevedo, R.: Teaching and learning in technology-rich environments. In: Alexander, P., Winne, P. (eds.) Handbook of Educational Psychology, 2nd edn, pp. 803–821. Erlbaum, Mahwah (2006)Google Scholar
  33. 33.
    Rienties, B., Tempelaar, D.T., Van den Bossche, P., Gijselaers, W.H., Segers, M.: The role of academic motivation in computer-supported collaborative learning. Comput. Hum. Behav. 25(6), 1195–1206 (2009)CrossRefGoogle Scholar
  34. 34.
    Schmidt, H.G., Van Der Molen, H.T., Te Winkel, W.W.R., Wijnen, W.H.F.W.: Constructivist, problem-based learning does work: a meta-analysis of curricular comparisons involving a single medical school. Educ. Psychol. 44(4), 227–249 (2009)CrossRefGoogle Scholar
  35. 35.
    Tempelaar, D.T., Rienties, B., Giesbers, B.: Who profits most from blended learning? Industr. High. Educ. 23(4), 285–292 (2009)CrossRefGoogle Scholar
  36. 36.
    Clow, D.: An overview of learning analytics. Teach. High. Educ. 18(6), 683–695 (2013)CrossRefGoogle Scholar
  37. 37.
    Boud, D., Falchikov, N.: Aligning assessment with long-term learning. Assess. Eval. High. Educ. 31(4), 399–413 (2006)CrossRefGoogle Scholar
  38. 38.
    Whitelock, D., Twiner, A., Richardson, J.T.E., Field, D., Pulman, S.: OpenEssayist: a supply and demand learning analytics tool for drafting academic essays. In: Proceedings of the 4th International Conference on Learning Analytics and Knowledge, pp. 208–212. ACM, New York (2015)Google Scholar
  39. 39.
    Hommes, J., Rienties, B., de Grave, W., Bos, G., Schuwirth, L., Scherpbier, A.: Visualising the invisible: a network approach to reveal the informal social side of student learning. Adv. Health Sci. Educ. 17(5), 743–757 (2012)CrossRefGoogle Scholar
  40. 40.
    Rienties, B., Cross, S., Zdrahal, Z.: Implementing a learning analytics intervention and evaluation framework: what works? In: Motidyang, B., Butson, R. (eds.) Big Data and Learning Analytics in Higher Education. Springer, Berlin (2015)Google Scholar
  41. 41.
    Lehmann, T., Hähnlein, I., Ifenthaler, D.: Cognitive, metacognitive and motivational perspectives on preflection in self-regulated online learning. Comput. Hum. Behav. 32, 313–323 (2014)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Dirk T. Tempelaar
    • 1
  • Bart Rienties
    • 2
  • Bas Giesbers
    • 3
  1. 1.School of Business and EconomicsMaastricht UniversityMaastrichtThe Netherlands
  2. 2.Institute of Educational TechnologyOpen University UKMilton KeynesUK
  3. 3.Rotterdam School of ManagementRotterdamThe Netherlands

Personalised recommendations