Advertisement

Implementing Learning Analytics into Existing Higher Education Legacy Systems

  • Daniel Klasen
  • Dirk IfenthalerEmail author
Chapter

Abstract

Learning analytics have become a well-considered aspect of modern digital learning environments. One opportunity of learning analytics is the use of learning process data enabling lecturers to analyse students’ learning progression as well as to identify obstacles and risks. With this analytics knowledge, lecturers may want to scaffold students’ learning activities to improve the learning progress and overcome obstacles or risks. Prompts are known to be a possible solution for such scaffolding mechanics. However, implementing prompts into existing legacy systems in learning environments with high data privacy concerns is quite a challenge. This research shows how a prompting application has been implemented into an existing university environment by adding a plug-in to the local digital learning platform which injects user-centric prompts to specific objects within their digital learning environment. The prompts are dynamically loaded from a separate learning analytics application which also collects the students’ learning trails and progress. The system is evaluated in two units in the fall semester 2017 with more than 400 students altogether. The system collects up to two thousand student events per day. An in-depth empirical investigation on how various prompts influence students’ learning behaviours and outcomes is currently conducted.

Keywords

Learning analytics Prompting Learner privacy Learning progress 

References

  1. Bannert, M. (2009). Promoting self-regulated learning through prompts. Zeitschrift für Pädagogische Psychologie, 23(2), 139–145.CrossRefGoogle Scholar
  2. Davis, E. (2003). Prompting middle school science students for productive reflection: Generic and directed prompts. Journal of the Learning Sciences, 12(1), 91–142.CrossRefGoogle Scholar
  3. Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64–71.  https://doi.org/10.1007/s11528-014-0822-x CrossRefGoogle Scholar
  4. Ifenthaler, D. (2012). Determining the effectiveness of prompts for self-regulated learning in problem-solving scenarios. Journal of Educational Technology & Society, 15(1), 38–52.Google Scholar
  5. Ifenthaler, D., Gibson, D. C., & Dobozy, E. (2018). Informing learning design through analytics: Applying network graph analysis. Australasian Journal of Educational Technology, 34(2), 117–132.  https://doi.org/10.14742/ajet.3767 CrossRefGoogle Scholar
  6. Ifenthaler, D., & Lehmann, T. (2012). Preactional self-regulation as a tool for successful problem solving and learning. Technology, Instruction, Cognition and Learning, 9(1-2), 97–110.Google Scholar
  7. Ifenthaler, D., & Schumacher, C. (2016). Student perceptions of privacy principles for learning analytics. Educational Technology Research and Development, 64(5), 923–938.  https://doi.org/10.1007/s11423-016-9477-y CrossRefGoogle Scholar
  8. Ifenthaler, D., & Tracey, M. W. (2016). Exploring the relationship of ethics and privacy in learning analytics and design: Implications for the field of educational technology. Educational Technology Research and Development, 64(5), 877–880.  https://doi.org/10.1007/s11423-016-9480-3 CrossRefGoogle Scholar
  9. Ifenthaler, D., & Widanapathirana, C. (2014). Development and validation of a learning analytics framework: Two case studies using support vector machines. Technology, Knowledge and Learning, 19(1-2), 221–240.  https://doi.org/10.1007/s10758-014-9226-4 CrossRefGoogle Scholar
  10. Kuhnel, M., Seiler, L., Honal, A., & Ifenthaler, D. (2018). Mobile learning analytics in higher education: Usability testing and evaluation of an app prototype. Interactive Technology and Smart Education.  https://doi.org/10.1108/itse-04-2018-0024 CrossRefGoogle Scholar
  11. McLoughlin, C., & Lee, M. J. W. (2010). Personalized and self regulated learning in the Web 2.0 era: International exemplars of innovative pedagogy using social software. Australasian Journal of Educational Technology, 26(1), 28–43.CrossRefGoogle Scholar
  12. Schumacher, C., & Ifenthaler, D. (2018). Features students really expect from learning analytics. Computers in Human Behavior, 78, 397–407.  https://doi.org/10.1016/j.chb.2017.06.030 CrossRefGoogle Scholar
  13. Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510–1529.  https://doi.org/10.1177/0002764213479366 CrossRefGoogle Scholar
  14. Spring Boot Project. (2018). Retrieved June 29, 2018, from http://spring.io/projects/spring-boot
  15. Veenman, M. V. J. (1993). Intellectual ability and metacognitive skill: Determinants of discovery learning in computerized learning environments. Amsterdam, The Netherlands: University of Amsterdam.Google Scholar
  16. Verbert, K., Manouselis, N., Drachsler, H., & Duval, E. (2012). Dataset-driven research to support learning and knowledge analytics. Educational Technology & Society, 15(3), 133–148.Google Scholar
  17. Zimmerman, B. J. (2002). Becoming a self-regulated learner: An overview. Theory Into Practice, 41(2), 64–70.CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.University of MannheimMannheimGermany
  2. 2.Curtin UniversityPerthAustralia

Personalised recommendations