A Framework for Interactive Exploratory Learning Analytics

  • Mohammad Javad MahzoonEmail author
  • Mary Lou Maher
  • Omar Eltayeby
  • Wenwen Dou
  • Kazjon Grace
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10925)


Many analytic tools have been developed to discover knowledge from student data. However, the knowledge discovery process requires advanced analytical modelling skills, making it the province of data scientists. This impedes the ability of educational leaders, professors, and advisors to engage with the knowledge discovery process directly. As a result, it is challenging for analysis to take advantage of domain expertise, making its outcome often neither interesting nor useful. Usually the outcome produced from such analytic tools is static, preventing domain experts from exploring different hypotheses by changing data models or predictive models inside the tool. We have developed a framework for interactive and exploratory learning analytics which begins to address these challenges. We engaged in data exploration and hypotheses generation with our university domain experts by conducting two focus groups. We used the findings of these focus groups to validate our framework, arguing that it enables domain experts to explore the data, analysis and interpretation of student data to discover useful and interesting knowledge.


Learning analytics Exploratory data analytics Educational data mining Learning analytics framework 



This research was supported by Charlotte Research Institute. We acknowledge Dr. Shannon Schlueter and Dr. Audrey Rorrer for their assistance in gaining access to the student data stored in the university databases. We had numerous discussions about our ideas for modelling student data with a broad range of faculty, of whom we especially thank Dr. Mohsen Dorodchi, Dr. Bojan Cukic, Dr. Xi (Sunshine) Niu, and Dr. Noseong Park.


  1. 1.
    Chatti, M.A., Dyckhoff, A.L., Schroeder, U., Thüs, H.: A reference model for learning analytics. Int. J. Technol. Enhanced Learn. 4(5–6), 318–331 (2012)CrossRefGoogle Scholar
  2. 2.
    Gašević, D., Dawson, S., Siemens, G.: Let’s not forget: learning analytics are about learning. TechTrends 59(1), 64–71 (2015)CrossRefGoogle Scholar
  3. 3.
    Greller, W., Drachsler, H.: Translating learning into numbers: a generic framework for learning analytics. Educ. Technol. Soc. 15(3), 42–57 (2012)Google Scholar
  4. 4.
    Gašević, D., Kovanović, V., Joksimović, S.: Piecing the learning analytics puzzle: a consolidated model of a field of research and practice (2017) Google Scholar
  5. 5.
    Santos, J.L., Verbert, K., Govaerts, S., Duval, E.: Addressing learner issues with StepUp!: an evaluation, p. 14 (2013).
  6. 6.
    Landrum, R.E., Gurung, R.A.R., Spann, N.: Assessments of textbook usage and the relationship to student course performance. Coll. Teach. 60(1), 17–24 (2012). Scholar
  7. 7.
    Bos, N., Groeneveld, C., van Bruggen, J., Brand-Gruwel, S.: The use of recorded lectures in education and the impact on lecture attendance and exam performance. Br. J. Educ. Technol. (2015).
  8. 8.
    Ma, Y., Liu, B., Wong, C.K., Yu, P.S., Lee, S.M.: Targeting the right students using data mining. In: Proceedings of the Sixth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 457–464. ACM (2000)Google Scholar
  9. 9.
    Minaei-Bidgoli, B., Punch, W.F.: Using genetic algorithms for data mining optimization in an educational web-based system. In: Cantú-Paz, E., et al. (eds.) GECCO 2003. LNCS, vol. 2724, pp. 2252–2263. Springer, Heidelberg (2003). Scholar
  10. 10.
    Morris, L.V., Wu, S.-S., Finnegan, C.L.: Predicting retention in online general education courses. Am. J. Distance Educ. 19(1), 23–36 (2005)CrossRefGoogle Scholar
  11. 11.
    Bravo, J., Ortigosa, A.: Detecting symptoms of low performance using production rules. In: International Working Group on Educational Data Mining (2009)Google Scholar
  12. 12.
    Romero, C., Ventura, S., García, E.: Data mining in course management systems: moodle case study and tutorial. Comput. Educ. 51(1), 368–384 (2008). Scholar
  13. 13.
    Arnold, K.E.: Signals: applying academic analytics. EDUCAUSE Q. 33(1), n1 (2010)Google Scholar
  14. 14.
    Arnold, K.E., Pistilli, M.D.: Course signals at Purdue: using learning analytics to increase student success, p. 267 (2012).
  15. 15.
    Campbell, J.P., Oblinger, D.G., et al.: Academic analytics. EDUCAUSE Rev. 42(4), 40–57 (2007)Google Scholar
  16. 16.
    Campbell, J.P.: Utilizing student data within the course management system to determine undergraduate student academic success: an exploratory study. ProQuest (2007)Google Scholar
  17. 17.
    Jayaprakash, S.M., Moody, E.W., Lauría, E.J., Regan, J.R., Baron, J.D.: Early alert of academically at-risk students: an open source analytics initiative. J. Learn. Anal. 1(1), 6–47 (2014)CrossRefGoogle Scholar
  18. 18.
    Macfadyen, L.P., Dawson, S.: Mining LMS data to develop an “early warning system” for educators: a proof of concept. Comput. Educ. 54(2), 588–599 (2010). Scholar
  19. 19.
    Martinez-Maldonado, R., Pardo, A., Mirriahi, N., Yacef, K., Kay, J., Clayphan, A.: The latux workflow: designing and deploying awareness tools in technology-enabled learning settings. In: Proceedings of the Fifth International Conference on Learning Analytics and Knowledge, LAK 2015, Poughkeepsie, New York, pp. 1–10. ACM (2015).
  20. 20.
    Han, J., Kamber, M., Pei, J.: Data Mining: Concepts and Techniques. Elsevier, Amsterdam (2011)zbMATHGoogle Scholar
  21. 21.
    Geng, L.Q., Hamilton, H.J.: Interestingness measures for data mining: a survey. ACM Comput. Surv. 38(3), 1–32 (2006)CrossRefGoogle Scholar
  22. 22.
    McGarry, K.E.N.: A survey of interestingness measures for knowledge discovery. Knowl. Eng. Rev. 20(01), 39 (2005). Scholar
  23. 23.
    Zhang, Y., Zhang, L., Nie, G., Shi, Y.: A survey of interestingness measures for association rules. In: Proceedings of the 2009 International Conference on Business Intelligence and Financial Engineering, pp. 460–463 (2009).
  24. 24.
    Bakharia, A., Dawson, S.: Snapp: a bird’s-eye view of temporal participant interaction. In: Proceedings of the 1st International Conference on Learning Analytics and Knowledge, LAK 2011, Banff, Alberta, Canada, pp. 168–173. ACM (2011).
  25. 25.
    Ferguson, R., Shum, S.B.: Social learning analytics: five approaches. In: Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, LAK 2012, Vancouver, British Columbia, Canada, pp. 23–33. ACM (2012).
  26. 26.
    Suthers, D., Chu, K.-H.: Multi-mediated community structure in a socio-technical network. In: Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, LAK 2012, Vancouver, British Columbia, Canada, pp. 43–53. ACM (2012).
  27. 27.
    Clow, D.: The learning analytics cycle: closing the loop effectively. In: Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, LAK 2012, Vancouver, British Columbia, Canada, pp. 134–138. ACM (2012).
  28. 28.
    Monroe, M., Lan, R., Lee, H., Plaisant, C., Shneiderman, B.: Temporal event sequence simplification. IEEE Trans. Visual Comput. Graph. 19(12), 2227–2236 (2013)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Mohammad Javad Mahzoon
    • 1
    Email author
  • Mary Lou Maher
    • 1
  • Omar Eltayeby
    • 1
  • Wenwen Dou
    • 1
  • Kazjon Grace
    • 2
  1. 1.University of North Carolina at CharlotteCharlotteUSA
  2. 2.The University of SydneySydneyAustralia

Personalised recommendations