Advertisement

OpenLAIR an Open Learning Analytics Indicator Repository Dashboard

Conference paper
  • 926 Downloads
Part of the Lecture Notes in Computer Science book series (LNCS, volume 12315)

Abstract

In this demo paper we present a tool that provides an overview of learning analytics indicators, metrics, and learning design activities in the field of learning analytics over the past decade. The system is based on our literature review from a total of 123 scientific publications, where we fetched 132 indicators and their metrics, 40 learning activities, and eight learning events (i.e., create, explore, practice, imitate, receive, debate, meta-learn, and experiment). Therefore, we proposed a system that will provide indicators and metrics based on learning activities and learning events selected by the stakeholders. The aim is to help the stakeholders in the application of learning analytics.

Keywords

Indicators Metrics Learning activities Learning analytics Learning design Learning events Dashboards 

References

  1. 1.
    Leclercq, D., Poumay, M.: The 8 learning Events Model and its principles (Release 2005–1) (2005)Google Scholar
  2. 2.
    Lockyer, L., Heathcote, E., Dawson, S.: Informing pedagogical action: aligning learning analytics with learning design. Am. Behav. Sci. 57, 1439–1459 (2013)CrossRefGoogle Scholar
  3. 3.
    Mangaroska, K., Giannakos, M.N.: Learning analytics for learning design: a systematic literature review of analytics-driven design to enhance learning. IEEE Trans. Learn. Technol. 12, 516–534 (2018)CrossRefGoogle Scholar
  4. 4.
    Martin, F., Ndoye, A., Wilkins, P.: Using learning analytics to enhance student learning in online courses based on quality matters standards. Educ. Technol. Syst. J. 45, 165–187 (2016)CrossRefGoogle Scholar
  5. 5.
    Park, Y., Jo, I.H.: Development of the learning analytics dashboard to support students’ learning performance. J. Univ. Comput. Sci. 21(1), 110 (2015)Google Scholar
  6. 6.
    Duval, E.: Attention please!: learning analytics for visualization and recommendation. In: LAK (2011)Google Scholar
  7. 7.
    Ferguson, R.: Learning analytics: drivers, developments and challenges. Int. J. Technol. Enhanced Learn. 4(5/6), 304–317 (2012)CrossRefGoogle Scholar
  8. 8.
    Craftb, B.: Learning design: reflections upon the current landscape. Res. Learn. Technol. 20, 85–94 (2012)Google Scholar
  9. 9.
    Chatti, M.A., et al.: Learning analytics: challenges and future research directions. eleed 10(1) (2014) Google Scholar
  10. 10.
    Jivet, I., Scheffel, M., Drachsler, H., Specht, M.: Awareness is not enough: pitfalls of learning analytics dashboards in the educational practice. In: Lavoué, É., Drachsler, H., Verbert, K., Broisin, J., Pérez-Sanagustín, M. (eds.) EC-TEL 2017. LNCS, vol. 10474, pp. 82–96. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-66610-5_7CrossRefGoogle Scholar
  11. 11.
    Park, J., Denaro, K., Rodriguez, F., Smyth, P., Warschauer, M.: Detecting changes in student behavior from clickstream data. In: Proceedings of the Seventh International Learning Analytics & Knowledge Conference, pp. 21–30. ACM, March 2017Google Scholar
  12. 12.
    Mor, Y., Ferguson, R., Wasson, B.: Learning design, teacher inquiry into student learning and learning analytics: a call for action. Br. J. Educ. Technol. 46(2), 221–229 (2015)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.DIPF—Leibniz Institute for Research and Information in EducationFrankfurtGermany

Personalised recommendations