Skip to main content
Log in

Supporting Indicator Personalization and Platform Extensibility in Open Learning Analytics

Technology, Knowledge and Learning Aims and scope Submit manuscript

Cite this article


The demand for Open learning analytics (OLA) has grown in recent years due to the increasing interest in the usage of self-organized, networked, and lifelong learning environments. However, platforms that can deliver an effective and efficient OLA are still lacking. Most OLA platforms currently available do not continuously involve end-users in the indicator definition process and follow design patterns which make it difficult to extend the platform to meet new user requirements. These limitations restrict the scope of such platforms where users regulate their own learning process according to their needs. In this paper, we discuss the Open learning analytics platform (OpenLAP) as a step toward an ecosystem that addresses the indicator personalization and platform extensibility challenges of OLA. OpenLAP follows a user-centered learning analytics approach that involves end-users in the process of defining custom indicators that meet their needs. Moreover, it provides a modular and extensible architecture that allows the easy integration of new analytics methods and visualization techniques.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10







Apereo LAI:

Apereo Learning Analytics Initiative


Architecture Tradeoff Analysis Method




Learning Analytics


Learning Analytics Processor


Learning Context Data Model


Massive Open Online Course


Open Learning Analytics


Open Learning Analytics Architecture


Open Learning Analytics Platform


Rule-based Indicator Definition Tool


Society for Learning Analytics Research


System Usability Scale


Technology Enhanced Learning


User Interface


  • Baker, R., & Inventado, P. (2014). Educational data mining and learning analytics. In: J. Larusson & B. White (Eds.), Learning analytics (pp. 61–75). Springer.

  • Brooke, J. (1996). SUS-A quick and dirty usability scale. In P. Jordan, B. Thomas, B. Weerdmeester, & A. McClelland (Eds.), Usability evaluation in industry (pp. 189–194). Taylor & Francis.

    Google Scholar 

  • Chatti, M. A. (2010). Personalization in technology enhanced learning: A social software perspective. Shaker Verlag. Retrieved from

  • Chatti, M. A., & Muslim, A. (2019). The PERLA framework: Blending personalization and learning analytics. International Review of Research in Open and Distributed Learning, 20(1).

  • Chatti, M. A., Dyckhoff, A. L., Schroeder, U., & Thüs, H. (2012). A reference model for learning analytics. International Journal of Technology Enhanced Learning, 4, 318–331.

    Article  Google Scholar 

  • Chatti, M. A., Lukarov, V., Thüs, H., Muslim, A., Yousef, A. M. F., Wahid, U., Greven, C., Chakrabarti, A., & Schroeder, U. (2014). Learning Analytics: Challenges and Future Research Directions. eleed, 10.

  • Chatti, M. A., Muslim, A., & Schroeder, U. (2017). Toward an open learning analytics ecosystem. In: B. Kei Daniel (Ed.), Big data and learning analytics in higher education (pp. 195–219). Cham: Springer.

    Chapter  Google Scholar 

  • Clements, P., Bachmann, F., Bass, L., Garlan, D., Ivers, J., Little, R., & Stafford, J. (2010). Documenting Software Architectures: Views and Beyond (2nd ed.). Digital Library Digital Library: Addison-Wesley Professional.

    Google Scholar 

  • Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4, 304–317.

    Article  Google Scholar 

  • Griffiths, D., Hoel, T., & Cooper, A. (2016). Learning Analytics Interoperability: Requirements, Specifications and Adoption. Retrieved June 29, 2021, from

  • Ionita, M. T., Hammer, D. K., & Obbink, H. (2002). Scenario-based software architecture evaluation methods: An overview. In Workshop on methods and techniques for software architecture review and assessment at the international conference on software engineering (pp. 19–24).

  • Jayaprakash, S. M. (2015). Updates from Apereo Learning Analytics Initiative (Apereo LAI). Retrieved June 29, 2021, from

  • Kazman, R., Klein, M., & Clements, P. (2000). ATAM: Method for architecture evaluation. Carnegie Mellon University Pittsburgh PA Software Engineering Institute.

  • Lukarov, V., Chatti, M. A., Thüs, H., Kia, F. S., Muslim, A., Greven, C., & Schroeder, U. (2014). Data models in learning analytics. In Proceedings of DeLFI Workshops (pp. 88–95).

  • Muslim, A., Chatti, M. A., Bashir, M. B., Varela, O. E. B., & Schroeder, U. (2018). A modular and extensible framework for open learning analytics. Journal of Learning Analytics, 5(1), 92–100.

    Article  Google Scholar 

  • Muslim, A., Chatti, M. A., & Guesmi, M. (2020) Open learning analytics: A systematic literature review and future perspectives. In N. Pinkwart & S. Liu (Eds.), Artificial intelligence supported educational technologies. Advances in analytics for learning and teaching (pp. 3–29). Cham: Springer.

    Chapter  Google Scholar 

  • Muslim, A., Chatti, M. A., Mahapatra, T., & Schroeder, U. (2016). A rule-based indicator definition tool for personalized learning analytics. In Proceedings of the sixth international conference on learning analytics & knowledge (pp. 264–273). New York, NY, USA: ACM.

  • Muslim, A., Chatti, M. A., Mughal, M., & Schroeder, U. (2017). The goal - question - indicator approach for personalized learning analytics. In Proceedings of the 9th international conference on computer supported education - volume 1: CSEDU (pp. 371–378). ScitePress.

  • Norman, D. (2013). The design of everyday things. Basic Books.

    Google Scholar 

  • Romero, C., Ventura, S., & Garcı´a, E. (2008). Data mining in course management systems: Moodle case study and tutorial. Computers & Education, 51, 368–384.

    Article  Google Scholar 

  • Sclater, N. (2015). Jisc’s Learning Analytics Architecture - who’s involved, what are the products and when will it be available? Retrieved June 29, 2021, from

  • Siemens, G., Gasevic, D., Haythornthwaite, C., Dawson, S., Shum, S. B., Ferguson, R., Baker, R. S. J. D. (2011). Open Learning Analytics: an integrated & modularized platform (Doctoral dissertation, Open University Press).

  • Thüs, H., Chatti, M. A., Greven, C., & Schroeder, U. (2014). Kontexterfassung,-modellierung und-auswertung in Lernumgebungen. In DeLFI 2014-Die 12. e-Learning Fachtagung Informatik (pp. 157–162). Gesellschaft für Informatik.

Download references

Author information

Authors and Affiliations



The evaluation was designed and conducted by AM. The Literature review was performed and results were documented by AM together with MAC. Editorial reviews and formatting of the paper were done by AM and MAC. US is the head of the department where the evaluation was performed. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Arham Muslim.

Ethics declarations

Conflicts of interests

The authors declare that they have no conflicts of interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Muslim, A., Chatti, M.A. & Schroeder, U. Supporting Indicator Personalization and Platform Extensibility in Open Learning Analytics. Tech Know Learn 27, 429–448 (2022).

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: