In this paper, we discuss the design, development, and implementation of a Learning Analytics (LA) dashboard in the area of Higher Education (HE). The dashboard meets the demands of the different stakeholders, maximizes the mainstreaming potential and transferability to other contexts, and is developed in the path of Open Source. The research concentrates on developing an appropriate concept to fulfil its objectives and finding a suitable technology stack. Therefore, we determine the capabilities and functionalities of the dashboard for the different stakeholders. This is of significant importance as it identifies which data can be collected, which feedback can be given, and which functionalities are provided. A key approach in the development of the dashboard is the modularity. This leads us to a design with three modules: the data collection, the search and information processing, and the data presentation. Based on these modules, we present the steps of finding a fitting Open Source technology stack for our concept and discuss pros and cons trough out the process.
- Learning Analytics
- Learning dashboard
- Open Source
- Technology enhanced learning
A dashboard is a visual display of the most relevant information, which is consolidated and arranged on a single screen to be monitored at a glance and needed to achieve one or more objectives . Siemens defined Learning Analytics (LA) as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” . Whereas, the main question addressed by Erik Duval is about what exactly should be measured to get a deeper understanding of how learning takes place .
Therefore, a Learning Analytics dashboard often present information to resources used, time spent, social interactions, artefacts produced, and exercise and test results . Thus, the students are able to monitor their learning efforts for reaching their intended learning outcomes more easily .
In this paper we discuss the design and development of Learning Analytics dashboards in the area Higher Education (HE). Our objectives are:
cover the demands of the different stakeholders,
maximize the mainstreaming potential and transferability to other contexts, and
make it available for everyone by developing in the path of Open Source.
The research concentrates on developing an appropriate concept to fulfil these objectives and finding a suitable technology stack. Therefore, we determine the capabilities and functionalities of the dashboard for the different stakeholders. This is of significant importance as it identifies which data can be collected, which feedback can be given, and which functionalities are provided.
The main demands of the stakeholders are the support of different data-sources as well as different types of data-sources. Therefore, we searched for appropriate technologies which support various kinds of sources. During this search, we focused on technologies which are available under an Open Source licenses to cover our third objective.
Further, we wanted to easily combine different technologies, so we decided to go with a modular architecture. This ensures that the stakeholders can choose between different software modules (e.g. they can choose to use proprietary software) for easy adjustment to their needs. We decided to use a design with three layers: the data collection, the search and information processing, and the data presentation. Additionally, this approach helped us with our second objective, the transferability to other contexts and the maximization of the mainstreaming potential.
The next section provides background about Learning Analytics and dashboards. The third section explains the development of our Learning Analytics dashboard as well as improvements made throughout the different versions. The last section discusses the restrictions and remarks on future work.
The purpose of Learning Analytics is the understanding and optimization of learning and the environment in which it occurs trough measurement, collection, analysis, and reporting of data [6, 7]. Since Learning Analytics was first mentioned in the New Media Consortium (NMC) Horizon Report 2012, it has gained an increasing relevance . Further, the Horizon Report 2013 defined Learning Analytics as one of the most important trends in technology-enhanced learning and teaching . Learning Analytics evaluates user’s behavior in the context of teaching and learning. Further, it analysis and interprets this behavior to gain new insights to support stakeholders with models for improving learning and teaching, organization and decision making .
One of the main goals of Learning Analytics is the return of the gained knowledge to learners and teachers; thereby, optimizing their learning and teaching behavior, promoting the development of skills in the area, and to better understand education as well as the connected fields such as university business and marketing [10, 11].
The Learning Analytics Life Cycle, which was introduced in 2015 by Khalil and Ebner and shown in Fig. 1, consists of four parts. The learning environment, where stakeholders such as learner or teachers produce data. The big data, which consist of massive amounts of different datasets. The analytics, which include different analytical techniques. And the Act, where objectives are achieved to optimize the learning environment .
In Higher Education, Learning Analytics has been proven to help universities in strategic areas such as finance, resource allocation and student success . Therefore, the universities are collecting more data than ever before to maximize the strategic outcomes . Additionally, universities use methods of Learning Analytics to obtain findings on the progress of students, predict future behaviors and recognize problems at an early stage . However, ethical and legal issues of collecting and processing the data of students are still seen as a barrier by the institutions [14, 15].
Learning analytics is applied in many adaptive learning systems such as MOOC-platforms [16, 17]. The results of such analysis are demonstrated on Learning Analytics Dashboards for better comprehension and further recognition of ongoing activities. An example of in detail analysis is the work on one-digit multiplication problems [18,19,20] or even beyond [21, 22]. Taraghi et al. analyzed at the first step the most prevalent error types and the statistical correlations between them in one-digit multiplication problems . In a second step, they carried out a detailed analysis to highlight the misconceptions of students that are of higher relevance providing hints at probable reasons behind them [24, 25]. The discovered learning paths and difficulty levels were also used to build learner profiles . Last but not least they applied Bayesian models and probabilistic graphical models for modeling misconceptions . According to Verbert et al., evaluating Learning Analytics dashboards is often complex and still little is known about the usefulness to solve real issues and needs of students and teachers .
3 Development of the Dashboard
This section discusses the development and implementation of our Learning Analytics dashboard. The following subsection describes the foundation of our development, provides an overview, and gives a short explanation of the technologies used.
3.1 Foundation of the Development
We used the Elastic StackFootnote 1 as foundation for our development. The Elastic Stack combines Logstash, Elasticsearch, and Kibana and provides a powerful platform for indexing, searching, and analyzing data. Additionally, each of these technologies is available under the Apache 2.0 Open Source LicenseFootnote 2.
Logstash has been proven to be a powerful data collection, enrichment, and transportation pipeline with connectors to common infrastructure for easy integration. It is designed to efficiently process log, event, and unstructured data sources for distribution into a variety of outputs .
ElasticsearchFootnote 3 provided the abilities to search and process the information. It is a popular and powerful distributed search and analytics engine. It is based on Apache LuceneFootnote 4 and designed for horizontal scalability, reliability, and easy management. It combines the speed of search with the power of analytics via a sophisticated, developer-friendly query language covering structured, unstructured, and time-series data .
For the data presentation and visualization we used KibanaFootnote 5, an open source data visualization platform that allows interaction through powerful graphics. It provides various visualizations, such as histograms or geomaps, which can be combined into custom dashboards .
Unfortunately, Elastic Stack lacks security features, which forced us to add Search GuardFootnote 6 to our concept. This Elasticsearch plugin offers encryption, authentication, and authorization. It builds on Search Guard SSL and provides pluggable authentication modules. Search Guard is an alternative to ES ShieldFootnote 7, and offers all basic security features for free. Search Guard is available under the Apache 2 Open Source License. An overview of the concept is shown in Fig. 2.
This initial setup of the technology stack handles multiple data-sources very well, has a good scalability, and presents a custom and dynamic dashboard with good and easy visualizations. Unfortunately, there are problems with handling complex queries and despite Elasticsearch is able to handle minor data-relationships, it by design does not support them very well. As a result, executing queries costs more resources and take more time. Further, Kibana cannot handle data-relationships at all. A workaround for this problem is to combine the linked data-sources in Logstash and insert this combined-data into Elasticsearch, such as adding the user-data to each entry of a Learning Management System (LMS) actions). This might work for small settings, but is very resource intensive. Additionally, many-to-many relationships cannot be modelled this way!
3.2 The Final Concept
Because of our self-imposed requirements, it is essential for our dashboard to handle data-relationships, that’s why we replaced the module for data presentation, Kibana with GrafanaFootnote 8, which is able to handle relationships. Similar to Kibana, Grafana is most commonly used for visualizing time series data and provides a powerful and elegant way to create, explore, and share dashboards. Additionally, Grafana comes with an own authentication layer .
With this configuration the data presentation layer was able to handle relationships between data, but the problems with complex queries remain as well as the relation-limitations of Elasticsearch.
Additionally, we extended our technology stack by the powerful, object-relational database management system PostgreSQLFootnote 12 (short Postgres). This support for relational data gives Elasticsearch the liberty to concentrate on its benefits, the ability to handle Big Data. Postgres has earned a strong reputation for reliability, data integrity, and correctness. It is free and open-source software, released under the terms of the PostgreSQL LicenseFootnote 13 .
After this step, the setup was able to handle all relationships. Additionally, the new created layer was able to combine the queries and taking of the complexity of the queries themselves. Despite that, we lost the ability of creating custom dashboards in the frontend. This concept of the dashboard, which is currently in use, is shown in Fig. 3.
4 Conclusion, Restrictions and Outlook
In this paper, we discussed the design and development of a Learning Analytics dashboard. We defined three objectives: the different demands of the stakeholders, the maximization of the mainstreaming potential, and the transferability to other contexts, and the development in the path of Open Source.
We fulfilled the demands of our stakeholders’ trough supporting various types of data sources. Further, we kept the mainstreaming potential and the transferability to other context at a maximum by using a modular architecture. Thereby, it is possible to replace any of those modules with proprietary software already in use at the different universities. Finally, we achieved our last objective by introducing fitting Open Source technologies for our concept. Additionally, we described in detail our way of finding these technologies and explained the pros and cons of the different technology stacks.
Initial results show that the data and analytics layering allows the usage of multiple data sets and analytics techniques in a single interface for both, visualizing and analyzing. It should be mentioned that our final concept complies with our current requirements. Through future changes of those requirements, it might be necessary to adapt our concept. We are currently working on the proof of concept for our implementation of the Learning Analytics dashboard in the context of a Small Private Online Course (SPOC) which we are going to publish soon.
https://www.elastic.co (last visited Feb. 25, 2017).
https://www.apache.org/licenses/LICENSE-2.0 (last visited Feb. 25, 2017).
https://www.elastic.co/products/elasticsearch (last visited Feb. 25, 2017).
http://lucene.apache.org (last visited Feb. 25, 2017).
https://www.elastic.co/products/kibana (last visited Feb. 25, 2017).
https://floragunn.com/searchguard (last visited Feb. 25, 2017).
https://www.elastic.co/products/shield (last visited Feb. 25, 2017).
http://grafana.org (last visited Feb. 25, 2017).
https://www.codeigniter.com (last visited Feb. 25, 2017).
http://www.chartjs.org (last visited Feb. 25, 2017).
https://opensource.org/licenses/mit-license.php (last visited Feb. 25, 2017).
https://www.postgresql.org (last visited Feb. 25, 2017).
https://www.postgresql.org/about/licence (last visited Feb. 25, 2017).
Few, S.: Dashboard Confusion. Intelligent Enterprise (2004)
Siemens, G., Long, P.: Penetrating the fog: analytics in learning and education. EDUCAUSE Rev. 46(5), 30 (2011)
Verbert, K., Govaerts, S., Duval, E., Santos, J.L., Van Assche, F., Parra, G., Klerkx, J.: Learning dashboards: an overview and future research opportunities. Pers. Ubiquit. Comput. 18(6), 1499–1514 (2014)
Duval, E.: Attention please! Learning analytics for visualization and recommendation. In: Proceedings of LAK11: 1st International Conference on Learning Analytics and Knowledge 2011 (2010, to appear). https://lirias.kuleuven.be/bitstream/123456789/315113/1/la2.pdf. Accessed 25 Feb 2017
Charleer, S., Klerkx, J., Duval, E., Laet, T., Verbert, K.: Creating effective learning analytics dashboards: lessons learnt. In: Verbert, K., Sharples, M., Klobučar, T. (eds.) EC-TEL 2016. LNCS, vol. 9891, pp. 42–56. Springer, Cham (2016). doi:10.1007/978-3-319-45153-4_4
Elias, T.: Learning Analytics: Definitions, Processes and Potential (2011)
Khalil, M., Ebner, M.: When learning analytics meets MOOCs - a review on iMooX case studies. In: Fahrnberger, G., Eichler, G., Erfurth, C. (eds.) I4CS 2016. CCIS, vol. 648, pp. 3–19. Springer, Cham (2016). doi:10.1007/978-3-319-49466-1_1
Johnson, L., Adams, S., Cummins, M.: The NMC Horizon Report: 2012, Higher Education edn. The New Media Consortium, Austin (2012)
Johnson, L., Adams Becker, S., Cummins, M., Freeman, A., Ifenthaler, D., Vardaxis, N.: Technology Outlook for Australian Tertiary Education 2013–2018: An NMC Horizon Project Regional Analysis. New Media Consortium, Austin (2013)
Leitner, P., Khalil, M., Ebner, M.: Learning analytics in higher education—a literature review. In: Peña-Ayala, A. (ed.) Learning Analytics: Fundaments, Applications, and Trends. Studies in Systems, Decision and Control, vol. 94, pp. 1–23. Springer International Publishing, Heidelberg (2017)
Drachsler, H., Greller, W.: The pulse of learning analytics - understandings and expectations from the stakeholders. In: Buckingham Shum, S., Gasevic, D., Fergu-Son, R. (eds.) Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK 2012), pp. 120–129. ACM, New York (2012)
Khalil, M., Ebner, M.: Learning analytics: principles and constraints. In: Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications, pp. 1326–1336 (2015)
Bichsel, J.: Analytics in higher education: benefits, barriers, progress, and recommendations. EDUCAUSE Center for Applied Research (2012)
Sclater, N.: Code of practice “essential” for learning analytics (2014). http://analytics.jiscinvolve.org/wp/2014/09/18/code-of-practice-essential-for-learning-analytics. Accessed 25 Feb 2017
Khalil, M., Ebner, M.: De-Identification in learning analytics. J. Learn. Anal. 3(1), 129–138 (2016)
Khalil, M., Taraghi, B., Ebner, M.: Engaging learning analytics in MOOCS: the good, the bad, and the ugly. In: International Conference on Education and New Developments, Ljublja-na, Slovenia, pp. 3–7 (2016)
Taraghi, B., Saranti, A., Ebner, M., Großmann, A., Müller, V.: Adaptive learner profiling provides the optimal sequence of posed basic mathematical problems. In: Rensing, C., Freitas, S., Ley, T., Muñoz-Merino, Pedro J. (eds.) EC-TEL 2014. LNCS, vol. 8719, pp. 592–593. Springer, Cham (2014). doi:10.1007/978-3-319-11200-8_85
Khalil, M., Ebner, M.: What massive open online course (MOOC) stakeholders can learn from learning analytics? In: Spector, M., Lockee, B., Childress, M. (eds.) Learning, Design, and Technology: An International Compendium of Theory, Research, Practice, and Policy, pp. 1–30. Springer International Publishing, Heidelberg (2016)
Taraghi, B., Frey, M., Saranti, A., Ebner, M., Müller, V., Großmann, A.: Determining the causing factors of errors for multiplication problems. In: Ebner, M., Erenli, K., Malaka, R., Pirker, J., Walsh, Aaron E. (eds.) EiED 2014. CCIS, vol. 486, pp. 27–38. Springer, Cham (2015). doi:10.1007/978-3-319-22017-8_3
Schön, M., Ebner, M., Kothmeier, G.: It’s just about learning the multiplication table. In: Buckingham Shum, S., Gasevic, D., Ferguson, R. (eds.) Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (LAK 2012), pp. 73–81. ACM, New York (2012)
Ebner, M., Schön, M.: Why learning analytics in primary education matters! Bull. Tech. Comm. Learn. Technol. 15(2), 14–17 (2013). Karagiannidis, C., Graf, S. (ed.)
Ebner, M., Schön, M., Neuhold, B.: Learning analytics in basic math education – first results from the field. e-Learning Pap. 36, 24–27 (2014)
Ebner, M., Schön, M., Taraghi, B., Steyrer, M.: Teachers little helper: multi-math-coach. IADIS Int. J. WWW/Internet 11(3), 1–12 (2014)
Taraghi, B., Saranti, A., Ebner, M., Schön, M.: Markov chain and classification of difficulty levels enhances the learning path in one digit multiplication. In: Zaphiris, P., Ioannou, A. (eds.) LCT 2014. LNCS, vol. 8523, pp. 322–333. Springer, Cham (2014). doi:10.1007/978-3-319-07482-5_31
Taraghi, B., Saranti, A., Ebner, M., Schön, M.: On using Markov chain to evidence the learning structures and difficulty levels of one digit multiplication. In: Proceedings of the Fourth International Conference on Learning Analytics and Knowledge, pp. 68–72 (2014)
Taraghi, B., Saranti, A., Ebner, M., Müller, V., Großman, A.: Towards a learning-aware application guided by hierarchical classification of learner profiles. J. Univ. Comput. Sci. 21(1), 93–109 (2015)
Taraghi, B., Saranti, A., Legenstein, R., Ebner, M.: Bayesian modelling of student misconceptions in the one-digit multiplication with probabilistic programming. In: Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (LAK 2016). ACM, New York (2016)
Turnbull, J.: The Logstash Book. James Turnbull, USA (2013)
Gormley, C., Tong, Z.: Elasticsearch: The Definitive Guide. O’Reilly Media Inc., Sebastopol (2015)
Gupta, Y.: Kibana Essentials. Packt Publishing Ltd., Birmingham (2015)
Nabi, Z.: Pro Spark Streaming: The Zen of Real-Time Analytics Using Apache Spark. Apress, New York (2016)
Maymala, J.: PostgreSQL for Data Architects. Packt Publishing Ltd., Birmingham (2015)
This research project is co-funded by the European Commission Erasmus+program, in the context of the project 562167-EPP-1-2015-1-BE-EPPKA3-PI-FORWARD. Please visit our website http://stela-project.eu.
Editors and Affiliations
Rights and permissions
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Leitner, P., Ebner, M. (2017). Development of a Dashboard for Learning Analytics in Higher Education. In: Zaphiris, P., Ioannou, A. (eds) Learning and Collaboration Technologies. Technology in Education. LCT 2017. Lecture Notes in Computer Science(), vol 10296. Springer, Cham. https://doi.org/10.1007/978-3-319-58515-4_23
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-58514-7
Online ISBN: 978-3-319-58515-4
eBook Packages: Computer ScienceComputer Science (R0)