1 Introduction

1.1 Learning analytics is inevitable due to increased digitalisation of education

Learning is increasingly becoming digitally supported, especially throughout Higher Education (HE). One of the key elements in this change is the use of Learning Management Systems (LMS) such as Moodle or Blackboard, allowing students to interact with course material, discussion posts, quizzes, and a variety of other resources. When interacting with these systems, students leave digital traces or footprints that are stored in activity logs (You, 2016). These digital traces have historically been the primary source of data in the field of Learning Analytics (LA) (Schwendimann et al., 2017). LA is an emerging field interested in “measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” (Siemens, 2013, p. 1382). As digital technologies continue to play an increasingly important role in HE, there is a growing interest in using these digital traces to inform decision making processes, improve teaching, and student learning. However, as Ifenthaler and Schumacher (2016) put it in their review of LA for supporting student success in HE “more educational data does not always make better educational decisions” (p. 1982). In order to effectively make use of LA to support students, challenges occur, including building a strong connection to learning sciences, understanding the environment in which learning occurs and the relevant datapoints, and focusing on the perspectives of the learners (Ferguson, 2012).

1.2 Learning analytics dashboards are bringing LA to the learners

One of the most promising sub-fields of LA when it comes to building a link to the learning sciences seems to be Learning Analytics Dashboards (LAD) (Viberg et al., 2018). Analytics dashboards provide a visual representation of data required to achieve one or more objectives (Teasley, 2017). This is then a shift from using LA to inform adaptive systems, to making data directly available to the user through visual analytics, placing the intervention in the hand of the user (Ruiperez-Valiente et al., 2021). While the terminology and boundaries around these tools have been unclear throughout its emerging state, an increasingly prevalent definition is proposed by Schwendimann et al. (2017): “A learning dashboard is a single display that aggregates different indicators about learner(s), learning process(es) and/or learning context(s) into one or multiple visualizations” (p. 38). Dashboards have been designed for a variety of groups including administrators, faculty, study advisers and teachers. Most current dashboards are, however, aimed at the learners themselves (Matcha et al., 2020). This is a recent change, with dashboards previously being mostly aimed at teachers (Jivet et al., 2017). In this change, it is important to ensure that LADs are not just student-facing, presenting data to students, but instead student-focused, being designed for supporting student learning by through feedback, reflection, and relevant next actions. This paper presents a systematic review of student-facing dashboards in HE, exploring how learning and analytics can be connected within LAD design in order to support students’ learning - making dashboards student-focused.

1.3 When learners become the users of LADs theory plays an important role

By looking at the commonly applied research interest outlined by Siemens (2013), the LA field should be inclined to embrace pedagogical assumptions, guiding the design of user-facing analytics, such as student-facing LADs. Suthers and Verbert (2013) call LA a ‘middle space’ between the fields of learning and analytics. LA as a field has however long been technologically-driven, mostly ignoring the social and material context of the learning environment in which analytics has been applied (Fawns, 2019). With pedagogical factors set aside, researchers have primarily looked at data-points in isolation from the rest of the learning process (Gašević et al., 2015). There then seems to be lack of theory informing LA, a problem that still seems to persist (Guzmán-Valenzuela et al., 2021). In order to put the learning back into LA, there has been a call for using LA to support students directly by creating tools that can scaffold their self-directed learning (Kilińska & Ryberg, 2019). As the primary user of the analytics changes from the institution to the student, as is the case with student-facing dashboards, pedagogy begins to play a central role, and questions about how we can design and use LA for supporting students’ learning begin to arise.

1.4 LA is increasingly becoming about learning and not just analytics

Multiple existing reviews have identified the theory of self-regulated learning (SRL) as a starting point for working either directly with LADs or conducting systematic reviews (Jivet et al., 2017; Matcha et al., 2020; Valle et al., 2021a). In recent times, different pedagogical assumptions have been proposed for informing dashboards, e.g. collaborative learning analytics – using analytics to support collaborative learning (Wise et al., 2021). With multiple pedagogical assumptions paving the way for dashboards, it could point towards LA becoming increasingly about learning and not just analytics. Even recent reviews, however, keep concluding that learning is missing from LA (Guzmán-Valenzuela et al., 2021). The approaches of these reviews however raise a methodological challenge of whether the missing learning is a continuing problem or a ghost from the past. In concluding these points, the reviews seem to be addressing the entirety of a corpus spanning back to the emergence of the LA field. LADs are often described as being in an exploratory state (Schwendimann et al., 2017). With recent concepts, reviews and studies emphasising pedagogy, it is necessary to look at the body of research through a time-sensitive lens, outlining the development over time and emerging trajectories for future dashboard design. In order to guide our review, the following research questions have been outlined and will be expanded upon through the methodology.

Research questions

  • RQ1) How have the following themes developed over time within the design of student-facing LADs in HE, and what trajectories, if any, appear when mapping the themes over publications per year?

    1. a

      The technological maturity of the dashboard

    2. b

      The frameworks informing the design of the dashboard

    3. c

      The ways that the dashboard present data to students

    4. d

      The different data sources that feed into the dashboard

    5. e

      The level of analytics used to analyse the data in the dashboard

  • RQ2) What trajectories, if any, appear when cross mapping the themes from RQ1, and how may these inform the future design of student-facing LADs in HE?

The review contributes to the existing knowledge base by providing a systematic overview of student-facing dashboards in higher education, uncovering emerging trajectories in order to guide further design of dashboards.

2 Systematic review methodology

In order to investigate the outlined research questions a systematic review has been conducted (Gough et al., 2017). For guiding the review, inspiration was taken from the PRISMA 2020 framework in order to ensure the transparency of the review (Page et al., 2021). The PRISMA 2020 framework proposes a model for systematically building a data corpus which has been adopted.

2.1 Identification

Due to the interdisciplinary nature of LAD (Matcha et al., 2020), a variety of databases were chosen for performing the search, encompassing both general databases as well as technological and educational databases (Table 1).

Table 1 Databases for conducting systematic review

In order to begin building the corpus, the following search string was developed using Boolean-operators (Gough et al., 2017): (“dashboard*”) AND (“learning analytics” OR “educational data mining”). The search string was intended to be as broad as possible, as the terminology around LAD’s remains unaligned throughout the field, especially during the early years (Schwendimann et al., 2017). A search block limiting the results to students and learners has therefore not been included. The target user has instead been addressed through the exclusion criteria (Section 2.2). The search string was applied on title, abstract and keywords. The initial search was performed in all databases on 08-NOV-2022 and resulted in 920 papers. The search was limited to include only peer-reviewed material from databases where this is not the default (ProQuest).

2.2 Screening, eligibility, and inclusion

Before screening results, duplicates were removed bringing the total amount of papers down to 592. In order to perform an initial screening of the results, a set of exclusion criteria were created and applied by assessing the abstracts of the initial corpus (Table 2).

Table 2 Exclusion criteria

The criteria “Language” was created to ensure that papers which could not be interpreted were not included in the synthesis. Therefore, only papers written in English were included. “Document type” excluded all documents other than journal articles, conference papers and PhD theses. The criteria “Dashboard” excluded studies that didn’t present a dashboard for education, as well as excluding non-empirically tested frameworks for designing and evaluating dashboards. “Target user” excluded dashboards for academic staff such as teacher-only dashboards. If the dashboard was aimed at both students and teachers, the paper was excluded, as student-facing dashboards differ from teacher-facing dashboards in their representation of data (Matcha et al., 2020). “Multiple dashboards” excluded literature reviews as well as comparisons of different dashboards from different educational settings. Papers comparing different versions of the same dashboard in the same educational setting were still included. Lastly, “educational setting”, excluded studies that didn’t present their educational setting or presented an educational setting different to higher education. This also excluded MOOCs not explicitly limited to higher education students.

If the exclusion was unclear the paper was marked and discussed with another researcher in order to determine inclusion/exclusion. After screening abstracts, the exclusion criteria presented in Table 2 were applied to the remaining full texts. 115 items were sought for retrieval with five items not being retrieved due to limited access. After the detailed process, 39 papers were included for further analysis (Fig. 1).

Fig. 1
figure 1

PRISMA flow diagram

Before attending to the synthesis of the corpus, the overall corpus characteristics will first be examined. A relevant metric is the number of publications per year in the corpus (Fig. 2).

Fig. 2
figure 2

Publication year of corpus

Figure 2 shows that student-facing LAD’s in Higher Education is a growing field, a point which is consistent with other reviews (Guzmán-Valenzuela et al., 2021; Matcha et al., 2020; Valle et al., 2021a). The publication types present in the final corpus has also been analysed (Fig. 3).

Fig. 3
figure 3

Publication type of corpus

Figure 3 shows that most of the publications are conference papers. This is in line with LADs still being in an exploratory state (Schwendimann et al., 2017). There is however also a noticeable amount of journal publications, leading back to the initial question of whether the technology is maturing – a point which will be discussed throughout the paper.

2.3 Synthesis

In order to make sense of the final corpus, research question 1 was addressed by mapping and coding the corpus with the aim of outlining the technological maturity, informing frameworks, affordances, data sources and analytical levels of the studies.

Informing frameworks, affordances, and data sources were mapped by inductively coding themes so as to not apply bias in terms of pre-conceptions of the approaches used in designing and implementing the dashboard. This meant that the codes emerged by reading the corpus and searching for the presented themes e.g., data sources. The codes were then iteratively refined by the researchers as the codes emerged and were noted down in a shared codebook for ensuring transparency (Belur et al., 2021). Initial coding was conducted by researcher A. Unclear codes were noted down and discussed with researcher B before assigning the final code and re-iterating the codebook. The inductively created codes will be further expanded upon as they are introduced in Section 3.

For addressing the technological maturity, technology readiness levels were adapted (Tzinis, 2015). Building upon the tech-readiness scale allows for developing a standardised approach that can be adopted by others in order to evaluate the maturity of LADs. To relate the scale to LADs, a codebook was developed translating the scale to reflect LAD maturity. For analysis the coding was further clustered (Table 3).

Table 3 Adaptation of technology readiness level scale

In order to code the analytical levels, four types of LA were adopted from Jayashanka et al. (2022) (Table 4). This allowed for going beyond the descriptive / predictive distinction seen in other reviews such as Susnjak et al. (2022).

Table 4 Four types of analytics - adopted from Jayashanka et al. (2022)

In addition to the descriptive codes, the different themes were also mapped out over number of publications per year in a stacked area chart. Doing so allowed for identifying current trends in the field and potential trajectories. In order to address RQ2 these categories were cross mapped, showing how the different themes relate to each other. The cross-mapping methodology will be further unfolded later in the paper.

2.4 Limitations

Despite the systematic nature of the review, and the added transparency of the codebook, there are still some limitations which may affect the conclusion that can be drawn from the results. By limiting itself to student-facing dashboards in HE, the trajectories neither represent student-facing dashboards, dashboards as a way of mediating learning analytics, nor the entirety of the learning analytics field. Another limiting factor is the approach used in the synthesis. The trajectories are dependent on the inductive codes for some categories and are therefore not objective statements about the dashboards. During the coding process it was noted that while many publications cite feedback and reflection as the purpose of the dashboard, these terms are used in very broad ways across the corpus - meaning that while authors may claim that a certain dashboard is built for feedback, it may not be coded as such if the affordance description is not matched. The inductive nature of these codes may then change the way the trajectories appear.

3 Results

In order to address the research questions, the following three steps will be outlined for every theme identified in RQ1. First, the codes will be presented. Secondly, descriptive results for each category will be presented. Finally, the stacked area charts will be presented based on the clusters for the categories in RQ1.

3.1 Technological maturity

This theme outlines the technological maturity of the dashboards, identifying how close they are to being implemented in a generalised setting. In order to address the technological maturity, the four clusters outlined in Section 2.3 have been applied to the final full texts (Table 5).

Table 5 Technological maturity of LADs

Over half of the included studies (23/39) are coded under the second maturity level, reporting that the dashboard has been introduced into the wild by being evaluated in one or more courses, but not repeatedly implemented over multiple semesters. Only a single paper has reported a level 4 dashboard, with it being available for others to use (Roa Romero et al., 2021). When mapped over publications per year, it can however be seen that the field is evolving over time (Fig. 4).

Fig. 4
figure 4

Technological maturity mapped out over publications over time

Figure 4 shows that while the number of publications is rising overall, most studies are still falling under maturity level 1 or 2. This can potentially indicate that most studies are still concerned with trying out different configurations, supporting the explorative notion. The mapping over publications by year however also shows an influx in level 3 and 4 papers emerging from 2016 and onwards.

3.2 Informing frameworks

This theme outlines the frameworks that have informed the design of the dashboard. This entails both design-oriented frameworks, e.g. user-centred design (Duan et al., 2022), as well as theory-oriented frameworks, e.g. Self-Regulated Learning (Lu et al., 2020) (Table 6).

Table 6 Informing frameworks of LADs

The codes were inductively constructed for this theme, meaning that codes were not pre-determined, but iteratively developed through reading the corpus, extracting the frameworks from the individual papers. During this coding process a pattern appeared, with the codes ‘none’ (11/39) and ‘self-regulated learning (SRL)’ (14/39) appearing more often than other codes. The rest of the codes were therefore grouped into two clusters, as can be seen in Table 6. The first cluster comprises the theory-oriented frameworks, including educational theories such as social constructivism (3/39), evaluative frameworks such as Actionable feedback (2/39), Psychological concepts Self-determination theory (3/39) and the sociological concept of Use diffusion (1/39). The second cluster comprises the design-oriented frameworks such as User-Centred Design (2/39). There is then a dominance of theory-oriented frameworks, primarily SRL. Despite this wide range of informing frameworks, there is still a noticeable number of publications not citing any frameworks for informing their design. In order to see whether this relation between frameworks is stable, the frameworks have been mapped over publications per year (Fig. 5).

Fig. 5
figure 5

Informing frameworks mapped out over publications per year

Figure 5 shows that LADs are increasingly informed by theory-oriented frameworks, primarily SRL. While there is still a stable number of dashboards present that are not informed by any framework, an increase can be seen in the number of dashboards informed by theory-oriented frameworks. Secondly, it can also be seen that the presence of other theory-oriented frameworks appeared before SRL emerged as a framework for informing dashboard design, with SRL taking the dominant position in recent years.

3.3 Affordances

This theme outlines the affordances that the dashboards entail when looking through a student-focused lens (Table 7). This encompasses different ways of visualising data for students, as well as the ways students can interact with the dashboard. The coded affordances attend to individual parts of the dashboards, e.g., a visualisation showing progress over time (Sedrakyan et al., 2017), which may differ from the research purpose outlined in the papers, e.g., increasing dashboard use.

Table 7 Affordances of LADs

The codes for this theme were inductively constructed for this theme, meaning that codes were not pre-determined, but developed by iteratively defining and coding the descriptions shown in the in Table 7.

The most common affordances are comparison (33/39), awareness (30/39) and monitoring (24/39). While comparison, awareness and monitoring are present in most dashboards, the rest of the coded items are more scarcely distributed. Here, it is interesting to note that the three most common affordances are all oriented towards describing practice – this will be further discussed during the fifth theme (Analytical Levels). The remaining codes entail items such as prediction (10/39), showing students an prediction of their final grades (Hellings & Haelermans, 2022); Recommendation (8/39), recommending student future action (Sansom et al., 2020); Feedback (4/39), giving students assessment / evaluation which is not just an grade or a scale, but e.g. written feedback (Tzi-Dong Ng et al., 2022); Reflection (3/39), using text prompts to facilitate reflection; Goal Setting (2/39), allowing students to set goals that they can follow up on (Winstone, 2019). When mapped out over publications per year, a trajectory appears (Fig. 6).

Fig. 6
figure 6

Affordances mapped out over publications per year

Figure 6 shows that the three primary affordances (comparison, awareness, and monitoring) were also the ones that were first present in dashboard design, with other affordances appearing from 2016 onwards. Figure 6 shows how the student-focused LAD field is evolving, leading to non-descriptive items beginning to appear. Here, a distinction can also be made between technically informed affordances such as predictive visualisations and theory-oriented affordances such as feedback. It can then be seen that most of the new affordances emerging after 2017 are aimed at supporting students, e.g., feedback and recommendation, creating a clearer link back to the theory-oriented frameworks which are increasingly informing LADs.

3.4 Data sources

This theme outlines the data sources that feed into the dashboards. This means that the focus is on the data generated by, or about the student (Table 8).

Table 8 Data sources of LADs

This category was also inductively coded by iteratively defining the description shown in the second column and resulted in four different codes. The most common approach coded is centred around activity logs from LMS systems, e.g. Moodle (Sahin & Yurdugül, 2017), being coded 26/39 times. Secondly, 11/39 dashboards collect data from an external tool outside of the LMS. These can be divided into dashboards exclusively focusing on external tools such as e-book readers (Chen et al., 2020) or online forums (Ullmann et al., 2019) and external tools, supplementing the activity data from LMS’s (Ramaswami et al., 2019). Students self-reporting data, e.g. time management (Broos et al., 2020) or emotional response (Sedrakyan et al., 2017) were coded 9/39 times. This approach is the least technical way of implementing LA, as the technology is only present in the analysis, and not in the data collection. The least common approach is collecting the students grades from administrative systems, being coded only 6/39. When mapping out the data sources over time, no clear trajectory emerges (Fig. 7).

Fig. 7
figure 7

Data sources mapped out over publications per year

Figure 7 shows that most data sources have been present throughout most of the time period, meaning that a trajectory cannot be outlined. This shows that while the affordances of the dashboards seem to be evolving over time, the inputs remain relatively stable.

3.5 Analytical levels

This theme outlines the analytical levels of analysing data which are present in the dashboards (Table 9). This means that the focus is how data is processed before being visualised to the student. Here, the distinction between four different levels of LA, and their description, was adopted from Jayashanka et al. (2022).

Table 9 Analytical levels of LADs

The first level of analytics is the descriptive level, which is also the most common, being coded 38/39 times. This is in alignment with the affordances outlined in trajectory 3, showing what has happened by making students aware, allowing them to monitor over time and compare to others. The second level is the diagnostic level, which is concerned with explaining why something happened. This level is coded 7/39 times, taking shape in form of e.g. elaborative text, explaining the descriptive results (Broos et al., 2020). While this analytical level is coded seven times, it is also here where multiple papers on the same dashboard seems to be most prevalent, with the seven codes being spread out over just three different dashboards (Broos et al., 2020; De Quincey et al., 2019; Sansom et al., 2020).

When moving to the third level, predictive analytics, there is an increase in publications, with it being coded 10/39 times. This may be in part be due to the above mentioned historical perspective, but also due to an increased focus on creating an distinction between descriptive and predictive analytics (Valle et al., 2021c), leaving out the remaining levels. Lastly, prescriptive analytics are found in 6/39 studies taking shape in the form of recommendation for next course material (Afzaal et al., 2021; Sansom et al., 2020) and changes in activity, e.g. increased reading or engagement (De Quincey et al., 2019; Susnjak et al., 2022). Here, three of the six papers are again reporting on the same dashboard (R. Bodily et al., 2018; R. G. Bodily, 2018; Sansom et al., 2020), showing that work done with diagnostic and prescriptive analytics also seem to be the dashboards that are published multiple times. When mapping the four levels out over publications per year, a trajectory appears (Fig. 8).

Fig. 8
figure 8

Analytical levels mapped out over publications per year

Figure 8 shows that the descriptive level is the basis of student-facing LADs, being present from the beginning. In recent years the other levels are beginning to appear, albeit with no diagnostic papers being coded in the last two years. Prescriptive and diagnostic codes before 2019 are the result of multiple publications on the same dashboards, which is in line with the trajectories around the affordance of recommendation and the diagnostic analytical level.

4 Cross mapping themes

In RQ1 the five presented themes each show a certain trajectory, or lack thereof, when mapped out over publications per year. Attending to RQ2, the use of heatmaps allows for cross mapping the identified themes, e.g., data sources over affordances. All 20 cross-mappings were examined, and the key cross-mappings from informing frameworks and data sources are presented here, as they provide insights which we deem valuable in relation to the design of future dashboards.

4.1 Informing frameworks

When mapping theme 3 (Affordances) over publications by year, it became clear that affordances have moved beyond comparison, awareness, and monitoring. At the same time, in theme 2 (Informing Frameworks) it was shown that LADs are increasingly being informed by theory-oriented frameworks. It is then deemed relevant to explore whether the change in affordances is connected to the change in informing frameworks. In attempt to answer this, informing frameworks can be mapped over affordances (Table 10)

Table 10 Informing frameworks mapped out over affordances

Table 10 shows that predictive affordances mostly arise from dashboards not informed by theory-oriented frameworks. The heatmap then also shows that theory-oriented frameworks inform reflection, goal setting, and recommended next actions. SRL is the dominant theory as an overall approach, but when the other theory-oriented frameworks are clustered, they seem to have slightly different affordances than the SRL-informed dashboard. A similar trajectory appears when mapping out informing frameworks over analytical levels (Table 11).

Table 11 Informing frameworks mapped out over analytical levels

Table 11 supports the notion that prediction is mostly non-theory, it however also shows that while SRL is the dominant theory-oriented framework, it is mostly applied in conjunction with descriptive analytics. The use of other theory-oriented frameworks then seems to be what is paving the way for diagnostic and prescriptive analytics.

4.2 Data sources

As seen in theme 4 (Data Sources), the inputs of LADs are relatively stable over time, with most of the data coming from LMS activity logs, and most of the external tools also just integrating activity-related measures. Here, it is deemed relevant to map out data sources over frameworks, and analytical levels (Table 12).

Table 12 Data sources mapped out over informing frameworks

Table 12 shows that the dashboards informed by SRL or no framework are mostly using LMS data while most of the other theory-oriented framework informed dashboards are using external tools. It is also interesting to note that the inclusion of students’ final grades is mostly used in combination with SRL. Another relevant cross mapping appears between data sources and analytical levels (Table 13).

Table 13 Data sources mapped out over analytical levels

Table 13 shows that while dashboards using LMS data are dominantly descriptive, they are also the primary utilisers of predictive analytics. Here, the dashboards using external tools seem to be more varied, with an increase in diagnostic analytics compared to the LMS category.

5 Discussion

Based on the insights derived from the mapping of themes over publications per year, and the presented cross-mappings, we will now discuss the implications of these results for moving student-facing dashboards towards student-focused dashboards. After this, we will present our design recommendations for student-focused dashboard design in HE.

5.1 Implications

5.1.1 LADs are increasingly becoming about learning

The results from this review show an increase in theory-oriented frameworks informing the design of student-facing LADs in HE This challenges the notion that learning is missing from analytics (Guzmán-Valenzuela et al., 2021; Jivet et al., 2017). The results from this review do not discredit this claim, as new dashboards are still emerging which are not informed by theory-oriented frameworks. The results however imply that dashboards informed by theory-oriented frameworks are emerging more rapidly than the ones that are not. A core implication of this trajectory is that dashboards informed by pedagogy can provide students with relevant insights and actions for their learning through relevant affordances and analytical levels.

5.1.2 Theory-oriented frameworks are pushing for student-focused affordances

The increase in theory-oriented frameworks also manifests itself in the affordances of the dashboards, with recent affordances appearing, such as recommendation and reflection, supporting students more directly, and not just making them aware of their activity and the acitivity of their peers. However, with most of the work still only focusing on comparison, awareness and monitoring, there remains a gap in the use of LADs to not only make students aware of their learning, but also support learning - moving from description of practice to providing actionable insights – a point also concluded by Susnjak et al. (2022), who, in line with this review, show that most learner-facing dashboards only employ descriptive measures. This potentially hinders the ability for students to move beyond monitoring and into meaningful action, a point also concluded by Liu and Nesbit (2020).

5.1.3 Theory-oriented frameworks need to be reflected in dashboard design

The clustering of the theory-oriented frameworks shows that while other theory-oriented frameworks emerged first, SRL seems to have taken over as the primary informing theory-oriented framework for student-facing dashboard design. SRL being the primary theory-oriented framework is in line with the literature (Jivet et al., 2017). The cross mappings performed around the theory-oriented frameworks however raise a question of whether SRL is the appropriate theory-oriented framework for supporting students’ learning, at least in the way most dashboards currently apply it. Cross mapping frameworks with affordances (Table 10) analytical levels (Table 11) show that the use of other theory-oriented frameworks (not SRL), seems to be what is moving dashboards towards supporting student learning, e.g., through the affordance of recommendation (Afzaal et al., 2021; Bodily et al., 2018; Bodily, 2018; De Quincey et al., 2019; Sansom et al., 2020; Susnjak et al., 2022), and the use of diagnostic analytics (Bodily et al., 2018; Bodily, 2018; Broos et al., 2017, 2018, 2020; De Quincey et al., 2019; Sansom et al., 2020), allowing students to identify what has gone wrong, and take relevant action. This is not to invalidate self-regulated learning as an theory-oriented framework, as the SRL phases (Zimmerman, 2002) provide many student-supporting affordances that are in line with this trajectory, while also resulting in affordances such as goal setting, which is also present in dashboard coded as informed by SRL (Tzi-Dong Ng et al., 2022; Winstone, 2019). These results then raise a need for dashboard designers to ensure that the pedagogical concepts embedded in the informing frameworks are also afforded by the final dashboard, the chosen data sources and then applied analytics.

It is also interesting to note that only 5/39 studies include a design framework. This is a topic which has recently received attention with some authors calling for more user-involvement in design of learning analytics (Sarmiento & Wise, 2022). There is then a need to explore how theory-oriented and design-oriented frameworks may complement each other in the design of student-focused dashboards in order to ensure relevance and effectiveness.

5.1.4 Moving beyond LMS-data and descriptive analytics

Another core implication of the results derived from this review is the link between data sources and analytical levels. Throughout the LA field there has been a continuous calling for increased and more complex data integration (Samuelsen et al., 2019). The results of this review imply that the data sources feeding into the dashboards have been mostly stable, with the types of analytics applied on these inputs changing over time.

LMS data is the most common data source feeding into the dashboards (Table 8). In the early days of the LA field this was often the go-to approach, as the goal was to predict student success based on course activity (Hellas et al., 2018). This however seems to be changing when the analytics are for the students instead of about the students, creating an increased focus on the process rather than the results. 15/39 studies include external data sources, which are then primarily informed by theory-oriented frameworks (Table 12) and the primary drivers behind diagnostic and prescriptive analytics (Table 13). It is interesting to note that the SRL informed dashboards are mostly using LMS data applied through descriptive dashboards while most of the other theory-oriented framework informed dashboards are using external tools aimed at diagnostic and prescriptive analytics. This could indicate that a new standard for dashboard design is emerging. This contrasts with the trajectory outlined in crossing frameworks with affordance and analytical levels, where dashboards with multiple publications seem to build on other theory-oriented frameworks and applied through diagnostic and prescriptive analytics, although still with the descriptive analytics present. There then seems to be a divide between new-entries to the field based on SRL, but only applying descriptive analytics, and repeated entries informed by a broader set of theories and realised through more student-oriented analytics.

Our results imply that external data sources are needed in order to support diagnostic and prescriptive analytics, the types of analytics that we argue are needed in order to support students’ learning through affordances such as feedback, reflection and recommendation. Dashboards limited to LMS data are by that nature also restricted in what they can present to students, and to what degree they can understand and support students’ learning processes.

5.1.5 The role of predictive analytics

The outlined discussion of descriptive vs. diagnostic analytics is in misalignment with the literature, which is primarily focusing on the distinction between description and prediction and not on diagnostic or prescriptive analytics (Valle et al., 2021c). When cross mapping informing frameworks and analytical levels (Table 11) the dashboards not informed by theory-oriented frameworks seem to be pushing towards predictive analytics, while dashboards informed by theory-oriented frameworks are pushing towards diagnostic and prescriptive analytics. While 4/10 of the predictive dashboards are informed by a theory-oriented framework, the remaining six are informed by a design-framework, or no framework at all. It is then vital, that predictive analytics are grounded in relevant theory-oriented frameworks in order to ensure that a divide doesn’t occur between analytics-driven prediction dashboards, and theory-oriented dashboards

5.2 Recommendations for student-focused dashboard design

Based on our results we outline a series of recommendations informing the design of future dashboards aimed at students in HE. These suggestions tie in to previous work by Jivet et al. (2018) and Bodily & Verbert (2017). Our recommendations translate the different mappings into core suggestions, strengthening the link between learning sciences and analytics that Ferguson (2012) identified as one of the core challenges for LA.

  • Dashboards should build upon existing literature in order to address the identified surge of recent papers primarily applying SRL with descriptive analytics, which our results have put into question.

  • Theory-oriented frameworks should be applied to ground affordances, data sources and analytical levels in pedagogical concepts relevant to the learning activity/environment.

  • Affordances should be in alignment with the chosen framework – For instance, if SRL is selected, the affordances should support different SRL phases/concepts.

  • Affordances should go beyond comparison, awareness, and monitoring – They should encompass tools that facilitate reflection and action through, such as feedback, recommendation, and planning.

  • Relevant data sources should be identified to provide the necessary measures for the affordances derived from the chosen frameworks – This allows for linking data measures to learning constructs.

  • Dashboards should go beyond descriptive analytics – Our findings suggest the need for diagnostic analytics to support reflection, and prescriptive analytics to support action.

  • Predictive analytics should be incorporated in accordance with theory-oriented framework, rather than being solely based on a technical justification.

While learning analytics dashboards still seem to be an exploratory state, as supported by the technological maturity of the dashboards spreading, rather than maturing, we believe that these recommendations can pave the way for supporting the emerging trajectories towards student-focused dashboard design in HE.

6 Conclusion

This review has highlighted the current themes and emerging trajectories in the design and implementation of student-facing learning analytics dashboards in higher education. The results show an emerging trajectory towards directly supporting students’ learning through dashboards that incorporate multiple data sources and are rooted in diverse theory-oriented frameworks. This trajectory has demonstrated the importance of a pedagogical approach to the design of student-facing learning analytics dashboards in higher education, as well as the need for the integration of multiple data sources and analytical levels to provide a deeper understanding and better facilitation of students’ learning processes. By attending to this trajectory, student-focused learning analytics dashboards have the potential to transform the way that students engage with digitally supported learning in higher education.