Introduction

Making productive use of the growing output of educational data systems presents a novel challenge for scientists as well as educators and learners. In the context of education, learning analytics, which refers to the use of static and dynamic data from learners and their contexts for the improvement of learning processes and educational decision-making (Ifenthaler 2015), offer a range of opportunities for supporting learning and teaching, formative and summative assessment, as well as improving learning design. For instance, making use of new modes of analysing and modelling data, data sets can be processed in real-time and presented to learners during their learning process (Ifenthaler et al. 2018a, b). Furthermore, learning analytics can be used to inform and influence decisions on different levels of the educational system (e.g. micro, meso, macro) to improve individual and organisational learning and performance. Various stakeholder groups, such as learners, teachers/educators, instructional designers, institutional leaders, scientists, and public as well as private providers already draw on learning analytics or are contemplating ways to make use of these rapid developments (Baker and Siemens 2015).

Although research in recent years has shown how educational practice might benefit from learning analytics, these new opportunities are accompanied by a range of new challenges (Gibson and Webb 2015). Among these are for instance growing concerns related to privacy and usability issues (Heath 2014; Ifenthaler and Schumacher 2016). With large data sets available to teachers, learners, and other educational stakeholders, questions of data ownership as well as processing and availability of specific data and data types to different user groups have to be addressed and dealt with (e.g. which data at which level of aggregation and interpretation) (Hoel and Chen 2018; Jones 2019). In these changing contexts, it is important to ascertain the exact needs and interests of the different user groups (Ifenthaler and Schumacher 2019; West et al. 2016a, b) for two reasons. On the one hand, the interests of different users impact the objectives for the use of data (e.g. learning support, placement in a certain study context, evaluation of instructional setting). On the other hand, the varying interests drive the assigned value and consequences of data for learners, teachers and educators, depending not only on their personal competencies to make use of the data but also on their understanding and beliefs regarding learning, acquisition of knowledge, and educational improvement in general (Howell et al. 2018).

In recent years, the incorporation of learning analytics into educational practices and research has further developed. However, while new applications and approaches have brought forth new insights, there is still a shortage of research addressing the effectiveness and consequences of these endeavours—especially with regard to the support of learning processes (Vieira et al. 2018). In what follows, we discuss recent developments and trends of learning analytics and identify important issues for educational stakeholders including researchers. The narrative first summarizes existing research on the impact of learning analytics on supporting learning and teaching and indicates key gaps. Second, we describe the results of a Delphi study involving international expert discussions about current key trends and areas of development of learning analytics. Third, we discuss these findings and outline actions for stakeholders such as policy makers, researchers, and practitioners and propose a research agenda to close the identified research gaps.

Background

Conceptual positioning of learning analytics

Learning analytics have been defined as the use of static and dynamic information about learners and learning environments, assessing, eliciting and analysing it, for real-time modelling, prediction and optimisation of learning processes, learning environments, as well as educational decision-making (Ifenthaler 2015). The primary aim is to better meet students’ needs by offering individual learning paths, adaptive assessments and recommendations, or adaptive and just-in-time feedback (Gašević et al. 2015; McLoughlin and Lee 2010), ideally, tailored to learners' motivational states, individual characteristics, and learning goals (Schumacher and Ifenthaler 2018b). Despite the central role learners and learning processes play in this definition, learning analytics frameworks (Chatti and Muslim 2019; Greller and Drachsler 2012; Ifenthaler 2015) often also include the utilisation of learning analytics by other educational stakeholders (e.g. teachers, educational administration). Hence, definitions of learning analytics vary in how tightly they are linked to or embedded in more general models of learning science, data science, or educational assessment (Marzouk et al. 2016).

Learning analytics generally rely on information such as learners’ behaviour in the digital learning environment (i.e. trace data). Such behaviours encompass, for example, the retrieval of the time dimension, the context, and the sequence of occurrence of different actions, such as the use of certain strategies, the posting of certain comments, or the retrieval of specific learning materials at given times during the learning process, which then allow for the analysis of these ‘traces of learning’ through sequence and pattern analysis or social network analysis (Baker and Siemens 2015; Berland et al. 2014; Dawson et al. 2011; Webb et al. 2018). Data might also be derived through more ‘traditional methods’ (Blikstein and Worsley 2016; Gibson et al. 2019), such as self-report measures or obtained from open language-based formats, such as reflective thoughts in chats, blogs or essays, which can be put through digitally assisted analysis, i.e. natural language processing (Gurevych and Kim 2013). Furthermore, these already highly complex data sets can be supplemented with information about learners' individual characteristics and might include further external data such as social interrelations or physical data (Berland et al. 2014; Ifenthaler and Widanapathirana 2014). In harnessing the above described potentials, learning analytics can yield several advantages. Among these are improved quality and validity of capturing relevant information about learning and learning products and the immediate processing and presentation of information which might foster interactive learning processes between learners, teachers and learning environments (Seufert et al. 2019; Shute and Rahimi 2017).

Current learning analytics approaches focus on indicators based on the behaviour in the digital learning environment (Yau and Ifenthaler 2020), such as time spent online, access to various types of resources, or reading and writing posts to relate them to learning performance (Ifenthaler et al. 2019; Mah 2016; Martin and Whitmer 2016). Only a few approaches are enriched with learner characteristics such as demographic data or results of assessments, to for instance predict study success (Costa et al. 2017; Vieira et al. 2018). In a literature review focusing on visual learning analytics, Vieira et al. (2018) found that most studies analyse usage of resources in particular, with only a few approaches having a processual perspective by trying to understand learning paths or learners’ learning progress. Furthermore, in some cases learning analytics might yield only limited insight into students' learning, because the indicators collected are not pedagogically valid. For instance, specific indicators, such as ‘time on task’ might have different meanings depending on the learning contexts (Goldhammer et al. 2014). Likewise, not all learning processes take place within the digital learning environment or can be captured with trace data (Wilson et al. 2017; Winne 2017). Hence, meaningful analysis of data requires sound theoretical grounding and modelling as well as verification of validity, gained for instance in complex evidence-based design processes (Marzouk et al. 2016; Shute et al. 2018; Wong et al. 2019).

Learning analytics might be used at different levels and for different forms of educational assessment, which has far-reaching consequences for modelling and analysing data as well as the development of criteria to evaluate its impact. From an assessment perspective, learning analytics for formative assessment refers to the generation and interpretation of evidence about learner performance by teachers, learners and/or technology to make assisted decisions about the next steps in instruction (Ifenthaler et al. 2018a, b; Spector et al. 2016). In this context, real- or near-time data are extremely valuable because of their benefits in ongoing learning interactions, for example for awareness and reflection. Learning analytics for summative assessments are used to make judgements that are typically based on standards or benchmarks and can for instance be used to classify and compare learners, classes, or institutions (Black and Wiliam 1998). Furthermore, learning analytics can also provide predictive or prescriptive insights for decision-making (Ifenthaler 2015), which can take on different forms depending on the aims and interests of different stakeholder groups in the educational system (e.g. learning recommendations, identification of learners at risk, allocation of resources for programs and interventions) (Sclater and Mullan 2017). In practice, but also in research, these different assessment perspectives and consequences are often intertwined, and it would be beneficial to explicitly locate certain learning analytic applications within this educational assessment space (Webb and Ifenthaler 2018).

Empirical evidence from learning analytics research

Research focussing on learning analytics is still rapidly evolving with most of the respective implementations being located in UK, USA and Australia (Ifenthaler et al. 2019; Sclater et al. 2016). Although in the last five years, there has been an increase in the number of related research efforts (Gašević et al. 2015,2017,2019; Rienties and Toetenel 2016; Tempelaar et al. 2015), large-scale studies regarding the effectiveness of learning analytics are still lacking (Mah et al. 2019; Wong et al. 2019). One of the main aims of learning analytics research is to improve awareness, reflection and regulation during learning processes as well as learning performance and the design of learning environments (Tsai et al. 2019). This section reports key empirical findings which are closely related to putting learning back into learning analytics.

For instance, Perry and Winne (2006)—building on the concept of self-regulated learning (SRL)—developed the tool ‘gStudy’ which uses a log-file analysis to give feedback to the learner to regulate their learning behaviours. In a more recent study, Goda et al. (2015) investigated the relationship between learning patterns and learning performance in computer-assisted language learning with respect to self-regulation strategies and procrastination behaviours by learners. An analysis of learning logs of university students revealed seven different learning patterns. 70% of learners where classified as ‘procrastinators’ and only about 7% belonged to the group ‘learning habit’, with the first group having a significantly lower learning performance than that of the learning habit group. In using a technology-driven research approach in learning analytics, Shimada et al. (2018a, b) developed an automation-summarisation-tool of lecture slides, and investigated the effects of using that tool on learning performance of university students. The findings indicated that students who used the automation-summarisation-tool learned more efficiently and gained higher scores than non-users.

Real-time analytics in learning systems are thought to be especially effective for learning improvement, for instance in areas like reading and writing, where learners can profit from immediate feedback (Whitelock and Bektik 2018). For instance, Shimada et al. (2018a, b) developed a system that tracks and analyses online readings in real-time. In an experimental study, two groups of classes, where the teacher either used or did not use the real-time analytic system, were compared. The findings revealed that teachers found the analytic system made it easier to adapt their teaching to students’ needs, such as to adjust lecture speed or ask students to use more learning strategies.

In order to demonstrate the potential of learning analytics design for improving learning, Ifenthaler et al. (2020) used navigational sequences and network graph analyses in a case study with N = 3550 learners as well as in linked follow-up studies, showing the most-used paths, characterising path and learning affordance simplicity-to-complexity and the topological structure of the learning environment. Even with open-ended freedom of choice by learners in the initial study, only 608 sequences out of hundreds of millions of possible sequences were evidenced by learners. Another recent network analytics study led to advanced metrics of team collaboration by situating generic network measures in the specific context of collaborative teamwork in structured problem spaces (Ifenthaler et al. 2018a, b; Lin et al. 2016).

Teasley (2017) investigated the effects of a learning dashboard in terms of feedback interventions. The findings indicate that learning dashboards can be especially effective on the enhancement of self-awareness and reflection when learners receive continuous feedback and intervention from learning experts. Another study examining the effects of dashboards is by Bodily et al. (2018) Bodily et al. (2018), who investigated the effects of learning dashboards, including a content and skill recommender systems for a blended chemistry course. The findings showed that 79% of university students who used the dashboard perceived positive effects, but 25% of all students never used it. Bodily et al. (2018) concluded that meaningful feedback in digital learning environments is of crucial importance since it improves the acceptance and use of learning dashboards (Roberts et al. 2017; Schumacher and Ifenthaler, 2018a).

Recent studies also explore how instructors and students feel about various analytics opportunities and how that influences their use for learning. For instance, Howell et al. (2018) found in their study that stakeholders possess and base judgments on expectations concerning learning (e.g. one must remember and perform on one’s own without scaffolds) as well as teaching (e.g. too much scaffolding coddles learners).

Ifenthaler (2017a) investigated the perception of the benefits of learning analytics-related issues for academic institutes. The findings revealed that learning data such as learning time or learner’s previous knowledge are considered important for stakeholders, learning facilitators, students, and learning designers who are top benefit receivers of learning analytics. Also, LMS managers and learning designers are common stakeholders who use learning analytics data, but many institutions did not assign specific roles, such as learning analytics specialist. These findings from a higher education context provide useful perspectives for the implementation of learning analytics in various forms of educational organisation. Hamada et al. (2020) suggested a set of important aspects, which are part of a learning analytics cycle for learning improvement in education, including regular learning analytics events with stakeholders for reflection, setting learning design supporters, and changing teacher’s traditional teaching beliefs.

To sum up the literature, positive evidence has been found on the use of learning analytics to support learning (Ifenthaler and Yau 2020). However, in terms of practical learning analytics, further evidence is required not only about the effects of learning analytics systems (e.g. learning dashboards) but also about the communication of data with decision-makers who will use learning analytics, such as teaching assistants and department heads. System development research is also as important as before, but ideally with an emphasis on user-centred design in development flow and methods (Gibson and Ifenthaler 2020). There is still a need for more evidence concerning the link between learning analytics, intervention measures and indicators to facilitate learning (Ifenthaler et al. 2019; Wong et al. 2019; Yau and Ifenthaler 2020). Therefore, this study used a Delphi design with a panel of international experts to investigate current trends in research, practice and policy focussing on analytics for learning, which will be described next.

Delphi study

The purpose of this study was to identify global trends and/or institutional development and practice in the educational arena which are related to learning analytics. The study employed a Delphi method (Brown 1968; Scheibe et al. 1975) to arrive at a consensus, on a set of important trends, among a panel of international experts from higher education institutions and industry.

Methodology

The Delphi method is a robust approach for determining forecasts or policy positions considered to be the most essential (Scheibe et al. 1975). A Delphi study can be conducted using paper-and-pencil instruments, computer- or web-based approaches as well as face-to-face communication processes. For this study, the researchers applied a mixed Delphi design including (a) computer-based and (b) face-to-face communication methods.

In a first phase, using the computer-based method, a panel of international experts in learning analytics were invited to submit five trends and/or institutional practices in the educational arena which are related to learning analytics. The initial list of trends was then aggregated through expert agreement, resulting in a final list of five important areas of development in learning analytics.

In preparation for the second, face-to-face discussion, phase participants were asked to provide three relevant scientific literature resources which were related to the identified five areas of development in the first phase and to explain their contribution to the respective area of development. Next, participants met face-to-face for a two-day workshop. During the face-to-face session, the experts discussed and came to a consensus on several trends, challenges and conclusions with respect to research gaps and important implications for educational stakeholders including policy makers and practitioners.

A total of N = 12 participants from higher education institutions (nhe = 10) and education organisations (neo = 2) took part in the study. The international experts had at least five years of experience in research and development in learning analytics and were based in Australia (n = 2), Canada (n = 1), Finland (n = 1), France (n = 2), Germany (n = 1), Japan (n = 2), Switzerland (n = 1), UK (n = 1), and USA (n = 1).

Results

The first phase (i.e., computer-based method) resulted in a preliminary list of N = 40 trends and/or institutional practices from the educational arena which are related to learning analytics. The results of the computer-based method are shown in Table 1. The trends and/or institutional practices identified can be categorised as either learning sciences related (e.g. assessment and feedback, adaptive learning), technical challenges (e.g. dashboards, data models), or policy issues (e.g. privacy and ethics regulations).

Table 1 Trends related to learning analytics identified in the computer-based method

The following final list of five important areas of development of learning analytics was determined through expert agreement:

  • Dashboards and visualisation methods and tools for learning and teaching, adaptive and real-time systems, assessment analytics;

  • Evidence-based practices of learners, teachers and other educational stakeholders using learning analytics including educational data literacy;

  • Conceptualisation and technological implementation of the relationship between instructional design and learning analytics, course and curriculum analytics;

  • Combining different data types, data models, data resources and analytics methods, standardised variables, AI and methodology;

  • Role of vendors in analytics solutions, adoption and implementation of analytics systems.

In the second phase (i.e. face-to-face communication method), the experts reflected on trends, themes and challenges in these areas of development and identified and discussed various missed opportunities for effective use of learning analytics systems to drive improvements in student learning and success at scale, with corresponding impacts on society as a whole, due to a number of problems, tensions and barriers. During the face-to-face communication method, the experts agreed that (a) there is a widespread lack of knowledge and understanding regarding learning analytics and the need to select and use learning analytics systems to support learning, teaching and assessment, tracking progress and informing decision-making. Further, the experts suggested that (b) guiding principles and policies need to be updated to help institutions make use of learning analytics. In addition, the experts point out that (c) standards are needed for ethical design and use of learning analytics systems by educational data services providers and users; ensuring quality (e.g., auditing, transparency, reporting), sustainability and scalability. The experts also recommend (d) flexible, user-centred designed tools for different learning levels, ages and stakeholder groups in their unique educational contexts. Last, the experts emphasise (e) the need to apply and advance educationally relevant research-based knowledge to:

  • Engage key stakeholders of learning (e.g., students, parents, teachers, school leaders);

  • Create and make ethical use of rich data models and methodologies to advance learning;

  • Integrate instructional theory, design and delivery with analytics data and insights;

  • Safeguard security, privacy and control of data;

  • Understand the impacts of combining data types from all sectors (i.e., health, socio-emotional, SES, etc.) on interactions with the individual.

Discussion

Initial learning analytics approaches were limited to analysing trace-data or web-statistics in order to describe learner behaviour in online learning environments (Veenman 2013). With increased investigation of educational data, potentials for a broader educational context have been recognised, such as the identification of potential dropouts from study programmes (Sclater et al. 2016). More recently, research on learning analytics has seen an extensive diversification of initial learning analytics approaches (Prieto et al. 2019). However, an emphasis on supporting learning processes whenever the learner needs it appears to be underrepresented (Ifenthaler and Yau 2020). Therefore, this project critically reflected on how to put learning at the centre of learning analytics by (a) undertaking a literature review focussing on the impact of learning analytics on supporting learning and teaching, and, (b) conducting a Delphi study involving international expert discussion on current opportunities and challenges of learning analytics In the discussion section we will summarise the findings, (1) reporting the identified alignment issues and challenges, (2) suggesting six strategies and actions for involved stakeholders, and (3) outlining a possible research agenda for closing identified research gaps.

The Delphi study resulted in a list of key trends/areas of development that provide a frame of reference for the development of adoption strategies (Gibson and Ifenthaler 2020) and change management processes (Ifenthaler 2020) for educational organisations. Although the field of learning analytics receives a lot of attention for its capacity to provide lead indicators of potential student failure and support learning processes (Dawson et al. 2017; Gašević et al. 2015; Joksimović et al. 2018), it has to date primarily focused on individual courses in isolation (Gašević et al. 2017), rather than the capabilities of higher education institutions as learning organisations as a whole (Ifenthaler 2020; Knobbout and van der Stappen 2020). The implementation of learning analytics at higher education institutions may have broad implications for the organisation (e.g. technological infrastructure, policies and regulations) and its stakeholders (e.g. students, academic staff, administrators) including changes in learning culture and educational decision-making (Hilliger et al. 2020). However, an international study on the readiness of higher education institutions for adopting and implementing learning analytics identified various deficits on organisational, technical, and staff level (Ifenthaler 2017a). Only a small number of higher education institutions meet the high staff and technology requirements for implementing actionable learning analytics frameworks (Kevan and Ryan 2016).

Reflecting on the key trends and areas of development if learning analytics, the expert panel identified emerging learning analytics alignments issues and challenges, which need to be addressed and resolved in order to develop and implement learning analytics which effectively improve learning. These will be described in the next section.

Alignment issues and challenges

Several positions that highlight alignment issues and challenges due to a variety of problems, tensions, barriers and missed opportunities for the effective use of learning analytics systems emerge in the initial phase of learning analytics adoption (Prieto et al. 2019). These consequently impede improvements in student learning and success at scale and their corresponding educational impacts on the whole of society (see Table 2).

Table 2 Issues, challenges and recommended actions
  1. (1)

    There is a widespread lack of knowledge and understanding regarding learning analytics and the concomitant need to select and use learning analytics systems for supporting learning, teaching and assessment; tracking progress; and, informing decision-making.

How should data inform practice? Who has the capacity to analyse big data and who is the data analysed for? Ethicists have pointed out that the aims, actions and actors in an educational setting are a complex context of overlapping and sometimes competing interests (West et al. 2016a, b). This implies a need for a certain level of literacy to be achieved by all stakeholders in the system in order to support informed decision-making. What knowledge and skills are needed to understand the role of new data science methods and fit those with conventional qualitative and quantitative traditions of research? Some writers have called for a re-examination of the foundations of educational research, in order to introduce data science methods into the open space that can potentially integrate qualitative and quantitative methods with AI-driven computational assistance and assistants (Nouri et al. 2019). These writers have pointed out the current status and the existing gaps in readiness of higher education to leverage learning analytics (West et al. 2016a, b). In particular what do students need to know to understand and be critical consumers of their own data and that of others? What can teachers do with all these data for their teaching activities and what feedback and monitoring of learning might students expect from learning analytics?

  1. (2)

    Guiding principles and policies need to be updated to help institutions make use of learning analytics.

Learning analytics can provide three kinds of information to students and teachers: summative, real-time or formative, and predictive or prescriptive insights from information prepared for decision-making and action (Ifenthaler 2015). Today, with the emerging potential to map sequences of the tools, communications and information utilised to solve a problem, the capability to build dynamic networks of the relationships of collaborating team members, and the computational resources to automatically classify and adapt curriculum materials in response to user interactions, the fields of learning design and analytics can be brought together as a new field of ‘learning analytics design’ (Ifenthaler 2017b; Lockyer et al. 2013). The new field integrates learning or instructional design informed by data analytics and the design of learning analytics interactive dashboards guided by learning design. Advancements in learning analytics design have the potential for mapping the cognitive, social and physical states of the learner and to optimise learning environments on the fly (Ifenthaler et al. 2018a, b). Three analytics layers have been proposed for data-informed learning design (Hernández-Leo et al. 2019): (a) analytics with a focus on learning decisions to be made by the learner (e.g. has the designed helped someone to learn), (b) analytics for decision-making by designers and teachers-as-designers (e.g. what aspects of the learning design were effective), and (c) analytics of the impact of community-based pedagogy for teachers (e.g. co-design of learning, peer learning).

  1. (3)

    Standards are needed for ethical design and use of learning analytics systems by educational data services providers and users for ensuring quality (e.g., auditing, transparency, reporting, security, privacy, compliance, sustainability, and scalability).

One of the main concerns of learning analytics applications is the handling of data privacy issues (Prinsloo and Slade 2014). As almost every learning analytics feature collects and processes user data by default, learning analytics designers need to consider each country’s data privacy legislation, such as the European General Data Protection Regulation (EU-GDPR). A principle of learning analytics developed by several authors is that a person will not be fully understood by their data trail, no matter how that data improves and broadens, i.e., a digital data trail will not fully encompass all individual characteristics of a person and may also neglect the context of data collection (Prinsloo and Slade 2014). Such issues have been documented in recent research studies regarding privacy issues and ethical dilemmas in learning analytics (Ifenthaler and Schumacher 2016, 2019; Slade and Prinsloo 2013; West et al. 2016a, b). However, it is also well understood that the improvement of automated decision-making, personalisation of learning and adaptation of the curriculum requires a complex, multifaceted and distributed data model of the learner (Behrens et al. 2012). Many questions remain regarding the features and constraints of such a model, how to deploy relevant features as needed in different contexts, and how to re-integrate features into more complex and dynamic pictures of learning progress and achievement.

  1. (4)

    Flexible and user-centred tools are needed for different learning levels, ages and stakeholder groups in their unique educational contexts.

Real-time analytics are increasingly feasible, for example as support systems for teaching. Research has reported on systems that track and analyse online readings as lecture system support services (Shimada et al. 2018a, b), student response systems for attention and engagement (Heaslip et al. 2014), and dashboards that visualise student progress and achievement (Kokoç and Altun 2019; Roberts et al. 2017; Schumacher and Ifenthaler 2018a). Dashboards can be powerful learning tools for both teachers and learners, if developed with user-centred design (e.g. the functions for teachers to interpret learning data before decision-making) (Roberts et al. 2017).

The expert panel identified further learning analytics alignments issues which deserve additional investigation: (a) Literacy, fluency and control over data are linked; (b) global differences in learning analytics impact uses, meanings, and methods; (c) advancing educational research is needed for analytics theory and methodology; (d) bridging data science and learning science requires multidisciplinary collaborations and integrated frameworks from these fields of research.

Proposed strategies and actions

Considering the identified issues of alignment and the challenges facing education worldwide with the advent of learning analytics, six strategies and actions are hereby proposed for three key stakeholder groups: policy makers, researchers and practitioners.

Evidence-based practice led by analytics

According to the expert panel data, in order to make the most of analytics for learning, researchers need to deepen the knowledge base to impact the development of new practices that lead to positive impacts on learning. Policymakers can then develop learning analytics policies that focus on leadership, professional learning, enabling mechanisms, and data governance with added confidence. Practitioners need these two measures—the deepening of research-based knowledge and building of professional practice policies on that knowledge—to take effect before they can develop sufficient skills and confidence to utilise practices led by evidence-based insights from learning analytics. The focus on analytics for learning is a critical commitment that must be maintained in this strategy. All stakeholders need to work in concert to ensure open access to the required resources and best practices so that everyone can benefit educationally. These three measures within the strategy of informing evidence-based practice with learning analytics insights directly address the widespread lack of knowledge regarding the support of learning, teaching and assessment.

Promote the adoption of learning analytics

Two actions in particular promote the adoption of learning analytics, according to the expert panel. Practitioners can take the lead within their schools in enabling local organizational change, which can in turn, support teachers, school leaders, students and the parent community to appreciate and advocate for learning analytics in learning. Local action and readiness for cultural change should precede the development of local policy, because it sets the stage for acceptance, supports the stages of adoption, and helps guide the later development of standards, principles and procedures by policymakers. These actions also address the challenge of updating principles and policies by engaging the impacted communities in the continual process of adapting and improving the organisational response to change.

Inform and guide data services providers and users

Trustworthy, ethical learning analytics practices are supported by policy mechanisms such as standards, accreditation processes, audits and evidence-based recommendations informed by practice. Researchers play a critical role here in promoting sustainability and scalability of policy and practice, for example by producing the knowledge needed to effectively embed analytics and provide just-in-time data services that support good decision-making focused on learning. This strategy of wisely balancing investment in both data services as well as users, supports both the supply and demand sides of the flow of information, which accelerates adoption and positive change.

Impact learning via analytics tools

A core implication of the expert-panel data is the consensus that learning analytics should focus first on its use ‘for learning,’ which contrasts with other potential foci such as ‘accountability,’ ‘testing,’ ‘organisational change’ or ‘financial efficiency.’ These alternative objectives have powerful forces aligned to advocate for them; but only the use of ‘analytics for learning’ will help achieve the most equitable and effective educational system. All stakeholders, including practitioners, researchers and policymakers, need new levels of data literacy to use the new tools of educational decision-making that leverage the knowledge, understanding and capabilities of dynamic learning analytics information flowing through the complex systems of education. A second measure under the ‘for learning’ strategy is to provide specific user-centred analytics tools for different stakeholders (e.g., age groups, learning levels), using evidence-informed context and impact insights, again with a focus on enhancing learning, as opposed to other goals. This strategy acknowledges that flexible user-centred tools are needed for all audiences, but the expert group specifically advocates to prioritise the learner-as-audience.

Leverage the relationship between instructional design and learning analytics, and then extend to course and curriculum analytics

The expert-group data and discussion has highlighted some of the benefits of using learning analytics to inform the advancement of instructional design for quality learning, teaching and assessment. These include transparency of information via new near-real-time dashboards and other flexible and user-centred tools to bring actionable insights directly to the learner and the instructor. For designers of learning experience materials, media and activity organisation, this new era of data permits research-oriented designers to see not only the impacts of a design on outcomes, but on the processes of learning and teaching. New, flexible re-design approaches include offering multiple paths for progress and a stronger role for learner and instructor choices during the processes of learning. Maintaining the focus on enhancing learning (as opposed to maximising other goals of education and schooling) requires actions by all stakeholders that enable multidisciplinary and participatory research for quality assurance as well as for keeping pace with the technology lifecycle of enabled learning environments.

Combine data types from several sectors (e.g. health, socio-emotional, SES, etc.) to improve interactions with individuals; improve data models and leverage AI and related technologies

A Chinese symbol, ‘challenge and opportunity’ could aptly denote the coming era of learning analytics. Big data and artificial intelligence bring into focus the inherent conflicts of goals across the complex system of education. For example, the goals of school accountability and efficiency are not in sync with the goal of high standards of learning for all, nor the individual needs of particular people. Strong and focused actions are needed that provide data privacy and security in the context of interoperability; for example, to ensure that the use of health data, socio-economic data, behavioural, social-emotional and academic data actually advances learning goals rather than other goals of education. This strategy has to guarantee the control and ownership of data is clear, transparent and in the hands of the person who is the subject of the data.

Research agenda

The presented Delphi study and expert panel discussion as well as recent research on learning analytics have outlined a broad and high-level research agenda with four themes and six strategy and action areas indicating knowledge requirements for underpinning organisational and educational system changes needed to put learning back into learning analytics. There are numerous in-depth possibilities within each of the research strands for more detailed exploration, discovery of basic facts and theories in the learning sciences, and areas where massive development efforts are needed—with many currently underway.

The four challenges suggest areas for future research that combine what is known about adoption of innovations and organisational behaviour with some of the unique new challenges of big data and complexity in the science of learning, with data collection and affordances for search, exploration and expression enhanced by linked technologies. The expert panel saw the implications for unique insights, new ways to intervene in a timely way during the process of learning, and new powers to provide formative information to the key actors: learners and teachers. Research questions that hopefully arise from this discussion can be organised by the four primary issues and challenges:

  1. 1.

    Need for knowledge to select and use analytics for learning-focused decision making.

  2. 2.

    Need for guiding principles and policies for institutional practices that enhance learning.

  3. 3.

    Need for standards for ethical use of learning analytics.

  4. 4.

    Need for flexible, user-focused analyses focused on enhancing learning.

Conclusions

As outlined above and discussed in the context of literature and an international panel of experts, the four primary issues and challenges are best addressed by an overlapping system of action recommendations. The mapping is many to many. One action can impact more than one challenge and vice versa, one challenge area can interact with and be impacted by more than one action. In addition, there are cross-cutting impacts within and across the actions. Impacting all stakeholders’ knowledge and awareness, for example, requires content that is surfaced by the actions supporting the development and use of the guiding principles and its related actions.

In conclusion, as new methods and models for data analysis and data representation (e.g. advanced statistics, dashboards, graphical visualisations for learner paths, semantic graphs or social networks, timing diagrams, etc.) continue to be elaborated and put to use, it is critical that stakeholders are supported to understand the methods and models in order to know what is involved—and to act consequently. To enable effective action, related literacies, in particular graphicacy and educational data literacy should be promoted. Graphicacy refers to the way in which spatial information is communicated other than by words or numbers alone (Boardman 1990). Educational data literacy is defined as the ethically responsible collection, management, analysis, comprehension, interpretation, and application of data from educational contexts (Ifenthaler and Yau 2020). With more and more learning analytics systems becoming available, some teachers may start using data to inform their practice while their learners may have access to analytics performance information that may help them set their own pace and objectives. However, education institutions have different practices and regulations regarding data sharing and use of processed data. Some institutions, for example, allow commercial providers to access data, but the level of trust in sharing data between institutions and providers currently vary widely (Ifenthaler and Schumacher 2016; Klein et al. 2019).

Hence, putting learning back into learning analytics requires a complex set of actions and strategies for policy makers, researchers, and practitioners. We hope to have provided the reader with information and ideas for starting conversations, engaging with others, and having a framework for reflecting, knowing that each aspect of this system has partial evidence mounting in the research literature and professional practice of using analytics for learning.