This paper is based on (a) a literature review focussing on the impact of learning analytics on supporting learning and teaching, (b) a Delphi study involving international expert discussion on current opportunities and challenges of learning analytics as well as (c) outlining a research agenda for closing identified research gaps. Issues and challenges facing educators linked to learning analytics and current research gaps were organised into four themes, the further development of which by the expert panel, led to six strategy and action areas. The four themes are 1. development of data literacy in all stakeholders, 2. updating of guiding principles and policies of educational data, 3. standards needed for ethical practices with data quality assurance, and 4. flexible user-centred design for a variety of users of analytics, starting with learners and ensuring that learners and learning is not harmed. The strategies and actions are outcomes of the expert panel discussion and are offered as provocations to organise and focus the researcher, policymaker and practitioner dialogs needed to make progress in the field.
Making productive use of the growing output of educational data systems presents a novel challenge for scientists as well as educators and learners. In the context of education, learning analytics, which refers to the use of static and dynamic data from learners and their contexts for the improvement of learning processes and educational decision-making (Ifenthaler 2015), offer a range of opportunities for supporting learning and teaching, formative and summative assessment, as well as improving learning design. For instance, making use of new modes of analysing and modelling data, data sets can be processed in real-time and presented to learners during their learning process (Ifenthaler et al. 2018a, b). Furthermore, learning analytics can be used to inform and influence decisions on different levels of the educational system (e.g. micro, meso, macro) to improve individual and organisational learning and performance. Various stakeholder groups, such as learners, teachers/educators, instructional designers, institutional leaders, scientists, and public as well as private providers already draw on learning analytics or are contemplating ways to make use of these rapid developments (Baker and Siemens 2015).
Although research in recent years has shown how educational practice might benefit from learning analytics, these new opportunities are accompanied by a range of new challenges (Gibson and Webb 2015). Among these are for instance growing concerns related to privacy and usability issues (Heath 2014; Ifenthaler and Schumacher 2016). With large data sets available to teachers, learners, and other educational stakeholders, questions of data ownership as well as processing and availability of specific data and data types to different user groups have to be addressed and dealt with (e.g. which data at which level of aggregation and interpretation) (Hoel and Chen 2018; Jones 2019). In these changing contexts, it is important to ascertain the exact needs and interests of the different user groups (Ifenthaler and Schumacher 2019; West et al. 2016a, b) for two reasons. On the one hand, the interests of different users impact the objectives for the use of data (e.g. learning support, placement in a certain study context, evaluation of instructional setting). On the other hand, the varying interests drive the assigned value and consequences of data for learners, teachers and educators, depending not only on their personal competencies to make use of the data but also on their understanding and beliefs regarding learning, acquisition of knowledge, and educational improvement in general (Howell et al. 2018).
In recent years, the incorporation of learning analytics into educational practices and research has further developed. However, while new applications and approaches have brought forth new insights, there is still a shortage of research addressing the effectiveness and consequences of these endeavours—especially with regard to the support of learning processes (Vieira et al. 2018). In what follows, we discuss recent developments and trends of learning analytics and identify important issues for educational stakeholders including researchers. The narrative first summarizes existing research on the impact of learning analytics on supporting learning and teaching and indicates key gaps. Second, we describe the results of a Delphi study involving international expert discussions about current key trends and areas of development of learning analytics. Third, we discuss these findings and outline actions for stakeholders such as policy makers, researchers, and practitioners and propose a research agenda to close the identified research gaps.
Conceptual positioning of learning analytics
Learning analytics have been defined as the use of static and dynamic information about learners and learning environments, assessing, eliciting and analysing it, for real-time modelling, prediction and optimisation of learning processes, learning environments, as well as educational decision-making (Ifenthaler 2015). The primary aim is to better meet students’ needs by offering individual learning paths, adaptive assessments and recommendations, or adaptive and just-in-time feedback (Gašević et al. 2015; McLoughlin and Lee 2010), ideally, tailored to learners' motivational states, individual characteristics, and learning goals (Schumacher and Ifenthaler 2018b). Despite the central role learners and learning processes play in this definition, learning analytics frameworks (Chatti and Muslim 2019; Greller and Drachsler 2012; Ifenthaler 2015) often also include the utilisation of learning analytics by other educational stakeholders (e.g. teachers, educational administration). Hence, definitions of learning analytics vary in how tightly they are linked to or embedded in more general models of learning science, data science, or educational assessment (Marzouk et al. 2016).
Learning analytics generally rely on information such as learners’ behaviour in the digital learning environment (i.e. trace data). Such behaviours encompass, for example, the retrieval of the time dimension, the context, and the sequence of occurrence of different actions, such as the use of certain strategies, the posting of certain comments, or the retrieval of specific learning materials at given times during the learning process, which then allow for the analysis of these ‘traces of learning’ through sequence and pattern analysis or social network analysis (Baker and Siemens 2015; Berland et al. 2014; Dawson et al. 2011; Webb et al. 2018). Data might also be derived through more ‘traditional methods’ (Blikstein and Worsley 2016; Gibson et al. 2019), such as self-report measures or obtained from open language-based formats, such as reflective thoughts in chats, blogs or essays, which can be put through digitally assisted analysis, i.e. natural language processing (Gurevych and Kim 2013). Furthermore, these already highly complex data sets can be supplemented with information about learners' individual characteristics and might include further external data such as social interrelations or physical data (Berland et al. 2014; Ifenthaler and Widanapathirana 2014). In harnessing the above described potentials, learning analytics can yield several advantages. Among these are improved quality and validity of capturing relevant information about learning and learning products and the immediate processing and presentation of information which might foster interactive learning processes between learners, teachers and learning environments (Seufert et al. 2019; Shute and Rahimi 2017).
Current learning analytics approaches focus on indicators based on the behaviour in the digital learning environment (Yau and Ifenthaler 2020), such as time spent online, access to various types of resources, or reading and writing posts to relate them to learning performance (Ifenthaler et al. 2019; Mah 2016; Martin and Whitmer 2016). Only a few approaches are enriched with learner characteristics such as demographic data or results of assessments, to for instance predict study success (Costa et al. 2017; Vieira et al. 2018). In a literature review focusing on visual learning analytics, Vieira et al. (2018) found that most studies analyse usage of resources in particular, with only a few approaches having a processual perspective by trying to understand learning paths or learners’ learning progress. Furthermore, in some cases learning analytics might yield only limited insight into students' learning, because the indicators collected are not pedagogically valid. For instance, specific indicators, such as ‘time on task’ might have different meanings depending on the learning contexts (Goldhammer et al. 2014). Likewise, not all learning processes take place within the digital learning environment or can be captured with trace data (Wilson et al. 2017; Winne 2017). Hence, meaningful analysis of data requires sound theoretical grounding and modelling as well as verification of validity, gained for instance in complex evidence-based design processes (Marzouk et al. 2016; Shute et al. 2018; Wong et al. 2019).
Learning analytics might be used at different levels and for different forms of educational assessment, which has far-reaching consequences for modelling and analysing data as well as the development of criteria to evaluate its impact. From an assessment perspective, learning analytics for formative assessment refers to the generation and interpretation of evidence about learner performance by teachers, learners and/or technology to make assisted decisions about the next steps in instruction (Ifenthaler et al. 2018a, b; Spector et al. 2016). In this context, real- or near-time data are extremely valuable because of their benefits in ongoing learning interactions, for example for awareness and reflection. Learning analytics for summative assessments are used to make judgements that are typically based on standards or benchmarks and can for instance be used to classify and compare learners, classes, or institutions (Black and Wiliam 1998). Furthermore, learning analytics can also provide predictive or prescriptive insights for decision-making (Ifenthaler 2015), which can take on different forms depending on the aims and interests of different stakeholder groups in the educational system (e.g. learning recommendations, identification of learners at risk, allocation of resources for programs and interventions) (Sclater and Mullan 2017). In practice, but also in research, these different assessment perspectives and consequences are often intertwined, and it would be beneficial to explicitly locate certain learning analytic applications within this educational assessment space (Webb and Ifenthaler 2018).
Empirical evidence from learning analytics research
Research focussing on learning analytics is still rapidly evolving with most of the respective implementations being located in UK, USA and Australia (Ifenthaler et al. 2019; Sclater et al. 2016). Although in the last five years, there has been an increase in the number of related research efforts (Gašević et al. 2015,2017,2019; Rienties and Toetenel 2016; Tempelaar et al. 2015), large-scale studies regarding the effectiveness of learning analytics are still lacking (Mah et al. 2019; Wong et al. 2019). One of the main aims of learning analytics research is to improve awareness, reflection and regulation during learning processes as well as learning performance and the design of learning environments (Tsai et al. 2019). This section reports key empirical findings which are closely related to putting learning back into learning analytics.
For instance, Perry and Winne (2006)—building on the concept of self-regulated learning (SRL)—developed the tool ‘gStudy’ which uses a log-file analysis to give feedback to the learner to regulate their learning behaviours. In a more recent study, Goda et al. (2015) investigated the relationship between learning patterns and learning performance in computer-assisted language learning with respect to self-regulation strategies and procrastination behaviours by learners. An analysis of learning logs of university students revealed seven different learning patterns. 70% of learners where classified as ‘procrastinators’ and only about 7% belonged to the group ‘learning habit’, with the first group having a significantly lower learning performance than that of the learning habit group. In using a technology-driven research approach in learning analytics, Shimada et al. (2018a, b) developed an automation-summarisation-tool of lecture slides, and investigated the effects of using that tool on learning performance of university students. The findings indicated that students who used the automation-summarisation-tool learned more efficiently and gained higher scores than non-users.
Real-time analytics in learning systems are thought to be especially effective for learning improvement, for instance in areas like reading and writing, where learners can profit from immediate feedback (Whitelock and Bektik 2018). For instance, Shimada et al. (2018a, b) developed a system that tracks and analyses online readings in real-time. In an experimental study, two groups of classes, where the teacher either used or did not use the real-time analytic system, were compared. The findings revealed that teachers found the analytic system made it easier to adapt their teaching to students’ needs, such as to adjust lecture speed or ask students to use more learning strategies.
In order to demonstrate the potential of learning analytics design for improving learning, Ifenthaler et al. (2020) used navigational sequences and network graph analyses in a case study with N = 3550 learners as well as in linked follow-up studies, showing the most-used paths, characterising path and learning affordance simplicity-to-complexity and the topological structure of the learning environment. Even with open-ended freedom of choice by learners in the initial study, only 608 sequences out of hundreds of millions of possible sequences were evidenced by learners. Another recent network analytics study led to advanced metrics of team collaboration by situating generic network measures in the specific context of collaborative teamwork in structured problem spaces (Ifenthaler et al. 2018a, b; Lin et al. 2016).
Teasley (2017) investigated the effects of a learning dashboard in terms of feedback interventions. The findings indicate that learning dashboards can be especially effective on the enhancement of self-awareness and reflection when learners receive continuous feedback and intervention from learning experts. Another study examining the effects of dashboards is by Bodily et al. (2018) Bodily et al. (2018), who investigated the effects of learning dashboards, including a content and skill recommender systems for a blended chemistry course. The findings showed that 79% of university students who used the dashboard perceived positive effects, but 25% of all students never used it. Bodily et al. (2018) concluded that meaningful feedback in digital learning environments is of crucial importance since it improves the acceptance and use of learning dashboards (Roberts et al. 2017; Schumacher and Ifenthaler, 2018a).
Recent studies also explore how instructors and students feel about various analytics opportunities and how that influences their use for learning. For instance, Howell et al. (2018) found in their study that stakeholders possess and base judgments on expectations concerning learning (e.g. one must remember and perform on one’s own without scaffolds) as well as teaching (e.g. too much scaffolding coddles learners).
Ifenthaler (2017a) investigated the perception of the benefits of learning analytics-related issues for academic institutes. The findings revealed that learning data such as learning time or learner’s previous knowledge are considered important for stakeholders, learning facilitators, students, and learning designers who are top benefit receivers of learning analytics. Also, LMS managers and learning designers are common stakeholders who use learning analytics data, but many institutions did not assign specific roles, such as learning analytics specialist. These findings from a higher education context provide useful perspectives for the implementation of learning analytics in various forms of educational organisation. Hamada et al. (2020) suggested a set of important aspects, which are part of a learning analytics cycle for learning improvement in education, including regular learning analytics events with stakeholders for reflection, setting learning design supporters, and changing teacher’s traditional teaching beliefs.
To sum up the literature, positive evidence has been found on the use of learning analytics to support learning (Ifenthaler and Yau 2020). However, in terms of practical learning analytics, further evidence is required not only about the effects of learning analytics systems (e.g. learning dashboards) but also about the communication of data with decision-makers who will use learning analytics, such as teaching assistants and department heads. System development research is also as important as before, but ideally with an emphasis on user-centred design in development flow and methods (Gibson and Ifenthaler 2020). There is still a need for more evidence concerning the link between learning analytics, intervention measures and indicators to facilitate learning (Ifenthaler et al. 2019; Wong et al. 2019; Yau and Ifenthaler 2020). Therefore, this study used a Delphi design with a panel of international experts to investigate current trends in research, practice and policy focussing on analytics for learning, which will be described next.
The purpose of this study was to identify global trends and/or institutional development and practice in the educational arena which are related to learning analytics. The study employed a Delphi method (Brown 1968; Scheibe et al. 1975) to arrive at a consensus, on a set of important trends, among a panel of international experts from higher education institutions and industry.
The Delphi method is a robust approach for determining forecasts or policy positions considered to be the most essential (Scheibe et al. 1975). A Delphi study can be conducted using paper-and-pencil instruments, computer- or web-based approaches as well as face-to-face communication processes. For this study, the researchers applied a mixed Delphi design including (a) computer-based and (b) face-to-face communication methods.
In a first phase, using the computer-based method, a panel of international experts in learning analytics were invited to submit five trends and/or institutional practices in the educational arena which are related to learning analytics. The initial list of trends was then aggregated through expert agreement, resulting in a final list of five important areas of development in learning analytics.
In preparation for the second, face-to-face discussion, phase participants were asked to provide three relevant scientific literature resources which were related to the identified five areas of development in the first phase and to explain their contribution to the respective area of development. Next, participants met face-to-face for a two-day workshop. During the face-to-face session, the experts discussed and came to a consensus on several trends, challenges and conclusions with respect to research gaps and important implications for educational stakeholders including policy makers and practitioners.
A total of N = 12 participants from higher education institutions (nhe = 10) and education organisations (neo = 2) took part in the study. The international experts had at least five years of experience in research and development in learning analytics and were based in Australia (n = 2), Canada (n = 1), Finland (n = 1), France (n = 2), Germany (n = 1), Japan (n = 2), Switzerland (n = 1), UK (n = 1), and USA (n = 1).
The first phase (i.e., computer-based method) resulted in a preliminary list of N = 40 trends and/or institutional practices from the educational arena which are related to learning analytics. The results of the computer-based method are shown in Table 1. The trends and/or institutional practices identified can be categorised as either learning sciences related (e.g. assessment and feedback, adaptive learning), technical challenges (e.g. dashboards, data models), or policy issues (e.g. privacy and ethics regulations).
The following final list of five important areas of development of learning analytics was determined through expert agreement:
Dashboards and visualisation methods and tools for learning and teaching, adaptive and real-time systems, assessment analytics;
Evidence-based practices of learners, teachers and other educational stakeholders using learning analytics including educational data literacy;
Conceptualisation and technological implementation of the relationship between instructional design and learning analytics, course and curriculum analytics;
Combining different data types, data models, data resources and analytics methods, standardised variables, AI and methodology;
Role of vendors in analytics solutions, adoption and implementation of analytics systems.
In the second phase (i.e. face-to-face communication method), the experts reflected on trends, themes and challenges in these areas of development and identified and discussed various missed opportunities for effective use of learning analytics systems to drive improvements in student learning and success at scale, with corresponding impacts on society as a whole, due to a number of problems, tensions and barriers. During the face-to-face communication method, the experts agreed that (a) there is a widespread lack of knowledge and understanding regarding learning analytics and the need to select and use learning analytics systems to support learning, teaching and assessment, tracking progress and informing decision-making. Further, the experts suggested that (b) guiding principles and policies need to be updated to help institutions make use of learning analytics. In addition, the experts point out that (c) standards are needed for ethical design and use of learning analytics systems by educational data services providers and users; ensuring quality (e.g., auditing, transparency, reporting), sustainability and scalability. The experts also recommend (d) flexible, user-centred designed tools for different learning levels, ages and stakeholder groups in their unique educational contexts. Last, the experts emphasise (e) the need to apply and advance educationally relevant research-based knowledge to:
Engage key stakeholders of learning (e.g., students, parents, teachers, school leaders);
Create and make ethical use of rich data models and methodologies to advance learning;
Integrate instructional theory, design and delivery with analytics data and insights;
Safeguard security, privacy and control of data;
Understand the impacts of combining data types from all sectors (i.e., health, socio-emotional, SES, etc.) on interactions with the individual.
Initial learning analytics approaches were limited to analysing trace-data or web-statistics in order to describe learner behaviour in online learning environments (Veenman 2013). With increased investigation of educational data, potentials for a broader educational context have been recognised, such as the identification of potential dropouts from study programmes (Sclater et al. 2016). More recently, research on learning analytics has seen an extensive diversification of initial learning analytics approaches (Prieto et al. 2019). However, an emphasis on supporting learning processes whenever the learner needs it appears to be underrepresented (Ifenthaler and Yau 2020). Therefore, this project critically reflected on how to put learning at the centre of learning analytics by (a) undertaking a literature review focussing on the impact of learning analytics on supporting learning and teaching, and, (b) conducting a Delphi study involving international expert discussion on current opportunities and challenges of learning analytics In the discussion section we will summarise the findings, (1) reporting the identified alignment issues and challenges, (2) suggesting six strategies and actions for involved stakeholders, and (3) outlining a possible research agenda for closing identified research gaps.
The Delphi study resulted in a list of key trends/areas of development that provide a frame of reference for the development of adoption strategies (Gibson and Ifenthaler 2020) and change management processes (Ifenthaler 2020) for educational organisations. Although the field of learning analytics receives a lot of attention for its capacity to provide lead indicators of potential student failure and support learning processes (Dawson et al. 2017; Gašević et al. 2015; Joksimović et al. 2018), it has to date primarily focused on individual courses in isolation (Gašević et al. 2017), rather than the capabilities of higher education institutions as learning organisations as a whole (Ifenthaler 2020; Knobbout and van der Stappen 2020). The implementation of learning analytics at higher education institutions may have broad implications for the organisation (e.g. technological infrastructure, policies and regulations) and its stakeholders (e.g. students, academic staff, administrators) including changes in learning culture and educational decision-making (Hilliger et al. 2020). However, an international study on the readiness of higher education institutions for adopting and implementing learning analytics identified various deficits on organisational, technical, and staff level (Ifenthaler 2017a). Only a small number of higher education institutions meet the high staff and technology requirements for implementing actionable learning analytics frameworks (Kevan and Ryan 2016).
Reflecting on the key trends and areas of development if learning analytics, the expert panel identified emerging learning analytics alignments issues and challenges, which need to be addressed and resolved in order to develop and implement learning analytics which effectively improve learning. These will be described in the next section.
Alignment issues and challenges
Several positions that highlight alignment issues and challenges due to a variety of problems, tensions, barriers and missed opportunities for the effective use of learning analytics systems emerge in the initial phase of learning analytics adoption (Prieto et al. 2019). These consequently impede improvements in student learning and success at scale and their corresponding educational impacts on the whole of society (see Table 2).
There is a widespread lack of knowledge and understanding regarding learning analytics and the concomitant need to select and use learning analytics systems for supporting learning, teaching and assessment; tracking progress; and, informing decision-making.
How should data inform practice? Who has the capacity to analyse big data and who is the data analysed for? Ethicists have pointed out that the aims, actions and actors in an educational setting are a complex context of overlapping and sometimes competing interests (West et al. 2016a, b). This implies a need for a certain level of literacy to be achieved by all stakeholders in the system in order to support informed decision-making. What knowledge and skills are needed to understand the role of new data science methods and fit those with conventional qualitative and quantitative traditions of research? Some writers have called for a re-examination of the foundations of educational research, in order to introduce data science methods into the open space that can potentially integrate qualitative and quantitative methods with AI-driven computational assistance and assistants (Nouri et al. 2019). These writers have pointed out the current status and the existing gaps in readiness of higher education to leverage learning analytics (West et al. 2016a, b). In particular what do students need to know to understand and be critical consumers of their own data and that of others? What can teachers do with all these data for their teaching activities and what feedback and monitoring of learning might students expect from learning analytics?
Guiding principles and policies need to be updated to help institutions make use of learning analytics.
Learning analytics can provide three kinds of information to students and teachers: summative, real-time or formative, and predictive or prescriptive insights from information prepared for decision-making and action (Ifenthaler 2015). Today, with the emerging potential to map sequences of the tools, communications and information utilised to solve a problem, the capability to build dynamic networks of the relationships of collaborating team members, and the computational resources to automatically classify and adapt curriculum materials in response to user interactions, the fields of learning design and analytics can be brought together as a new field of ‘learning analytics design’ (Ifenthaler 2017b; Lockyer et al. 2013). The new field integrates learning or instructional design informed by data analytics and the design of learning analytics interactive dashboards guided by learning design. Advancements in learning analytics design have the potential for mapping the cognitive, social and physical states of the learner and to optimise learning environments on the fly (Ifenthaler et al. 2018a, b). Three analytics layers have been proposed for data-informed learning design (Hernández-Leo et al. 2019): (a) analytics with a focus on learning decisions to be made by the learner (e.g. has the designed helped someone to learn), (b) analytics for decision-making by designers and teachers-as-designers (e.g. what aspects of the learning design were effective), and (c) analytics of the impact of community-based pedagogy for teachers (e.g. co-design of learning, peer learning).
Standards are needed for ethical design and use of learning analytics systems by educational data services providers and users for ensuring quality (e.g., auditing, transparency, reporting, security, privacy, compliance, sustainability, and scalability).
One of the main concerns of learning analytics applications is the handling of data privacy issues (Prinsloo and Slade 2014). As almost every learning analytics feature collects and processes user data by default, learning analytics designers need to consider each country’s data privacy legislation, such as the European General Data Protection Regulation (EU-GDPR). A principle of learning analytics developed by several authors is that a person will not be fully understood by their data trail, no matter how that data improves and broadens, i.e., a digital data trail will not fully encompass all individual characteristics of a person and may also neglect the context of data collection (Prinsloo and Slade 2014). Such issues have been documented in recent research studies regarding privacy issues and ethical dilemmas in learning analytics (Ifenthaler and Schumacher 2016, 2019; Slade and Prinsloo 2013; West et al. 2016a, b). However, it is also well understood that the improvement of automated decision-making, personalisation of learning and adaptation of the curriculum requires a complex, multifaceted and distributed data model of the learner (Behrens et al. 2012). Many questions remain regarding the features and constraints of such a model, how to deploy relevant features as needed in different contexts, and how to re-integrate features into more complex and dynamic pictures of learning progress and achievement.
Flexible and user-centred tools are needed for different learning levels, ages and stakeholder groups in their unique educational contexts.
Real-time analytics are increasingly feasible, for example as support systems for teaching. Research has reported on systems that track and analyse online readings as lecture system support services (Shimada et al. 2018a, b), student response systems for attention and engagement (Heaslip et al. 2014), and dashboards that visualise student progress and achievement (Kokoç and Altun 2019; Roberts et al. 2017; Schumacher and Ifenthaler 2018a). Dashboards can be powerful learning tools for both teachers and learners, if developed with user-centred design (e.g. the functions for teachers to interpret learning data before decision-making) (Roberts et al. 2017).
The expert panel identified further learning analytics alignments issues which deserve additional investigation: (a) Literacy, fluency and control over data are linked; (b) global differences in learning analytics impact uses, meanings, and methods; (c) advancing educational research is needed for analytics theory and methodology; (d) bridging data science and learning science requires multidisciplinary collaborations and integrated frameworks from these fields of research.
Proposed strategies and actions
Considering the identified issues of alignment and the challenges facing education worldwide with the advent of learning analytics, six strategies and actions are hereby proposed for three key stakeholder groups: policy makers, researchers and practitioners.
Evidence-based practice led by analytics
According to the expert panel data, in order to make the most of analytics for learning, researchers need to deepen the knowledge base to impact the development of new practices that lead to positive impacts on learning. Policymakers can then develop learning analytics policies that focus on leadership, professional learning, enabling mechanisms, and data governance with added confidence. Practitioners need these two measures—the deepening of research-based knowledge and building of professional practice policies on that knowledge—to take effect before they can develop sufficient skills and confidence to utilise practices led by evidence-based insights from learning analytics. The focus on analytics for learning is a critical commitment that must be maintained in this strategy. All stakeholders need to work in concert to ensure open access to the required resources and best practices so that everyone can benefit educationally. These three measures within the strategy of informing evidence-based practice with learning analytics insights directly address the widespread lack of knowledge regarding the support of learning, teaching and assessment.
Promote the adoption of learning analytics
Two actions in particular promote the adoption of learning analytics, according to the expert panel. Practitioners can take the lead within their schools in enabling local organizational change, which can in turn, support teachers, school leaders, students and the parent community to appreciate and advocate for learning analytics in learning. Local action and readiness for cultural change should precede the development of local policy, because it sets the stage for acceptance, supports the stages of adoption, and helps guide the later development of standards, principles and procedures by policymakers. These actions also address the challenge of updating principles and policies by engaging the impacted communities in the continual process of adapting and improving the organisational response to change.
Inform and guide data services providers and users
Trustworthy, ethical learning analytics practices are supported by policy mechanisms such as standards, accreditation processes, audits and evidence-based recommendations informed by practice. Researchers play a critical role here in promoting sustainability and scalability of policy and practice, for example by producing the knowledge needed to effectively embed analytics and provide just-in-time data services that support good decision-making focused on learning. This strategy of wisely balancing investment in both data services as well as users, supports both the supply and demand sides of the flow of information, which accelerates adoption and positive change.
Impact learning via analytics tools
A core implication of the expert-panel data is the consensus that learning analytics should focus first on its use ‘for learning,’ which contrasts with other potential foci such as ‘accountability,’ ‘testing,’ ‘organisational change’ or ‘financial efficiency.’ These alternative objectives have powerful forces aligned to advocate for them; but only the use of ‘analytics for learning’ will help achieve the most equitable and effective educational system. All stakeholders, including practitioners, researchers and policymakers, need new levels of data literacy to use the new tools of educational decision-making that leverage the knowledge, understanding and capabilities of dynamic learning analytics information flowing through the complex systems of education. A second measure under the ‘for learning’ strategy is to provide specific user-centred analytics tools for different stakeholders (e.g., age groups, learning levels), using evidence-informed context and impact insights, again with a focus on enhancing learning, as opposed to other goals. This strategy acknowledges that flexible user-centred tools are needed for all audiences, but the expert group specifically advocates to prioritise the learner-as-audience.
Leverage the relationship between instructional design and learning analytics, and then extend to course and curriculum analytics
The expert-group data and discussion has highlighted some of the benefits of using learning analytics to inform the advancement of instructional design for quality learning, teaching and assessment. These include transparency of information via new near-real-time dashboards and other flexible and user-centred tools to bring actionable insights directly to the learner and the instructor. For designers of learning experience materials, media and activity organisation, this new era of data permits research-oriented designers to see not only the impacts of a design on outcomes, but on the processes of learning and teaching. New, flexible re-design approaches include offering multiple paths for progress and a stronger role for learner and instructor choices during the processes of learning. Maintaining the focus on enhancing learning (as opposed to maximising other goals of education and schooling) requires actions by all stakeholders that enable multidisciplinary and participatory research for quality assurance as well as for keeping pace with the technology lifecycle of enabled learning environments.
Combine data types from several sectors (e.g. health, socio-emotional, SES, etc.) to improve interactions with individuals; improve data models and leverage AI and related technologies
A Chinese symbol, ‘challenge and opportunity’ could aptly denote the coming era of learning analytics. Big data and artificial intelligence bring into focus the inherent conflicts of goals across the complex system of education. For example, the goals of school accountability and efficiency are not in sync with the goal of high standards of learning for all, nor the individual needs of particular people. Strong and focused actions are needed that provide data privacy and security in the context of interoperability; for example, to ensure that the use of health data, socio-economic data, behavioural, social-emotional and academic data actually advances learning goals rather than other goals of education. This strategy has to guarantee the control and ownership of data is clear, transparent and in the hands of the person who is the subject of the data.
The presented Delphi study and expert panel discussion as well as recent research on learning analytics have outlined a broad and high-level research agenda with four themes and six strategy and action areas indicating knowledge requirements for underpinning organisational and educational system changes needed to put learning back into learning analytics. There are numerous in-depth possibilities within each of the research strands for more detailed exploration, discovery of basic facts and theories in the learning sciences, and areas where massive development efforts are needed—with many currently underway.
The four challenges suggest areas for future research that combine what is known about adoption of innovations and organisational behaviour with some of the unique new challenges of big data and complexity in the science of learning, with data collection and affordances for search, exploration and expression enhanced by linked technologies. The expert panel saw the implications for unique insights, new ways to intervene in a timely way during the process of learning, and new powers to provide formative information to the key actors: learners and teachers. Research questions that hopefully arise from this discussion can be organised by the four primary issues and challenges:
Need for knowledge to select and use analytics for learning-focused decision making.
Need for guiding principles and policies for institutional practices that enhance learning.
Need for standards for ethical use of learning analytics.
Need for flexible, user-focused analyses focused on enhancing learning.
As outlined above and discussed in the context of literature and an international panel of experts, the four primary issues and challenges are best addressed by an overlapping system of action recommendations. The mapping is many to many. One action can impact more than one challenge and vice versa, one challenge area can interact with and be impacted by more than one action. In addition, there are cross-cutting impacts within and across the actions. Impacting all stakeholders’ knowledge and awareness, for example, requires content that is surfaced by the actions supporting the development and use of the guiding principles and its related actions.
In conclusion, as new methods and models for data analysis and data representation (e.g. advanced statistics, dashboards, graphical visualisations for learner paths, semantic graphs or social networks, timing diagrams, etc.) continue to be elaborated and put to use, it is critical that stakeholders are supported to understand the methods and models in order to know what is involved—and to act consequently. To enable effective action, related literacies, in particular graphicacy and educational data literacy should be promoted. Graphicacy refers to the way in which spatial information is communicated other than by words or numbers alone (Boardman 1990). Educational data literacy is defined as the ethically responsible collection, management, analysis, comprehension, interpretation, and application of data from educational contexts (Ifenthaler and Yau 2020). With more and more learning analytics systems becoming available, some teachers may start using data to inform their practice while their learners may have access to analytics performance information that may help them set their own pace and objectives. However, education institutions have different practices and regulations regarding data sharing and use of processed data. Some institutions, for example, allow commercial providers to access data, but the level of trust in sharing data between institutions and providers currently vary widely (Ifenthaler and Schumacher 2016; Klein et al. 2019).
Hence, putting learning back into learning analytics requires a complex set of actions and strategies for policy makers, researchers, and practitioners. We hope to have provided the reader with information and ideas for starting conversations, engaging with others, and having a framework for reflecting, knowing that each aspect of this system has partial evidence mounting in the research literature and professional practice of using analytics for learning.
Baker, R. S., & Siemens, G. (2015). Educational data mining and learning analytics. In R. K. Sawyer (Ed.), The Cambridge handbook of the learning sciences (2nd ed., pp. 253–272). Cambridge, UK: Cambridge University Press.
Behrens, J., Mislevy, R., Dicerbo, K., & Levy, R. (2012). Evidence centered design for learning and assessment in the digital world. In M. Mayrath, J. Clarke-Midura, D. Robinson, & G. Schraw (Eds.), Technology-based assessments for 21st century skills (pp. 13–54). Charlotte, NC: Information Age Publishers.
Berland, M., Baker, R. S., & Bilkstein, P. (2014). Educational data mining and learning analytics: Applications to constructionist research. Technology, Knowledge and Learning, 19(1–2), 205–220. https://doi.org/10.1007/s10758-014-9223-7.
Black, P., & Wiliam, D. (1998). Assessment and classroom learning Assessment in Education: Principles. Policy & Practice, 5(1), 7–74. https://doi.org/10.1080/0969595980050102.
Blikstein, P., & Worsley, M. (2016). Multimodal learning analytics and education data mining: Using computational technologies to measure complex learning tasks. Journal of Learning Analytics, 3(2), 220–238. https://doi.org/10.18608/jla.2016.32.11.
Boardman, D. (1990). Graphicacy revisited: mapping abilities and gender differences. Educational Review, 42(1), 57–64. https://doi.org/10.1080/0013191900420106.
Bodily, R., Ikahihifo, T. K., Mackley, B., & Graham, C. R. (2018). The design, development, and implementation of student-facing learning analytics dashboards. Journal of Computing in Higher Education, 30(3), 572–598.
Brown, B. B. (1968). Delphi process: A methodology used for the elicitation of opinions of experts. Santa Monica, CA: RAND Corporation.
Chatti, M. A., & Muslim, A. (2019). The PERLA framework: Blending personalization and learning analytics. The International Review of Research in Open and Distributed Learnin. https://doi.org/10.19173/irrodl.v20i1.3936.
Costa, E. B., Fonseca, B., Santana, M. A., de Araújo, F., & Rego, J. (2017). Evaluating the effectiveness of educational data mining techniques for early prediction of students’ academic failure in introductory programming courses. Computers in Human Behavior, 73, 247–256. https://doi.org/10.1016/j.chb.2017.01.047.
Dawson, S., Jovanović, J., Gašević, D., & Pardo, A. (2017). From prediction to impact: Evaluation of a learning analytics retention program. In I. Molenaar, X. Ochoa, & S. Dawson (Eds.), Proceedings of the seventh international learning analytics & knowledge conference (pp. 474–478). New York, NY: ACM
Dawson, S., Macfadyen, L., Lockyer, L., & Mazzochi-Jones, D. (2011). Using social network metrics to assess the effectiveness of broad-based admission practices. Australasian Journal of Educational Technology, 27(1), 16–27.
Gašević, D., Dawson, S., & Siemens, G. (2015). Let’s not forget: Learning analytics are about learning. TechTrends, 59(1), 64–71. https://doi.org/10.1007/s11528-014-0822-x.
Gašević, D., Jovanović, J., Pardo, A., & Dawson, S. (2017). Detecting learning strategies with analytics: Links with self-reported measures and academic performance. Journal of Learning Analytics, 4(2), 113–128.
Gašević, D., Tsai, Y.-S., Dawson, S., & Pardo, A. . (2019). How do we start? An approach to learning analytics adoption in higher education. International Journal of Information and Educational Technology, 36(4), 342–353. https://doi.org/10.1108/IJILT-02-2019-0024.
Gibson, D. C., & Ifenthaler, D. (2020). Adoption of learning analytics. In D. Ifenthaler & D. C. Gibson (Eds.), Adoption of data analytics in higher education learning and teaching (pp. 3–20). Cham: Springer.
Gibson, D. C., & Webb, M. (2015). Data science in educational assessment. Education and Information Technologies, 20(4), 697–713. https://doi.org/10.1007/s10639-015-9411-7.
Gibson, D. C., Webb, M., & Ifenthaler, D. (2019). Measurement challenges of interactive educational assessment. In D. G. Sampson, J. M. Spector, D. Ifenthaler, P. Isaias, & S. Sergis (Eds.), Learning technologies for transforming teaching, learning and assessment at large scale (pp. 19–33). New York, NY: Springer.
Goda, Y., Yamada, M., Kato, H., Matsuda, T., Saito, Y., & Miyamaga, H. (2015). Procrastination and other learning behavioral types in e-learning and their relationship with learning outcomes. Learning and Individual Differences, 37, 72–80. https://doi.org/10.1016/j.lindif.2014.11.001.
Goldhammer, F., Naumann, J., Stelter, A., Toth, K., Rölke, H., & Klieme, E. (2014). The time on task effect in reading and problem solving is moderated by task difficulty and skill. Insights from a computer-based large-scale assessment. Journal of Educational Psychology, 106, 608–626.
Greller, W., & Drachsler, H. (2012). Translating learning into numbers: A generic framework for learning analytics. Educational Technology & Society, 15(3), 42–57.
Gurevych, I., & Kim, J. (Eds.). (2013). The people’s web meets NLP. Collaboratively constructed language resources. Berlin: Springer.
Hamada, S., Xu, Y., Geng, X., Chen, L., Ogata, H., Shimada, A., & Yamada, M. (2020). For evidence-based class design with learning analytics: a proposal of preliminary practice flow model in high school, companion. Paper presented at the Proceedings of Learning Analytics and Knowledge Conference.
Heaslip, G., Donovan, P., & Cullen, J. G. (2014). Student response systems and learner engagement in large classes. Active Learning in Higher Education, 15(1), 11–24. https://doi.org/10.1177/1469787413514648.
Heath, J. (2014). Contemporary privacy theory contributions to learning analytics. Journal of Learning Analytics, 1(1), 140–149.
Hernández-Leo, D., Martinez-Maldonado, R., Pardo, A., Muñoz-Cristóbal, J. A., & Rodríguez-Triana, M. J. (2019). Analytics for learning design: A layered framework and tools. British Journal of Educational Technology, 50(1), 139–152. https://doi.org/10.1111/bjet.12645.
Hilliger, I., Ortiz-Rojas, M., Pesántez-Cabrera, P., Scheihing, E., Tsai, Y.-S., Muñoz-Merino, P. J., & Pérez-Sanagustín, M. (2020). Identifying needs for learning analytics adoption in Latin American universities: A mixed-methods approach. Internet and Higher Education. https://doi.org/10.1016/j.iheduc.2020.100726.
Hoel, T., & Chen, W. (2018). Privacy and data protection in learning analytics should be motivated by an educational maxim—towards a proposal. Research and Practice in Technology Enhanced Learning. https://doi.org/10.1186/s41039-018-0086-8.
Howell, J. A., Roberts, L. D., Seaman, K., & Gibson, D. C. (2018). Are we on our way to becoming a “helicopter university”? Academics’ views on learning analytics. Technology, Knowledge and Learning, 23(1), 1–20. https://doi.org/10.1007/s10758-017-9329-9.
Ifenthaler, D. (2015). Learning analytics. In J. M. Spector (Ed.), The SAGE encyclopedia of educational technology (Vol. 2, pp. 447–451). Thousand Oaks, CA: Sage.
Ifenthaler, D. (2017a). Are higher education institutions prepared for learning analytics? TechTrends, 61(4), 366–371. https://doi.org/10.1007/s11528-016-0154-0.
Ifenthaler, D. (2017b). Learning analytics design. In L. Lin & J. M. Spector (Eds.), The sciences of learning and instructional design: Constructive articulation between communities (pp. 202–211). New York, NY: Routledge.
Ifenthaler, D. (2020). Change management for learning analytics. In N. Pinkwart & S. Liu (Eds.), Artificial intelligence supported educational technologies (pp. 261–272). Cham: Springer.
Ifenthaler, D., Gibson, D. C., & Dobozy, E. (2018). Informing learning design through analytics: Applying network graph analysis. Australasian Journal of Educational Technology, 34(2), 117–132. https://doi.org/10.14742/ajet.3767.
Ifenthaler, D., Gibson, D. C., & Zheng, L. (2020). Attributes of engagement in challenge-based digital learning environments. In P. Isaias, D. G. Sampson, & D. Ifenthaler (Eds.), Online teaching and learning in higher education (pp. 81–91). Cham: Springer.
Ifenthaler, D., Greiff, S., & Gibson, D. C. (2018). Making use of data for assessments: harnessing analytics and data science. In J. Voogt, G. Knezek, R. Christensen, & K.-W. Lai (Eds.), International handbook of IT in primary and secondary education (2nd ed., pp. 649–663). New York, NY: Springer.
Ifenthaler, D., Mah, D.-K., & Yau, J.Y.-K. (2019). Utilising learning analytics for study success. Reflections on current empirical findings. In D. Ifenthaler, J.Y.-K. Yau, & D.-K. Mah (Eds.), Utilizing learning analytics to support study success (pp. 27–36). Cham: Springer.
Ifenthaler, D., & Schumacher, C. (2016). Student perceptions of privacy principles for learning analytics. Educational Technology Research and Development, 64(5), 923–938. https://doi.org/10.1007/s11423-016-9477-y.
Ifenthaler, D., & Schumacher, C. (2019). Releasing personal information within learning analytics systems. In D. G. Sampson, J. M. Spector, D. Ifenthaler, P. Isaias, & S. Sergis (Eds.), Learning technologies for transforming teaching, learning and assessment at large scale (pp. 3–18). Cham: Springer.
Ifenthaler, D., & Widanapathirana, C. (2014). Development and validation of a learning analytics framework: Two case studies using support vector machines. Technology, Knowledge and Learning, 19(1–2), 221–240. https://doi.org/10.1007/s10758-014-9226-4.
Ifenthaler, D., & Yau, J.Y.-K. (2020). Utilising learning analytics to support study success in higher education: a systematic review. Educational Technology Research and Development, 68(4), 1961–1990. https://doi.org/10.1007/s11423-020-09788-z.
Joksimović, S., Poquet, A., Kovanovic, V., Dowell, N., Millis, C., Gasevic, D., & Brooks, C. (2018). How do we model learning at scale? A systematic review of research on MOOCs. Review of Educational Research, 88(1), 43–86.
Jones, K. M. L. (2019). Learning analytics and higher education: a proposed model for establishing informed consent mechanisms to promote student privacy and autonomy. International Journal of Educational Technology in Higher Education. https://doi.org/10.1186/s41239-019-0155-0.
Kevan, J. M., & Ryan, P. R. (2016). Experience API: Flexible, decentralized and activity-centric data collection. Technology, Knowledge and Learning, 21(1), 143–149. https://doi.org/10.1007/s10758-015-9260-x.
Klein, C., Lester, J., Rangwala, H., & Johri, A. (2019). Learning analytics tools in higher education: Adoption at the intersection of institutional commitment and individual action. The Review of Higher Education, 42(2), 565–593. https://doi.org/10.1353/rhe.2019.0007.
Knobbout, J., & van der Stappen, E. (2020). A capability model for learning analytics adoption: Identifying organizational capabilities from literature on big data analytics, business analytics, and learning analytics. International Journal of Learning Analytics and Artificial Intelligence for Education, 2(1), 47–66.
Kokoç, M., & Altun, A. (2019). Effects of learner interaction with learning dashboards on academic performance in an e-learning environment. Behaviour & Information Technology. https://doi.org/10.1080/0144929X.2019.1680731.
Lin, L., Mills, L., & Ifenthaler, D. (2016). Collaboration, multi-tasking and problem solving performance in shared virtual spaces. Journal of Computing in Higher Education, 28(3), 344–357. https://doi.org/10.1007/s12528-016-9117-x.
Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing pedagogical action: Aligning learning analytics with learning design. American Behavioral Scientist, 57(10), 1439–1459. https://doi.org/10.1177/0002764213479367.
Mah, D.-K. (2016). Learning analytics and digital badges: potential impact on student retention in higher education. Technology, Knowledge and Learning, 21(3), 285–305. https://doi.org/10.1007/s10758-016-9286-8.
Mah, D.-K., Yau, J.Y.-K., & Ifenthaler, D. (2019). Future directions on learning analytics to enhance study success. In D. Ifenthaler, J.Y.-K. Yau, & D.-K. Mah (Eds.), Utilizing learning analytics to support study success (pp. 313–321). Cham: Springer.
Martin, F., & Whitmer, J. C. (2016). Applying learning analytics to investigate timed release in online learning. Technology, Knowledge and Learning, 21(1), 59–74. https://doi.org/10.1007/s10758-015-9261-9.
Marzouk, Z., Rakovic, M., Liaqat, A., Vytasek, J., Samadi, D., Stewart-Alonso, J., & Nesbit, J. C. (2016). What if learning analytics were based on learning science? Australasian Journal of Educational Technology, 32(6), 1–18. https://doi.org/10.14742/ajet.3058.
McLoughlin, C., & Lee, M. J. W. (2010). Personalized and self regulated learning in the Web 2.0 era: International exemplars of innovative pedagogy using social software. Australasian Journal of Educational Technology, 26(1), 28–43.
Nouri, J., Ebner, M., Ifenthaler, D., Saqr, M., Malmberg, J., Khalil, M., & Berthelsen, U. D. (2019). Efforts in Europe for data-driven improvement of education: A review of learning analytics research in seven countries. International Journal of Learning Analytics and Artificial Intelligence for Education, 1(1), 8–27. https://doi.org/10.3991/ijai.v1i1.11053.
Perry, N. E., & Winne, P. H. (2006). Learning from learning kits: gStudy traces of students’ self-regulated engagements with computerized content. Educational Psychological Review, 18, 211–228. https://doi.org/10.1007/s10648-006-9014-3.
Prieto, L. P., Rodríguez-Triana, M. J., Martínez-Maldonado, R., Dimitriadis, Y., & Gašević, D. (2019). Orchestrating learning analytics (OrLA): Supporting inter-stakeholder communication about adoption of learning analytics at the classroom level. Australasian Journal of Educational Technology, 35(4), 14–33. https://doi.org/10.14742/ajet.4314.
Prinsloo, P., & Slade, S. (2014). Student data privacy and institutional accountability in an age of surveillance. In M. E. Menon, D. G. Terkla, & P. Gibbs (Eds.), Using data to improve higher education. Research, policy and practice (pp. 197–214). Rotterdam: Sense Publishers.
Rienties, B., & Toetenel, L. (2016). The impact of learning design on student behaviour, satisfaction and performance: A cross-institutional comparison across 151 modules. Computers in Human Behavior, 60, 333–341. https://doi.org/10.1016/j.chb.2016.02.074.
Roberts, L. D., Howell, J. A., & Seaman, K. (2017). Give me a customizable dashboard: personalized learning analytics dashboards in higher education. Technology, Knowledge and Learning, 22(3), 317–333. https://doi.org/10.1007/s10758-017-9316-1.
Scheibe, M., Skutsch, M., & Schofer, J. (1975). Experiments in Delphi methodology. In H. A. Linestone & M. Turoff (Eds.), The Delphi method - techniques and applications (pp. 262–287). Boston, MA: Addison-Wesley.
Schumacher, C., & Ifenthaler, D. (2018a). Features students really expect from learning analytics. Computers in Human Behavior, 78, 397–407. https://doi.org/10.1016/j.chb.2017.06.030.
Schumacher, C., & Ifenthaler, D. (2018b). The importance of students’ motivational dispositions for designing learning analytics. Journal of Computing in Higher Education, 30(3), 599–619. https://doi.org/10.1007/s12528-018-9188-y.
Sclater, N., & Mullan, J. (2017). Learning analytics and student success – assessing the evidence. Bristol: JISC.
Sclater, N., Peasgood, A., & Mullan, J. (2016). Learning analytics in higher education: A review of UK and international practice. Bristol: JISC.
Seufert, S., Meier, C., Soellner, M., & Rietsche, R. (2019). A pedagogical perspective on big data and learning analytics: a conceptual model for digital learning support. Technology, Knowledge and Learning, 24(4), 599–619. https://doi.org/10.1007/s10758-019-09399-5.
Shimada, A., Konomi, S., & Ogata, H. (2018). Real-time learning analytics system for improvement of on-site lecture. Interactive Technology and Smart Education, 15(4), 314–331. https://doi.org/10.1108/ITSE-05-2018-0026.
Shimada, A., Okubo, F., Yin, C., & Ogata, H. (2018). Automatic summarization of lecture slides for enhanced student preview-technical report and user study. IEEE Transaction of Learning Technologies, 11(2), 165–178. https://doi.org/10.1109/TLT.2017.2682086.
Shute, V. J., & Rahimi, S. (2017). Review of computer-based assessment for learning in elementary and secondary education. Journal of Computer Assisted Learning, 33, 1–19.
Shute, V. J., Rahimi, S., & Emihovich, B. (2018). Assessment for learning in immersive environments. In D. Lui, C. Dede, R. Huang, & J. Richards (Eds.), Virtual, augmented, and mixed realities in education (pp. 71–89). Heidelberg: Springer.
Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510–1529. https://doi.org/10.1177/0002764213479366.
Spector, J. M., Ifenthaler, D., Sampson, D. G., Yang, L., Mukama, E., Warusavitarana, A., & Gibson, D. C. (2016). Technology enhanced formative assessment for 21st century learning. Educational Technology & Society, 19(3), 58–71.
Teasley, S. D. (2017). Student facing dashboards: One size fits all? Technology, Knowledge and Learning, 22(3), 377–384. https://doi.org/10.1007/s10758-017-9314-3.
Tempelaar, D. T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: Learning analytics in a data-rich context. Computers in Human Behavior, 47, 157–167. https://doi.org/10.1016/j.chb.2014.05.038.
Tsai, Y.-S., Poquet, S., Gašević, D., Dawson, S., & Pardo, A. (2019). Complexity leadership in learning analytics: drivers, challenges, and opportunities. British Journal of Educational Technology. https://doi.org/10.1111/bjet.12846.
Veenman, M. V. J. (2013). Assessing metacognitive skills in computerized learning environments. In R. Azevedo & V. Aleven (Eds.), International handbook of metacognition and learning technologies (pp. 157–168). New York, NY: Springer.
Vieira, C., Parsons, P., & Byrd, V. (2018). Visual learning analytics of educational data: A systematic literature review and research agenda. Computers & Education, 122, 119–135. https://doi.org/10.1016/j.compedu.2018.03.018.
Webb, M., & Ifenthaler, D. (2018). Assessment as, for and of 21st Century learning using information technology: An overview. In J. Voogt, G. Knezek, R. Christensen, & K.-W. Lai (Eds.), International handbook of IT in primary and secondary education (2nd ed., pp. 1–20). Cham: Springer.
Webb, M., Prasse, D., Phillips, M., Kadijevich, D. M., Angeli, C., Strijker, A., & Laugesen, H. (2018). Challenges for IT-enabled formative assessment of complex 21st Century Skills. Technology, Knowledge and Learning, 23(3), 441–456. https://doi.org/10.1007/s10758-018-9379-7.
West, D., Heath, D., & Huijser, H. (2016). Let’s talk learning analytics: A framework for implementation in relation to student retention. Online Learning, 20(2), 1–21. https://doi.org/10.24059/olj.v20i2.792.
West, D., Huijser, H., & Heath, D. (2016). Putting an ethical lens on learning analytics. Educational Technology Research and Development, 64(5), 903–922. https://doi.org/10.1007/s11423-016-9464-3.
Whitelock, D., & Bektik, D. (2018). Progress and challenges for automated scoring and feedback systems for large-scale assessments. In J. Voogt, G. Knezek, R. Christensen, & K.-W. Lai (Eds.), International handbook of IT in primary and secondary education (2nd ed., pp. 617–634). New York, NY: Springer.
Wilson, A., Watson, C., Thompson, T. L., Drew, V., & Doyle, S. (2017). Learning analytics: challenges and limitations. Teaching in Higher Education, 22(8), 991–1007.
Winne, P. H. (2017). Learning analytics for self-regulated learning. In C. Lang, G. Siemens, A. Wise, & D. Gašević (Eds.), Handbook of Learning Analytics (1st ed., pp. 241–249). New York: Society for Learning Analytics Research.
Wong, J., Baars, M., de Koning, B. B., van der Zee, T., Davis, D., Khalil, M., & Paas, F. G. (2019). Educational theories and learning analytics: from data to knowledge. In D. Ifenthaler, J.Y.-K. Yau, & D.-K. Mah (Eds.), Utilizing learning analytics to support study success (pp. 3–25). Cham: Springer.
Yau, J., & Ifenthaler, D. (2020). Reflections on different learning analytics indicators for supporting study success. International Journal of Learning Analytics and Artificial Intelligence for Education, 2(2), 4–23. https://doi.org/10.3991/ijai.v2i2.15639.
The authors would like to acknowledge the contributions during the EDUsummIT 2019 meeting of the Thematic Working Group 6 including co-leader Jonathan San Diego and members Jill Downie, Sandra Elliott, Monique Baron, and Séverine Parent which helped to frame the presented research.
Open Access funding enabled and organized by Projekt DEAL.
Conflict of interest
Authors Dirk Ifenthaler, David Gibson, Doreen Prasse, Atsushi Shimada, Masanori Yamada declare that they have no conflict of interest.
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. This article does not contain any studies with animals performed by any of the authors.
Informed consent was obtained from all individual participants included in the study. Additional informed consent was obtained from all individual participants for whom identifying information is included in this article.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Ifenthaler, D., Gibson, D., Prasse, D. et al. Putting learning back into learning analytics: actions for policy makers, researchers, and practitioners. Education Tech Research Dev (2020). https://doi.org/10.1007/s11423-020-09909-8
- Learning analytics
- Policy recommendation
- Learning support
- Data literacy
- Data privacy