In this special section, we feature six articles that examine the learning analytics ecosystem. Four articles offer empirical analysis of learning analytics initiatives, and two provide conceptual approaches to understanding the learning analytics ecosystem. The two conceptual articles in this special section focus on how to create the foundation that can allow for inclusive learning analytics initiatives and opportunities for the type of empirical research highlighted in the four other articles in the section. Together, these articles offer insights into how to engage stakeholders to build out initiatives (Blackmon & Moore, 2023; Motz & Morrone, 2023), heuristics of learning analytics frameworks (Prinsloo et al., 2023), examination of learning self-regulated learning behaviors and course engagement (Çakiroğlu et al., 2024; Yang et al., 2024), and research focused on inclusiveness (Khalil et al., 2023).

Conceptual approaches

The focus of Motz and Morrone ‘s (2023) piece “Wild brooms and learning analytics” is institutional-level learning analytics initiatives, mainly focusing on the faculty role. The authors draw on the analogy of the sorcerer’s apprentice to highlight that success means having the right skills in the right sequence. They focus on first understanding that the IT division needs to be involved and has expertise to support institutional-level analytics initiatives. The authors point out that a critical factor for implementation success is that there are also engaged stakeholders outside of IT. The challenge can be identifying those stakeholders and engaging them at the right time within the project (Moore & Johnson, 2017). The authors distinguish between learning analytics and the sorcerer’s apprentice in emphasizing that learning analytics projects are only successful when folks work together. In contrast, the sorcerer’s apprentice has more autonomy in establishing their spell book (Motz & Morrone, 2023).

Blackmon & Moore’s (2023) article “Using networked learning to improve learning analytics implementation” offers a networked learning analytics logic model that can be implemented by higher education institutions seeking to build their learning analytics infrastructure. This article builds off prior research by the authors which proposed an interdisciplinary framework for learning analytics (Blackmon & Moore, 2020). The authors utilized a logic model to complement their already established framework, and the model provides essential details to aid in the implementation and adaption of their work. The authors’ logic model incorporates ethics and care as key components. This intention inclusion connects to the recommended next steps from Khalil et al.‘s (2023) systematic review that also appears in this special section.

Empirical analysis

Yang et al.‘s (2024) paper “Investigating the mechanisms of analytics-supported reflective assessment for fostering collective knowledge” draws on the Knowledge Building pedagogical model (Scardamalia, 2002; Scardamalia & Bereiter, 2006, 2014, 2021). The authors note that while there has been work examining how this model can aid in student collaborative knowledge development, there have been some limitations around capturing the collective knowledge gained through the Knowledge Building model. The authors proposed a quasi-experimental design focused on analytics-supported reflective assessment to address this gap. Their participants were Chinese undergraduate students enrolled in a course titled Scientific Inquiry and Knowledge Creation, and they examined the interrelationships between participants’ levels of engagement, including cognitive and metacognitive, and both individual domain understanding and collective knowledge advancement (Yang et al., 2024). The study had 55 participants in the experimental condition who received access to a learning analytics tool, whereas the control group of 38 participants had a portfolio-supported reflective assessment (Yang et al., 2024). Key findings included a positive impact on analytics-supported reflective assessment for the experimental group, which supports the value of integrating reflective assessment practices into educational settings (Yang et al., 2024). Additionally, the authors found more collective knowledge in the experimental group.

Prinsloo et al.‘s (2023) article “Learning analytics as data ecology: a tentative proposal” looks at existing frameworks for learning analytics implementation with a particular interest in exploring how the various stakeholders interact and depend on each other. Together, these relationships form what the authors describe as a data ecosystem that connects to broader data ecologies (Prinsloo et al., 2023). The authors thoroughly overview the taxonomies of ecologies, ecosystems, and data interests. They link these three concepts within learning analytics contexts. The authors then used a systematic approach to scrutinize 46 learning analytics frameworks. Using a rigorous screening process, they narrowed these frameworks to 11 that met their criteria and then used a coding heuristic to extract key findings from these 11 frameworks. The application of the heuristic across the key areas of data ecologies, data ecosystems, and data interests to each of the 11 frameworks provides a helpful roadmap to not only understand the state of existing frameworks but also identify opportunities for adaption and implementation that will be of interest to stakeholders across the continuum of learning analytics implementations.

Khalil et al.‘s (2023) article “Learning analytics in support of inclusiveness and disabled students: a systematic review” identified 26 peer-reviewed articles and conference proceedings published between 2011 and 2022 that focused on learning analytics on inclusiveness, disabilities, and disadvantaged groups. The authors used PRISMA principles (Liberati et al., 2009) to guide their article selection process. Their synthesis identified six themes for improvement of LA methods: increasing inclusion, reducing discrimination, supporting validated learning design, improvement of learning design, and adaptive/personalized teaching and learning (Khalil et al., 2023). The authors conclude their systematic review with a call to action to expand research into inclusiveness and disability access.

Çakiroğlu et al.‘s (2024) article “Online learners’ self-regulated learning skills regarding LMS interactions: a profiling study” conducted a clustering analysis of a convenience sample of undergraduate students. The context was a third-year computing course offered at a public university in Turkey, which was a blended 16-week course. To build the profiles, the authors used the interaction data from the Moodle LMS and asked students to complete the Online SRL Scale (Barnard et al., 2009). They used the SRL skills and the learners’ interaction behaviors to develop their three clusters: actively engaging, assessment-oriented, and passively engaging. Once they had the clusters, they could examine how the clusters differed from each other.