1 Introduction

The need to improve the quality of higher education has fostered an interest in integrating technology tools to support teachers’ pedagogical practices (Bennett et al. 2015). Over the past decade, this interest led to the development of tools to help create improved online learning experiences, specifications to underpin designs of educational systems and repositories to share examples (Bennett et al. 2015). A more recent strand of research, learning analytics (LA), is receiving increased attention, partially because of its promising potential to assist educational institutions and teachers to improve their design practices (Rienties and Toetenel 2016). Emerging from educational technology, artificial intelligence, educational data mining and related fields, LA concerns the collection, measurement, analysis and reporting of data about learners and their context to optimize learning and teaching (Siemens 2013). LA systems use and analyze learners’ behavioral and interaction data mainly from online learning systems, which enables ecologically more valid research since no interruptions to authentic student learning processes are necessary to collect LA data (Berland et al. 2014). From this perspective, the promises of LA are timely, given the increasing spread and use of online learning environments (e.g. learning management systems) for teaching and learning within higher education institutions for purposes like identifying low performing students (Saa et al. 2019), monitoring students’ online social learning behaviors (Kaliisa et al. 2019) and supporting course design practices (e.g. planning, sequencing, feedback, assessment and redesigning of learning activities) (Rienties and Toetenel 2016).

The research reported herein was conducted with 16 teachers at two Norwegian universities (The University of Oslo and Oslo Metropolitan University) by the use of semi-structured interviews. The aim is to advance our understanding of teachers’ course design practices and perspectives towards LA as a potential tool to support course design practices. As a constructive contribution, this study proposes a conceptual framework that clarifies key elements for the proper alignment of LA with design practices based on views expressed by teachers. Next, we provide relevant literature on teachers’ LA experiences and design practices. We then state the aim and study research questions, followed by a brief description of the study’s conceptual framework, including the technology acceptance model (TAM). Subsequently, we present the methodology and findings, before discussing the findings and presentation of the proposed bi-directional LA–LD model and an exemplary case study detailing its application in practice. The paper ends with a conclusion highlighting key implications and suggestions for future research.

2 Background

2.1 Teachers’ Learning Analytics Perspectives

Research investigating teachers’ experiences with LA has garnered mixed reactions (Howell et al. 2018; Ifenthaler and Yau 2019). In a mixed-methods study conducted among 276 academics in Australia and New Zealand, West et al. (2016) found 37% of participants reported LA as a potential tool to assist teachers’ decision-making. Other empirical studies have shown that teachers found LA reports useful for diagnosing and intervening during student activities (van Leeuwen 2019). Research has shown that the deeper analysis of summative reports generated by LA tools helps teachers to understand individual and collaborative students’ learning behaviors and domain-specific effects of courses (Ifenthaler 2017). More recently, Muljana and Luo (2020) explored the perception of instructors regarding the intent and actual practice of LA, and the majority admitted that LA had great potential (e.g. gaining insights into students’ learning behaviors), even though adoption was limited.

Nonetheless, other studies have reported negative experiences regarding LA from the teachers’ perspective. For instance, Corrin et al. (2013) in studying staff associated with teaching and learning at the University of Melbourne noted teachers found the accurate tracking of students’ online engagement (e.g. counting the number of times students visited course pages for learning purposes) was challenging. The same concern was reported in a large-scale longitudinal study of 1159 teachers using LA over four years, wherein most teachers found that, although the LA system was relatively easy to use, the more difficult decision was how to act on the provided analytics to effectively intervene (Herodotou et al. 2019).

Teachers have also noted insufficient support (e.g. training) and communication as a challenge to implementing LA (Tsai and Gasevic 2017). For example, Rienties et al. (2018) in an embedded case study amongst 95 teachers at a large distance-learning university found teachers indicated a need for training and follow-up support to use LA tools. In the same vein, adequate time to learn and implement LA has been reported as another challenge with teachers having insufficient time to review LA visualizations (Herodotou et al. 2019). This implies teachers may demand extra resources and peer support to adopt LA, as noted by Ifenthaler and Yau (2019) in their study that involved 37 German higher education teachers. Similarly, Muljana and Luo (2020) concluded that teachers’ adoption and perception of LA is largely affected by social influences and facilitating conditions.

2.2 Teachers’ Design Practices

Early studies identified several factors influencing teachers’ course design practices. For instance, researching the design context of 30 Australian university teachers, Bennett et al. (2011) found institutional policies and practices were key factors determining the way teachers plan, assess and revise courses. Similarly, Nguyen et al. (in press) conducted a mixed-method study into how teachers design for learning in online and distance education. The findings showed that institutional policies and management are important factors affecting teachers’ course design processes. Other research has reported teachers to rely on insights from summative and formative assessments (e.g. mid- and end-of-term assessments) as well as the different assumptions about students, to make necessary design changes (Bakharia et al. 2016; Black and Wiliam 2009; Lockyer et al. 2013). However, as identified in previous studies (Lockyer et al. 2013; Berland et al. 2014), such approaches provide limited support to teachers concerning real-time course adaptation since they are less precise and take much longer to compile. This calls for a rethinking of the design process by use of appropriate mechanisms (e.g. LA) to support teachers in making timely and pedagogically informed course design decisions.

To address concerns in teachers’ design practices, recently, researchers have begun to align course design with LA to enable a reciprocal relationship between the two elements (Nguyen et al. 2018). Several authors have argued that, in an effective teaching practice, course design establishes learning objectives and pedagogical plans, which can be evaluated against outcomes captured through LA in contexts where relevant technological tools (e.g. LMS) are used for course design purposes. Simply put, LA can help inform teachers on the success and outcomes of their course designs by providing real-time evidence of design impact, such as engagement patterns, learning paths and time consumed to complete activities (Mor et al. 2015; Nguyen et al. 2018). If taken advantage of, teachers can act accordingly with adaptive teaching and micro-interventions (Ifenthaler et al. 2018).

2.3 Aims and Research Questions

Although research suggests the potential of combining LA and course design practices, before this integration is fully realized, it is necessary to understand teachers’ course design practices, because the failure to understand teachers’ practices and constraints is a hallmark of many educational technology innovations’ failure to achieve success at the practitioner level (Shibani et al. 2019). More precisely, unanswered questions remain about what shapes university teachers’ course design practices, what stage of the design process LA supports would be most helpful, and the factors that should be considered to support this integration. This would support the integration of LA and course design based on empirical evidence, not presumptions about the design process (Bennett et al. 2015). The need for accounts of teachers’ course design practices aligns with Bennett et al.’s (2011) argument for understanding teachers’ challenges and their scope before suggesting the implementation of new ideas into teaching practices. A recent systematic literature review concerning learning design tools found that, even though teachers are the final users of the proposed methods and tools, out of the 20 reviewed studies, only three (e.g. Bennett et al 2015; Arpetti et al. 2014; Laurillard 2013) explored teachers’ actual design practices and needs (Dagnino et al. 2018). At the same time, Sergis and Sampson (2017) concluded that few LA studies have addressed the aspect of supporting teachers’ reflections on course design. The lack of studies specifically dedicated to understanding teachers’ design practices and needs can be considered a gap in the field. Therefore, the first research question will contribute to this gap, focusing on identifying driving factors that shape university teachers’ design decisions.

  • RQ1: What are the driving factors behind university teachers’ course design practices?

Besides, to enable greater opportunities for LA integration into teachers’ practice, it is important to unpack teachers’ LA perspectives, particularly to support course design practices. It is also important to uncover why some teachers might be more willing and able to adopt LA for course design purposes than others, and the kind of LA they perceive as important to support course design practices. Simultaneously, as noted by Howell et al. (2018) and Ifenthaler (2017), LA’s successful implementation and maintenance depends on involving intended end-users (e.g. teachers and students). However, very little attention has been devoted to understanding what teachers think about LA, what their practice is and how LA could address everyday pedagogical challenges (e.g. making informed design decisions, giving timely feedback). More importantly, there is a need to examine teachers’ contextual pre-conditions for using LA in everyday practices. It is particularly important to advance research work seeking to align LA to course design and to guide the vast number of institutions exploring whether to start using LA (Ferguson et al. 2016). Consequently, by considering these important but underrepresented views, our second and third research questions explore teachers’ LA perspectives and derive their implications for LA to support design practices.

  • RQ2: What is the current university teachers’ state of awareness, acceptance, needs and perspectives about LA?

  • RQ3: What are teachers’ expectations for using LA to support course design?

We employed the technology acceptance model (TAM) (Venkatesh and Davis 2000) as the theoretical lens to interpret teachers’ beliefs about LA. TAM distinguishes perceived ease of use (e.g. perceived effort needed to use LA) and perceived usefulness of technology (e.g. extent a teacher believes LA use will, for example, enhance their teaching quality) as key drivers for teachers’ adoption. TAM’s influence was shown in previous LA-related research (Rienties et al. 2018), hence being a suitable model to use in this study. By using TAM, we expect to gain unique insights into potential reasons for teachers’ beliefs and needed implementation requirements to support LA’s successful adoption.

3 Methodology

3.1 Research Design, Context, Participants and Demographics

This study used a qualitative approach to elicit context-rich teacher-focused perspectives and practices of LA and course design. The sample consisted of 16 university teachers (male, n = 10, female, n = 6) from different disciplines and academic levels at two campus-based Norwegian universities (one research-The University of Oslo- and one profession oriented-Oslo Metropolitan University). The two institutions are both at the initial stages of exploring possibilities of implementing LA, with small-scale implementation (e.g. piloting scales by individual researchers at a course level) but without institution-wide scale uptake. We assumed teacher norms and practices could vary between educational institutions and disciplines. Thus, participants were selected from different universities and disciplines to enrich our analysis, as previous studies recommended (Howell et al. 2018). Participants were strategically selected (Bryman 2016) through institutional networks based on discipline, affiliation, and teaching experience. Table 1 summarizes teachers’ profiles. Potential participants were emailed a letter explaining the study’s purpose and seeking their informed written consent. From the 20 teachers invited, 16 accepted and were interviewed. All interviews were face-to-face. Written informed consent approved by the Norwegian National Center for Research Data was gained before interviews started. All personal information was anonymized.

Table 1 Demographic characteristics of participating teachers

3.2 Methods of Data Collection and Analysis

3.2.1 Interviews

Data were collected using a semi-structured interview protocol structured around four sections (supplementary material Appendix 1). Teachers were initially asked about their role and teaching experience to establish rapport. Later, teachers were asked about their course design practices, followed by questions about teachers’ LA perspectives. The fourth section stimulated discussions about the ways teachers perceived LA as a potential tool to support design practices. The interview protocol was devised based on the prior review of research on teachers’ design practices (Bennett et al. 2015) and LA (Howell et al. 2018). The protocol was piloted with two teachers based at the authors’ department before interviews. The interviews were audio-recorded and transcribed verbatim. The first author conducted all interviews, which lasted between 30 and 60 min (median: 35 min). As LA is an emerging area of research and practice, we introduced visual mediating artefacts (e.g. LA visualizations) during interviews to encourage in-depth discussion and reflection about this field. Visualizations shown to participants depicted an example of LA visualizations adapted from empirical studies (Rienties and Toetenel 2016). Vignettes were evaluated by an LA expert who judged their suitability for the study. After transcribing, member validation (Bryman 2016) was conducted by seeking additional input during the processes of analyzing data. In this case, we shared interview transcripts with individual participants to provide opportunities for questions, critique, feedback and affirmation. Through this process, six participants freshly illuminated and provided new data on some questions, which provided an opportunity for reflexive elaboration (Bryman 2016). The remaining 10 participants approved the transcripts as true and accurate with no further additions or subtractions, which ensured the qualitative credibility of findings (Bryman 2016).

3.2.2 Mini-Survey

We administered a mini-survey to explore teachers’ beliefs about the ease of use and usefulness of LA tools on five-point Likert scales (Table 2). The survey was administered after each interview, enabling us to capture teachers’ LA perceptions using TAM, a well-researched model reported as reliable in predicting technology use (Rienties et al. 2018). As most TAM questionnaires have focused on users and students, not teachers, we adapted the mini-survey in line with Rienties et al. (2018). Consequently, we rephrased items to fit our teacher context. A think-aloud approach was used; participants were asked to explain their thoughts as they reviewed each survey item. This process helped clarify questions some participants found challenging to interpret, thus ensuring construct validity (Bryman 2016).

Table 2 LA ease of use and usefulness survey

3.2.3 Data Analysis

Interviews were transcribed by the first author and coded through a deductive thematic analysis approach (Braun and Clarke 2012). The development of codes was theoretical and structural, meaning they were formulated according to study research questions and TAM. The three researchers familiarized themselves with data and repeatedly examined raw data to determine how to reduce raw information into smaller units, like categories and themes. The initial categorization was performed by the first author using qualitative analysis software (NVivo). Later, all three authors discussed codes and themes with close reference to raw data until a consensus was reached. The unit of analysis in our coding system was one paragraph (i.e. one full response to an interview question). Paragraphs could be given multiple codes if the essence presented differed. Altogether, our thematic analysis identified six themes. The codebook is available as supplemental material. The mini-survey, which sought to capture teachers’ beliefs about LA, was analyzed using descriptive statistics (Table 2).

4 Findings

4.1 Teachers’ Course Design Practices

The first research question explored teachers’ course design practices, which fell into two themes: (1) course design influences and (2) limitations. Tables 3 and 4 summarizes these findings, supported by illustrative quotes that represent common perspectives and alternative views from participants.

Table 3 Theme 1: course design influences, sub-themes and sample excerpts

4.1.1 Theme 1: Course Design Influences

The theme ‘course design influences’ includes three sub-themes: situational factors, teachers’ beliefs and experience, and feedback and peer influence. Table 3 illustrates sub-themes and sample excerpts of quotes as exemplifications of themes identified.

Table 4 Theme 2: course design limitations sub-themes and sample excerpts

4.1.2 Theme 2: Course Design Limitations

The theme ‘course design limitations’ includes two sub-themes: institutional limitations and unreliable feedback mechanisms. Table 4 illustrates sub-themes and sample excerpts of quotes to exemplify themes.

4.2 Teachers’ LA Perspectives

The second research question focused on teachers’ LA perspectives. Three themes emerged from this topic: (1) teachers’ awareness of LA; (2) LA challenges and (3) teachers’ LA needs.

4.2.1 Teachers’ Awareness of LA

Teachers expressed mixed familiarity with LA. Out of 16 teachers, 12 reported some awareness of what LA is about, while a modest number of teachers (four) claimed no idea about what LA is until visual examples were provided. For example, ‘I have a very poor idea of LA. I probably should know more, but it is not something I have encountered very much, I have to say’ (R10). ‘No idea’ (R7). Teachers who knew about LA described LA as context-dependent. Teachers highlighted LA can be conceptualized at the macro (institutional), micro (course, teacher, student) and societal levels or from a research and teaching perspective. Hence, its application will depend on the desired level of application. ‘If you separate it by micro, macro, societal level and course level, then it is a very different thing’ (R3). Another teacher stated, ‘I think LA can be as simple as knowing how much time do students use online in one course compared to other courses and up to knowing specific details’ (R8).

4.2.2 LA Challenges

Teachers highlighted several challenges to LA’s potential use. First, all 16 teachers agreed using LA daily is a time-consuming imposition with the potential of an extra workload. Teachers reported they work under considerable pressure with many demands from their employers, making it hard to implement LA, which requires teacher follow-ups. Teachers claimed they lack the motivation and incentives to implement LA besides the big workload they already have, as illustrated by the following quote: ‘What is the motivation for me? There is no incentive from the perspective of, like, doing the analytics’ (R1). However, it is unsurprising such comments were especially prominent among teachers from the more research-oriented university, who usually perceived teaching as an extra workload on top of their normal research activities.

Privacy and data ownership concerns were raised as obstacles to LA’s application. Teachers mentioned a lack of clarity at many institutions concerning data ownership, how this data can be used and if students’ property rights are violated. For example, ‘Here at XX, we are not very comfortable with using systems hidden in the cloud—not very clear what happens to the information stored, so control is really an important thing’ (R3). To deal with privacy concerns, one teacher stated, ‘We have to be able to anonymize students’ data so they are not worried’ (R2).

Some teachers disagreed with using LA to follow up with students especially in higher education, where students are expected to be self-driven. R10 used the ‘big-brother effect’ metaphor to illustrate this concern. R3 further echoed this view by stating, ‘If you’re responding to students based on the analytics, maybe you’re cultivating a new culture of running after the students. … I do not want to run after the students, so that is the drawback’ (R3) and ‘Maybe with primary children, but these adults, I am not their mother, so they do what they want’ (R6).

Teachers acknowledged LA’s evaluative nature is an additional concern. This is particularly serious when LA is viewed as capable of evaluating teachers/students’ performance based on unnuanced data with limited depth in observation. According to R15, this view of using LA might lead to incorrect educational decisions and be a reason for teachers and students to resist LA.

Last, five of 16 teachers expressed extreme skepticism about LA’s potential to provide meaningful interpretations of the learning process. Teachers from the arts and social science disciplines (7), who claimed LA was incongruent with their teaching style, mainly emphasized this claim. According to the teachers, these LA data validity issues could potentially result in inaccurate, incomplete evaluations of students’ learning processes. ‘I need a way of ensuring that the analytics in question are suitable for qualitative teaching because the kind of materials LA uses are poorly suited to my teaching’ (R10). Teachers claimed that, unless different sources of information are utilized, LA alone cannot account for processes behind the numbers. Another teacher attributed the challenge of getting meaning from LA to the lack of theorizing of LA, claiming the field is ‘highly data-driven’ (R15). Other representative quotes to express these concerns are highlighted below:

My point of departure for analytics is extreme skepticism. … I am not confident if Canvas analytics and some plugins can provide me with reliable findings about the quality of the text produced by students (R1).

Until we have sensors in the heads to tell us what they are learning, I think LA will always be based on data but not giving the whole picture of what is happening (R13).

No real scope for the automation of assignments, as it will provide dictionary meanings to the text (R10).

This skepticism was corroborated by the mini-survey’s results: six out of 16 teachers neither agree nor disagree about LA’s potential to enhance their teaching’s effectiveness.

4.2.3 LA Needs

Teachers were asked about the kind of data, pre-conditions and environment they would need to facilitate LA’s use in their everyday professional practice. This question provoked a range of responses from teachers, with their needs spanning practical and technical considerations.

From the practical perspective, throughout the interviews was the call for incentives regarding increased time allocation to account for the time and extra effort involved in LA’s use to improve course designs. Teachers noted that, if they were to take a dynamic approach to embrace LA, they would demand extra teaching resources. For example, ‘I think I would make a cynical point, but it is going to be a very important point. What you are talking about here is a massive time investment on the side of the teachers. Are you aware of that?’ (R10). R2 added, ‘It is important to have a preparation count attached to LA application. For example, at least five hours can be suggested to have a better account for the time teachers use to apply the LA’. In this regard, the institution is making it visible that course design is important, so teachers can put more effort into leveraging LA in everyday practice. Teachers also demanded rewards for quality teaching and the need to improve a teacher’s status to make teaching lucrative and enforce the use of innovative approaches like LA. One teacher stated, ‘I think that half of the lecturers don’t like to lecture, but they have to do it, so they don’t really care. … Now, employees are set to teach, instead of them wanting to teach, and that is not good’ (R8).

Technically, teachers were interested in different forms of LA outputs to enable them to capture and gain an overview of students’ learning processes. The kind of analytics demanded by teachers can be categorized as social analytics (e.g. student participation and interaction patterns), feedback analytics (e.g. automated feedback to students), process analytics (e.g. time spent on the task, access and submission metrics), discourse analytics (e.g. language used by students), text analytics (e.g. topics discussed), peer analytics (e.g. their comparison with other students in class and previous years) and temporal analytics (e.g. when materials are downloaded). The need for these analytic forms is illustrated in the following statements:

I need a metric on the quality of the language used by the students and some sophistication of their ideas; I think that could be huge regarding being able to tailor the course (R1).

It is important to know what they talked about, and maybe the machine can tell me what the topics discussed were. Was the discussion heated at some point? (R13).

I would like to know how quickly students download the slides (R5).

Finally, teachers saw the need for data analytics courses for all university teachers to provide them with the right LA competencies to deal with data interoperability issues. ‘Not everybody is into statistics; most people cannot relate to a graph and cannot read or understand it because they do not have the mindset’ (R8). This argument was supported by survey responses, where 11 teachers, irrespective of LA acceptance level, strongly agreed and three agreed that teachers need the training to use LA in everyday practice.

4.2.4 Teachers’ Expectations of LA to Support Course Design Practices

The third research question explored teachers’ expectations of using LA as a tool to support course design practices. The benefits highlighted by participants are categorized broadly into two forms: normative and formative perspectives.

From the normative perspective, teachers highlighted LA would signal an opportunity to generate descriptive information about students’ learning, which could be used to inform timely course design interventions by recommendations. For example, teachers viewed LA use to identify non-active students, difficult assignments, students’ attendance, submissions and page views, among other indicators, providing an opportunity to intervene and make necessary adjustments.

From a formative perspective, teachers viewed LA as a baseline tool to understand and shape learning processes (e.g. to check the quality of learning materials and design and to provide feedback in the form of visualizations). Teachers perceived LA as capable of providing quality checks regarding learning materials, activities and overall design. ‘It would be an interesting proxy for how your learning material is perceived’ (R14). Another teacher commented: ‘I also see the potential of using such inputs as a baseline to understand certain processes, at least in quantitative terms, and perhaps guide some learning and teaching activities based on that input’ (R15).

Also welcomed was LA’s potential to provide temporal statistics about students’ online engagement and how this relates to performance. ‘It would be interesting to ask whether there is any relationship between time spent and the quality of the text which would lead to potential interventions like reminding students, for example, the more time you spend on the assignment, the better the quality of your assignment can be’ (R1). Another teacher said, ‘If I see something like this [pointing to vignettes during the interview] and can see that the student has been online very often, then I start thinking, why hasn’t the student handed in, yet always online? And then I can send them an email’ (R11).

The teachers also perceived LA as a tool that can identify differences in students’ learning processes, which could inform personalized, adaptive teaching. ‘You can do more data mining of what they have handed in and try to share them beautifully and help others who have not been able to perform at the same level, so it is a huge benefit for feedback purposes’ (R3).

More importantly, LA’s ability to provide less biased assessments of students’ learning processes were deemed important by teachers. As commented by one teacher: ‘If you ask students, they may not give you right [adequate] responses, but LA can be helpful regarding knowing what exactly happens’ (R9). These claims are supported by mini-survey findings: Out of 16 teachers, nine agree and four strongly agree that LA can potentially assist with making timely, informed educational decisions. A snapshot of themes and sub-themes from RQ2 and three are illustrated in Fig. 2 (appendix).

5 Discussion and Implications

This work’s objective was to explore and better understand current university teachers’ course design practices and perspectives of LA as a tool to support course design practices. Within the context of this objective, exploring RQ1 revealed that teachers’ course design practices are underpinned by situational factors (e.g. nature of the course, size of the class); feedback sources (e.g. course evaluations, summative assessments, informal reflections and discussions with students and fellow teachers) and teachers’ intuition and experience (e.g. based on in-classroom assessments and personal beliefs about the learning process). These findings align with evidence in related work asserting teachers’ course design practices closely relate to their experience, expectations and practical issues like student characteristics (Nguyen et al. in press; Bennett et al. 2015; Arpetti et al. 2014), which prompt adjustments in course areas. However, as previous studies noted (Bakharia et al. 2016) and teachers emphasized during interviews, such approaches are prone to personal bias, fail to capture students’ learning behavior in online learning environments and cannot provide timely feedback to teachers. Thus, it affects teachers’ possibility of real-time design adjustments due to the time lag between receiving feedback and using it to modify a course. This finding suggests a context in which there could be opportunities for further scaffolds (e.g. using LA) to support the way teachers approach their normal practice (Laurillard 2012) and get insights about students’ learning outside the classroom, and promptly to support design adaptations on the ‘fly’.

RQ2 targeted teachers’ awareness, needs and beliefs about LA. Some teachers admitted during interviews that they did not fully know what LA is about until visual examples were introduced. This finding raises complications about the extent LA implementation will soon be a reality in university teachers’ course designs. This suggests that efforts to promote LA implementation at scale should involve increasing LA awareness among relevant stakeholders (e.g. teachers, HE administrators, educational policy-makers), since extensive research in the field has shown that LA implementation is a multifaceted process, which involves multiple stakeholders (Ifenthaler 2017). Conversely, some teachers were skeptical about LA’s possibilities. For example, even though there is promising research on how LA can provide insights about students’ online discourse (Kaliisa et al. 2019), teachers were concerned about the evaluative role of LA, which one teacher called ‘unnuanced’, and whether it captures the most reliable predictors of student learning and teacher performance. This concern was mostly observed by teachers from the arts and social science disciplines and those who indicated lower perceived usefulness of LA scores in the survey. These teachers claimed LA was incongruent with their teaching style and could not reliably explain students’ learning behaviors. This concern is not just unique to this study, but a common challenge raised across the LA field (Ifenthaler and Yau 2019). Efforts to reduce this concern have included contextualizing LA within learning theory to help teachers interpret LA outputs according to well-grounded learning theories (Shibani et al. 2019).

Besides, building from TAM’s perceived ease of use principle, teachers worried about the increased workload, training and time demands LA may impose a concern especially prominent among teachers from the more research-oriented institution. Implied within these findings, and explicit in LA literature is the importance of considering the institutional context, provision of incentives and training of teachers to enable the effective, easy utilization of LA in everyday practice (Rienties et al. 2018). Moreover, building on previous findings (e.g. Corrin et al. 2013), the concerns raised by teachers confirm the assertion that effective adoption of LA in higher education will depend on the ability of universities to provide the necessary institutional infrastructure and remuneration (Howell et al. 2018). Thus, motivating teachers to integrate LA into their everyday practice with no reservations and to be explicit about the purpose of collecting student data to teachers and students.

RQ3 analyzed teachers’ perception of LA as a tool to support course design practices. Thus, building on the theoretical lenses of TAM (e.g. LA’s perceived usefulness), teachers highlighted LA’s potential from a normative and formative perspective. Taking a normative perspective, teachers saw LA as a tool to provide more objective and timely evidence about students’ learning behavior, which could be used in design adjustments. This finding supports the narrative that LA might support teachers’ design practices by addressing the limitations of traditional feedback approaches (e.g. summative assessments and course evaluations) which research has identified as less precise and take much longer to compile (Berland et al. 2014; Ifenthaler et al. 2018; Nguyen et al. 2018). However, underlying LA’s theme as a formative tool was an emphasis that LA can be used as a tool to support personalized learning based on evidence collected from students’ online trajectories. This finding identifies that teachers would like to use LA as a diagnostic tool to understand individual students’ learning processes, hence providing customized educational experiences to students during the course (e.g. personalized feedback), as also noted in previous research (van Leeuwen 2019). To support these processes, teachers demanded different forms of LA, like automated feedback, metrics on students’ participation and engagement with the implemented design, among others. The nature of LA highlighted by teachers has important implications for the nature of LA support to be provided to teachers. Currently, though, few LA tools offer course design subject-specific support, which calls for the development of relevant tools to provide the kind of analytics required by the teachers. Arpetti et al (2014) who reported that teachers need systems that can provide simple and quickly visible instructions to support course designs also noted this.

Overall, given the identified teachers’ current course design practices, several opportunities exist to leverage LA, especially due to recognized challenges concerning current influences of course design practices, such as summative assessments and student evaluations, which, according to teachers, cannot provide timely feedback and an objective assessment of students’ learning behaviors. Accordingly, using LA to inform course design decisions could result in a better understanding of students by the teachers before and during the course, which could lead to more informed, timely course design decisions. Moreover, the integration of LA and teachers’ course design practices addresses the known concern that LA is highly data-driven and lacks an explicit pedagogical approach (Shibani et al. 2019). Yet, if integrated, LA could provide the lens to interpret the pedagogical assumptions embedded within a course design (Shibani et al. 2019). This answers our third RQ, as it shows the potential value of linking course design with LA to overcome such barriers. This is through providing teachers with more objective evidence concerning students’ learning behaviors that currently teachers cannot determine in any concrete way, other than basing on their observations and less objective, untimely students’ evaluations. Thus, the findings indicate that LA could form the basis of an important reflective tool for improving teaching practices and allow an adaptive, rather than reactive, teaching practice (Howell et al. 2018).

Meanwhile, for LA to effectively guide teachers’ course design practices, the teachers need a framework to guide this integration. Thus, to unpack how these two components can work together, we synthesize the findings from teachers’ interviews and related literature to propose the ‘Bi-directional LA—course design conceptual model’ (Fig. 1), which is presented in the next section.

5.1 The Bi-directional LA—Course Design Conceptual Framework

The study’s findings revealed that the necessary conditions for connecting LA to course design are broad and revolve around several factors and issues. Thus, to contribute to this alignment, we propose the ‘Bi-directional LA—course design conceptual framework’ (Fig. 1) to support and guide teachers, researchers, administrators and technology developers with a usable model to make informed decisions about the nature of the analytics needed and the requirements to inform the connection between LA and course design. The name ‘Bi-directional’ was chosen to convey that both LA and course design can reciprocally determine each other, and therefore, both need to be equally considered while planning an LA—course design implementation. For instance, the data captured from learning environments may not only affect course design adaptation but also the design informing the type of data to collect and how it is to be structured and captured (i.e. data capturing, sense-making etc.). Taking this perspective, the bi-directional LA-course design conceptual framework emphasizes that an outcome within one element (e.g. LA) precipitates changes in outcomes within another element (e.g. course design). The proposed ‘Bi-directional LA—course design conceptual model’ operationalizes the connection between LA and course design through four interconnected dimensions: context, digital tools, technical/functional requirements and stakeholders. These dimensions are informed by both the empirical findings of the current study with teachers as well as existing LA literature and frameworks (e.g. Seufert et al. 2019; Kaliisa et al. 2021; Muljana and Luo 2020). Next, a conceptual description and theoretical rationale of these dimensions is given.

Dimension 1: Context. This dimension emphasizes the learning context (e.g. online, blended or face-to-face), the specific course context (e.g. course objectives, discipline) and course design influences (e.g. class size, student characteristics, teacher experience). These elements are important to consider beforehand since these will affect the nature of LA that can be gathered and the conclusions made. Moreover, the meaningful interpretation of LA outputs depends on understanding the pedagogical context in which data is being collected (Shibani et al. 2019; Ifenthaler and Widanapathirana 2014; Hernández-Leo et al. 2019; Muljana and Luo 2020). In this regard, defining/understanding the context before designing, collecting and interpreting LA is necessary because different pedagogical practices could imply particular epistemological and pedagogic assumptions (Knight et al. 2014).

Dimension 2: Digital Tools. This dimension considers the need to have the necessary technologies/tools (e.g. LMS, LD technologies, multimodal LA technologies such as sensors and eye scanners) that will support course enactment and to log machine-readable course data (e.g. access reports, page views, discussion analytics, eye movements) for LA purposes and regarding the intentions of a given course. Thus, digital tools represent an important element to support the connection between LA and course design, especially within blended learning (BL) contexts where teachers tend to deliver course designs with the limited use of technology (Rodríguez-Triana et al. 2015).

Dimension 3: Technical and Functional Requirements. This dimension highlights the necessary resources and support mechanisms (e.g. LA institutional regulations, training, policy on LA ethics) to support the integration of LA and course design practices. The teachers revealed that they need training in the use of LA tools, which implies that institutions should provide the necessary support in form of training as well as local policies about ethics. This dimension also highlights the need for LA tools and approaches that could support the analysis of course analytics needed by teachers, for example, Epistemic Network Analysis (ENA). A recent review of existing LA adoption frameworks revealed that a large number of frameworks are not concretized into technological artefacts and concrete data streams, which could make it hard for teachers to use them in making pedagogically informed learning, and teaching decisions based on the analytics (Kaliisa et al. 2021). Thus, the technical and functional requirements dimension represents an important element to support teachers’ use of LA for course design.

Dimension 4: Stakeholders. This dimension recognizes the mediating role of different players (e.g. teachers, students, institutional managers, IT personnel, among others) to support the LA-course design adoption in practice. As noted by Rienties et al. (2016) and Seufert et al. (2019) and emphasized by teachers during the interviews, LA implementation requires the involvement of different stakeholders. In particular, institutional leaders should be at the forefront of LA adoption since teachers expressed concern that institutional structures and guidelines limit them from implementing the necessary innovations and design adjustments on a needs basis. Moreover, integrating LA tools requires institutional multi-level approval due to security and ethical related issues (e.g. securing student privacy). At the same time, in this framework, a teacher and a student are both placed in the middle of the LA—course design process, as both play a key role. For example, students are involved as proactive stakeholders throughout the whole LA adoption process, so that the analytics systems do not run the risk of diminishing the ability of students to exercise judgment (e.g. who is using my data?). As emphasized by Wise et al. (2016) the teachers should position students’ analytics as an integral part of the learning process, and create an environment where LA outputs are discussed between the teacher and students.

Fig. 1
figure 1

The Bi-directional LA-Course design conceptual framework for linking LA and course design

5.2 An Illustrative Example for Application of the Bi-directional LA-course Design Conceptual Framework

The following section includes the description of a use case (Armour and Miller 2000) that illustrates how the dimensions of the Bi-directional LA-course design framework can be translated into practice. In this case, we illustrate how a teacher can design relevant activities and make use of the different elements depicted in the framework to make timely data-informed decisions.

5.2.1 Use Case: Using Social Learning Analytics to Support Course Design

This use case illustrates an example of a teacher using social learning analytics (SLA), which involves the collection and analysis of students’ learning behaviors and traces gathered from online social interaction activities and contexts (Buckingham Shum and Ferguson 2012). The context of this case is a blended learning environment using a learning management system such as Canvas to support the instruction process and logging of machine-readable data. To connect LA and course design, the teacher begins with identifying design features or designing relevant course activities based on their context and related factors (e.g. course size, course objectives, student characteristics), and theoretical or pedagogical perspectives (e.g. behaviorist, constructivist). It is also important for the teacher to consider the LA indicators (e.g. the number of logins, discussion contributions) that can be used to monitor if the planned learning activities are going as intended or not, according to a metric or model. For example, a teacher can design online discussions to be completed through the Canvas LMS, with a socio-constructivist assumption that students that are more active in the discussion imply better learning, and that active students require a certain number of postings, of a certain length, and depth of discourse. Once the course is enacted, the collection of relevant LA begins. As highlighted in Fig. 1, the teacher needs a tool to capture students’ digital traces and in line with the pedagogical needs (e.g. students’ interactions and content posted). In this case, relevant social network analysis tools such as NodeXL (Smith et al. 2009) and discourse analysis tools such as Infranodus (Paranyushkin 2019) could be used to analyze students’ online interactions and contributions. However, to support timely course adaptation, these tools should provide timely feedback before the unit or lesson is completed to enable the teacher to generate quicker insights into the topics discussed by students and areas where they are struggling. In this regard, a teacher could make some modifications into the course design based on the feedback from the LA or make targeted interventions for specific individual students. This implies a shift from the traditional summative approaches (e.g. end of semester course evaluations), which, according to teachers, cannot provide timely feedback and an objective assessment of students’ learning behaviors. However, for this process to be successful, several stakeholders need to be involved as emphasized in the Bi-directional LA-course design framework. For example, the teachers who are at the center of the design and redesign experience, the students who are the data subjects, course administrators and technical people to assist with the design or installation of relevant LA tools. Thus, for the LA-course design process to be successful, it should be a dynamic, continuous, and holistic process covering all stakeholders involved in the learning and teaching process (Conole 2012, p. 33).

6 Limitations

This study is restricted to the specific university cases and a sample of teachers, which limits the generalizability of the findings. A potential limitation is that we did not interview other stakeholders, such as administrators, researchers and IT staff. It may be worthwhile to expand future multi-stakeholder analyses to include perspectives from additional stakeholders. Thus, insights should be treated as preliminary and beginning to shed light on teachers’ perceptions of LA and its potential to support teaching practices in higher education. Besides, the self-reported nature of the interview’s results implies that sometimes what participants say may not represent what they do. Another limitation is that the empirical validation of our proposed framework and how the intended users (e.g. teachers) perceive it is missing. Towards this direction, the focus of follow-up studies is to design different interventions with teachers to examine the relevancy of the framework in supporting teachers’ practice.

7 Conclusion

This paper began by making a case for research into teachers’ design practices and beliefs about LA as a tool to support these practices. The findings revealed the prominence of situational factors, feedback systems and teachers’ personal experiences as key influences of teachers’ course design practices. Moreover, teachers reported their course design practices were challenged by institutional regulations, as well as biased and untimely sources of feedback. These findings imply that supporting tools such as LA could have the most potential to improve teachers’ course design practices by engaging with the key influences and challenges that shape teachers’ course design practices.

Simultaneously, findings revealed that teachers recognize the potential value of LA to support course design practices (e.g. normative and formative perspectives). Meanwhile, skepticism over the evaluative role of LA, the failure to meaningfully represent students’ learning behavior and a perceived increased workload LA may impose were common issues identified by the teachers.

Based on these findings, we suggest that the issues raised by LA skeptics and documented in the existing literature such as time, motivation and LA relevance should be considered if the field of LA is to make a significant step in impacting teachers’ practices. For example, to support the meaningful interpretation of students’ learning behaviors through LA, it is important to employ pedagogically driven rather than data-driven LA interventions so teachers can make meaningful and contextualized interpretations and conclusions about LA data and suggest relevant actionable insights/interventions. This also implies that the teacher/learning designer should play an active role in re-designing elements of the course based on the feedback provided by the LA, as illustrated by the proposed bi-directional LA—course design conceptual framework (Fig. 1).

We also suggest the need to include all stakeholders (e.g. administrators, technology developers, teachers and students) at all levels of LA implementation to ensure transparency, fairness and the ethical use of student data. Similarly, the study identified differences between teachers’ course design practices and perceptions towards LA based on institutional and disciplinary affiliations. As an implication, it is important to recognize the different constraints and conditions within which teachers work and to consider LA context-specific approaches. In this study, we contribute to this direction by proposing the ‘Bi-directional LA—course design conceptual framework’, which might offer institutions and teachers an easy tool to clarify the necessary conditions and guide discussions about how LA can be implemented to support teachers’ course design practices. An exemplary case has also been included to illustrate how the four-dimensional framework can be operationalized in authentic practice. We regard this framework as the main contribution to the LA and course design integration knowledge base from a social cultural perspective.

The findings reported in this paper are part of a large project seeking to explore how teachers can leverage LA to support their course design practices. The findings extend our understanding of factors that are fundamental to teachers’ course design practices, with important implications for the nature of analytics that are needed by teachers based on their current practices. Thus, this study will inform the next phases of our research by particularly addressing some of the LA needs and course design challenges highlighted by the teachers. The proposed framework will also be further examined and refined through empirical cases during the next phases of research. We hope that the results and the proposed conceptual framework have significant theoretical and practical insights into the understanding and application of LA, which teachers, researchers and higher education managers can leverage as they try to promote excellence in teaching, learning and curriculum planning.