Advertisement

Educational Technology Research and Development

, Volume 67, Issue 5, pp 1043–1064 | Cite as

Teachers’ perceptions of the usability of learning analytics reports in a flipped university course: when and how does information become actionable knowledge?

  • Anouschka van LeeuwenEmail author
Open Access
Research Article

Abstract

The flipped classroom model is a form of blended learning in which delivery of content occurs with online materials, and face-to-face meetings are used for teacher-guided practice. It is important that teachers stay up to date with the activities students engage in, which may be accomplished with the help of learning analytics (LA). This study investigated university teachers’ perceptions of whether weekly LA reports that summarized student activities supported their teaching activities (n = 7). The teachers reported using the LA reports for diagnosing and intervening during student activities, and that the reports encouraged them to start interaction with students. Teachers did sometimes find it difficult to connect the information from the LA reports to concrete interventions, which was partly dependent on the level of the teacher’s experience. LA reports might support teachers further by not only offering information, but also by suggesting interventions.

Keywords

Flipping the classroom Learning analytics Teaching in higher education Scaffolding Adaptive support Actionable knowledge 

Introduction

The flipped classroom model is a form of blended learning in which delivery of content occurs primarily by means of online materials, and face-to-face meetings are used for teacher-guided practice (O’Flaherty and Phillips 2015; Staker and Horn 2012). To provide useful support during the face-to-face meetings, teachers must be able to adequately monitor the activities of individual students or groups of students (Vogt and Rogalla 2009). Monitoring can be demanding precisely because of the distributed nature of student activities over online and face-to-face contexts (Xin et al. 2013). A possible way to support teachers is by means of learning analytics (LA), which refers to studying traces of learning and the analysis of these traces to improve learning processes (Siemens and Gasevic 2012). Empirical studies concerning the affordances of LA to support teachers in higher education are scarce. This study investigated university teachers’ perceptions of whether weekly LA reports with summaries of student activities could support their teaching activities.

Flipping the classroom: challenges for teachers

Blended learning (BL) is defined as “a formal education program in which a student learns at least in part through online delivery of content and instruction with some element of student control over time, place, path, and/or pace and at least in part at a supervised brick-and-mortar location away from home” (Staker and Horn 2012, p. 3). Thus, BL involves the integration of online and face-to-face learning activities (Garrison and Kanuka 2004). A commonly employed form of BL is the flipped classroom model, in which delivery of content primarily occurs by means of online materials (rather than face-to-face lectures), and the subsequent face-to-face meetings with teachers are used for guided individual or collaborative practice and processing of the course materials (Staker and Horn 2012).

Student and teacher roles change in the flipped classroom compared to more traditional approaches to teaching and learning, in the sense that students have control over when and at what pace they study the course materials (Condie and Livingston 2007). For teachers, it means there are differences between students, classes or cohorts concerning the amount of time and the type of activities students engage in (Abeysekera and Dawson 2015; Yen and Lee 2011). Also, since the traditional lecture is no longer the dominant format, student–teacher contact is primarily centered on the face-to-face guided practice sessions (Staker and Horn 2012). Because BL enables more freedom for students to study at their own pace, teachers may encounter students or groups of students with differing prior knowledge during these sessions. Consequently, the teacher’s primary task changes from transferring knowledge to facilitating knowledge acquisition, with more emphasis on monitoring and adapting to students’ learning processes (Condie and Livingston 2007; Partridge et al. 2011; Xu 2012). Literature concerning teacher scaffolding (Wood et al. 1976) of student learning establishes the importance of teacher support that adapts to the needs of a student or a group of students (Vogt and Rogalla 2009). Teachers must monitor and diagnose student or group activities in order to choose the correct intervention at any given time (Van de Pol et al. 2010). In the flipped classroom model, the content of the face-to-face meetings is largely determined by student input and by students’ learning needs. Therefore, adequate teacher monitoring of student or group activities is important in order for teachers to act adaptively during these meetings (Xin et al. 2013).

Previous research has given attention to how teachers transition to their new role during BL, for example in terms of adjusting their pedagogical views of learning and teaching, and in terms of developing technological skills (e.g., Çardak and Selvi 2016; Shelley et al. 2013). From these studies, it can be deduced that teachers think BL requires great effort and flexibility from them; the teacher in that sense remains a central figure in the learning process despite students’ self-directed independence (Wanner and Palmer 2015; Xin et al. 2013). Empirical studies regarding how teachers monitor and diagnose student or group activities in the context of blended learning are scarce, and their findings vary (e.g., Comas-Quinn 2011; Greenberg et al. 2011; Roehl et al. 2013; Vaughan 2007; Xin et al. 2013). On the one hand, the face-to-face meetings in the flipped classroom model are primarily used for teacher–student interaction instead of teacher lecturing, which offers teachers more opportunities to ask students or groups of students about their progress (Roehl et al. 2013). Several case studies (Greenberg et al. 2011; Xin et al. 2013) indeed reported on a flipped classroom pilot in which the teachers spent more time interacting with students, thereby providing more opportunity for feedback on and detection of misconceptions. On the other hand, the variety of (online) learning activities in BL can make it more difficult for teachers to keep up with student activities. Vaughan (2007) reported that teachers in a course with a BL design were in “fear of losing control over the course” (p. 88). Similarly, Comas-Quinn (2011) found that teachers experienced time pressure and “a feeling that learning was too distributed, that there were too many places to check and contribute to” (p. 227). A challenge for teachers in the flipped classroom thus seems to be to monitor the activities and progress inside and outside of the classroom. In the next section, the possible role that learning analytics could play to aid teachers is discussed.

Learning analytics to support teachers

As explained above, the teacher’s primary task in the flipped classroom changes from transferring knowledge to facilitating knowledge acquisition, in which being able to monitor and adapt to students’ learning processes is a key competence required from the teacher. While the flipped classroom enables more diversity in the time students spend on the course activities, it can be a demanding task for teachers to maintain an overview of students’ progress. At the same time, because student activities occur online to some extent, these can be automatically logged and analyzed. Subsequently, the teacher could be informed of what activities students or groups of students have engaged in when they enter the classroom. This way of using data about students is a form of learning analytics.

The field of learning analytics (LA) is defined as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs” (Siemens and Gasevic 2012, p. 1). LA can be seen as a process in which data is collected from learners, an analysis is performed on these data, and then an intervention is derived which in some way (positively) impacts the learning process (Clow 2012). LA is a relatively young field, and the interested reader is referred to the recently published handbook (Lang et al. 2017) for a discussion of its possible affordances for education, as well as several common criticisms.

Important for the purposes of the present study is that summaries of student activities can be used to inform teachers about the learning processes that take place in the groups they teach, and teachers can subsequently use this information to make pedagogical decisions (e.g., Van Leeuwen 2015). In the flipped classroom model, students’ self-study of the course materials as preparation for the face-to-face meetings are distributed, and sometimes multiple media are employed. While these varied online activities make diagnosis of student activities more complex, it also means these online activities can be tracked and summarized to inform the teacher on a regular basis (Van Leeuwen 2015). The affordances of LA can thus be related to scaffolding: the hypothesis is that when teachers are informed by LA, they can offer more appropriate support to student, classes, or even subsequent cohorts of students.

Thus, LA are hypothesized to deliver actionable knowledge (Dix and Leavesley 2015; Siemens 2013) to teachers when the provided information enables teachers to engage a pedagogical intervention. Recently, first steps have been taken to develop LA for teachers (for a review, see Sergis and Sampson 2017). For example, in secondary education, LA have been shown to lead to more teacher interventions when teachers use LA during a lesson (e.g., Van Leeuwen et al. 2015), and to have been used by teachers to select appropriate tasks for students in between lessons (e.g., Tissenbaum et al. 2016). A theoretical discussion of the role of LA for instructors (Wise and Vytasek 2017) points out that there are certain principles of design that make LA an effective support tool for teachers, namely the principles of (1) coordination, i.e., making the LA an integral part of the educational context, (2) comparison, i.e., providing a frame of reference to interpret the meaning of an analytic, and (3) customization, i.e., the recognition that each situation requires specific LA.

Research on LA in higher education has mainly focused on the relationship between student behavior and achievement to identify students that may be at risk of dropout (e.g., Ognjanovic et al. 2016; Tanes et al. 2011; Tempelaar et al. 2015), thereby using LA as a predictive tool (Wise and Vytasek 2017). However, none of these studies have thoroughly investigated how teachers use information obtained through LA to shape their decisions while teaching a course. Furthermore, none of these studies in the context of higher education addresses the role of LA in the specific case of the flipped classroom model. In the flipped classroom, the variety of students’ (online) learning activities can make it more difficult for teachers to keep up with student activities, and thus make teachers’ diagnosing of student activities a demanding task. At the same time, the flipped classroom offers opportunities for collecting data about students, which can be used to inform teachers’ instructional decisions. The hypothesis of this study is therefore that the combination of the flipped classroom and the application of LA could support teachers by providing summaries of student activities.

The present study

Existing review studies concerning LA in higher education emphasize that LA should deliver actionable knowledge (Dix and Leavesley 2015), especially during a course: “if they [LA] are to directly benefit students’ learning… they must be used by academics in their on-going relationships with students and management of teaching material” (Dix and Leavesley 2015, p. 48, emphasis added). Yet, it is exactly this aspect of teachers’ use of LA that is not yet documented in empirical studies. There is a lack of studies that examine which strategies teachers use in a flipped course to diagnose student progress in combination with LA, and whether teachers adjust their interventions based on LA.

The aim of the present study is to examine which functions weekly learning analytics reports serve for teachers during a flipped university course, and which challenges teachers experience in using these reports. Teachers were provided with weekly LA reports that showed summaries of student activities, and the focus was on teachers’ reports of how these LA were incorporated into their teaching practices. By answering these questions, the wider aim of the present study is to contribute to supporting adaptive teaching practices in the flipped classroom, as well as to contribute to our knowledge about the affordances of LA in higher education (such as delivering actionable knowledge). The following two research questions were formulated:

RQ1

Which functions do weekly learning analytics reports serve for teachers during a flipped university course?

RQ2

What challenges do teachers experience concerning weekly learning analytics reports during a flipped university course?

Method

Design

The study used a qualitative, exploratory methodology. Using teacher logbooks and interviews during an 8-week period in an undergraduate university course, it was investigated how teachers made use of learning analytics.

Participants

The seven participating teachers formed the teaching team for the course Designing Educational Materials (DEM) during one semester (10 weeks) at a university in the Netherlands. Teachers 1 and 2 were the coordinators of the course. Teachers 1, 2, and 3 alternated in providing the weekly lectures in the course. Teachers 2, 3, 4, 5, 6, and 7 taught the weekly working groups. The mean age of the teachers was 35.9 years (SD = 9.7). On average, they had 6.9 years teaching experience (SD = 5.9) in higher education and 3.4 years teaching experience (SD = 2.3) in the DEM course. Five teachers were female. None of the teachers had experience with using learning analytics (LA) as part of their teaching. All teachers gave informed consent at the start of the course, indicating they agreed their data would be used for research purposes.

One hundred and fifty students were enrolled in the course and divided into seven working groups (average working group size was 21.4 students, SD = 2.0). One hundred and forty-six students signed informed consent for their data to be used for research purposes. The mean age of these students was 23.5 years (SD = 4.6); 114 students (78.1%) were female. The course involved both individual and collaborative activities. The collaborative activities were carried out in groups of four or five students; each working group therefore consisted of five or six small groups.

Course structure

Course structure in terms of student activity

The DEM course had two learning objectives, namely for students to gain (1) knowledge about design models, and (2) skills concerning DEM. Students’ mastery of the first objective was assessed using an individual final exam, and the second objective was assessed using a collaborative assignment in which groups of four or five students designed course materials (e.g., a customer service training for bank employees).

The course used a flipped classroom model (Staker and Horn 2012). Students were expected to study the course materials before coming to the face-to-face meetings, during which time was spent on processing and deepening knowledge. Figure 1 outlines students’ weekly activities. To study the course materials, students could read the book in which DEM was explained or watch one of the 24 web lectures (or both), which were based on the content of the book. In addition, students were required to process the learning material by submitting at least one formative multiple choice question about the materials for that specific week. The questions were handed in through PeerWise, an application that enables students to see and practice with each other’s questions (Hardy et al. 2014; PeerWise 2016).
Fig. 1

Outline of weekly student activities in the DEM course

Every week, two face-to-face meetings were scheduled that were aimed at processing and deepening knowledge of DEM. In the 90-min interactive lecture, students discussed with a guest speaker that presented a real-life example of DEM. Also, additional background was provided about DEM that students could subsequently use in the group assignment. The weekly working groups, guided by one of the working group teachers, lasted 4 h. In the first hour of the working group, the formative questions submitted by the students were answered and discussed as a whole group. The remaining time was used to work on small group assignments, during which the teacher consulted with each small group and answered questions when needed.

In the first 8 weeks of the course, the course followed the structure outlined in Fig. 1. In week 9, no face-to-face meetings were scheduled, and students handed in the group assignment. In week 10, the individual exam took place. Students’ online activities (PeerWise and web lectures) were automatically logged. Attendance at the face-to-face meetings was registered on paper. A further source of information about students’ activities was a short weekly questionnaire that students filled in at the start of each working group (see “Content and design of the weekly LA reports” section). Students’ activities served as input for the weekly teacher LA report, see “Content and design of the weekly LA reports” section.

Course structure in terms of teacher activity

Teachers met with the students during lectures and working groups (see Fig. 1) in weeks 1–8 during the course. Every Monday between week 1 and 8, before the lectures and working groups occurred, a meeting between the teaching staff took place in which the planning for that week was discussed. During these meetings, teachers were provided with weekly LA reports about students’ activities. Section “Content and design of the weekly LA reports” details the content and design of the LA reports. The first report was handed out in week 2, after the first data about students’ activities had been collected. The LA reports were printed on paper for each of the members of the teaching team separately (corresponding to the working group they taught). To investigate teachers’ use of the reports, two instruments were used. First of all, teachers were asked to fill in a weekly logbook. Starting in week 2, after each staff meeting on Monday, teachers were sent an invitation with a link to the online logbook and were requested to fill it in that same day. In week 9, after all face-to-face meetings had ended, the teachers were interviewed about their experiences with and opinions of the LA reports. The logbooks and interviews are described in “Teacher logbooks and interviews” section.

Content and design of the weekly LA reports

Content

An important decision was which data to display on the weekly LA reports concerning students’ activities. A criticism of LA has been that the choice of indicators of student learning “has been controlled by and based on the data available in learning environments” (Dyckhoff et al. 2013, p. 220, see also Slade and Prinsloo 2013), even though these measures do not always represent the information teachers are interested in or adequately measure student learning (Tempelaar et al. 2015). As has been increasingly advocated in the field of LA, data should be collected and interpreted by means of educational theory (Wise and Shaffer 2015). An explicit goal of this study was to provide meaningful LA reports to the teachers. Instead of relying on data that were automatically logged by the online systems, the researchers adopted the stance from action research to start by “think[ing] about the questions that have to be answered before deciding about the methods and data sources” (Dyckhoff et al. 2013, p. 220). Thus, the leading questions to determine the content of the LA reports were: what data are available; what data are considered important in literature; and what data are deemed important by the teaching staff.

Based on literature, the assumption was made that more student online activity would indicate more task effort and predict better learning outcomes on the exam and the group assignment (e.g., this relation was found for creating formative questions, Hardy et al. 2014, for study time, Lim and Morris 2009, and for engaging in online activities, Lowes et al. 2015). It was therefore important to gather information about which topics of the course materials students studied, as well as for how long they did so. In addition to individual study of the course materials, a large part of the course concerned the group assignment. Literature indicates that within groups, variables such as communication, trust, and perception of reliability of group members are important for successful collaboration because these factors enhance the social cohesion that is needed for group members to contribute equally to the task (Janssen et al. 2010; Kyndt et al. 2013). Therefore, these indicators were also added to the list of initial data types that were to be included on the LA reports.

This initial list of indicators was presented to the teaching staff, after which they could suggest additional types of information they wanted to have about their students. Based on the teachers’ experiences in previous cohorts of the DEM course, they suggested including the following additional information: (1) a measure indicating whether students feel they are able to translate the theoretical DEM model into their practical group assignment. This suggestion was interpreted theoretically to mean that teachers would like information about students’ self-efficacy; (2) a measure indicating whether there was agreement within the small groups about how to realize this translation; (3) a measure indicating whether students feel like they are on schedule with their assignment or are falling behind; and (4) an open question in which students could comment on the group collaboration.

The indicators based on literature and teacher suggestions were not automatically logged like the online activities, and thus had to be collected purposefully. This was done by means of a weekly questionnaire that was filled in by the students at the start of each working group. Filling in the questionnaire took about 5 min and was a fixed part of the routine of the working groups. The average number of completed questionnaires over the 8 weeks was 130.71 (SD = 3.5), indicating a high and steady completion rate (out of the sample of 146).

Design

The final types of student information that the LA reports consisted of are displayed in Table 1, in which a distinction is made between information concerning activities students carry out individually, and information concerning the group assignment. The report also displays what percentage of students had watched the web lectures. Given that each web lecture started with an outline of its content and ended with a short outro, we estimated that students had seen the core of the web lecture if they had watched at least 75%. Students were therefore only included in this number when they had watched at least 75% of the web lecture.
Table 1

Overview of student information on the weekly LA reports

Data sources

Measures of individual activity

Measures of perceptions regarding group assignment

Automatically logged online activities

Frequency of viewing web lectures

Frequency of submitting formative questions

 

Information gathered by weekly questionnaire

Number of hours studying course book during the past week

Course topic studied most during the past week

Open comments about individual progress

Perception of collaboration on group assignment (score on a scale from 1 to 7)

   Communication within group

   Participation of group members

   Reliability of group members

   Agreement within group

Progress on assignment (answer options ‘behind schedule’, ‘on schedule’, or ‘ahead of schedule’)

Self-efficacy: ability to translate theory into assignment (answer options ‘yes’ or ‘no’)

Open comments about group assignment

The measures were aggregated to working group level, so that information about individual students could not be identified by the teachers. While from a scaffolding perspective it might be more beneficial for teachers to obtain information about each individual student separately, there were two reasons why it was decided to aggregate the data. First, the information might be considered sensitive and should therefore be treated carefully (Slade and Prinsloo 2013). Second, the LA reports might have become overly long and complex if they detailed information about every student separately. To secure students’ privacy and to keep the LA reports manageable for teachers, the data were therefore aggregated to working group level.

The LA reports had a basic, straightforward design using tables to display the quantitative information, which included the mean, standard deviation, and minimum and maximum for the working group (see Fig. 2) as well as for the whole course, so that teachers could compare their group to the course average. The open comments the students submitted in the weekly questionnaire were copied onto the LA report in a bulleted list (see Fig. 2).
Fig. 2

Excerpt from example of LA report (with fictitious data)

Teacher logbooks and interviews

Teachers were asked to fill in a weekly logbook about how they had used the LA reports. The logbook first contained a question about how the teachers had used the LA reports in the previous week to make choices concerning their teaching activities in the DEM course. They were then asked about their intentions for the coming week, based on the LA report they had received that same day prior to filling in the logbook. In total, 37 logbook entries were collected (on average 5.3 per teacher; min = 3, max = 7), yielding a response rate of 75%. The logbooks recorded the teachers’ responses to the LA reports during the course, and were used as input for the main data source, the teacher interviews.

In the teacher interviews, general questions were asked about the functions the LA reports fulfilled for the teachers in terms of preparing for meetings with students, during meetings with students, and for reflecting on meetings with students. Teachers were also asked about their general perceptions and opinions of adding the LA reports as a part of the course. Teachers were furthermore asked to look at one LA report as an example and to think aloud about what they could infer from such a report and how they valued the information displayed on the report. The logbooks were analyzed prior to the interviews and served as prompts during the interview when asking teachers to reflect on the functions the LA report had served for them. Therefore, the interviews were a form of cued retrospective reporting (Van Gog et al. 2005), where the logbooks served as cues to elicit teachers’ thoughts and experiences concerning the LA reports. For example, teachers were asked to report in the interview what functions the LA reports had served for them. For each teacher, the answers from the logbooks were selected in which teachers had mentioned such functions. During the interview, the logbook answers could then be shown to the teacher to prompt further explanation and reflection on the LA reports.

The interviews were semi-structured and lasted on average 30 min. Teachers were asked permission to record the interviews, so that the interviewer could focus on the conversation and transcribe and analyze the interviews at a later point.

Analysis

During analysis, a bottom-up and top-down approach were combined, using both the raw data as well as a theoretical lens to identify relevant codes and themes. The grounded theory approach was used to code the transcribed interviews in three phases: open coding, axial coding, and selective coding (Boeije 2010; Strauss and Corbin 1994). Open coding means the data are read, meaningful information is determined, and codes are created and assigned. Open codes were created for two dimensions, corresponding to the research questions: what functions of the LA reports the teachers mentioned, and which challenges they raised. After five interviews, saturation occurred and no new codes were needed. Interview six and seven were also open coded and used to further refine the list of open codes.

During axial coding, the relationships between the codes were further examined and main and sub-codes were distinguished. The theoretical lens of scaffolding was used in this process. In particular, the distinction between diagnosing students’ needs and intervening in the learning process became relevant. After thematically sorting the open codes, reliability of the coding scheme was established by calculating interrater reliability. For the codes for functions and challenges, 20% of the fragments were coded by two researchers, resulting in 90% agreement. The resulting axial coding scheme consisted of 10 codes for functions and 9 for challenges. After establishing reliability, the whole set of fragments was recoded with the final axial code book.

In the final stage of selective coding, connections between codes were sought in order to determine the core themes within the data. Core themes not only appear frequently, but are also central in the sense they are connected to a lot of other codes (Boeije 2010). A table was created for Teacher × Code. For each code, the corresponding fragments were read and compared to identify the relations between the functions and challenges of the LA reports. Thus, the result of this phase was the identification of the core themes and the relation between them.

Results

The teacher interviews were analyzed on two dimensions in line with the research questions, namely which functions the LA reports served for the teachers, and which challenges arose concerning teachers’ use of the LA reports. The core themes that emerged and the relations between them are further discussed in the sections below.

Functions of LA reports

The functions that the LA reports served for the teachers were categorized broadly into the two phases of scaffolding, namely diagnosing and intervening. Table 2 displays how many of the seven teachers mentioned each function.
Table 2

Frequencies of functions of LA reports and their distribution among teachers

Function

Number of teachers that mentioned this function

Total mentions

Diagnosis

 Overview of student activities

7

23

 Overview of student self-assessment

5

7

 Detecting problems

7

18

 Comparison to face-to-face information

5

11

 Teacher team level

2

2

 Course level

2

6

 Outlet for students

2

2

Intervention

 Address issues

4

12

 Starting interaction with students

6

28

 Course level

2

3

Total

 

112

Diagnosis

The first and most frequently mentioned function of the LA reports was diagnosing, which was mentioned by all teachers. The LA reports were used for diagnosing in several ways and on several levels.

First of all, the LA reports allowed teachers to diagnose student activities at working group level. In one glance, it provided them with an overview of the activities that students had engaged in that week. T2: “Even if you don’t use the information, it is still good information to have, it makes you feel more in control of what is happening.” The reports also included information on how students assessed themselves, for example whether they thought they were on schedule concerning the group assignment. Teachers valued this information. For example, T5 stated, “My [working] group was constantly behind schedule. That was quite an insight for me, to see they constantly had the feeling they were falling behind.” Some teachers also specifically looked at the progress students were making by comparing the LA reports from several weeks. T3 expressed, “It was informative in the sense that you could see it change week by week. You could deduce that some students became more confident about themselves, and others became more insecure.”

Having an overview of student activities and student self-assessments also helped to detect the occurrence of problems. T6 explained, “It is good to know, especially tension within [collaborating] groups. It occurs, you are prepared for it, you know that it was happening.” A strategy teachers used to identify problems was to compare the working group score to the course average that was also displayed on the LA reports. Students’ open comments on the LA reports were also valuable in detecting problems. T1 gave this example: “… the qualitative part with the student comments, that was especially useful. You could see whether there were any issues, because it simply said so on the report.”

The information teachers gained from the LA reports deviated from the information a teacher would receive purely based on face-to-face contact with students. In such contexts, students sometimes feel pressured or insecure and do not always indicate what problems they experience. T2 said, “If something goes wrong, you have an early warning as a teacher [because of the LA report], and that is valuable. Of course you can also ask students directly, but the question is whether they will answer truthfully.” For students, it is therefore easier to mention something in a weekly, anonymous questionnaire than in front of their peers during the working group. T3 explained further, “During a working group, of course they keep quiet. They won’t be telling me that they are the only one in their group that does not feel confident of what to do next.”

To a lesser extent, the LA reports facilitated diagnosis at teacher team and course level. T4 mentioned that some of the information on the LA reports was discussed in the weekly staff meeting. The two coordinators of the course (T1 and T2) mentioned that the LA reports provided them with valuable information about what was going on in the course. T2 explained, “Usually, you don’t have this type of information about a course, except for the stories you hear from the teachers. And these are ‘hard numbers’. I think it has added value to have this information.”

Intervention

A second function the LA reports served for the teachers was to inform them of when intervention might be needed. This function was mentioned less often than diagnosing (see Table 2), but was still mentioned by almost all teachers. Again, a distinction can be made here between several levels: intervening at working group level or at course level.

Within the working groups, four of the teachers mentioned the LA reports allowed them to address issues that they noticed from the reports. Sometimes there was a clear request for additional explanation that was mentioned in the open student comments. Teacher T7, for example, once adjusted her working group slides to incorporate information about time planning and to discuss what strategies students were using to work on the assignment. Conversely, the information could also motivate teachers to give compliments when students were doing well.

A strategy for intervening mentioned more often was to start interaction with students by mentioning the results of the LA report. T2 gave the following example: “During the working group I took the LA report and told the students: ‘This is the image of you that I get from the report.’ I thought, ‘Let’s see whether they respond to that.’” Other teachers asked specific questions based on the LA report, such as T3, who said, “You could tell them: ‘I see that some of you find this difficult.’ And I could ask questions about why that was the case, why it was so difficult.” The LA reports allowed teachers to start the discussion about particular issues that students themselves may not have brought up during the face-to-face meetings. Also, when teachers noticed a downward trend in a particular score, it allowed them to inform the students and offer them help. T4 explained, “In one of the final working groups I told them ‘I saw [the scores for confidence were] going down, and if there are any problems, you know I am available for you.’” Thus, the LA reports were also a means for teachers to start interacting with students since it was an objective piece of information. Teacher T5 mentioned that the reports made her focus more precisely on what students were saying and also made her spend more time on issues she otherwise would not have focused on thoroughly. T5 stated, “It makes you sharper in asking follow up questions, to ask for the reason why a student feels he is falling behind. Are you really falling behind, or is that the feeling you have? Where does it come from? So it offers me more reason to ask questions.”

Intervening rarely occurred at the course level. T1 mentioned she sometimes addressed issues during the lectures such as questions about planning that arose from the LA (similar to teachers addressing issues in the working groups), but making other changes at course level was difficult. Even though some students had suggestions for course improvement in the open comments on the weekly questionnaire, these could not all be dealt with immediately but only be taken up for the subsequent cohort (such as the ordering or content of lectures).

Challenges concerning LA reports

The functions of the LA reports as discussed above also gave rise to a number of obstacles teachers encountered while using the reports and to issues that require further attention in future projects. These challenges were categorized broadly into three topics: coupling diagnosis and intervention; struggling with the design of the LA reports; and reconciling student and teacher expectations and experiences. Table 3 displays how many of the seven teachers mentioned each challenge.
Table 3

Frequencies of challenges concerning LA reports and their distribution among teachers

Challenge

Number of teachers that mentioned this point

Total mentions

From diagnosis to intervention

 Interpretation and actionable knowledge

7

48

 Role of teacher characteristics

2

5

 Team level agreements

1

3

 Value of LA report at course level

2

2

Design of LA reports

 Form and lay-out

7

17

 Ideas for future content

  

  More in depth information

3

9

  Additional information

3

13

Expectations and experiences

 Communicating goal of LA reports

3

4

 Student perceptions

7

10

Total

 

111

From diagnosis to intervention

The challenge mentioned most frequently was the transition from diagnosis to intervention, which was raised by all seven teachers. Although teachers used the LA reports for diagnoses and intervention, diagnoses did not always lead to interventions. Sometimes this was because intervention simply was not needed, and the LA reports did not show deviating or alarming scores. T3 stated, “The data get especially interesting when it contradicts your diagnosis of how your group is doing. In my case, that did not really happen. But I can imagine that it could happen.” As long as the scores were average or above a certain threshold, there was no reason for teachers to intervene (except in some cases to pay students a compliment).

There were also cases, however, when a teacher did detect a problem but was unable to act upon it. This could happen for several reasons. First of all, the types of information on the LA reports differed in how valuable they were to the teachers, and there were differences between teachers as to what extent they were able to deduce information about students’ need for support from these data. For example, one of the elements on the LA reports was how often students had viewed the web lectures. Some teachers indicated they found this information useful but did not know how to act upon this knowledge. T6 stated, “On the one hand I find all of this very interesting, but on the other hand I asked myself: how much does it tell me? Some students viewed the web lectures and others didn’t. But that does not tell me whether they would benefit from viewing them when they didn’t.” In comparison, T2 used this information to reason about students’ time planning, stating: “At a certain point I could see they were falling behind viewing the web lectures. You would expect the viewing patterns to be congruent with the subjects we discuss in the lectures. Apparently, at some point students were busy with another subject than the one discussed in that week.” Similarly, some teachers found the information about student collaboration useful and used it to start the discussion with students, whereas others found these social aspects less relevant. Another difference between teachers was to what extent they thought the LA report provided them with information they could not gather from interacting with the students during the face-to-face meetings. T3 indicated: “The student comments [on the LA report] were always interesting. But they did not add that much in the sense that usually you also notice these things during the working groups. So you have some sort of confirmation, but whether it was new information… I doubt that.” This contrasts with some of the other teachers, who indicated they sometimes acted directly based on the issues they encountered in the report (see “Intervention” section). Thus, teachers’ abilities to glean additional information from the LA reports determined whether they would follow up with an intervention.

A second factor related to the question whether teachers would intervene concerned the granularity of the information on the LA report. At course level it was hard to act upon suggestions for course improvements. At working group level, teachers were sometimes unsure whom to address concerning the problem they detected because the data on the LA report was at working group level and was not retraceable to individual students. The way teachers sometimes dealt with this was by announcing to the whole group that they suspected there was a problem, and by encouraging students to come talk to them if they wanted to.

Lastly, a reason why a detected problem did not lead to intervention was because some teachers were unsure of how to address the problem. In hindsight they acknowledged they would have liked to do something with the information they had. This may also have been due to their relatively limited teaching experience, as one of the teacher suggested: “The fact I couldn’t act on it [the LA report] may be due to the fact I am inexperienced as a teacher. Maybe a more experienced teacher has more tools to do so.”

T5 suggested that the group may have benefitted from team agreements, i.e., agreements or discussions before the start of the course about what to do when particular situations arose. As a teaching team, they could have sat down, looked at an example LA report, and come up with hypothetical situations and teaching strategies. T5 stated: “As teachers, we may have gotten more out of the LA reports for the students… For example, when the communication scores for a group drop below a certain level, we could have thought of some activities that you suggest to students in that situation.” According to T5, such guidelines could have been relevant for the teachers that were unsure about how to connect a diagnosis to an intervention.

Design of LA reports

The second challenge concerned the design of the LA reports. The reports were distributed on paper and consisted of relatively simple tables (see Fig. 2). All teachers indicated they thought the present version was fine. The majority of teachers also thought that it was very useful to receive the weekly reports on paper during the staff meeting. T1 said: “Everyone really looked at it [at the LA report]. If it is not part of the teacher meeting, I’m not sure everyone would still do so.”

The teachers did have suggestions for future versions of the LA reports concerning the types of data to be displayed. On the one hand, they suggested making the information on the report more in-depth. For example, although some teachers laid out the reports next to each other to see progress over time, a display of this development of scores would have been useful. Another suggestion was to have a comparison between working groups, especially for the coordinators of the course. Some teachers would have liked to have information on student level instead of information aggregated on working group level so that they could, for example, see which students were less active than others.

Another type of suggestion concerned additional types of information that were not on the LA reports in the present form. For example, having students evaluate their teacher weekly was suggested. T4 stated: “Whether you are an approachable teacher, that could be interesting. Your functioning as a working group teacher at a moment that you are still able to adjust.” T1 suggested to also ask students about their interest in the course and their motivation. Finally, T5 suggested changing the function of the report; instead of only reporting on student behavior, the LA report could also indicate whether student behavior is problematic, and thus whether intervention of some kind is necessary. As T5 said: “I’m thinking about whether it would be a good idea to provide a ‘margin’ of desirable student behavior.”

Expectations and experiences

The last challenge concerned students’ expectations and experiences regarding the fact that their data was gathered and used as input for the LA reports. Although students were informed about the present project, the participating teachers emphasized that it was important for future projects to inform students of the goal of the project even more thoroughly. The students were asked to weekly fill in a short questionnaire, and teachers mentioned differing student sentiments about this requirement. T2 for example noted that “students gladly contributed to quality improvement of their education,” whereas T4 had the impression that “they [students] sighed more as the course progressed and they had to fill it in again.” Some teachers explicitly used and mentioned the reports during the working groups more often than others, which also contributed to students’ awareness of how the LA reports were used.

Discussion

This study investigated how teachers monitor students’ activities in a university course with a flipped classroom design, and specifically whether teachers perceived weekly LA reports with summaries of student activities to influence their teaching practices. This study was among the first to investigate the possible role of LA in a flipped university course. As such, this study has contributed to expanding our knowledge on how teachers can use LA as actionable knowledge in flipped classrooms to further their students’ learning.

Research question 1: functions of LA reports

The first research question was which functions the weekly learning analytics reports fulfilled for the teachers. The results showed that all teachers used the reports to check whether students engaged in the activities as set out in the course structure. Previous studies indicated that monitoring student activities in the flipped classroom might be more difficult because activities are spread and individualized (e.g., Comas-Quinn 2011), but might also be easier because there is more face-to-face contact with students during which the teacher can find out how students are doing (e.g., Roehl et al. 2013). Findings from the present study indicate that the teachers indeed used face-to-face contact to hear how students were doing, but that the LA reports were a valuable tool for achieving this goal as well, and a necessary tool to gain a complete understanding of students’ activities. The LA reports provided a more complete and objective overview of what students do and perceive than they would tell a teacher directly in a meeting. Also, the LA reports provided structure in keeping up with the multitude of activities that students engaged in as part of the course.

The overview and understanding of student activities also led to pedagogical intervention. The LA reports were actionable in the sense that they opened up the interaction between the teachers and the students, enabling teachers to use the information on the report to start the conversation (Siemens 2013). The types of information the teachers used to start the conversation were not only the automatically collected data such as how often web lectures were viewed, but also students’ perceptions of their own progress. Students’ perceptions about their individual and collaborative learning activities were obtained from the weekly student questionnaire, which confirms that meaningful data should also be purposefully collected instead of only derived from what is logged automatically (Dyckhoff et al. 2013). This additional information also concerned affective data such as students’ self-efficacy. Acquiring additional data about students therefore is a promising approach to support teachers.

The findings also showed in what way teachers judged the information on the LA reports. The teachers were for example inclined to pay students a compliment when they were doing better than the course average, or to ask what was going on when a value dropped below the course average. The teachers used the course average as a baseline to decide whether intervention was necessary or not, which is in line with the principle of comparison (Wise and Vytasek 2017). Showing the course average as a baseline was thus an example of how to present information in an actionable way. In future versions of the LA reports, specific attention could be paid to the visual display of the data to make them more functional. For example, the teachers suggested plotting students’ progress longitudinally in a graph. Using cues and visualizations such as these could help to make the information easier to interpret (Mayer 2009) and therefore more actionable. In this respect, as LA also involves a phase of data visualization, it could be informed by other fields such as human–computer interaction (HCI, see Dix et al. 2004). HCI is concerned with the physical characteristics of machines and visual interfaces, and uses design principles derived from the capabilities and limitations of how humans process information to optimize interface design (Dix et al. 2004). In case of supporting teachers, LA should facilitate teachers to quickly and easily identify which student activities might be in need of further guidance. Future research could focus on how specific ways of visualizing information lead to more or less actionable teacher knowledge.

To summarize, the LA reports served several functions and influenced teacher behavior in several ways. Although teacher behavior was influenced by the LA reports, the question remains whether providing teachers with the reports was also beneficial for students. Although we did not investigate the effect on students directly, we can reflect on possible consequences for student learning. For example, teachers reported using the information about students’ reported self-efficacy (their confidence of being able to apply the theoretical model to their group assignment) by trying to boost students’ confidence when this variable was low. This teacher strategy to motivate students to focus and persist has indeed been shown to increase students’ self-efficacy (Koh and Frick 2009). Here, the findings thus provide initial evidence of how the LA reports could indirectly influence student achievement, for example by a cycle of increased teacher support, leading to increased student self-efficacy, leading to increased student achievement (Van Dinther et al. 2011).

Research question 2: challenges of LA reports

The second research question was what challenges teachers experienced concerning the LA reports. As discussed in the previous section, providing teachers with LA reports could (indirectly) influence student learning when they stimulated teachers to take action in ways that are beneficial for student learning. At the same time, the core issue reported by teachers was the question of how to translate information from the LA reports into interventions during the face-to-face meetings with students. Besides the functionalities of the reports mentioned above, instances were also reported of the teachers not being able to deduce relevant information from the reports, or not being able to deduce relevant pedagogical actions. In fact, there appeared to be two separate teacher competencies that related to whether the LA reports were actionable or not, namely (1) detecting issues and (2) choosing appropriate interventions. The first ability refers to being able to infer from the reports what might be going on with students, i.e., to form a diagnosis of the current state of students’ activities (Klug et al. 2013). Some teachers showed more proficiency at reasoning about the data than others. As suggested above, variations in visual presentation of data (such as providing a mean value to compare to) may help to make it easier to deduce students’ needs from the data. Other factors of influence seemed to be what the teachers themselves found to be important indicators of students’ progress, and whether they were able to connect specific information to broader concepts and principles of learning (Sherin and Van Es 2005). For example, the number of times students had watched web lectures led to different inferences about student learning, depending on which teacher judged it. This finding is in line with research that investigates teachers’ transition from a purely face-to-face course setting to a blended context (Shelley et al. 2013). Some teachers may have relied more heavily on the face-to-face meetings to diagnose the progress of their students instead of figuring out what students’ online activity could tell them about students’ cognitive progress in the course.

On the other hand, information about social aspects of student collaboration on the assignment, such as students’ perceptions of the group process, was also inferred by the teachers during the face-to-face meetings. When a small group of students had a problem, it usually became apparent during these meeting. In this case, the information about student collaboration on the LA reports became more a confirmation of teachers’ impressions of students than an offering of new information. When there was information on the LA reports about these social processes that the teachers had not considered yet or were not aware of, they addressed it in class and investigated further.

Thus, there was an interplay between different types of information (cognitive, affective, and social), the available sources where this information could be obtained (through the LA reports and during face-to-face meetings), and the teachers’ abilities to deduce what each source of information meant. This finding suggests that for LA to be actionable, the specific characteristics of the intended users should carefully be considered. Teacher experience with blended learning contexts, and in particular the combination of face-to-face meetings and online activities that is at the core of the flipped classroom model (Staker and Horn 2012), is one of those important characteristics.

The second competency that seemed to underlie teachers’ use of the LA reports was determining which intervention could best address a particular situation given a diagnosis of that situation (Van de Pol et al. 2010). Again, there was variation among teachers, which was partly related to teacher experience. The less experienced teachers were more hesitant about what to do with the information offered on the LA report. Again, this finding confirms that taking into account user characteristics is important for LA to be actionable. To support less experienced teachers to connect pedagogical interventions to the LA reports, it could be considered to discuss the LA reports more thoroughly in the weekly teacher team meetings. In the context of secondary education, so called “data teams” have for example been explored to increase data-based decision making within schools (Schildkamp et al. 2016). Another option is to extend the LA reports so that they do not only contain information about students, but also suggestions for how to intervene (Sergis and Sampson 2017). For example, when students indicate low levels of self-efficacy, the LA report could recommend teachers to offer motivational support to students and to create more mastery experiences for students, as this has been shown to be one of the most effective strategies for improving self-efficacy (Van Dinther et al. 2011). In this case, the role of the LA reports changes from merely providing information to advising teachers (Sergis and Sampson 2017).

In other cases, there were external factors that explained why the LA reports did not always lead to intervention. For example, the unit of analysis was sometimes not small enough, so that teachers could not deduce which student or project group were struggling. As discussed in “Design” section, the choice was made to aggregate data to working group level to preserve students’ privacy. When data about individual students is displayed on the LA reports, students should be well informed of what happens with these data (Slade and Prinsloo 2013). One of the findings in this study was that teachers gained a more complete overview of student activities because students were honest in the weekly questionnaires. The question is whether students are still willing to share their information when they know the data is directly traceable to them. In general, however, even with aggregated data, the teachers found the LA reports meaningful. With even more detailed information, the benefits of LA might be even greater.

In sum, there were both internal and external factors that determined to what extent the LA reports were actionable.

Practical implications

Building on previous research, the results of this study suggest some recommendations for educators that consider supporting teachers in the flipped classroom by means of LA. As these recommendations are based on this small-scale exploratory study, they are preliminary in nature and require further research. This study suggests that LA designers should:
  • Determine in advance which information about students is useful for teachers This choice can be made for example by consulting relevant literature, but also by asking teachers from previous cohorts about typical problems or difficulties that are likely to occur with students that follow this particular course. Consider what information teachers already gather from face-to-face meetings and which additional information is useful to collect purposefully.

  • Consider at what unit of analysis to collect information Designers should consider how the course is structured (are there small group activities) and also what is agreed upon with the students. Students might be more comfortable with submitting data that they know will be aggregated at the working group level rather than reported at the student level. Similarly, to avoid possible “participant fatigue” by having students repeatedly fill in questionnaires, consider how often data should be collected. For students to understand the purpose of collecting LA, and their role in this process, it is necessary to keep them informed about the project.

  • Build in a system that enables teachers to draw conclusions about the data Teachers might be in doubt about what the information means (diagnosing), but also about what to do when the information is interpreted (intervention). These questions could be discussed during regular team meetings. Additionally, designers could create a list of best practices with the teaching staff, which could serve as ideas for action for beginning teachers.

Limitations and directions for future research

The findings of this study must be regarded in light of a number of limitations. First of all, the study reported here includes a relatively small sample of seven participating teachers, and was undertaken in an undergraduate university course within a specific academic discipline. The types of indicators valuable to teachers are likely to vary according to the subject being taught and the specific pedagogical design of the course. Furthermore, the study was a descriptive one without a control group, which makes it hard to compare the findings to teachers without LA reports. Lastly, the paper took the perspective of the teacher, and it was not investigated whether changes in teacher behavior also had an impact on student behavior or on students’ learning outcomes (although we did present evidence for an indirect relation, see “Research question 1: functions of LA reports” section).

Despite these limitations, the fact that this was a small-scale study did allow us to construct the LA reports in collaboration with the teachers and subsequently, to closely follow them during the course. It also offers a detailed case of how to construct a flipped classroom design with incorporated teacher LA reports. As a result, a number of interesting initial hypotheses can be formulated as input for future research. First of all, it could be examined whether students are indeed indirectly affected when teachers are provided with LA. As a first step, students’ perceptions of the teacher could be gauged when teachers have access to LA, followed by investigations of the impact on students’ learning outcomes. Furthermore, these findings suggest that teachers’ perceptions of the usefulness of LA reports as well as their abilities to use them depend on teacher characteristics such as teaching experience. It could be investigated with more controlled experiments whether novice and expert teachers reason about LA differently, and thus whether it is better to provide different kinds of information accordingly. These types of studies would provide further information about the core issue stemming from this study, namely which information to provide to teachers that is directly applicable in the flipped classroom, so that they can stay updated on and meet the diverse needs of their students.

Conclusion

To conclude, the key findings of this study are that weekly LA reports for teachers in the flipped classroom were used by teachers to detect and further investigate issues that students might be facing, and were thus considered a valuable tool in scaffolding students’ learning processes. The extent to which the LA reports were valuable to teachers were related to teachers’ competence to deduce what each source of information meant, and to teachers’ competence to determine which intervention could best address a particular situation given a diagnosis of the situation. Implications for implementing LA in higher education aimed at informing teachers include the recommendations to determine the contents of LA reports in collaboration with teachers, to support teachers in translating the LA reports into concrete interventions, and to inform students of how their data are used.

Notes

Acknowledgements

The author would like to thank Christa Asterhan, Lisette Hornstra, Brianna Kennedy, and Marieke van der Schaaf for feedback on earlier versions of this article.

Funding

This study was funded by Utrecht University Incentive Fund for Education (Utrechts Stimuleringsfonds Onderwijs). The authors declare that they have no conflict of interest.

References

  1. Abeysekera, L., & Dawson, P. (2015). Motivation and cognitive load in the flipped classroom: Definition, rationale and a call for research. Higher Education Research & Development, 34(1), 1–14.Google Scholar
  2. Boeije, H. (2010). Analysis in qualitative research. London: SAGE.Google Scholar
  3. Çardak, C. S., & Selvi, K. (2016). Increasing teacher candidates’ ways of interaction and levels of learning through action research in a blended course. Computers in Human Behavior, 61, 488–506.Google Scholar
  4. Clow, D. (2012). The learning analytics cycle: Closing the loop effectively. In Proceedings of the 2nd international conference on learning analytics and knowledge (pp. 134–138).Google Scholar
  5. Comas-Quinn, A. (2011). Learning to teach online or learning to become an online teacher: An exploration of teachers’ experiences in a blended learning course. ReCALL, 23, 218–232.Google Scholar
  6. Condie, R., & Livingston, K. (2007). Blending online learning with traditional approaches: Changing practices. British Journal of Educational Technology, 38(2), 337–348.Google Scholar
  7. Dix, A., Finlay, J., Abowd, G. D., & Beale, R. (2004). Human–computer interaction (3rd ed.). Harlow: Pearson.Google Scholar
  8. Dix, A., & Leavesley, J. (2015). Learning analytics for the academic: An action perspective. Journal of Universal Computer Science, 21(1), 48–65.Google Scholar
  9. Dyckhoff, A. L., Lukarov, V., Muslim, A., Chatti, M. A., & Schroeder, U. (2013). Supporting action research with learning analytics. In Proceedings of the third international conference on learning analytics and knowledgeLAK’13 (pp. 220–229).Google Scholar
  10. Garrison, D. R., & Kanuka, H. (2004). Blended learning: Uncovering its transformative potential in higher education. Internet and Higher Education, 7(2), 95–105.Google Scholar
  11. Greenberg, B., Medlock, L., & Stephens, D. (2011). Blend my learning: Lessons from a blended learning pilot. Oakland, CA: Envison Schools. Retrieved from http://www.blendmylearning.com/2011/12/06/white-paper/
  12. Hardy, J., Bates, S. P., Casey, M. M., Galloway, K. W., Galloway, R. K., Kay, A. E., et al. (2014). Student-generated content: Enhancing learning through sharing multiple-choice questions. International Journal of Science Education, 36(13), 2180–2194.Google Scholar
  13. Janssen, J., Kirschner, F., Erkens, G., Kirschner, P. A., & Paas, F. (2010). Making the black box of collaborative learning transparent: Combining process-oriented and cognitive load approaches. Educational Psychology Review, 22(2), 139–154.Google Scholar
  14. Klug, J., Bruder, S., Kelava, A., Spiel, C., & Schmitz, B. (2013). Diagnostic competence of teachers: A process model that accounts for diagnosing learning behavior tested by means of a case scenario. Teaching and Teacher Education, 30, 38–46.Google Scholar
  15. Koh, J. H. L., & Frick, T. W. (2009). Instructor and student classroom interactions during technology skills instruction for facilitating preservice teachers’ computer self-efficacy. Journal for Educational Computing Research, 40(2), 211–228.Google Scholar
  16. Kyndt, E., Raes, E., Lismont, B., Timmers, F., Cascallar, E., & Dochy, F. (2013). A meta-analysis of the effects of face-to-face cooperative learning. Do recent studies falsify or verify earlier findings? Educational Research Review, 10, 133–149.Google Scholar
  17. Lang, C., Siemens, G., Wise, A. F., & Gasevic, D. (2017). Handbook of learning analytics. Beaumont, AB: Society for Learning Analytics Research.Google Scholar
  18. Lim, D. H., & Morris, M. L. (2009). Learner and instructional factors influencing learning outcomes within a blended learning environment. Educational Technology & Society, 12(4), 282–293.Google Scholar
  19. Lowes, S., Lin, P., & Kinghorn, B. (2015). Exploring the link between online behaviours and course performance in asynchronous online high school courses. Journal of Learning Analytics, 2(2), 169–194.Google Scholar
  20. Mayer, R. E. (2009). Multimedia learning (2nd ed.). New York: Cambridge University Press.Google Scholar
  21. O’Flaherty, J., & Phillips, C. (2015). The use of flipped classrooms in higher education: A scoping review. The Internet and Higher Education, 25, 85–95.  https://doi.org/10.1016/j.iheduc.2015.02.002.Google Scholar
  22. Ognjanovic, I., Gasevic, D., & Dawson, S. (2016). Using institutional data to predict student course selections in higher education. The Internet and Higher Education, 29, 49–62.Google Scholar
  23. Partridge, H., Ponting, D., & McCay, M. (2011). Good practice report: Blended learning. Australian Teaching & Learning Council. http://eprints.qut.edu.au/47566/1/47566.pdf
  24. PeerWise. (2016). Retrieved July 20, 2016 from http://peerwise.cs.auckland.ac.nz
  25. Roehl, A., Reddy, A. L., & Shannon, G. J. (2013). The flipped classroom: An opportunity to engage millennial students through active learning strategies. Journal of Family & Consumer Science, 105(2), 44–49.Google Scholar
  26. Schildkamp, K., Poortman, C. L., & Handelzalts, A. (2016). Data teams for school improvement. School Effectiveness and School Improvement, 27(2), 228–254.Google Scholar
  27. Sergis, S., & Sampson, G. (2017). Teaching and learning analytics to support teacher inquiry: A systematic literature review. In A. Peña-Ayala (Ed.), Learning analytics: Fundaments, applications, and trends: A view of the current state of the art. New York: Springer.Google Scholar
  28. Shelley, M., Murphy, L., & White, C. J. (2013). Language teacher development in a narrative frame: The transition from classroom to distance and blended settings. System, 41(3), 560–574.Google Scholar
  29. Sherin, M. G., & Van Es, E. A. (2005). Using video to support teachers’ ability to notice classroom interactions. Journal of Technology and Teacher Education, 13(3), 475–491.Google Scholar
  30. Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behavioral Scientist, 57(10), 1380–1400.Google Scholar
  31. Siemens, G., & Gasevic, D. (2012). Guest editorial—learning and knowledge analytics. Educational Technology & Society, 15(3), 1–2.Google Scholar
  32. Slade, S., & Prinsloo, P. (2013). Learning analytics: Ethical issues and dilemmas. American Behavioral Scientist, 57(10), 1510–1529.Google Scholar
  33. Staker, H., & Horn, M. B. (2012). Classifying K-12 blended learning. CA: Innosight Institute.Google Scholar
  34. Strauss, A., & Corbin, J. (1994). Grounded theory methodology. In N. K. Denzin & Y. S. Lincoln (Eds.), Handbook of qualitative research (pp. 217–285). Sage Publications: Thousand Oaks.Google Scholar
  35. Tanes, Z., Arnold, K. E., King, A. S., & Remnet, M. A. (2011). Using signals for appropriate feedback: Perceptions and practices. Computers & Education, 57(4), 2414–2422.Google Scholar
  36. Tempelaar, D. T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: Learning analytics in a data-rich context. Computers in Human Behavior, 47, 157–167.Google Scholar
  37. Tissenbaum, M., Matuk, C., Berland, M., Lyons, L., Cocco, F., Linn, M. et al. (2016). Real-time visualization of student activities to support classroom orchestration. In Symposium at the international conference of the learning sciences (ICLS), Singapore.Google Scholar
  38. Van de Pol, J., Volman, M., & Beishuizen, J. (2010). Scaffolding in teacher–student interaction: a decade of research. Educational Psychology Review, 22(3), 271–296.Google Scholar
  39. Van Dinther, M., Dochy, F., & Segers, M. (2011). Factors affecting students’ self-efficacy in higher education. Educational Research Review, 6(2), 95–108.Google Scholar
  40. Van Gog, T., Paas, F., van Merriënboer, J., & Witte, P. (2005). Uncovering the problem-solving process: Cued retrospective reporting versus concurrent and retrospective reporting. Journal of Experimental Psychology: Applied, 11(4), 237–244.Google Scholar
  41. Van Leeuwen, A. (2015). Learning analytics to support teachers during synchronous CSCL: Balancing between overview and overload. Journal of Learning Analytics, 2, 138–162.Google Scholar
  42. Van Leeuwen, A., Janssen, J., Erkens, G., & Brekelmans, M. (2015). Teacher regulation of cognitive activities during student collaboration: Effects of learning analytics. Computers & Education, 90, 80–94.Google Scholar
  43. Vaughan, N. D. (2007). Perspectives on blended learning in higher education. International Journal on E-Learning, 6(1), 81–94.Google Scholar
  44. Vogt, F., & Rogalla, M. (2009). Developing adaptive teaching competency through coaching. Teaching and Teacher Education, 25(8), 1051–1060.Google Scholar
  45. Wanner, T., & Palmer, E. (2015). Personalising learning: Exploring student and teacher perceptions about flexible learning and assessment in a flipped university course. Computers & Education, 88, 354–369.Google Scholar
  46. Wise, A. F., & Shaffer, D. W. (2015). Why theory matters more than ever in the age of big data. Journal of Learning Analytics, 2(2), 5–13.Google Scholar
  47. Wise, A. F., & Vytasek, J. (2017). Learning analytics implementation design. In C. Lang, G. Siemens, A. F. Wise, & D. Gasevic (Eds.), Handbook of learning analytics (pp. 151–160). Edmonton, AB: Society for Learning Analytics Research.Google Scholar
  48. Wood, D., Bruner, J. S., & Ross, G. (1976). The role of tutoring in problem-solving. Journal of Child Psychology and Psychiatry and Allied Disciplines, 17(2), 89–100.Google Scholar
  49. Xin, C., Mudholland, J., Jugic, V., & Kaur, H. (2013). On instructor experiences in three flipped large undergraduate calculus courses. Journal of Chemical Information and Modeling, 53, 1689–1699.Google Scholar
  50. Xu, Z. (2012). The blended ELT environment and the changing roles of teachers and students in Hong Kong. ELT Research Journal, 1(1), 3–10.Google Scholar
  51. Yen, J.-C., & Lee, C.-Y. (2011). Exploring problem solving patterns and their impact on learning achievement in a blended learning environment. Computers & Education, 56, 138–145.Google Scholar

Copyright information

© The Author(s) 2018

OpenAccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  1. 1.Utrecht UniversityUtrechtThe Netherlands

Personalised recommendations