Keywords

1 Introduction

Gamification is pointed out as a valuable approach to improve students’ engagement, motivation, and learning outcomes [1, 2, 6, 13, 15]. However, previous studies reported that using gamification in educational technologies does not always assure the expected results’ achievement [5, 8, 14, 18]. A promising solution to maximise the gamification benefits is to monitor users’ behaviour in the gamified environment and adapt its gamification design when the expected outcomes are not achieved [9, 10]. This approach is named gamification analytics and it was defined by Heilbrunn, Herzig, and Schill [10] as “the data-driven processes of monitoring and adapting gamification designs”.

Nevertheless, there is a lack of studies that apply the gamification analytics approach in education, and, particularly, in the AIED field [3, 9, 21]. Therefore, we propose a gamification analytics model for teachers to support them in the process of monitoring the impact of gamification in gamified adaptive learning systems, and adapt the gamification design when considered necessary. Based on this model, a tool was developed, and a case study was conducted to investigate the impact of the use by teachers of the model through the proposed tool regarding students‘ engagement, learning, and motivation.

2 Gamification Analytics Model for Teachers and GamAnalytics Tool

In the Gamification Analytics Model, teachers may define interaction goals they expect their students achieve, and monitor, during the learning process, if the interaction goals are being achieved through the visualisation of students’ interaction with the system’s learning resources and game elements. If the outcome is not as expected, teachers may adapt the gamification design through the creation of missions. GamAnalytics is a gamification analytics model-based tool, and the design concepts implemented in the GamAnalytics tool were validated with teachers with respect to their needs and opinions [20]. GamAnalytics tool is integrated to a gamified adaptive educational environment, named Avance (https://avance.eyeduc.com/). This tool includes a class’ dashboard and an individual student’s dashboard. In the class’ dashboard, there are visualisations shown through descriptive data and graphs for each topic of a course, such as number of students registered in the course; the period expected for students to achieve the interaction goals; the class’ progress over time in relation to interaction with learning resources; the number and names of students that achieved or not the interaction goals; the number and names of the students that interacted (with success or not) with each learning resource; the number and names of the students that are in each level of gamification. In the individual student’s dashboard, there are more visualisations, such as student’ basic info; student’s gamification info such as points, current level, and position in the ranking; student’s progress over time in relation to interaction with learning resources; and student’s interaction with each learning resource (see Fig. 1).

Fig. 1.
figure 1

GamAnalytics Tool: Class (a) and individual (b) students’ dashboards showing the topic’s interaction goals, students’ interaction with resources, and game elements.

3 Method

A case study is conducted to explore the impact of the use by teachers of the gamification analytics model through the GamAnalytics tool regarding students’ engagement, learning, and motivation. Ten undergraduate and graduate students of the Federal University of Alagoas enrolled in the “Gamification in Education” course are considered in this case study. This study took place for four weeks, which was the expected time for students to master the “Framework, models and processes” and the “Gamiflow” topics.

To conduct the case study, the GamAnalytics tool integrated into the gamified adaptive educational environment (Avance) was used. First, the teacher defined the interaction goals that he expected students to achieve for the domain of each topic (e.g., it was expected that students interact at least with 60% of the resources of the “Gamiflow” topic in 3 weeks). After the teacher’s preparation, students completed a demographic questionnaire, and answered the informed consent form. Students also answered a pre-test, reviewed by the teacher, of the two topics. Pre-tests were planned according to the levels of the revised Bloom taxonomy [12] to be balanced with the post-tests.

Afterwards, students started using Avance, and the teacher could visualise students’ data through the GamAnalytics tool. When the teacher realised that the outcomes were not as expected, he assigned missions to groups or to a specific student through sending emails. In the email, teacher indicated the expected period of time for the mission, reward, and the set of resources that students should interact to achieve the sent mission. After that, he could visualise the impact of the intervention through the GamAnalytics. For each topic, teachers created 3 different missions depending on students’ interaction. At the end, students answered the post-tests, the IMI (Intrinsic Motivation Inventory) [7, 16, 17] and IMMS questionnaires (Instructional Materials Motivation Survey) [11, 19] to measure participants’ motivation – questionnaires validated in the Portuguese language [4].

4 Results and Discussion

4.1 Effects on Engagement

To investigate students’ engagement, we measured the number of students’ interaction with each topic’s resources before and after the teacher’s intervention (creation of missions). The results (from Shapiro-Wilk test for normality) indicate that the data concerning the two topics are not from a normal population (First topic: W = 0.594, p-value = 0.000047; W = 0.618, p-value = 0.000091/Second Topic: W = 0.432, p-value = 0.020; W = 0.432, p-value = 0.000058 – before and after the intervention respectively). A non-parametric Wilcoxon signed-rank test was performed to compare the number of students’ interaction before and after the intervention. Concerning the first topic, the Wilcoxon signed-rank test indicates a statistically significant difference (Z = −2.121, p-value = 0.034) between the number of interactions before and after the teacher’s intervention. For the second topic, the test’s results also indicated a statistically significant difference (Z = −2.214, p-value = 0.027) between the number of interactions before and after the intervention. Therefore, students increased significantly their interaction with the resources of the two topics after the teacher’s intervention based on the monitoring of students’ information, suggesting that students have improved their interaction with the system after teachers intervention.

4.2 Effects on Learning

The results of the pre- and pos-tests taken by students before and after the domain of each topic learned were used to measure the impact on students’ learning. Results from a Shapiro-Wilk test show that the data may come from a normal distribution – First topic: W = 0.965, p-value = 0.843 (pre-test); W = 0.932, p-value = 0.473 (post-test)/Second topic: W = 0.909, p-value = 0.271 (pre-test); W = 0.916, p-value = 0.325 (post-test). A t-test was performed, which indicates that there is a statistically significant difference between the scores of the first topic (t(9) = −4.116, p-value = 0.003) and of the second topic (t(9) = −2.449, p-value = 0.037). Therefore, our results might suggest that students have improved their understanding on both topics of the “Gamification in Education” course after interacting with resources sent by teachers through missions.

4.3 Effects on Motivation

At the end of each topic, the IMI and IMMS questionnaires were answered by the participants (7-point Likert scale). The internal consistency of all IMI and IMMS questionnaires’ subscales was greater than .70. Concerning the IMI questionnaire, the mean overall intrinsic motivation score for the “Frameworks, Models and Process” topic was 4.52. Concerning the second topic, the mean overall intrinsic motivation score for the “Gamiflow” topic was 4.63. These results may suggest that students were more intrinsically than extrinsically motivated during the intervention in the two topics. Concerning the IMMS questionnaire, in the first topic, note that the mean overall motivation level score was 5.19. Whereas, in the second topic, the mean overall motivation level score during the teaching was 4.95. In summary, our results might suggest that the students were motivated (intrinsically and extrinsically) during the intervention in the “Frameworks, Models and Process” and “Gamiflow” topics.

5 Conclusion

In this work, we conducted a case study to validate the impact of a gamification analytics model for teachers to monitor and adapt gamification design for students during the learning process. Our results might suggest that a gamification analytics tools based on this model impacts positively on students’ learning, engagement, and motivation – which are of utmost importance since it also shows that teachers may be active users of gamified adaptive learning systems with the aid of gamification learning analytics. As teachers may monitor and adapt gamification design according to how students or groups of students interact with an adaptive system, teachers could be more effective to make opportunistic pedagogical decisions (informed by gamification analytics) that may lead to an increase in learning, engagement, and motivation of the students.