Keywords

1 Introduction

There is a growing interest in applying gamification in the technology-enhanced learning context [1, 7, 10, 27, 28, 31] to increase students’ motivation and engagement [3, 20]. However, despite the benefits of using gamification in users’ psychological and behavioral outcomes [11], including in the educational context [24, 30], some studies have also reported unexpected outcomes after the implementation of gamification in technology-enhanced learning environments [9, 12, 29]. Research has pointed out that the design of gamification is one of the possible causes of negative results in educational settings [9, 19]. According to Heilbrunn, Herzig, and Schill [15], the process to design gamification should incorporate different aspects such as the personas of involved users, the application’s domain, properties of the gamified application itself, or legal constraints. These diverse aspects are subject to change over time, thus, gamification design must not be a rigid artifact [15].

Therefore, monitoring and adapting data related to gamification can be an alternative solution to avoid negative outcomes related to the use of gamification and can give valuable insights to take corresponding actions towards goals achievement [13,14,15]. Heilbrunn, Herzig and Schill [15] named this process as gamification analytics, and defined it as “the data-driven processes of monitoring and adapting gamification designs”. Nonetheless, the studies that address gamification in technology-enhanced learning environments are, in general, not concerned in monitoring and adapting gamification design during the learning process, neither through automated adaptation nor through human decision-making, increasing the risk of obtaining unexpected results [32].

Considering that we are in an era where data is being more used in the service of human decision-making and design than automated adjustment [2, 5], and that teachers should be at the heart of most ICT for education programs [34], teachers could also be in charge to monitor and adapt gamification design in gamified adaptive learning systems. In this sense, this paper proposes the “gamification analytics model for teachers” that can be applied in gamified adaptive learning systems to allow teachers to adapt the gamified design during learning process based on monitoring of data that show students’ relevant information about their interaction with the system‘s learning resources and gamification elements in an intuitive and meaningful way, aiming to increase the chances in obtaining positive results related to students’ motivation, engagement, and learning outcomes.

To implement this model as support for a gamified learning system, it is of utmost importance that the model-based design concepts are well designed concerning the needs of the teachers, and the target audience of the model. In this paper, we use the “Speed Dating method” – a design method for rapidly exploring application concepts and their interactions and contextual dimensions without requiring any technology implementation [6, 17] – aiming to validate the design concepts related to the Gamification Analytics Model for Teachers that we are targeting.

The remainder of this paper is structured as follows. In Sect. 2, we describe the Gamification Analytics model. In Sect. 3, we depict the Speed Dating method planning and execution. In Sect. 4, we describe the results obtained after the Speed Dating method execution. Finally, in Sect. 5, we present the discussion, concluding remarks and future works.

2 Gamification Analytics Model for Teachers

The “Gamification Analytics Model for Teachers” was developed to increase the chance of obtaining positive results concerning students’ engagement, learning outcome, and motivation during the learning process in gamified adaptive learning systems. In this model, teachers can define interaction goals, monitor students’ interaction with the system’s learning resources and gamification elements, and adapt the gamification design through the use of missions to motivate disengaged students to achieve the defined goals. Therefore, this is the main goal of the “Gamification Analytics Model for Teachers”, which is shown in Fig. 1. In the following sections, we describe this model.

Fig. 1.
figure 1

Gamification Analytics Model for Teachers.

2.1 Model Components

Definition of Interaction Goals. In the Gamification literature, it is stated the importance of defining clear goals and measuring the success of gamification design towards their achievement [14, 18, 35]. As such, in the model presented in this paper, teachers may define interaction goals that they expect students to achieve in a given time. The interaction goals represent the number/percentage of interactions that are expected students to have with the educational resources (e.g., videos, texts, questionnaires, forums, and so on) available in the system related to a certain topic in a specific time. Therefore, the interaction goals can be represented by two elements for each topic (quantity of resources, time expected). For example, one interaction goal configured by a teacher could be: expect that students interact with at least 70% of the learning resources available in the gamified learning system related to a topic in 2 weeks.

Monitoring of Students Interaction with Resources. In the Gamification Analytics Model for Teachers, teachers are allowed to visualize students‘interaction with learning resources and compare the if students’ interactions occur according to the interaction goals defined by the teacher. The interaction goals previously define may serve as a metric for teachers to monitor students’ learning process, since they can assess if students are at the expected pace towards the defined goal. To better present these important data for teachers, it is necessary to rely on research in Information Visualisation and Learning Dashboards. The positive effects of Information Visualisation and Learning Dashboards on teachers’ decision-making processes in the technology-enhanced learning context have been reported in several studies in the literature [21, 22, 25, 36].

Monitoring of Students Interaction with Gamification Elements. There are different objectives in showing students’ interaction with gamification elements to the teachers. First, teachers can visualise students’ interactions with the gamification elements implemented in the system in order to understand students‘ engagement with these elements, increasing teachers awareness about students‘ status (e.g., how many points each student accumulated so far, students’ ranking and current level). Moreover, this monitoring could increase the chance of teachers perceiving the positive impact of gamification, and hence, motivating themselves towards the use of gamification. Furthermore, the adaptation of the gamification design during the learning process is performed by using the gamification element mission, thus, it is necessary that teachers can visualise which missions are more effective to motivate the students. Through these visualisations, teachers could see which missions were most successful, and assign missions properly along the learning process. This concept is based on the theoretical model of user requirements for supporting the monitoring and adaptation of gamification designs proposed by Heilbrunn, Herzig and Schill [14]. However, there is a lack of studies that explore the visualisation by teachers of students’ interaction with gamification elements in the technology-enhanced learning context.

Adaptation of Gamification Design Through Missions. As previously explained, the adaptation of gamification design in educational systems can be made by teachers through the gamification element mission, e.g. when students’ interaction is decreasing over time and students are not achieving the interaction goal defined by the teacher. In previous studies, missions have been also effectively used to motivate students during the learning process [25, 26]. Therefore, we propose the usage of missions to adapt gamification design during the learning process because when teachers perceive students’ interactions are not as expected, they can assign missions in order to motivate students to increase interaction with the educational resources available in the system. Hence, the gamification design of other gamification elements will also be adapted because when students achieve a mission, they also conquer points, badges, levels, and change their position on the leaderboard.

3 Method

To explore the wide range of feature possibilities with users, the speed dating method based on the HCI (Human-Computer Interaction) research is designed to help researchers/designers draw unmet needs and probe the boundaries of what certain users will find acceptable (initially unknown until after a technology prototype) [16, 37]. The method begins with sessions in which participants receive hypothetical scenarios in rapid succession (for example, through storyboards) while researchers observe and understand participants’ immediate reactions [6, 23, 37]. The Speed Dating method leads to the discovery of unexpected design opportunities when unforeseen needs are found, based on participants’ assessment of the given scenario. Note that the Speed Dating method can reveal needs and opportunities not easily discovered through field observations or other project activities [6, 8, 23, 37]. The method consists of two main stages - validation and user approval. In the validation step, researchers present to the target users a variety of predefined storyboards to observe the needs that users demonstrate [6, 33]. Storyboards select innovation spaces and use this information to narrow the design space for the potential product. Therefore, researchers create an array of critical design problems and write short dramatic scenarios that address the permutations of these problems. As such, participants must play a specific role that they play regularly (as a teacher) while running through scenarios in a simulation [4, 6, 33].

3.1 Validation Through Speed Dating Method

As the gamification analytics model for teachers is a new contribution, it is still an open question on how to design gamified educational systems implementing this model. Therefore, it is of utmost importance that model-based design concepts are well designed to respect the needs of the teachers. Hence, the “Speed Dating method” was used to validate the design concepts of this model. As the target audience of the model are teachers, we recruited 15 teachers (14 post-secondary teachers and 1 secondary education teacher, all living in Alagoas, Brazil) to participate in individual sessions, through emails or requests made personally. The duration of the sessions with each teacher ranged from 30 to 60 min and 14 were performed at the university and 1 through video conference (with the help of meet.google.com).

At the beginning of each session, teachers attended a presentation made by one of the researchers, where teachers were presented with a contextualization of learning systems, gamification, and their challenges. Afterward, the “gamification analytics model for teachers” was presented. Moreover, to put all teachers on the same page regarding their understanding of gamified educational environments, a gamified educational platform (https://avance.eyeduc.com/) and its functionalities were introduced, clarifying doubts that appeared from teachers about educational environments and gamification. Therefore, it was possible to equalize the knowledge level of all teachers, thus they could formulate a more concrete opinion on the subject in the evaluation of the concepts embedded in the storyboards.

The session participants were introduced to design concepts based on the model proposed through storyboards. Teachers had time to read, reflect, and analyze each concept presented. At this time, teachers were encouraged to talk about their immediate reactions to the concept presented. Hence, the teachers evaluated the concept and classified it into three grades: grade 1 (if the teacher thought the concept would be relevant for him to use in a gamified educational environment), grade \(-1\) (if the teacher thought the concept would not be relevant for him to use in a gamified educational environment) and grade 0 (if the teacher could not decide whether or not the concept would be relevant to him). These grades are based on the work by [17].

The first design concepts presented to teachers were developed by the author of the model. However, teachers could at any time suggest new ideas for the formulation of new concepts based on their needs. When a teacher suggested a new concept, the researchers created a new storyboard related to the concept and that storyboard would be included in the set shown to the next participant. After debating and evaluating a concept, the next concept was presented, extending that method until the last concept in the set. During this process, two supporting researchers were responsible for recording teachers’ opinions, ideas, and grades for each concept for future analysis.

This research was initialized with 13 initial concepts, which were increased after the suggestion of new concepts by the teachers, resulting in a maximum of 20 concepts until the end of the research. After conducting the analysis, a table was created with the average teacher evaluation for each concept presented and recorded opinions of each teacher. The information given by each teacher will be further analyzed, so researchers can define what will be developed or adjusted in future gamified learning platforms.

4 Results

As previously explained, 20 (twenty) design concepts based on the gamification analytics model were evaluated by teachers to understand their needs in gamified adaptive learning systems. These concepts are related to the visualizations they judge most applicable to monitor students’ interaction with resources and students’ interaction with gamification elements, as well as the most appropriate procedures for adapting gamification design when they consider necessary. In this section, we discuss the most five well-rated design concepts in Sect. 4.1 and the most three poorly rated design concepts in Sect. 4.2. The list of all design concepts and their correspondent storyboards explored in this work can be visualized in the following site: sites.google.com.

Fig. 2.
figure 2

Validation results and average (Color figure online)

The quantitative evaluation made by teachers about each design concept is shown in Fig. 2. The columns in this figure represent the teachers who participated in the research (listed in order of participation), and the rows represent the design concepts. The last seven design concepts listed in the figure were generated by the participants. The cells in red indicate that the teacher evaluated negatively the correspondent concept while the cells in yellow indicate that the teacher was neutral about the corresponding concept. Moreover, the cells in green show that the teacher rated positively the correspondent design concept. The overall average rating of the design concepts among teachers is listed in the rightmost column. The average grade was calculated considering the sum of the grades the teachers assigned to the design concept divided by the number of teachers who evaluated the following design concept.

4.1 Most Well-Rated Design Concepts

Concept 3: Visualization of the percentage of the students that reached interaction goals (Average: 0,9333333333).

The vast majority of participants reported that this concept is fundamental to understand the progress of the class, enabling the teacher to intervene and make a decision regarding these results (T1, T2, T4, T9, T15) since the purpose of the concept is to provide a visualization in the system showing how many of the students have already reached the interaction goals defined by the teacher. As pointed out by teacher T9, “This visualization is important for a quick overview of the class as we would know if we can move on to the next topic, or continue in the topic and intervene in the process to motivate students to achieve the goals”.

Concept 8: Visualization of each student’s interaction with the resources (Average: 0,9333333333).

From the opinions captured in the sessions regarding this concept, we understand the need for the teacher to obtain a detailed view of each student, not just the class, and visualize their interaction with each available resource in the system (T2, T9, T15). Therefore, this concept enables the teacher to visualize the interaction of each student with each resource added in the activity plan of each topic. However, some teachers reported that for classes with a small number of students this concept would be ideal, but for large classes would be impracticable.

Concept 9: Creation of personalized missions for a student or for a specific group (Average: 0,8666666667).

In this concept, we investigate the need for the teacher to have the autonomy to intervene/adapt the system when students or a specific student are not achieving an expected goal. A mission, in the teachers’ view, makes it possible to motivate students to interact with the system resources and motivate the achievement of interaction goals (T1, T3, T6, T13). Some teachers believe that missions might have a more positive impact if they involve rewards that impact students’ grades (T1, T6). In addition, one teacher reported, “The teacher could monitor groups by levels and could select from the most advanced group to assist the less advanced students as well, being possible to create a mission with this suggestion” (T2). By analyzing other points of view, we obtained negative opinions regarding the offering of rewards (such as trophy, points) to students who achieve a mission. As reported by teacher T15, the reward would be the learning.

Concept 11: Show the status of each mission created (Average: 0,8666666667).

This concept was considered relevant by most of the teachers who participated in the sessions. For teachers, once missions are created, it is important for them to be able to view the results of each mission they create, such as the number of students who successfully completed the mission, the number of students who tried but did not achieve the mission, and the students who have not tried. Teachers believe this visualization becomes interesting for teacher monitoring and evaluation of which assignments have the most positive impact on students (T12) and whether they are positively impacting students’ level of interaction with resources (T9). In a teacher’s opinion, with this concept, he can measure the difficulty of the mission, whether it is difficult, easy, or moderate. It also has the possibility to look for students who failed the mission to know the reasons for the failure (T1).

Concept 13: Help button provided for each visualization describing its functionality (Average: 1).

This concept was the most well-rated among the teachers who participated in the sessions. For teachers, the support of the system through help buttons describing the functionality of the graphics is important especially at the beginning of the teacher’s interaction with the system when the teacher is not familiar with the system (T14, T11, T3). In addition, this functionality increases the possibility of joining users with few technological experiences (T1, T9).

4.2 Most Poorly Rated Design Concepts

Concept 6: Visualization of the number of students who achieved each trophy (Average: 0,2).

Some teachers see the possibility of taking advantage of this concept, given that the trophies obtained by the students correspond to the achievements and facilities in the use of the system, “can be used to compare the evolution of the class through the trophies” (T2) and “interesting to analyze the motivation or difficulty of the class with the trophies” (T4). However, the concept was poorly evaluated by most teachers, because according to teachers T3, T5, T9, T10, T13, T14, this functionality would not affect the methodology applied by the teacher. As pointed out by professor T10, “This kind of visualization would be most useful for designers or teachers with full control of course authorship, but apart from this use it can be a problem than a solution.”

Concept 16: Visualization of each student’s interaction with the trophies (Average: 0,2142857143).

The purpose of this concept is to visualize each student’s trophy achievements. However, teachers show doubts regarding the achievement of trophies and their relationship with student performance, “I do not find the viewing of trophies per student as relevant”, teacher (T11). However, the teacher T3 affirms the relevance of this concept, “being a way to track students’ performance”.

Concept 19: Visualization of student’s descriptive data (Average: 0,2727272727).

The availability of student descriptive data (interaction with resources, trophies, missions completed) for teachers in a textual way was poorly rated due to the teachers’ remarkable preference for visualizing data through graphs. For teacher T10, “presentation as the text may be a detriment to the teacher, a sensory noise.”

5 Discussion, Conclusion and Future Works

In this article, we introduced the “Gamification Analytics Model for Teachers”, a model that can be implemented in gamified adaptive learning systems to decrease the chances to obtain negatives outcomes concerning students’ engagement and motivation. In this model, teachers are allowed to define interaction goals, monitor students’ interaction with the system’ resources and gamification elements, and adapt the gamification design when they judge necessary through the use of missions to motivate and engage students to achieve the interaction goals. Nonetheless, future gamified adaptive learning systems that adopt the “Gamification Analytics Model for Teachers” need to implement model-based design concepts in the system that corresponds to teachers’ needs. Therefore, to validate these design concepts, in this paper, we used the “Speed Dating” method to understand teachers‘ needs in gamified adaptive learning systems. We present the most well-rated design concepts and most poorly rated design concepts related to the “Gamification Analytics Model for Teachers”. In general, most of the 20 design concepts evaluated by the participant teachers were well accepted and judged useful.

The most well-rated concept is the concept 13 (Help button provided for each visualization describing its functionality), teachers pointed out that this functionality is mainly important at the beginning of teachers’ interaction with the system, supporting and facilitating the understanding of the visualizations provided in the gamified adaptive learning systems. Other highly well-rated concept designs were the concepts 3 (Visualization of the percentage of the students that reached interaction goals) and 8 (Visualization of each student interaction with the resources). Note that there was a high acceptance rate for both more general, class level visualizations (such as concepts 2, 3, 5, 7, 11), and the more specific, more individually focused visualizations (such as concepts 8, 14, 15, 17). The first type of visualization helps teachers because it is a very compact and straightforward visualization while the second type helps teachers to act in isolated cases in the underperforming students, as stated by teacher T3.

Furthermore, the most poorly rated design concept was the concept 6 (Visualization of the number of students who achieved each trophy), followed by concept 16 (Visualization of each student’s interaction with the trophies) and concept 19 (Visualization of student descriptive data). Therefore, it could be observed that the visualization of the interaction of students with the trophies available in the gamified learning system was not judged important and relevant by the teachers. However, the students’ interactions with other gamification elements such as missions and levels (concepts 5, 11, 14, 15) were well-rated design concepts. Consequently, although teachers did not evaluate the visualization of the interaction of students with the trophies relevant, teachers judged useful/relevant visualize students’ interaction with other gamification elements (missions, levels) to help them understand the students’ status. Teachers have also demonstrated that visualizing students’ data through graphs is more relevant for them than visualizing students’ data through descriptive data in a textual way. During the speed dating process, some teachers highlighted how better is to visualize students’ data through graphs. For example, teacher T9 stated that visualize students’ interaction through descriptive data could be relevant, but visualize through graphs is more enjoyable and useful. Teachers T2 and T6 concluded that both visualizations could be relevant, but they should not be shown together, but by demand, at different levels.

This article presents some limitations such as the participants’ recruitment, 93% of the participants are post-secondary teachers, implying a threat to external validity. Another limitation faced in this article is related to the subjectivity of the storytellings where the design concepts were presented for teachers, which may have caused different interpretations depending on the participating teacher. However, we tried to soften this limitation through explanations and clarifying doubts during the speed dating method conduction. Our future work includes the validation of a prototype to be developed based on the most well-rated design concepts validated by teachers regarding teachers’ perceptions of perceived usefulness, perceived ease of use, behavioral intention, relevance, and perceived enjoyment. Afterward, a controlled experiment will be held in a real scenario within a gamified educational system platform based on the validated prototype to evaluate the effectiveness of the use of gamification analytics model by teachers on students’ learning outcomes, motivation, and engagement.