What We Can Learn from the Data: A Multiple-Case Study Examining Behavior Patterns by Students with Different Characteristics in Using a Serious Game
- First Online:
- Cite this article as:
- Liu, M., Lee, J., Kang, J. et al. Tech Know Learn (2016) 21: 33. doi:10.1007/s10758-015-9263-7
- 608 Downloads
Using a multi-case approach, we examined students’ behavior patterns in interacting with a serious game environment using the emerging technologies of learning analytics and data visualization in order to understand how the patterns may vary according to students’ learning characteristics. The results confirmed some preliminary findings from our previous research, but also revealed patterns that would not be easily detected without data visualizations. Such findings provided insights about designing effective learning scaffolds to support the development of problem-solving skills in young learners and will guide our next-step research.
KeywordsLearning analytics Data visualization Serious games Problem solving Middle school science Fantasy Game engagement
In this study, we used a multi-case approach to examine through data visualization techniques how students with different characteristics accessed various tools built in a 3D immersive serious game environment designed for middle school science. By exploring and discovering the tool use patterns of learners with different characteristics, we aim to understand how students interact with the environment in order to gain insights about designing effective learning scaffolds to support the development of problem-solving skills in young learners.
2 Relevant Literature
In this section, we discuss the literature that provides the perspective and connection for this study, including learning analytics, analytics in serious games, data visualization, and learners’ characteristics.
2.1 Learning Analytics
Given the advancement of modern technologies, opportunities for collecting large educational datasets to understand how students learn become increasingly available (Koedinger et al. 2008; Siemens 2013). As a result, a new field of Learning Analytics (LA) has emerged. Johnson et al. (2014) defined LA as “an educational application of ‘big data’” as educators are searching for new ways of making use of big data available in order “to improve student engagement and provide a high-quality, personalized experience for learners” (p. 38). LA draws upon the previous research and associated techniques in Educational Data Mining (EDM) and focuses on “understanding and optimizing learning and the environments in which it occurs” (Romero and Ventura 2013, p. 13). While EDM aims to develop methods or algorithms for analyzing educational data or discovering new patterns in data (Romero and Ventura 2013), LA focuses on sense-making and is concerned with “developing tools and techniques for capturing, storing, and finding patterns” (Martin and Sherin 2013, p. 512) in order to improve organizational processes or learning outcomes through data-driven approaches to decision making (Siemens and Baker 2012; Siemens 2013). However, the two share many attributes in terms of the interests and techniques of the analysis and require an interdisciplinary approach. LA adapts techniques from multiple areas including data mining (Buckingham 2012) as well as extends EDM techniques.
Recent literature showed various methodologies are used in LA and EDM. Romero and Ventura (2013) categorized primary methodologies in LA and EDM, which include prediction, clustering, outlier detecting, process, relationship, and text mining, social networking analysis and others. Although there are more opportunities of collecting big data, various concerns of implementing learning analytics and data mining within educational settings exist such as technical challenges, legal, and ethical issues (Johnson et al. 2015). One challenge relevant to LA and EDM are privacy and ethical issues of using personal information. Another challenge is to deal with the large amount of data available for research purposes from how to manage the data, what data to collect, and what questions to be answered, and how to analyze and interpret the data (Verbert et al. 2012). In particular, researchers still need to deal with aggregating data from multiple sources and databases across different systems and analyzing the data often through manual processes.
2.2 Serious Game Analytics
While the focus of LA is on learning management systems and is mostly concerned with the use of intelligent tutoring systems and social media (Bienkowski et al. 2012), researchers in serious games (SG) field have also begun to pay attention to the enormous volume of dynamic data generated by SG. Serious games leverage game mechanics for training through exer-games, management games, and simulations (the Serious Games Initiative, http://www.seriousgames.org/). In researching SG potential to enhance learning, researchers typically rely on traditional data such as surveys and standardized tests, which are usually collected before and after playing a game. Such user-generated data was mainly used for the usability testing to figure out flaws in a game (Moura et al. 2011). The emergence of LA, however, offers new possibilities to allow SG researchers to focus both on an individual learning process during the game playing without interruption and the behavioral patterns of game players (Drachen and Canossa 2009b; Linek et al. 2010; Scarlatos and Scarlatos 2010). Thus, researchers can make use of user-generated data such as player interactions with the game including spatial (e.g., location of a player) and non-spatial data (e.g., time to complete a task) (Wallner and Kriglstein 2013). The user-generated data usually includes multiple parameters such as number of clicks, frequency of tool use, and duration of interaction, which can be interpreted as a specific indicator of a student behavior including the subjective meaning (Linek et al. 2010). According to Linek et al. (2010), different types of player activity (e.g., confusion or nervousness versus enthusiasm) can be interpreted according to mouse-clicking rate, thus providing the potential meaning of behavioral indicators.
2.3 Data Visualization in Serious Games Analytics
Given the challenges to manipulate “big data,” data visualization recently has emerged to be another technique to visualize captured user-generated data for studying behavior patterns. Wallner and Kriglstein (2013) categorized the visualization techniques in SG analytics into five categories: chart and diagram, heatmap, movement visualization, self-organizing map, and node-link approach.
Charts and diagrams are one of the most common graphic representations to deliver any result of quantitative research. These techniques are helpful to answer a specific research question such as a completion rate and average amount of time spent, or to address an exploratory question such as player behavior patterns (Wallner and Kriglstein 2013). Specifically, bar charts can present behavior patterns of specific players in a sequential order. Scarlatos and Scarlatos (2010) proposed several promising representation techniques (e.g., glyph-based techniques, parallel coordinates, and layering and separation) to further improve algorithms that automatically analyze the data for discovering new patterns. They introduced the concept of action shapes using a variation on parallel coordinates. The action shapes are basically glyphs useful to represent multivariate data using graphical representations of how dissimilar corresponding action shapes are. With the action shapes, a player’s positive or negative outcome can be represented in a context of learning progress.
A heatmap is another common way to visualize spatial data. Heatmaps are density or location-based visualization using color gradient such as light green to dark red, which maps the number of variables occurring into a specific coordinate in a two dimensional space (Wallner and Kriglstein 2013). Heatmaps can be helpful to detect detailed player behaviors. Chittaro and Ieronutti (2004) used heatmaps to track where players spent their time and what they did in order to examine a game environment and identify player characteristics. A single heatmap is suitable for visualizing a single variable at a time; however, it cannot address additional contextual information such as the nature of the behavior (Drachen and Canossa 2009a).
Andersen et al. (2010) presented Playtracer using a node-link diagram that visualized player paths through a game in order to visualize the progression of individual game-play. The researchers applied a clustering method to group players and game states; and attempted to provide an indicator of success or failure of players. Movement visualizations were used as a user-testing tool to track a player path with different variables such as position at spatial coordinates and orientation during the game development in order to provide a better guidance of “the design of interactive experience for a wide range of players” (Dixit and Youngblood 2008, p. 2). According to Wallner and Kriglstein (2013), in this type of visualization, “the path of each player is plotted individually by connecting the logged position as lines” (p. 149), in which each path or trace is represented using a different color to visualize the movement.
2.4 Importance of Understanding Learner Characteristics
Literature has indicated students with different learning characteristics may exhibit different learning behaviors. For example, the level of competency in problem-solving environments has been reported as a salient indicator that distinguishes different learning behaviors between experts and novices (Dreyfus 2004; Ericsson et al. 2006). Novice students tend to follow rules to solve a problem, while experts seem to like breaking the rules and following their intuition (Loh and Sheng 2013). The research identifying aspects of expert and novice students’ programming styles found that more experienced students had a tendency to be self-directed in their programming, while the novice students tended to adapt other external sources to code a problem (Blikstein 2011). There are some attempts to investigate the relationship between learning behavior and learning performance. For example, Hwang et al. (2011) investigated learning behavior in cooperative programming and its relationship with learning performance. Their research showed students who worked cooperatively tended to be more motivated and therefore performed better than the other groups. The results also showed some students’ learning behaviors changed during different problem-solving stages (e.g., initial, intermediate, outcome) and the students who were motivated made better transitions from one stage to another than the other students.
In our previous research examining what tools were used and at what stages of the problem-solving process using statistical cluster analyses, we found that high-performing and low-performing students differed in their patterns of tool use, showing that students with higher performance scores seemed to exercise more productive use of the tools than the students with lower performance scores (Liu and Bera 2005). In another study investigating the effects of motivation on learning for students with different characteristics, we found that the students’ achievement scores varied with their goal orientations (Hsieh et al. 2008). These preliminary findings suggest that there are some connections between more effective and strategic use of tools and students’ characteristics. The new possibilities and opportunities made available through data visualization in LA provide a promising and different way to examine and understand the analytics in SG at micro-level (Buckingham 2012) so researchers and designers can create evidence-based scaffolds in learning environments.
3 Purpose of the Study
- What tool use patterns do students with different characteristics exhibit as they use a 3D immersive serious game environment?
What are the tool use patterns from students with different levels of learning performance (Case #1)?
What are the tool use patterns from students with different levels of fantasy proneness, game engagement, and alien information acquisition (Case #2)?
This study employed a multiple-case approach (Stake 2005). In both cases, the participants were sixth graders (the target audience) who used the same 3D serious game environment (as described below). But each case focused on a different student characteristic as these sixth graders interacted with the environment.
The focus of the learner characteristic for Case #1 was on students’ learning performance. The ultimate goal of this serious game environment as a research context is to support students’ development of problem-solving skills. We were interested in how the sixth graders interacted with the environment and what they have learned from using it. Performance was measured by (a) how well the students solved the central problem and (b) the gain of their science knowledge after they used the environment. Students’ performance is an indication if a game player has achieved his/her goal. We wanted to find out if the students with higher performance scores would access the tools differently from the students with lower performance scores.
The learner characteristic for Case #2 was fantasy related. Fantasy is defined as an element that “evokes mental images of physical or social situations that are not actually presented” (Malone and Lepper 1987, p. 240). The fantasy embodied in this serious game is through the design of 3D models of the aliens and alien related elements (i.e. alien body, food, habitat, communication technology etc.). A particular focus of this case was the use of Alien Database tool. Literature has indicated adolescents were more likely to engage in a learning task when it is applied in a fantasy context (Asgari and Kaufman 2010), and the use of fantasy can stimulate curiosity and imagination of students as well as cognitive activities that would be out of reach in traditional education settings (Asgari and Kaufman 2010; Cordova and Lepper 1996; Malone 1981; Wiest 2001). We were curious whether students with different characteristics as measured by levels of fantasy proneness and game engagement would use the tools differently and acquire different amount of alien information. The assumption was students with a higher level of fantasy proneness and game engagement would acquire more alien related information.
Demographics of the participating schools
52 % White, 36 % Hispanic, 5 % African American, 2 % Asian/Pacific Islander, and 5 % two or more races; 40 % Economic Disadvantaged
67 % White, 24 % Hispanic, 5 % African American, 2 % Asian/Pacific Islander, and 2 % two or more races; 23 % Economic Disadvantaged
4.2 Research Context
The research context is a serious game environment designed for sixth-grade space science called Alien Rescue (AR, http://alienrescue.edb.utexas.edu, Liu et al. 2013a; Liu et al. 2014). AR uses a problem-based learning approach to engage middle school students in solving a complex problem. Designed for 50-min class session for approximately 15 days, it blends the playful science fiction with the real-world problem-solving process. Its design is guided by key game elements to keep players engaged such as interaction, communication, mystery, role-play, representation, goals, sensory stimuli, adaptation, and 3D (Garris et al. 2002; Malone and Lepper 1987; Wilson et al. 2009).
Students are to use these tools to research about planets and aliens’ characteristics, identify the problem, generate and test hypothesis, and then create a solution with justifications. For example, as students learn more about both aliens’ and planets’ characteristics using Alien Database and Solar System Database, a Notebook tool is available for them to write down the information they have gathered. Because these databases are designed to be intentionally incomplete, students will find out some information is missing so they need to collect additional data by designing and launching probe(s). Based upon their research, they can then form hypothesis as to which planet(s) might be a possible relocation site and compare the data returned from launched probes with the information they have collected to confirm, revise, or reformulate their hypothesis. Once they test and confirm their hypothesis, they will submit a recommendation for each alien through the Solution Form tool.
Descriptions of the four stages of problem solving in Alien Rescue
Stage 1—Understanding the problem (Understanding)
Students watch a video introducing them to the central problem, explore and familiarize themselves with the environment and cognitive tools, and define the problem for themselves in order to form a general strategy for solving it. Students spend time exploring the rooms of the space station and opening the various tools. Use of the tools in this stage does not tend to conform to a discernable pattern
Stage 2—Identifying, gathering, and organizing information (Researching)
Students conduct research by gathering and organizing information in order to further refine the problem. Students spend the most time perusing the Alien and Solar System Databases, gathering notes in the Notebook, and accessing other tools such as the Concepts Database, Missions Database, Spectra, and Periodic Table as needed. These tools are designed to share cognitive load and support cognitive processes
Stage 3—Integrating information and hypothesis testing (Hypothesis testing)
Students continue to research and gather information, but in a more targeted manner. As they identify gaps in their knowledge, they begin to generate hypotheses. Students continue to consult the tools accessed in the second stage, but more strategically, and begin to the design and launch probes by using Probe Design and Probe Launch Rooms
Stage 4—Evaluating the process and outcome (evaluating)
Students engage in testing their hypotheses, interpreting the results, and revising their ideas. The Probe Design Center, Probe Launch Center, and Mission Control Center are most frequently accessed. Students draft and submit their solutions to the problem via the Solution Form
4.3 Data Sources and Analysis
To address the research questions, students’ log files were examined to see how students actually used the tools in connection to their learning characteristics. In each case, log data (frequency and duration of tool use) were used as dependent variables and student characteristics as independent variables.
4.3.1 Log Files
The log file consisted of all time- and date-stamped actions of each student as he/she used the program. It included the number of times a student accessed each built-in tool and the amount of time the student used that tool. This log information provided a view of the frequency and duration of each tool a student used.
Two types of students’ characteristics were included in this study, one for each case.
4.3.2 Levels of Performance for Case #1
Students’ performance was evaluated by (a) the quality of their solution to the central problem, solution score, and (b) their science knowledge, science knowledge posttest score.
184.108.40.206 Solution Score
How well a student solved the central problem is determined by if she can find an appropriate relocation home for an alien species and the rationale provided for each relocation site. We only used the last solution score if a student submitted more than one, assuming this would represent student’s best effort after he/she has worked on multiple sub-problems and gained more experience in solving the problem. The solution was assessed using an 8-point rubric used in the previous research (Bogard et al. 2013). The rubric considered both the suitability of the recommended home and the degree to which students justify their recommendation: 0 point if the recommended home is not suitable, 1 point if the recommended home is suitable, but no reasons are provided to substantiate the choice, and 2–7 points if the recommended home is not only suitable but also reasons are provided to substantiate the choice–one point for each reason.
220.127.116.11 Science Knowledge Test Score
How well a student performed was also evaluated by a school district produced science knowledge test which consisted of 26 items addressing both factual knowledge and application questions, reflecting what the school district expected students to learn after completing the space curriculum unit. This science knowledge test score ranged from 0 to 26. According to the school district, students who received scores 90 % or higher were considered Commended (i.e. high performing).
4.3.3 Levels of Fantasy Proneness, Game Engagement, and Alien Information Acquisition for Case #2
18.104.22.168 Fantasy Proneness
As a construct of human cognition, fantasy proneness is a personal tendency to be deeply involved in imaginative thinking and gratifying fanciful activities that one has experienced from childhood (Lynn and Rhue 1986). Fantasy proneness was measured using the Creative Experience Questionnaire (CEQ, Merckelbach et al. 2001), a self-reporting survey on a dichotomous (yes/no) scale asking the elaboration of imaginative thinking, involvement in fantasy, daydreaming, and consequences of fantasizing. A modified version of CEQ consisting of 12 items was used with a score ranging from 0 to 12 (Cronbach alpha = 0.83).
22.214.171.124 Game Engagement
Students’ immersive experience while using the serious game was measured using the Game Engagement Questionnaire (GEQ) which measures the individual experience of game involvement during active participation in a game (Brockmyer et al. 2009). A modified version of GEQ (Cronbach alpha = 0.85) was used to assess absorption experience, flow, presence, and immersion experience (Brockmyer et al. 2009). It consists of 11 items and each item has three response choices (“No”, “Sort of”, “Yes”) with a score ranging from 0 to 22.
126.96.36.199 Alien Information Acquisition
Because the main fantasy component in Alien Rescue are the features related to each of the six aliens as shown in the Alien Database, we hypothesized students who are more prone to fantasy and engaged in the game would likely to acquire and retain more information about aliens. A test of 18 questions, three questions with 12 points for each alien species (score ranging from 0 to 72), was developed to measure how much information about these aliens the students learned while they played the game. The test went through several rounds of content validation by subject matter experts and middle school teachers and was also pilot-tested in sixth grade classrooms in the previous year. In this study, questions related to two alien species were used because the participants had time only to work on two species. The score ranges from 0 to 24.
The Creative Experience Questionnaire was given before the use of AR, Game Engagement Questionnaire was given during the mid-use as it sought for game involvement during the gameplay, and Alien Information Acquisition was given toward the end of use.
4.4 Data Cleaning and Analysis
Each log file contained student ID, teacher ID, timestamp including start time, end time, and duration; tools accessed; and solution texts. Students’ log files were first cleaned, eliminating empty or incomplete records and then prepared for analysis. Only matched data were included for each case. That is, in Case #1, we included only the data which has logging information for the entire period of use as well as the solution score and science knowledge test score for each student. In Case #2, we included only the data which has logging information for the entire period of use as well as the fantasy proneness, game engagement, and alien information acquisition scores for each student. Since both studies were conducted in a real classroom setting and not all students completed all measures during the three-week period, incomplete records were excluded which, consequently, reduced the overall sample size in each case.
Grouping variables based upon students’ characteristics
Number of students
Solution Score (Score: 0–7)
Science Knowledge Score (Score: 0–26)
Fantasy Proneness (Score: 0–12)
Alien Information (Score: 0–24)
Game Engagement (Score: 0–22)
In Case #2, the matched log files of 64 students with fantasy proneness, game engagement, and alien information acquisition scores were used in the respective analysis. Students were grouped into high and low levels of the variables using the mean of each variable. Given the spread of the scores, the scores above the mean were grouped into the high group and those scores below the mean were grouped into the low group.
For analyses, we first examined the tool use patterns for all tools built-in the environment. We then focused on six tools that were used most often based upon our previous research (Liu et al. 2015): Alien Database, Solar System Database, Notebook, Probe Design, Probe Launch, and Mission Control. For Case #2, we also specifically looked at the use of Alien Database given the research focus. We used Tableau Desktop (tableausoftware.com) as a tool to create visualizations, as it allows the representation of multiple layers of information in a single view. In the graphs presented below, the X-axis represents time (log days or stages) and the Y-axis represents the average frequency (number of times a tool was accessed) or average total duration (total amount of time, in sec., spent with a particular tool) averaged across the students by the grouping variables. We also ordered the tools used in each of the four conceptual problem-solving stages or log days, which helps us understand different behavior patterns across the stages or over the entire period. We performed the descriptive analysis as well as ANOVAs with grouping variables as the independent variables and frequency and duration of tool use as dependent variables. Correlation analysis was also performed in Case #2 to examine any possible relationship. Any significant findings were reported.
5.1 Case #1: Findings
In Case #1, we attempted to address the research question “What are the tool use patterns from students with different levels of learning performance?” We examined the students’ tool use using the log data in connection to their performances as shown in their solution and science knowledge test scores.
5.1.1 Solution Score
The results also showed HS students increased their use of Probe Design and Probe Launch Rooms as they progressed and their tool use peaked on Day 8. Their use of Mission Control Room also increased and peaked on Day 10. These tools are needed for students to conduct further research, integrate information, and test their hypotheses, because the information provided in Alien and Solar Databases is intentionally incomplete. HS students also utilized the Notebook tool, a tool designed to support cognitive processing, more often and longer in the initial days than LS students did. Together, these patterns indicated more active use of the tools appropriate to the four stages as outlined in Table 2 by the HS group.
5.1.2 Science Knowledge Test
5.2 Case #2: Findings
The focus of Case #2 was on fantasy related factors. We examined the question: “What are the tool use patterns from students with different levels of fantasy proneness, game engagement, and alien information acquisition?” We first looked the patterns of all built-in tool use and then focused on six most frequently used ones by each of the grouping variables: fantasy proneness, game engagement, and alien information acquisition.
5.2.1 Fantasy Proneness
The results also showed the High Fantasy Proneness group accessed Probe Design more often than the Low Fantasy Proneness group during the Researching Stage, and accessed Mission Control and Notebook more often than the other group during the Evaluating Stage.
5.2.2 Game Engagement
5.2.3 Alien Information Acquisition
5.2.4 Relationship Among Three Grouping Variables
Given the findings above, we further examined the relationship among the three fantasy related variables. The results showed there was a significant positive, though moderate, correlation between fantasy proneness and game engagement: r = 0.39, p < 0.01, and between fantasy proneness and alien information acquisition: r = 0.28, p < 0.05, but not between game engagement and alien information acquisition. That is, those with higher fantasy proneness scores were more engaged in the game and acquired more information about the aliens.
6.1 Different Tool Use Patterns by Students with Different Characteristics
The analysis through data visualization from the two cases showed the students with different characteristics exhibited different patterns as they interacted with the tools built-in the serious game environment. In Case #1 which focused on students’ performance, the results showed students with high performance (High Solution group and High Science Knowledge group) used two critical tools, Alien and Solar System Databases, significantly longer at the early stages than the students with low performance, and increased the use of Probe Design and Probe Launch Rooms, and Mission Control at later stages of problem-solving process. These are appropriate use of tools relevant to the problem-solving stages (Liu et al. 2013b). The low performance students, on the other hand, used the more fun tools (Probe Design, Probe Launch, and Mission Control) more frequently during the early stages. These patterns of tool use indicated the students with high performance selected tools more relevant to the problem-solving stages they were at and presumably the use of the tools assisted their solving the central problem which was reflected by their higher performance scores. This finding is consistent with our previous research that found students with higher performance scores seemed to have made more productive use of tools than students with lower performance scores (Liu and Bera 2005). Each built-in tool performs a specific function as discussed above and is needed at different times during the entire problem-solving process. Since students had the same amount of time to solve the central problem, the higher performance students seemed to be more intentional in their tool use and therefore, more effective at solving the problem. In addition, if the tools are used more relevant to the problem-solving stages they are needed, they should provide the scaffolding they are intended for.
In Case #2 which focused on fantasy related factors, the results showed the higher usage of Alien Database, the tool that embodies most fantasy elements, was positively related to game engagement and alien information acquisition. Specifically, students with higher fantasy proneness spent more time in Alien Database during the first two stages than students with low fantasy proneness. The High Game Engagement group spent significantly more time in Alien Database than the Low Game Engagement group. Given the significant positive correlations between fantasy proneness and game engagement, fantasy proneness and alien information acquisition, and alien information acquisition and time spent in Alien Database, these findings suggest students who are more prone to fantasy elements tend to be more engaged in the game and spent more time in Alien Database which allowed them to learn more information about alien species. That is, fantasy elements can possibly engage adolescents with high fantasy proneness more in a task (Asgari and Kaufman 2010) and promote cognitive skills (Wilson et al. 2009). Literature has indicated applying fantasy can stimulate curiosity and imagination of students and lead to high engagement and possibly cognitive learning (Asgari and Kaufman 2010; Cordova and Lepper 1996; Malone 1981; Wiest 2001). If students are engaged in a learning task, it is hopeful they will learn more. However, given the correlation between fantasy proneness and other variables is moderately strong, additional research is needed to replicate the study.
The findings from these two cases provide additional evidence to support the literature showing students with different learning characteristics may exhibit different learning behaviors and understanding learners characteristics is important (Blikstein 2011; Dreyfus 2004; Ericsson et al. 2006; Hsieh et al. 2008; Loh and Sheng 2013). Identifying these patterns and examining the relationship between learning behavior and learning performance in relation to learner characteristics (Hwang et al. 2011) is essential to create needed scaffolding to facilitate learning.
6.2 Implications for Future Research
Learning analytics and visualization techniques offer new opportunities to examine dynamic user data in educational applications in ways not possible previously (Johnson et al. 2014). This multi-case study, built upon our previous research investigating learner behavior patterns through traditional statistical analysis (Liu and Bera 2005), utilized data visualizations which allowed us to examine the data along multiple dimensions and at a micro level (Buckingham 2012). For example, we were able to present frequency and duration of tool use along with days or stages by students with different characteristics in one view. Such fine-grained micro-examinations with multiple data points in a single view revealed some anticipated findings that confirmed our previous findings (Liu and Bera 2005) in that high performing students used the tools more appropriate to the problem-solving stages they were at. At the same time, the visualizations also revealed unexpected findings. For example, in Case #2, the Mission Control, a more fun tool (Liu et al. 2013a) but not needed till later stages of problem-solving, was used more and during first 1–2 days by the Low Fantasy group. We would hypothesize the High Fantasy group used it more often and early. Why did the students with low fantasy proneness use it early and more instead? This warrants future research. The finding that the use of Mission Control by the High Game Engagement group peaked during the Understanding, Hypothesis Testing, and Evaluating Stages (see Fig. 9) seems to suggest it is a tool that can engage students. Combining both visualization techniques and traditional statistical analysis enabled us to examine the relationship among three fantasy related variables and identify shifts across different stages of problem-solving in a new and visual way, which can help us make sense of the data and create data-driven interventions to support learning (Siemens and Baker 2012). The visualizations also illustrated individual access patterns as shown in Fig. 14 with relevance to other individuals in a large group (Drachen and Canossa 2009b; Linek et al. 2010) and provide a glimpse of progression visually from one stage to another that might have been missed if we just examined the overall patterns.
The findings from this multi-case study using visualizations provide insights for our next-step research and can help achieve our goal of designing effective learning environments (Romero and Ventura 2013). With the understanding that the active and productive use of tools relevant to the problem-solving stages can possibly lead to higher performance; and rich and visual fantasy elements can engage students with certain characteristics, we can design new tools that can provide relevant scaffoldings at the appropriate times. Knowing students with certain characteristics tend to involve in more fun tools suggests a need to provide necessary interventions. For example, if some students have not accessed Alien and Solar Databases up to a certain point, the program can intervene through some scaffolding features. At the same time, if students spend time only in using fun tools, guidance can also be provided.
We are currently creating a teacher’s dashboard which aims to apply our understanding from this research in presenting just-in-time data to teachers so they can intervene to facilitate students’ learning. We also intend to create just-in-time scaffolds to students in the program based upon their learning behaviors. In addition, we plan to explore other visualization techniques used in SG (Wallner and Kriglstein 2013). We are in the process of creating visualizations using customized tools other than Tableau with a goal to map individual learning paths in the context of this serious game to further our understanding of creating scaffolds to adapt and personalize the learning environments according to students’ learning characteristics. As indicated in the literature (Verbert et al. 2012), we also face the challenge of how to best collect, clean, and manage the data in a real-world setting. We will continue to research along this line.
6.3 Limitations of the Study
This study is limited in that the sample only reflected those sixth graders who completed all corresponding measures. This research occurred in the actual classrooms. Due to logistic factors such as student absence and school holidays, not all students completed all measures. A challenge for learning analytics is capturing and storing data. While all mouse clicks were captured, computer crashes, switching computers among students (a factor not uncommon in a school setting) further reduced the number of students to be included. In addition, the context of this research is confined to one serious game environment. Readers should keep these in mind when interpreting our findings.
In this study, we attempted to capitalize on the emerging field of learning analytics and examined students’ behavior patterns in a serious game environment using data visualization techniques. The results not only confirmed some preliminary findings from the previous research but also revealed patterns that would not have been easily detected without visualizations. The findings provide new insights in understanding how students with different characteristics interact with a rich media educational application. Such findings will guide our next-step research to design effective scaffolds to help students develop their problem-solving skills.