Examining Through Visualization What Tools Learners Access as They Play a Serious Game for Middle School Science
This study intends to use data visualization to examine learners’ behaviors in a 3D immersive serious game for middle school science to understand how the players interact with various features to solve the central problem. The analysis combined game log data with measures of in-game performance and learners’ goal orientations. The findings indicated students in the high performance and mastery-oriented groups tended to use the tools more appropriately relative to the stage they were at in the problem-solving process, and more productively than students in low performance groups. The use of data visualization with log data in combination with more traditional measures shows visualization as a promising technique in analytics with multiple data sets that can facilitate the interpretation of the relationships among data points at no cost to the complexity of the data. Design implications and future applications of serious games analytics and data visualization to the serious game are discussed.
KeywordsSerious games Problem-based learning Middle school science Learner behaviors Goal orientation
The popularity of playing games has been increasing. According to a report by the Pew Research Center, digital game industry “takes in about $93 billion a year” (Holcomb & Mitchell, 2014), and playing games continue to be an important of form of how people, young and old, spend their leisure time. A Kaiser Family Foundation report stated, “In a typical day, 8- to 18-year-olds spend an average of 1:13 playing video games on any of several platforms” (Rideout, Foehr, & Roberts, 2010, p. 25). Therefore, it behooves educators to investigate how to employ techniques used in digital games to design digital learning environments.
The goal of this study was to examine learners’ behaviors in a 3D immersive serious game environment designed for middle school science to understand how the play-learners interact with various features of the environment to solve the central problem. We used data visualization as a way to represent patterns of learners’ behaviors. By applying data visualization techniques to serious games analytics, we hope to acquire insights on how serious game environments should be designed to facilitate learning.
2 Relevant Literature
2.1 Definition and Examples
Serious Games (SGs) are a type of games that include simulated events or virtual processes designed for the purpose of real-world problem-solving (Djaouti, Alvarez, Jessel, & Rampnoux, 2011; Rieber, 1996; Sawyer & Smith, 2008). Abt stated that SGs have “an explicit and carefully thought-out educational purpose and are not intended to be played primarily for amusement” (1970, p. 9). According to the Serious Games Initiative (www.seriousgames.org), SGs leverage game mechanics for training through exer-games, management games, and simulations. Therefore, although serious games can be fun and entertaining, their main purposes are to train, educate, or change users’ attitudes in the real-world situations. The applications for SGs are diverse. The term “serious” denotes an alteration of the context of gaming from fun and entertainment to engagement, efficiency, and pedagogical effectiveness for specific purposes such as training and performance enhancement (Djaouti et al., 2011). In this study, we were interested in using SGs to teach science concepts and problem-solving skills and create a fun learning experience for play-learners.
Many commercial games have been integrated into classroom settings for instructional purposes, such as SimCity (Tanes & Cemalcilar, 2010), Civilization (Squire, 2004), and Minecraft (List & Bryant, 2014). Some educational researchers also design and develop SGs themselves. For example, “Outbreak @ The Institute” is a role-play science game in which play-learners take on the roles of doctors, medical technicians, and public health experts to discover the cause of and develop a cure for a disease outbreak across a university campus (Rosenbaum, Klopfer, & Perry, 2007). Play-learners can interact with virtual characters and employ virtual diagnostic tests and medicines. In another science SG, Mad City Mystery, play-learners develop explanations of scientific phenomena in an inquiry-based learning environment (Squire & Jan, 2007).
2.2 Research Trends in Serious Games
Research on serious games typically focuses on their effects on learners’ engagement or effectiveness using traditional intervention studies with experimental designs or qualitative methods. The emergence of serious games analytics (SEGA) makes it possible to investigate beyond traditional research methodologies and focus on the learning processes of individuals as expressed through patterns of in-game behavior and accomplishments (Djaouti et al., 2011; Johnson et al., 2013; Scarlatos & Scarlatos, 2010).
The purpose of using analytics is to illuminate the process of performance improvement via in-game instructional resources (van Barneveld, Arnold, & Campbell, 2012). Studies in the field of SEGA for performance assessment primarily use game logs—unobtrusively saved records—on user activities with chronological and spatial tracking data (Johnson, Adams Becker, Estrada, & Freeman, 2014; Liu, Horton, Kang, Kimmons, & Lee, 2013; Macfadyen & Dawson, 2010; Wallner & Kriglstein, 2013). SEGA, therefore, is inherently an interdisciplinary field that links gaming data and student responses to statistics, computer science, data mining, and visualization (Baker & Yacef, 2009; Romero, Ventura, & García, 2008). The learning models and usage patterns are utilized to predict student knowledge-building trajectories through the categorization of levels of performance, engagement, and resource-processing sequences (U.S. Department of Education, Office of Educational Technology, 2012). Researchers are interested in using analytics to gain insights that can enable the design and validation of pedagogical scaffolding support in online learning environments.
There have been a number of research efforts to produce standardized analysis procedures, from planning the capture of learner activities to analyzing the data to finally visualizing the analysis, so that SEGA techniques can contribute to the field of SG as a solid methodology of learner evaluation (Loh, 2008, 2011; Romero & Ventura, 2010, 2013). Romero’s data mining model (2013) provides SG researchers seven steps to follow to conduct a SEGA study with a clear hypothesis: hypothesis formation, raw data gathering, preprocessing, data modification, data mining, finding models and patterns, and interpretation/evaluation. Serrano, Marchiori, del Blanco, Torrente, and Fernández-Manjón (2012) also provided a similar framework containing seven elements: data selection, data capture, aggregation and report, assessment, knowledge creation, knowledge refinement, and knowledge sharing.
In studies involving serious games analytics (Linek, Marte, & Albert, 2008; Loh, 2011; Reese, Tabachnick, & Kosko, 2013; Scarlatos & Scarlatos, 2010), the learning processes of individual students have been tracked using diverse techniques in order to support the personalization of instruction. In these examples, game logs have been regarded as an important metric in examining topics ranging from knowledge domains to tool use (Dede, 2014; Wallner & Kriglstein, 2013).
2.3 Issues in SEGA Evaluation
The efficacy of SGs has often been evaluated using traditional tests (e.g., standardized tests or surveys), which may not sufficiently measure higher learning objectives such as application, analysis, or synthesis (Scarlatos & Scarlatos, 2010). Since most of these tests are collected before or after SG play, the obtained data can merely represent prospective or retrospective views (Linek, Öttl, & Albert, 2010). They cannot be used to assess how learners achieved learning objectives within the game environment or the decision-making processes undertaken to solve a given problem. In addition, Loh (2008) warned of the limitations of computer-based tests since these cannot be used to evaluate opinions of learners, but only to assess the accuracy of their choices. Other methods such as observations or interviews have also been used for evaluating and understanding gameplay (Garzotto, 2007; Sweetser & Wyeth, 2005). Yet, researchers assert that such methods are inefficient in terms of time and lose clarity with large numbers of learners (e.g., Andersen, Liu, Apter, Boucher-Genesse, & Popović, 2010; Drachen & Canossa, 2009).
These challenges highlight the need to use log data to understand the play-learners’ behaviors within the environment and examine log data in connection to learners’ performance. Game-generated data logs contain records of human behaviors during learning, which can include any interaction between a learner and a game such as mouse click or keystroke. Reese et al. (2013) emphasized that learning objectives align with game objectives; therefore, a player’s idiosyncratic trajectory towards the game goal can reveal the dynamics of the learning process. To understand how a learner achieves a learning goal requires the discovery and analysis of patterns of play-learner behaviors (Drachen & Canossa, 2009), and log data can provide insights into play-learner behavior in context (Scarlatos & Scarlatos, 2010). The emerging technology of data visualization allows researchers to present and examine data visually in order to discover patterns relating to what learners are doing in an SG context (Dixit & Youngblood, 2008; Milam & El Nasr, 2010; Scarlatos & Scarlatos, 2010). Therefore, using visualization in combination with more traditional measures should provide more targeted and nuanced information to gain a holistic view of play-learners’ behaviors (Linek et al., 2008).
2.4 Background of Research
We have conducted several studies to examine students’ usage patterns through statistical procedures such as descriptive analysis and cluster analysis with the same serious game used in this study, Alien Rescue. The study by Liu and Bera (2005) applied cluster analysis to sixth-graders’ log data to examine what tools were used and at what stages of their problem-solving process. The results showed that tools supporting cognitive processing and tools sharing cognitive load played a more central role early in the problem-solving process whereas tools supporting cognitive activities that would be out of students’ reach otherwise and tools supporting hypothesis generation and testing were used more in the later stages of problem-solving. The findings also indicated that the students increasingly used multiple tools in the later stages of the problem-solving process. The various tools appeared to enable students to coordinate multiple cognitive skills in a seamless way and, therefore, facilitated their information processing. Results also suggested that students with higher performance scores seemed to exercise more productive use of the tools than students with lower performance scores.
In a follow-up study in our investigation (Liu et al., 2009), log data were matched with surveys from a group of college students who played Alien Rescue in a laboratory setting. A researcher observed each student’s activity in the environment and stimulated recall interviews elicited information on students’ cognitive processes at specific points in the problem-solving process. Quantitative data–log files–and qualitative data together revealed deliberate and careful use of tools by the students. Students simultaneously used multiple tools while engaged in integrating and evaluating information and different tools predominated during each problem-solving stage. This finding suggested that different types of tools were needed and used by the college students in this study, as they were by sixth graders in the previous research (Liu & Bera, 2005; Liu, Bera, Corliss, Svinicki, & Beth, 2004), but the results did not show evidence that students with higher performance used the tools more consistently or actively than the other groups as in the previous research (Liu et al., 2004; Liu & Bera, 2005).
Given these preliminary findings and especially the technological advancements in our field, the purpose of this study was to further this research line by using data visualization techniques to examine the patterns of how sixth graders played the SG and identify factors contributing to individual variations.
3 Research Questions and Research Context
3.1 Research Questions
How do play-learners access different tools built into the game?
How do play-learners with different goal orientations access the tools?
How do play-learners with different performance scores access the tools?
3.2 Description of the Serious Game Environment
The serious game environment under investigation is called Alien Rescue (AR, alienrescue.edb.utexas.edu; Liu et al., 2013). AR is designed and developed by a research and development team in the Learning Technologies Program at the University of Texas at Austin. AR aspires to teach science and complex problem-solving skills to students in fun and interactive ways. Its development is guided by a design-based research framework which aims to generate and refine theories by evaluating iterative enhancements to an instructional innovation within authentic settings (Brown, 1992; Cobb, Confrey, Lehrer, & Schauble, 2003).
AR incorporates problem-based learning pedagogy into a 3D virtual environment to engage middle-school students in solving complex and meaningful scientific problems. Students take on the role of young scientists in a rescue operation to save a group of six distressed alien species displaced from a distant galaxy due to the destruction of their home worlds. The young scientists are challenged to find the most suitable relocation homes for these aliens in our solar system. Each alien species is unique in its characteristics and needs. Upon starting the program, students are not given explicit instructions on how to proceed. They must explore the available tools, discover their capabilities, and develop their own strategies for how and when to effectively use them. Learning occurs as a result of solving a complex, ill-structured problem; there is not one single correct solution, and play-learners must present evidence and justify the rationale for their solutions.
This real-world process of scientific inquiry is transformed into a playful experience and delivered through an immersive, discovery-based, and sensory-rich approach, in line with Salen and Zimmerman’s (2004) definition of play as “free movement within a more rigid structure” (p. 304). The element of fantasy evokes uncertainty, mystery, and curiosity, while the quest-based narrative situates students in the role of experts with an urgent mission, motivating them to acquire competence in the language, concepts, tools, and processes of space science in order to succeed. Furthermore, the students must exercise high-level cognitive and metacognitive skills such as goal setting, hypothesis generation, problem-solving, self-regulation, evaluation of various possible solutions, and the effective presentation of evidence. Thus, AR provides a learning experience with real-world authenticity that also accomplishes essential curricular goals, all within an engaging science fiction fantasy context.
3.3 Cognitive Tools and Their Corresponding Conceptual Categories
Descriptions of cognitive tools provided in AR
Tools sharing cognitive load
Presents textual descriptions and 3D visuals of the aliens’ home solar system and journey to Earth, as well as the characteristics and needs of each species
Solar System Database
Provides information on the planets and selected moons in our solar system under consideration as habitats. Intentionally incomplete data ensures the need to generate and test hypotheses
Presents information on the mission, technology, and findings of historical NASA probe launches
Provides interactive and highly visual supplemental instruction on selected scientific concepts presented elsewhere in the environment
Helps students to interpret spectral data encountered in the environment
Provides an interactive periodic table of the elements for reference
Tools supporting cognitive process
Provides a place for students to record, summarize, and organize data as they engage in solving the central problem
Tools supporting otherwise out-of-reach activities
Probe Design Center
Allows students to design and build probes to send to gather data on worlds in our solar system
Probe Launch Center
Allows students to review built probes and make launch decisions in consideration of their remaining budget
Tools supporting hypothesis testing
Mission Control Center
Displays data from launched probes
Allows students to read messages from the Aliens and from the Interstellar Relocation Commission Director. Provides the Solution Form, which allows students to submit their habitat relocation recommendations and rationales for review by teachers
The Notebook supports cognitive processes as students work to solve the problem. As the physical space within the serious game environment where information from disparate sources is integrated, the Notebook facilitates the students’ synthesis of knowledge. On a metacognitive level, the Notebook provides a way for students to monitor their own progress towards solving the central problem.
The tools that support cognitive activities that would otherwise be out of reach are the Probe Design Center (see Fig. 8.1d) and Probe Launch Center. Designing and launching probes are activities that most students will only ever experience in a virtual environment such as AR. These tools not only provide an exciting and novel experience to the student, but also preserve the authenticity of the scientific inquiry process and the consequentiality of the serious game environment, since students’ probe design decisions directly impact the data available to them (Barab, Gresalfi, & Ingram-Goble, 2010, p. 526).
The Mission Control Center and Message Tool support hypothesis testing. Since the information provided in the research databases is intentionally incomplete, only the data from deployed probes viewed in the Mission Control Center allow students to draw the inferences necessary to generate their own solutions to the central problem. The Solution Form housed in the Message Tool provides students with a mechanism to develop their hypotheses into well-formed rationales to be evaluated by their teacher.
These tools are accessed via a two-layer interface (see Fig. 8.1a). The first layer is the virtual space station itself, which consists of five rooms, each containing an instrument for students to use. The second layer of the interface consists of a collection of persistent tools available at the bottom of the screen. It is possible to have several of these overlay tools open at once, though a student can visit only one room in the navigation layer at a time.
AR is designed for approximately 3 weeks of 50-min class sessions as a sixth-grade science curriculum unit. Depending on specific needs and classroom situations, teachers can adapt and adjust the days accordingly. The open-ended, ill-structured framework of AR gives students the freedom to access any tool(s) they wish at any time.
Our previous research (Liu et al., 2004, 2009) has indicated the problem-solving process in AR can be grouped into four conceptual stages: (a) understanding the problem (roughly days 1–2), (b) identifying, gathering, and organizing information (days 3–7), (c) integrating information (days 8–10), and (d) evaluating the process and outcome (days 11–13). This four-stage process reflects the cognitive processes in the revised version of Bloom’s taxonomy (Anderson et al., 2001) and the five components of an IDEAL problem-solver (Bransford & Stein, 1984).
Participants were sixth graders from a school in a mid-sized southwestern city. The teacher reported that most students were comfortable with computers as computer activities were a common part of classroom instruction. These sixth graders used AR as their science curriculum for approximately 3 weeks in the spring of 2014.
4.2 Data Sources
4.2.1 Log Files
All student actions performed while using the program were logged to a data file, which contained time- and date-stamped entries for each student. The data set consisted of the number of times a student accessed each of the cognitive tools and the amount of time the student used each tool. The participants were introduced to the central problem by watching a video scenario together, and then used the program in their science classes. The log file data presented a view of which tools a student used and for how long during this 3-week period.
4.2.2 Solution Scores
Students’ performance was evaluated by the quality of their solution to the central problem. A student’s solution score was determined by how well she solved the problem of finding an appropriate relocation home for each alien species. Variations in pace of work resulted in students submitting different numbers of solutions, in which case we used only one solution score. Assuming the quality of solutions would increase as a student gained more experience in solving the problem, we chose to score the last solution a student submitted.
Rubric used for grading solution forms
The student recommends an unsuitable home for the alien species
The student recommends a suitable home, but does not provide any reasons to substantiate their choice
The student recommends a suitable home and is awarded one additional point for each reason provided to substantiate their choice
Two researchers who had recently scored a set of solutions from another school participated in this scoring task. They first reviewed the scoring rubric and scored five solutions together to ensure they applied the same criteria during scoring. Then, the researchers scored the remainder of the solutions independently.
4.2.3 Goal Orientation
Students’ goal orientation was measured by the revised Patterns of Adaptive Learning Scales (PALS, Midgley et al., 2000), which assesses personal achievement goal orientations through three subscales: mastery (r = .85), performance-approach (r = .89), and performance-avoidance (r = .74) goals with 4 items for each goal orientation and a total of 12 items. Each item was rated on a 5-point scale with 1 being “Not at all true,” 3 being “Somewhat true,” and 5 being “Very true.” Due to this particular learning context, the general term “class” was replaced with “science class” as in these sample statements:
My goal in this science class is to learn as much as I can (mastery).
My goal is to show others that I’m good at my science class work (performance-approach).
It’s important to me that I don’t look stupid in my science class (performance-avoid).
Grouping based upon students’ goal orientation scores and solution scores
Number of students
Goal orientation (score: 1–5)
>2.75 and <3.75
>3 and <4
Solution score (score: 0–7)
4.3 Data Processing and Analysis
4.3.1 Data Cleaning and Processing
Each log file contained: student ID, teacher ID, time stamp including start time, end time, and duration; cognitive tools; and solution texts. After the data was cleaned, students’ solution and goal orientations scores were matched with their log files. Only the matched data were included in this study. Since this study was conducted in a real classroom setting, not all students completed all measures, which necessitated dropping the non-matched data and reduced the overall sample size. Students who did not submit any solutions were also removed from the sample.
For research question 1, we examined overall behavior patterns. The log files of 47 students with 7,404 lines of logs were included. To address the second and third research questions, the matched log files with solution scores of 38 students and the matched log files with goal orientation scores of 16 students comprised the respective analyses. Students’ solution and mastery goal orientation scores were grouped into high and low (see Table 8.3). Performance-approach and performance-avoid scores were grouped into high, mid, and low.
We selected Tableau Desktop (tableausoftware.com, Computer software, Seattle, WA) as our visualization tool, since it enables the representation of multidimensional data or multiple layers of information in a single view. To examine overall behavior patterns, we performed descriptive analyses on usage of tools by Lajoie’s (1993) four conceptual categories during the entire 3-week period. For log data, we used measures of frequency (number of times a tool was accessed) and duration (total amount of time, in sec., spent with a particular tool) averaged across students for a given time period. We then examined the tool use patterns by different grouping variables (i.e., performance or goal orientation). Specifically we used action shapes (Scarlatos & Scarlatos, 2010) to indicate tool use by each group. For the X-axis, we ordered the tools used in each of the four conceptual problem-solving stages or log days to understand different behavior patterns across the stages and over the entire period. The Y-axis represents the average frequency or average total duration of tool use by the grouping variable. Among all available tools, we focused on the six most frequently used tools: the Alien Database, Solar System Database, Notebook, Probe Design, Probe Launch, and Mission Control. ANOVAs were performed with grouping variables as the independent variables and frequency and duration of tool use as dependent variables.
For research question one, we examined frequency and duration across all tools for the entire sample. The findings confirmed that play-learners tended to use the tools that were central to the problem-solving process more frequently and for longer. For research questions two and three, we concentrated on six essential tools, looking for patterns according to performance levels and goal orientations. The findings suggested that some patterns of tool use were related to these grouping variables, though at this time the causal mechanism can only be speculated.
5.1 How Do Play-Learners Access Different Tools Built into the Game?
5.2 How Do Play-Learners with Different Goal Orientations Access the Tools?
5.2.1 Mastery Goal Orientation (Mastery GO)
Therefore, the Alien and Solar Databases are critical to performing these activities. What is interesting, however, is that during Stage 4 the Mastery GO High group also used the Alien Database and Solar Database significantly more: MeanAlienDB_High = 2.23, MeanAlienDB_Low = 1.69, F(1, 68) = 5.19, p < 0.05; MeanSolarDB_High = 4.38, MeanSolarDB_Low = 1.56, F(1, 42) = 21.46, p < 0.01. In fact, the Mastery GO High group used both the Solar System and Alien Databases consistently more throughout the four stages as compared to the Mastery GO Low group. It is possible they used these two content databases to help verify the information returned from launched probes. The findings also indicate that the Mastery GO Low group used the Probe Design significantly more (MeanProbeDesign_Low = 5, MeanProbeDesign_High = 3.29, F(1, 54) = 6.93, p < 0.01), which is appropriate to this stage.
5.2.2 Performance-Approach Goal Orientation (Performance GO)
5.2.3 Performance-Avoidance Goal Orientation (Performance-Avoid GO)
Performance-Avoid GO Low group used these tools longer during Stage 3: Probe Design (MeanProbeDesign_Low = 405.30, MeanProbeDesign_Mid = 80.53, MeanProbeDesign_High = 216.13), Probe Launch (MeanProbeLaunch_Low = 476.78, MeanProbeLaunch_Mid = 10.24, MeanProbeLaunch_High = 10.79), and Mission Control Tools (MeanMissionControl_Low = 196.52, MeanMissionControl_Mid = 115.77, MeanMissionControl_High = 75.93). These patterns by the Performance-Avoid GO Low group suggest that students in the Low group used tools more appropriate to the problem-solving stages while the Performance-Avoid GO High group seemed to only explore the more fun tools such as Probe Design, Probe Launch, and Mission Control in Stage 1.
5.3 How Do Play-Learners with Different with Performance Scores Access the Tools?
6 Discussion and Implications
The visualizations revealed several patterns of relevance to our ongoing efforts to design and enhance serious games such as Alien Rescue. The ultimate goal is to design effective scaffolds based upon our growing understanding of learner behaviors.
6.1 General Patterns of Tool Use
In general, the results supported our previous research into the four stages of the problem-solving process of AR (Liu & Bera, 2005; Liu et al., 2009). This is significant because play-learners are allowed to move through the process at their own pace and are not guided in how to proceed. In addition, they more frequently accessed and spent more time with the six tools that are most vital for solving the central problem. That the play-learners generally play the game “as intended” stands testament to the pedagogical soundness of the design.
The Notebook, which supports cognitive processes related to the synthesis and application of knowledge, was only infrequently accessed by the students. We wondered why since we consider the Notebook to be an integral part of the AR problem-solving process (Liu et al., 2009; Liu, Horton, Toprac, & Yuen, 2012). This finding can possibly be explained by our classroom observations over the years which revealed that teachers often assigned worksheets for students to complete during the AR unit that perform similar functions to the Notebook (Liu, Wivagg, Geurtz, Lee, & Chang, 2012). It is likely that students are doing the work of recording and organizing information on these paper worksheets, rather than with the built-in Notebook tool, thereby achieving the same end by different means. However, such paper worksheets may or may not be designed with the problem-based learning pedagogical approach that is the foundation of this serious game, and they may take away from the immersive experience of the play-learners. For future improvements to AR, we hope to address this undesired outcome by making the content of students’ notes available to the teacher, thereby eliminating the impetus to assign paper-and-pencil work.
The Alien and Solar System Databases represent two critical tools for gathering information and were therefore frequently accessed, yet students tended to stay in the Alien Database much longer than the Solar System Database as the visualization showed. This finding can possibly be explained by the fact that the Solar System Database can be accessed at any time via a pop-up window, whereas students must navigate to the Research Lab to view the Alien Database (see Fig. 8.1a). So students need to navigate to the Alien Database first and then access the Solar Database concurrently. Another possible explanation is that Alien Database with 3D models and animations may just be more engaging for students, as our previous research has indicated (Liu et al., 2013).
6.2 Productive Tool Use by High-Performance and Mastery Goal Orientation Groups
Our previous research has indicated that high-performing and low-performing students differed in their patterns of tool use (Liu & Bera, 2005). The present study confirmed this finding and additionally linked the similar pattern of productive tool use shown by high-performing students to those with a mastery goal orientation, as might be expected from the previously established connection between goal orientation and performance (Hsieh, Cho, Liu, & Schallert, 2008). Students in the High Solution and Mastery GO High groups tended to use the tools more appropriately according to the problem-solving stages. HS students used cognitive load and processing tools more and longer during Stages 2 and 3 and Probe Design and Launch centers during Stages 3 and 4, exactly when these tools are most pertinent. Since all students in a class are generally given the same amount of time to solve the central problem, less productive tool use can affect performance scores, as shown by the findings.
Concerning the other two goal orientation groups related to performance, the patterns are less straightforward. In our sample, the same students appeared in both the Performance GO High and the Performance-avoid GO High groups. This puzzling result is perhaps due to the small sample size and therefore limits the conclusions to be drawn. What is more, although the performance-related goal orientation groups showed active use of tools at times, they did not show a clear pattern on in-game productivity in contrast to the high-performing and mastery-oriented groups.
Goal orientation indicates a student’s motivations for completing an academic task, which play an influential role on behaviors and performance (Ames, 1992; Dweck, 1986). Students with a mastery goal orientation tend to focus more on mastering a task and acquiring new skills, and less on how competent they look in front of others (performance-approach goal) or on avoiding unfavorable judgments of capabilities and embarrassment in front of peers (performance-avoidance goal) (Elliot, 1999; Elliot & Harackiewicz, 1996). The findings from this study offered some evidence in support of the literature on goal orientations (Middleton & Midgley, 1997; Midgley & Urdan, 1995; Pajares, Britner, & Valiante, 2000) in that students with a mastery goal orientation tend to show more positive patterns of learning, while students with performance-approach or performance-avoidance goals appear to try to find a quick way to solve the complex problem and do not exhibit purposeful learning patterns.
6.3 Visualization as a Promising Technique for Serious Games Analytics
Our experience of visually exploring log data in combination with data from traditional sources indicates visualization as a promising technique in serious games analytics, especially with multidimensional data sets. Visualization facilitates interpretation of the relationships among multiple data points at no cost to the complexity of the data (Milam & El Nasr, 2010; Scarlatos & Scarlatos, 2010). In our study, by displaying data points (tool use frequency, duration) over days and across stages according to different grouping variables (performance levels and goal orientations) in a multidimensional way, visualization helped present the data and reveal findings not easily detected using traditional measures. The findings confirmed some of our previous research findings and more importantly, also revealed areas that call for further research. For examples, the Mastery GO High group used both the Solar System and Alien Databases consistently more throughout the four stages as compared to the Mastery GO Low group. Why? Could this be attributed to their mastery goal orientation or to other factors? The Performance GO High group only used Probe Design and little of other tools during Stage 1. Does their goal orientation have anything to do with this finding? The findings showed the potential of using visualization to facilitate the interpretation of how multiple data points may contribute to the patterns of play-learners’ behaviors as they engage in an SG environment, and provided empirical support for the use of multifaceted approaches to visually represent complex and sophisticated information (Drachen & Canossa, 2009; Linek et al., 2008; Wallner & Kriglstein, 2013).
6.4 Limitations and Future Directions
This study involved discovering patterns of play-learner behavior among students grouped by performance levels and goal orientations. Therefore, we limited the log data to the students who completed at least one of the measures, which reduced the overall sample size. The small size of the matched data used in this analysis is a limitation. For the log data, it was first necessary to manually compare the time stamp and the school calendar to calculate how long a class used AR while eliminating school holidays and testing days. As a part of our future work, we intend to develop code to parse log data into a more useable format. An attempt to make the processing of the log data automatic is a logical next step for our future research on this topic.
Our research group plans to continue this line of inquiry in several ways. First, we are designing an interactive dashboard for teachers, which will enable them to more closely monitor students’ work and thereby facilitate and intervene in a play-learner’s activity as needed. Visualizations, including some presented in this chapter, will allow teachers to monitor activities at the level of the classroom and an individual student, thereby facilitating both classroom management and grading. As we continue to refine our analytics and visualization techniques, we hope to replace the paper-and-pencil worksheets with more empirically tested analytics. We consider the exploration into visualization reported in this chapter as an important initial step in our application of serious games analytics to AR.
A second application of SEGA to AR will involve the provision of cognitive feedback to play-learners in the environment through visualizations. Thus far, in-game scaffolding and teacher support have been, for practical reasons, restricted to information about the task itself. The introduction of analytics-based visual feedback to play-learners can provide feedback on their decision-making processes and the effectiveness of those decisions, thus increasing the potential of success for all students (Balzer, Doherty, & O’Connor, 1989).
We have reported on a study using serious games analytics and data visualizations to discover patterns of play-learner behaviors in Alien Rescue, a serious game for sixth-grade space science. Play-learners’ use of built-in cognitive tools was visually presented in multiple formats and discussed according to trends among all students in the sample and between groups that differed according to performance levels and goal orientations. The results showed that specific patterns of tool use do indeed correlate with successful performance. The results were discussed in terms of the pedagogical implications for the design of the serious game and the integral role that serious games analytics and data visualization will play in that effort.
We would like to acknowledge the help by Damilola Shonaike in creating the image in Figure 14 as part of her 2014 summer CERT REU internship program. We also appreciate the help from Divya Thakur and Kelly Gaither from the Texas Advanced Computing Center at the University of Texas at Austin in exploring the use of the Processing language to create visualizations in the specific game environment.
- Abt, C. C. (1970). Serious games. New York: The Viking Press.Google Scholar
- Ames, C. (1992). Achievement goals and classroom motivational climate. In J. Meece & D. Schunk (Eds.), Students’ perceptions in the classroom (pp. 327–348). Hillsdale, NJ: Erlbaum.Google Scholar
- Andersen, E., Liu, Y. E., Apter, E., Boucher-Genesse, F., & Popović, Z. (2010). Gameplay analysis through state projection. In Proceedings from The Fifth International Conference on the Foundations of Digital Games, Pacific Grove, CA (pp. 1–8). doi:10.1145/1822348.1822349.
- Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., et al. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Longman.Google Scholar
- Baker, R. S., & Yacef, K. (2009). The state of educational data mining in 2009: A review and future visions. Journal of Educational Data Mining, 1(1), 3–17.Google Scholar
- Bransford, J. D., & Stein, B. S. (1984). The IDEAL problem solver. New York: W.H. Freeman and Company.Google Scholar
- Dede, C. (2014, May 6). Data visualizations in immersive, authentic simulations for learning [Flash slides]. Retrieved from http://www.edvis.org/tuesday-presentations/
- Dixit, P. N., & Youngblood, G. M. (2008). Understanding playtest data through visual data mining in interactive 3d environments. In Proceedings from 12th International Conference on Computer Games: AI, Animation, Mobile, Interactive Multimedia and Serious Games (CGAMES) (pp. 34–42).Google Scholar
- Drachen, A., & Canossa, A. (2009). Towards gameplay analysis via gameplay metrics. In Proceedings from the 13th International MindTrek Conference: Everyday Life in the Ubiquitous Era (pp. 202–209). ACM. doi:10.1145/1621841.1621878.
- Garzotto, F. (2007). Investigating the educational effectiveness of multiplayer online games for children. In Proceedings from the 6th International Conference on Interaction Design and Children, Aalborg, Denmark (pp. 29–36). doi:10.1145/1297277.1297284.
- Holcomb, J., & Mitchell, A. (2014, March). The revenue picture for American journalism and how it is changing. Retrieved from http://www.journalism.org/2014/03/26/the-revenue-picture-for-american-journalism-and-how-it-is-changing/
- Hsieh, P., Cho, Y., Liu, M., & Schallert, D. (2008). Examining the interplay between middle school students’ achievement goals and self-efficacy in a technology-enhanced learning environment. American Secondary Education, 36(3), 33–50.Google Scholar
- Johnson, L., Adams Becker, S., Cummins, M., Estrada, V., Freeman, A., & Ludgate, H. (2013). NMC horizon report: 2013 Higher Education Edition. Austin, TX: The New Media Consortium.Google Scholar
- Johnson, L., Adams Becker, S., Estrada, V., & Freeman, A. (2014). NMC horizon report: 2014 Higher Education Edition. Austin, TX: The New Media Consortium.Google Scholar
- Lajoie, S. P. (1993). Computer environments as cognitive tools for enhancing learning. In S. P. Lajoie & S. J. Derry (Eds.), Computers as cognitive tools (pp. 261–288). Hillsdale, NJ: Lawrence Erlbaum Associates.Google Scholar
- Linek, S. B., Marte, B., & Albert, D. (2008). The differential use and effective combination of questionnaires and logfiles. In Computer-Based Knowledge & Skill Assessment and Feedback in Learning Settings (CAF), Proceedings from The International Conference on Interactive Computer Aided Learning (ICL), Villach, Austria.Google Scholar
- Linek, S. B., Öttl, G., & Albert, D. (2010). Non-invasive data tracking in educational games: Combination of logfiles and natural language processing. In L. G. Chova, D. M. Belenguer (Eds.), Proceedings from INTED 2010: International Technology, Education and Development Conference, Spain, Valenica.Google Scholar
- List, J., & Bryant, B. (2014, March). Using Minecraft to encourage critical engagement of geography concepts. In Society for Information Technology & Teacher Education International [Conference Proceedings] (pp. 2384–2388). Jacksonville, FL.Google Scholar
- Liu, M., Bera, S., Corliss, S., Svinicki, M., & Beth, A. (2004). Understanding the connection between cognitive tool use and cognitive processes as used by sixth graders in a problem-based hypermedia learning environment. Journal of Educational Computing Research, 31(3), 309–334.CrossRefGoogle Scholar
- Loh, C. S. (2008). Designing online games assessment as “Information Trails”. In V. Sugumaran (Ed.), Intelligent information technologies: Concepts, methodologies, tools, and applications (pp. 553–574). Hershey, PA: Information Science Reference. doi:10.4018/978-1-59904-941-0.ch032.CrossRefGoogle Scholar
- Loh, C. S. (2011). Using in situ data collection to improve the impact and return of investment of game-based learning. In Old Meets New: Media in Education—Proceedings of the 61st International Council for Educational Media and the XIII International Symposium on Computers in Education (ICEM & SIIE’2011) Joint Conference (pp. 801–811). doi: 10.4018/jvple.2013010101.
- Midgley, C., Maehr, M. L., Hruda, L. Z., Anderman, E., Anderman, L., Freeman, K. E., et al. (2000). Patterns of adaptive learning scales (PALS). Ann Arbor, MI: University of Michigan.Google Scholar
- Milam, D., & El Nasr, M. S. (2010, July). Design patterns to guide player movement in 3D games. In Proceedings of the 5th ACM SIGGRAPH Symposium on Video Games (pp. 37–42). ACM. doi:10.1145/1836135.1836141.
- Rideout, V. J., Foehr, U. G., & Roberts, D.F. (2010, January). Generation M2: Media in the lives of 8- to 18-year-olds. Kaiser Family Foundation. Retrieved from http://kff.org/other/poll-finding/report-generation-m2-media-in-the-lives/
- Salen, K., & Zimmerman, E. (2004). Rules of play: Game design fundamentals. Cambridge, MA: MIT Press.Google Scholar
- Sawyer, B., & Smith, P. (2008). Serious games taxonomy. [PDF document]. Retrieved from http://www.dmill.com/presentations/serious-games-taxonomy-2008.pdf
- Scarlatos, L. L., & Scarlatos, T. (2010). Visualizations for the assessment of learning in computer games. In 7th International Conference & Expo on Emerging Technologies for a Smarter World (CEWIT 2010), September 27–29 2010, Incheon, Korea.Google Scholar
- Serrano, A., Marchiori, E. J., del Blanco, A., Torrente, J., & Fernández-Manjón, B. (2012, April). A framework to improve evaluation in educational games. In Proceedings from Global Engineering Education Conference (EDUCON), 2012 IEEE (pp. 1–8). IEEE. doi:10.1109/EDUCON.2012.6201154.
- U.S. Department of Education, Office of Educational Technology (2012). Enhancing teaching and learning through educational data mining and learning analytics: An issue brief. Washington, DC.Google Scholar
- van Barneveld, A., Arnold, K. E., & Campbell, J. P. (2012). Analytics in higher education: Establishing a common language. EDUCAUSE Learning Initiative. Retrieved from https://qa.itap.purdue.edu/learning/docs/research/ELI3026.pdf