Examining Through Visualization What Tools Learners Access as They Play a Serious Game for Middle School Science

  • Min Liu
  • Jina Kang
  • Jaejin Lee
  • Elena Winzeler
  • Sa Liu
Part of the Advances in Game-Based Learning book series (AGBL)

Abstract

This study intends to use data visualization to examine learners’ behaviors in a 3D immersive serious game for middle school science to understand how the players interact with various features to solve the central problem. The analysis combined game log data with measures of in-game performance and learners’ goal orientations. The findings indicated students in the high performance and mastery-oriented groups tended to use the tools more appropriately relative to the stage they were at in the problem-solving process, and more productively than students in low performance groups. The use of data visualization with log data in combination with more traditional measures shows visualization as a promising technique in analytics with multiple data sets that can facilitate the interpretation of the relationships among data points at no cost to the complexity of the data. Design implications and future applications of serious games analytics and data visualization to the serious game are discussed.

Keywords

Serious games Problem-based learning Middle school science Learner behaviors Goal orientation 

1 Introduction

The popularity of playing games has been increasing. According to a report by the Pew Research Center, digital game industry “takes in about $93 billion a year” (Holcomb & Mitchell, 2014), and playing games continue to be an important of form of how people, young and old, spend their leisure time. A Kaiser Family Foundation report stated, “In a typical day, 8- to 18-year-olds spend an average of 1:13 playing video games on any of several platforms” (Rideout, Foehr, & Roberts, 2010, p. 25). Therefore, it behooves educators to investigate how to employ techniques used in digital games to design digital learning environments.

The goal of this study was to examine learners’ behaviors in a 3D immersive serious game environment designed for middle school science to understand how the play-learners interact with various features of the environment to solve the central problem. We used data visualization as a way to represent patterns of learners’ behaviors. By applying data visualization techniques to serious games analytics, we hope to acquire insights on how serious game environments should be designed to facilitate learning.

2 Relevant Literature

2.1 Definition and Examples

Serious Games (SGs) are a type of games that include simulated events or virtual processes designed for the purpose of real-world problem-solving (Djaouti, Alvarez, Jessel, & Rampnoux, 2011; Rieber, 1996; Sawyer & Smith, 2008). Abt stated that SGs have “an explicit and carefully thought-out educational purpose and are not intended to be played primarily for amusement” (1970, p. 9). According to the Serious Games Initiative (www.seriousgames.org), SGs leverage game mechanics for training through exer-games, management games, and simulations. Therefore, although serious games can be fun and entertaining, their main purposes are to train, educate, or change users’ attitudes in the real-world situations. The applications for SGs are diverse. The term “serious” denotes an alteration of the context of gaming from fun and entertainment to engagement, efficiency, and pedagogical effectiveness for specific purposes such as training and performance enhancement (Djaouti et al., 2011). In this study, we were interested in using SGs to teach science concepts and problem-solving skills and create a fun learning experience for play-learners.

Many commercial games have been integrated into classroom settings for instructional purposes, such as SimCity (Tanes & Cemalcilar, 2010), Civilization (Squire, 2004), and Minecraft (List & Bryant, 2014). Some educational researchers also design and develop SGs themselves. For example, “Outbreak @ The Institute” is a role-play science game in which play-learners take on the roles of doctors, medical technicians, and public health experts to discover the cause of and develop a cure for a disease outbreak across a university campus (Rosenbaum, Klopfer, & Perry, 2007). Play-learners can interact with virtual characters and employ virtual diagnostic tests and medicines. In another science SG, Mad City Mystery, play-learners develop explanations of scientific phenomena in an inquiry-based learning environment (Squire & Jan, 2007).

2.2 Research Trends in Serious Games

Research on serious games typically focuses on their effects on learners’ engagement or effectiveness using traditional intervention studies with experimental designs or qualitative methods. The emergence of serious games analytics (SEGA) makes it possible to investigate beyond traditional research methodologies and focus on the learning processes of individuals as expressed through patterns of in-game behavior and accomplishments (Djaouti et al., 2011; Johnson et al., 2013; Scarlatos & Scarlatos, 2010).

The purpose of using analytics is to illuminate the process of performance improvement via in-game instructional resources (van Barneveld, Arnold, & Campbell, 2012). Studies in the field of SEGA for performance assessment primarily use game logs—unobtrusively saved records—on user activities with chronological and spatial tracking data (Johnson, Adams Becker, Estrada, & Freeman, 2014; Liu, Horton, Kang, Kimmons, & Lee, 2013; Macfadyen & Dawson, 2010; Wallner & Kriglstein, 2013). SEGA, therefore, is inherently an interdisciplinary field that links gaming data and student responses to statistics, computer science, data mining, and visualization (Baker & Yacef, 2009; Romero, Ventura, & García, 2008). The learning models and usage patterns are utilized to predict student knowledge-building trajectories through the categorization of levels of performance, engagement, and resource-processing sequences (U.S. Department of Education, Office of Educational Technology, 2012). Researchers are interested in using analytics to gain insights that can enable the design and validation of pedagogical scaffolding support in online learning environments.

There have been a number of research efforts to produce standardized analysis procedures, from planning the capture of learner activities to analyzing the data to finally visualizing the analysis, so that SEGA techniques can contribute to the field of SG as a solid methodology of learner evaluation (Loh, 2008, 2011; Romero & Ventura, 2010, 2013). Romero’s data mining model (2013) provides SG researchers seven steps to follow to conduct a SEGA study with a clear hypothesis: hypothesis formation, raw data gathering, preprocessing, data modification, data mining, finding models and patterns, and interpretation/evaluation. Serrano, Marchiori, del Blanco, Torrente, and Fernández-Manjón (2012) also provided a similar framework containing seven elements: data selection, data capture, aggregation and report, assessment, knowledge creation, knowledge refinement, and knowledge sharing.

In studies involving serious games analytics (Linek, Marte, & Albert, 2008; Loh, 2011; Reese, Tabachnick, & Kosko, 2013; Scarlatos & Scarlatos, 2010), the learning processes of individual students have been tracked using diverse techniques in order to support the personalization of instruction. In these examples, game logs have been regarded as an important metric in examining topics ranging from knowledge domains to tool use (Dede, 2014; Wallner & Kriglstein, 2013).

2.3 Issues in SEGA Evaluation

The efficacy of SGs has often been evaluated using traditional tests (e.g., standardized tests or surveys), which may not sufficiently measure higher learning objectives such as application, analysis, or synthesis (Scarlatos & Scarlatos, 2010). Since most of these tests are collected before or after SG play, the obtained data can merely represent prospective or retrospective views (Linek, Öttl, & Albert, 2010). They cannot be used to assess how learners achieved learning objectives within the game environment or the decision-making processes undertaken to solve a given problem. In addition, Loh (2008) warned of the limitations of computer-based tests since these cannot be used to evaluate opinions of learners, but only to assess the accuracy of their choices. Other methods such as observations or interviews have also been used for evaluating and understanding gameplay (Garzotto, 2007; Sweetser & Wyeth, 2005). Yet, researchers assert that such methods are inefficient in terms of time and lose clarity with large numbers of learners (e.g., Andersen, Liu, Apter, Boucher-Genesse, & Popović, 2010; Drachen & Canossa, 2009).

These challenges highlight the need to use log data to understand the play-learners’ behaviors within the environment and examine log data in connection to learners’ performance. Game-generated data logs contain records of human behaviors during learning, which can include any interaction between a learner and a game such as mouse click or keystroke. Reese et al. (2013) emphasized that learning objectives align with game objectives; therefore, a player’s idiosyncratic trajectory towards the game goal can reveal the dynamics of the learning process. To understand how a learner achieves a learning goal requires the discovery and analysis of patterns of play-learner behaviors (Drachen & Canossa, 2009), and log data can provide insights into play-learner behavior in context (Scarlatos & Scarlatos, 2010). The emerging technology of data visualization allows researchers to present and examine data visually in order to discover patterns relating to what learners are doing in an SG context (Dixit & Youngblood, 2008; Milam & El Nasr, 2010; Scarlatos & Scarlatos, 2010). Therefore, using visualization in combination with more traditional measures should provide more targeted and nuanced information to gain a holistic view of play-learners’ behaviors (Linek et al., 2008).

2.4 Background of Research

We have conducted several studies to examine students’ usage patterns through statistical procedures such as descriptive analysis and cluster analysis with the same serious game used in this study, Alien Rescue. The study by Liu and Bera (2005) applied cluster analysis to sixth-graders’ log data to examine what tools were used and at what stages of their problem-solving process. The results showed that tools supporting cognitive processing and tools sharing cognitive load played a more central role early in the problem-solving process whereas tools supporting cognitive activities that would be out of students’ reach otherwise and tools supporting hypothesis generation and testing were used more in the later stages of problem-solving. The findings also indicated that the students increasingly used multiple tools in the later stages of the problem-solving process. The various tools appeared to enable students to coordinate multiple cognitive skills in a seamless way and, therefore, facilitated their information processing. Results also suggested that students with higher performance scores seemed to exercise more productive use of the tools than students with lower performance scores.

In a follow-up study in our investigation (Liu et al., 2009), log data were matched with surveys from a group of college students who played Alien Rescue in a laboratory setting. A researcher observed each student’s activity in the environment and stimulated recall interviews elicited information on students’ cognitive processes at specific points in the problem-solving process. Quantitative data–log files–and qualitative data together revealed deliberate and careful use of tools by the students. Students simultaneously used multiple tools while engaged in integrating and evaluating information and different tools predominated during each problem-solving stage. This finding suggested that different types of tools were needed and used by the college students in this study, as they were by sixth graders in the previous research (Liu & Bera, 2005; Liu, Bera, Corliss, Svinicki, & Beth, 2004), but the results did not show evidence that students with higher performance used the tools more consistently or actively than the other groups as in the previous research (Liu et al., 2004; Liu & Bera, 2005).

Given these preliminary findings and especially the technological advancements in our field, the purpose of this study was to further this research line by using data visualization techniques to examine the patterns of how sixth graders played the SG and identify factors contributing to individual variations.

3 Research Questions and Research Context

3.1 Research Questions

The following research questions guided this study:
  • How do play-learners access different tools built into the game?

  • How do play-learners with different goal orientations access the tools?

  • How do play-learners with different performance scores access the tools?

3.2 Description of the Serious Game Environment

The serious game environment under investigation is called Alien Rescue (AR, alienrescue.edb.utexas.edu; Liu et al., 2013). AR is designed and developed by a research and development team in the Learning Technologies Program at the University of Texas at Austin. AR aspires to teach science and complex problem-solving skills to students in fun and interactive ways. Its development is guided by a design-based research framework which aims to generate and refine theories by evaluating iterative enhancements to an instructional innovation within authentic settings (Brown, 1992; Cobb, Confrey, Lehrer, & Schauble, 2003).

AR incorporates problem-based learning pedagogy into a 3D virtual environment to engage middle-school students in solving complex and meaningful scientific problems. Students take on the role of young scientists in a rescue operation to save a group of six distressed alien species displaced from a distant galaxy due to the destruction of their home worlds. The young scientists are challenged to find the most suitable relocation homes for these aliens in our solar system. Each alien species is unique in its characteristics and needs. Upon starting the program, students are not given explicit instructions on how to proceed. They must explore the available tools, discover their capabilities, and develop their own strategies for how and when to effectively use them. Learning occurs as a result of solving a complex, ill-structured problem; there is not one single correct solution, and play-learners must present evidence and justify the rationale for their solutions.

This real-world process of scientific inquiry is transformed into a playful experience and delivered through an immersive, discovery-based, and sensory-rich approach, in line with Salen and Zimmerman’s (2004) definition of play as “free movement within a more rigid structure” (p. 304). The element of fantasy evokes uncertainty, mystery, and curiosity, while the quest-based narrative situates students in the role of experts with an urgent mission, motivating them to acquire competence in the language, concepts, tools, and processes of space science in order to succeed. Furthermore, the students must exercise high-level cognitive and metacognitive skills such as goal setting, hypothesis generation, problem-solving, self-regulation, evaluation of various possible solutions, and the effective presentation of evidence. Thus, AR provides a learning experience with real-world authenticity that also accomplishes essential curricular goals, all within an engaging science fiction fantasy context.

3.3 Cognitive Tools and Their Corresponding Conceptual Categories

To assist students’ problem-solving, a set of tools are provided. These cognitive tools in the AR environment align with Lajoie’s (1993) four conceptual categories (see Table 8.1): tools that (a) share cognitive load, (b) support cognitive and metacognitive processes, (c) support cognitive activities that would otherwise be out of reach, and (d) support hypothesis generation and testing. Table 8.1 outlines the tools according to Lajoie’s categorization (1993).
Table 8.1

Descriptions of cognitive tools provided in AR

Tool categories

 

Tool functions

Tools sharing cognitive load

Alien Database

Presents textual descriptions and 3D visuals of the aliens’ home solar system and journey to Earth, as well as the characteristics and needs of each species

Solar System Database

Provides information on the planets and selected moons in our solar system under consideration as habitats. Intentionally incomplete data ensures the need to generate and test hypotheses

Missions Database

Presents information on the mission, technology, and findings of historical NASA probe launches

Concepts Database

Provides interactive and highly visual supplemental instruction on selected scientific concepts presented elsewhere in the environment

Spectra

Helps students to interpret spectral data encountered in the environment

Periodic Table

Provides an interactive periodic table of the elements for reference

Tools supporting cognitive process

Notebook

Provides a place for students to record, summarize, and organize data as they engage in solving the central problem

Tools supporting otherwise out-of-reach activities

Probe Design Center

Allows students to design and build probes to send to gather data on worlds in our solar system

Probe Launch Center

Allows students to review built probes and make launch decisions in consideration of their remaining budget

Tools supporting hypothesis testing

Mission Control Center

Displays data from launched probes

Message Tool

Allows students to read messages from the Aliens and from the Interstellar Relocation Commission Director. Provides the Solution Form, which allows students to submit their habitat relocation recommendations and rationales for review by teachers

Of the tools that share cognitive load, the Alien Database (see Fig. 8.1c) and Solar System Database are the most central to the problem-solving process. Together all of these tools provide students with a wealth of information to assist them in solving the problem (see Fig. 8.1b). They share cognitive load by reducing the need to memorize facts; the information is always available to the student. Thus, these tools shift the focus of learning from remembering to understanding, applying, and analyzing.
Fig. 8.1

Screenshots of various cognitive tools in AR that support the problem-solving process. (a) A view of the space station with tools panel overlay. (b) Students can open several tools, such as the Concepts, Solar System and Missions Databases, at a time. (c) The Alien Database contains 3D visuals and descriptions of the aliens, their former homes, and their journey to Earth. (d) Students can design, launch, and view data collected from their own simulated probes to test their hypotheses

The Notebook supports cognitive processes as students work to solve the problem. As the physical space within the serious game environment where information from disparate sources is integrated, the Notebook facilitates the students’ synthesis of knowledge. On a metacognitive level, the Notebook provides a way for students to monitor their own progress towards solving the central problem.

The tools that support cognitive activities that would otherwise be out of reach are the Probe Design Center (see Fig. 8.1d) and Probe Launch Center. Designing and launching probes are activities that most students will only ever experience in a virtual environment such as AR. These tools not only provide an exciting and novel experience to the student, but also preserve the authenticity of the scientific inquiry process and the consequentiality of the serious game environment, since students’ probe design decisions directly impact the data available to them (Barab, Gresalfi, & Ingram-Goble, 2010, p. 526).

The Mission Control Center and Message Tool support hypothesis testing. Since the information provided in the research databases is intentionally incomplete, only the data from deployed probes viewed in the Mission Control Center allow students to draw the inferences necessary to generate their own solutions to the central problem. The Solution Form housed in the Message Tool provides students with a mechanism to develop their hypotheses into well-formed rationales to be evaluated by their teacher.

These tools are accessed via a two-layer interface (see Fig. 8.1a). The first layer is the virtual space station itself, which consists of five rooms, each containing an instrument for students to use. The second layer of the interface consists of a collection of persistent tools available at the bottom of the screen. It is possible to have several of these overlay tools open at once, though a student can visit only one room in the navigation layer at a time.

AR is designed for approximately 3 weeks of 50-min class sessions as a sixth-grade science curriculum unit. Depending on specific needs and classroom situations, teachers can adapt and adjust the days accordingly. The open-ended, ill-structured framework of AR gives students the freedom to access any tool(s) they wish at any time.

Our previous research (Liu et al., 2004, 2009) has indicated the problem-solving process in AR can be grouped into four conceptual stages: (a) understanding the problem (roughly days 1–2), (b) identifying, gathering, and organizing information (days 3–7), (c) integrating information (days 8–10), and (d) evaluating the process and outcome (days 11–13). This four-stage process reflects the cognitive processes in the revised version of Bloom’s taxonomy (Anderson et al., 2001) and the five components of an IDEAL problem-solver (Bransford & Stein, 1984).

4 Method

4.1 Participants

Participants were sixth graders from a school in a mid-sized southwestern city. The teacher reported that most students were comfortable with computers as computer activities were a common part of classroom instruction. These sixth graders used AR as their science curriculum for approximately 3 weeks in the spring of 2014.

4.2 Data Sources

4.2.1 Log Files

All student actions performed while using the program were logged to a data file, which contained time- and date-stamped entries for each student. The data set consisted of the number of times a student accessed each of the cognitive tools and the amount of time the student used each tool. The participants were introduced to the central problem by watching a video scenario together, and then used the program in their science classes. The log file data presented a view of which tools a student used and for how long during this 3-week period.

4.2.2 Solution Scores

Students’ performance was evaluated by the quality of their solution to the central problem. A student’s solution score was determined by how well she solved the problem of finding an appropriate relocation home for each alien species. Variations in pace of work resulted in students submitting different numbers of solutions, in which case we used only one solution score. Assuming the quality of solutions would increase as a student gained more experience in solving the problem, we chose to score the last solution a student submitted.

The assessment of students’ performance was evaluated using an 8-point rubric that considers both the suitability of the recommended home and the degree to which students justify their recommendation based upon the evidence they have collected (see Table 8.2).
Table 8.2

Rubric used for grading solution forms

Description

Points awarded

The student recommends an unsuitable home for the alien species

0

The student recommends a suitable home, but does not provide any reasons to substantiate their choice

1

The student recommends a suitable home and is awarded one additional point for each reason provided to substantiate their choice

2–7

Two researchers who had recently scored a set of solutions from another school participated in this scoring task. They first reviewed the scoring rubric and scored five solutions together to ensure they applied the same criteria during scoring. Then, the researchers scored the remainder of the solutions independently.

4.2.3 Goal Orientation

Students’ goal orientation was measured by the revised Patterns of Adaptive Learning Scales (PALS, Midgley et al., 2000), which assesses personal achievement goal orientations through three subscales: mastery (r = .85), performance-approach (r = .89), and performance-avoidance (r = .74) goals with 4 items for each goal orientation and a total of 12 items. Each item was rated on a 5-point scale with 1 being “Not at all true,” 3 being “Somewhat true,” and 5 being “Very true.” Due to this particular learning context, the general term “class” was replaced with “science class” as in these sample statements:

My goal in this science class is to learn as much as I can (mastery).

My goal is to show others that I’m good at my science class work (performance-approach).

It’s important to me that I don’t look stupid in my science class (performance-avoid).

We looked for natural groupings of the goal orientation scores, which resulted in two groups for mastery and three groups each for performance-approach and performance-avoid (see Table 8.3).
Table 8.3

Grouping based upon students’ goal orientation scores and solution scores

Variable

  

Score

Number of students

Goal orientation (score: 1–5)

Mastery

High

=5

9

Low

<5

7

Performance-approach

High

≥3.75

3

Mid

>2.75 and <3.75

7

Low

≤2.75

6

Performance-avoidance

High

≥4

3

Mid

>3 and <4

7

Low

≤3

6

Solution score (score: 0–7)

 

High

≥4

11

 

Low

<4

27

4.3 Data Processing and Analysis

4.3.1 Data Cleaning and Processing

Each log file contained: student ID, teacher ID, time stamp including start time, end time, and duration; cognitive tools; and solution texts. After the data was cleaned, students’ solution and goal orientations scores were matched with their log files. Only the matched data were included in this study. Since this study was conducted in a real classroom setting, not all students completed all measures, which necessitated dropping the non-matched data and reduced the overall sample size. Students who did not submit any solutions were also removed from the sample.

For research question 1, we examined overall behavior patterns. The log files of 47 students with 7,404 lines of logs were included. To address the second and third research questions, the matched log files with solution scores of 38 students and the matched log files with goal orientation scores of 16 students comprised the respective analyses. Students’ solution and mastery goal orientation scores were grouped into high and low (see Table 8.3). Performance-approach and performance-avoid scores were grouped into high, mid, and low.

4.3.2 Analysis

We selected Tableau Desktop (tableausoftware.com, Computer software, Seattle, WA) as our visualization tool, since it enables the representation of multidimensional data or multiple layers of information in a single view. To examine overall behavior patterns, we performed descriptive analyses on usage of tools by Lajoie’s (1993) four conceptual categories during the entire 3-week period. For log data, we used measures of frequency (number of times a tool was accessed) and duration (total amount of time, in sec., spent with a particular tool) averaged across students for a given time period. We then examined the tool use patterns by different grouping variables (i.e., performance or goal orientation). Specifically we used action shapes (Scarlatos & Scarlatos, 2010) to indicate tool use by each group. For the X-axis, we ordered the tools used in each of the four conceptual problem-solving stages or log days to understand different behavior patterns across the stages and over the entire period. The Y-axis represents the average frequency or average total duration of tool use by the grouping variable. Among all available tools, we focused on the six most frequently used tools: the Alien Database, Solar System Database, Notebook, Probe Design, Probe Launch, and Mission Control. ANOVAs were performed with grouping variables as the independent variables and frequency and duration of tool use as dependent variables.

5 Findings

For research question one, we examined frequency and duration across all tools for the entire sample. The findings confirmed that play-learners tended to use the tools that were central to the problem-solving process more frequently and for longer. For research questions two and three, we concentrated on six essential tools, looking for patterns according to performance levels and goal orientations. The findings suggested that some patterns of tool use were related to these grouping variables, though at this time the causal mechanism can only be speculated.

5.1 How Do Play-Learners Access Different Tools Built into the Game?

Figure 8.2 presents an overall picture of tool use patterns. The visualization indicates tools in the cognitive load category, especially the Solar System and Alien Databases, were used for significantly longer periods of time than those in the other tool categories (MeanSolarDB = 382.03, MeanAlienDB = 525.80, F(9, 5154) = 154.64, p < 0.001). The cognitive-processing tool, the Notebook, was used for a longer time on day 2 and then again on days 9–12. The Probe Design tool was used frequently, especially on day 8, and for longer on day 5 and often towards the end of the program. Tools for hypothesis testing were used most frequently on days 8–10, coinciding with increased activity with the Probe Design tool. It appears the most active period of overall tool use was around day 8.
Fig. 8.2

Average frequency and duration of tool use

Of all the tools, the students used Probe Design (frequency = 3.695) and Mission Control (frequency = 3.804) most often, while they stayed in the Alien Database (525.80 s) and Solar System Database (382.03 s) the longest (see Fig. 8.3). During the problem-solving process, the Alien Database is needed to understand alien characteristics and the Solar System Database is needed to understand what each planet in our solar system can offer. Probably the most fun tool is Probe Design, a simulation allowing students to equip a probe with scientific instruments. Mission Control presents the data from a launched probe. As Fig. 8.3 shows, students accessed these latter tools often, but not for long periods.
Fig. 8.3

Average frequency and duration of tool use by categories

5.2 How Do Play-Learners with Different Goal Orientations Access the Tools?

5.2.1 Mastery Goal Orientation (Mastery GO)

In examining tool use patterns by different goal orientation groups, we focused on six tools the students tended to use the most as shown above: Alien Database, Solar System Database, Notebook, Probe Design, Probe Launch, and Mission Control. In Figs. 8.4 and 8.5, each point in a shape represents the average frequency or duration of tool use according to its value on the Y-axis. During Stage 2, the Mastery GO High group used the Alien DB significantly more often (MeanAlienDB_High = 2.25, MeanAlienDB_Low = 1.83, F(1, 110) = 4.135, p < 0.05) and for longer (MeanAlienDB_High = 727.62, MeanAlienDB_Low = 586.80) than the Mastery GO Low group. They also stayed in the Solar System DB significantly longer (MeanSolarDB_High = 245.03, MeanSolarDB_Low = 84.18, F(1, 64) = 5.435, p < 0.05). As discussed above, these two tools are critical for this stage of problem solving. Stage 2 activities center on identifying, gathering, and organizing information in order to further refine the problem.
Fig. 8.4

Average frequency of tool use across four stages by mastery goal orientation groups

Fig. 8.5

Average duration of tool use across four stages by mastery goal orientation groups

Therefore, the Alien and Solar Databases are critical to performing these activities. What is interesting, however, is that during Stage 4 the Mastery GO High group also used the Alien Database and Solar Database significantly more: MeanAlienDB_High = 2.23, MeanAlienDB_Low = 1.69, F(1, 68) = 5.19, p < 0.05; MeanSolarDB_High = 4.38, MeanSolarDB_Low = 1.56, F(1, 42) = 21.46, p < 0.01. In fact, the Mastery GO High group used both the Solar System and Alien Databases consistently more throughout the four stages as compared to the Mastery GO Low group. It is possible they used these two content databases to help verify the information returned from launched probes. The findings also indicate that the Mastery GO Low group used the Probe Design significantly more (MeanProbeDesign_Low = 5, MeanProbeDesign_High = 3.29, F(1, 54) = 6.93, p < 0.01), which is appropriate to this stage.

5.2.2 Performance-Approach Goal Orientation (Performance GO)

The Performance GO High group only used Probe Design and little of other tools during Stage 1 and yet, used the Solar Database more during Stage 4 (see Fig. 8.6, MeanSolarDB_High = 5.33, MeanSolarDB_Mid = 2.67, MeanSolarDB_Low = 3.1, F(2, 41) = 3.05, p = 0.06). Performance GO Mid group showed high usage of Probe Design in Stage 2 (see Fig. 8.6, MeanProbeDesign_High = 3.56, MeanProbeDesign_Mid = 4.91, MeanProbeDesign_Low = 3.64). These patterns indicate inappropriate tool use relative to problem-solving stage. On the other hand, the Performance GO Low group used the Alien Database significantly longer in Stage 3 (see Fig. 8.7, MeanAlienDB_High = 355.52, MeanAlienDB_Mid = 853.18, Mean AlienDB_Low = 1042.01, F(2, 69) = 3.678, p < 0.05). The Performance GO Mid and Low groups also used the Solar System Database longer in Stage 3 (MeanSolarDB_High = 689.60, MeanSolarDB_Mid = 966.43, MeanSolarDB_Low = 993.89) and used Probe Design significantly more frequently in Stage 4 (see Fig. 8.6, MeanProbeDesign_High = 1.75, MeanProbeDesign_Mid = 4.00, MeanProbeDesign_Low = 4.07, F(2, 53) = 4.061, p < 0.05). These patterns indicate more appropriate tool use for the problem-solving stages.
Fig. 8.6

Average frequency of tool use across four stages by performance-approach goal orientation groups

Fig. 8.7

Average duration of tool use across four stages by performance-approach goal orientation groups

5.2.3 Performance-Avoidance Goal Orientation (Performance-Avoid GO)

Figures 8.8 and 8.9 present tool use patterns by groups according to their degree of performance-avoidance. Since the same students in the Performance GO High group were also in the Performance-Avoidance GO High group, the pattern for this group was the same as above. The Performance-Avoid GO Low group showed significantly high use of the Solar System Database in Stage 2 (MeanSolarDB_High = 2.14, MeanSolarDB_Mid = 1.50, MeanSolarDB_Low = 2.94, F(2, 63) = 4.991, p < 0.05), while the Performance GO Mid group showed high usage of Probe Design Tool in this stage (MeanProbeDesign_High = 3.56, MeanProbeDesign_Mid = 5.11, MeanProbeDesign_Low = 3.65). The Performance-Avoid GO High group also used the Solar System Database significantly more during the last stage (MeanSolarDB_High = 5.33, MeanSolarDB_Mid = 2.44, MeanSolarDB_Low = 3.30, F(2, 41) = 3.617, p < 0.05).
Fig. 8.8

Average frequency of tool use across four stages by performance-avoidance goal orientation groups

Fig. 8.9

Average duration of tool use across four stages by performance-avoidance goal orientation groups

Performance-Avoid GO Low group used these tools longer during Stage 3: Probe Design (MeanProbeDesign_Low = 405.30, MeanProbeDesign_Mid = 80.53, MeanProbeDesign_High = 216.13), Probe Launch (MeanProbeLaunch_Low = 476.78, MeanProbeLaunch_Mid = 10.24, MeanProbeLaunch_High = 10.79), and Mission Control Tools (MeanMissionControl_Low = 196.52, MeanMissionControl_Mid = 115.77, MeanMissionControl_High = 75.93). These patterns by the Performance-Avoid GO Low group suggest that students in the Low group used tools more appropriate to the problem-solving stages while the Performance-Avoid GO High group seemed to only explore the more fun tools such as Probe Design, Probe Launch, and Mission Control in Stage 1.

5.3 How Do Play-Learners with Different with Performance Scores Access the Tools?

Students in the High Solution (HS) group (n = 11 with scores ≥4 out of 7) used the cognitive load tools significantly longer, specifically the Alien and Solar System Databases, than students in the Low Solution (LS) group did (n = 27 with scores <4): MeanSolarDB_High = 492.70, MeanSolarDB_Low = 311.71, F(1, 490) = 11.94, p < 0.01; MeanAlienDB_High = 705.31, MeanAlienDB_Low = 438.15, F(1, 714) = 30.572, p < 0.001 (see Figs. 8.10 and 8.11). Use of activities-out-reach tools increased and peaked on day 8 and use of hypothesis tools increased and peaked on day 10 for HS students, indicating they began to integrate information and test their hypotheses. HS students also utilized the Notebook tool more often and for longer in the initial days than did LS students. Together these patterns indicated more active use of the tools appropriate to the four stages by the HS group. The HS group also used most of the cognitive load tools longer than LS students did. This indicates that these HS students took more advantage of the domain-knowledge scaffolding provided by the serious game.
Fig. 8.10

Average frequency and duration of four categories tool use by solution groups (lines representing frequency and areas representing duration)

Fig. 8.11

Average frequency and duration of individual tool use by solution groups (lines representing frequency and areas representing duration)

6 Discussion and Implications

The visualizations revealed several patterns of relevance to our ongoing efforts to design and enhance serious games such as Alien Rescue. The ultimate goal is to design effective scaffolds based upon our growing understanding of learner behaviors.

6.1 General Patterns of Tool Use

In general, the results supported our previous research into the four stages of the problem-solving process of AR (Liu & Bera, 2005; Liu et al., 2009). This is significant because play-learners are allowed to move through the process at their own pace and are not guided in how to proceed. In addition, they more frequently accessed and spent more time with the six tools that are most vital for solving the central problem. That the play-learners generally play the game “as intended” stands testament to the pedagogical soundness of the design.

The Notebook, which supports cognitive processes related to the synthesis and application of knowledge, was only infrequently accessed by the students. We wondered why since we consider the Notebook to be an integral part of the AR problem-solving process (Liu et al., 2009; Liu, Horton, Toprac, & Yuen, 2012). This finding can possibly be explained by our classroom observations over the years which revealed that teachers often assigned worksheets for students to complete during the AR unit that perform similar functions to the Notebook (Liu, Wivagg, Geurtz, Lee, & Chang, 2012). It is likely that students are doing the work of recording and organizing information on these paper worksheets, rather than with the built-in Notebook tool, thereby achieving the same end by different means. However, such paper worksheets may or may not be designed with the problem-based learning pedagogical approach that is the foundation of this serious game, and they may take away from the immersive experience of the play-learners. For future improvements to AR, we hope to address this undesired outcome by making the content of students’ notes available to the teacher, thereby eliminating the impetus to assign paper-and-pencil work.

The Alien and Solar System Databases represent two critical tools for gathering information and were therefore frequently accessed, yet students tended to stay in the Alien Database much longer than the Solar System Database as the visualization showed. This finding can possibly be explained by the fact that the Solar System Database can be accessed at any time via a pop-up window, whereas students must navigate to the Research Lab to view the Alien Database (see Fig. 8.1a). So students need to navigate to the Alien Database first and then access the Solar Database concurrently. Another possible explanation is that Alien Database with 3D models and animations may just be more engaging for students, as our previous research has indicated (Liu et al., 2013).

6.2 Productive Tool Use by High-Performance and Mastery Goal Orientation Groups

Our previous research has indicated that high-performing and low-performing students differed in their patterns of tool use (Liu & Bera, 2005). The present study confirmed this finding and additionally linked the similar pattern of productive tool use shown by high-performing students to those with a mastery goal orientation, as might be expected from the previously established connection between goal orientation and performance (Hsieh, Cho, Liu, & Schallert, 2008). Students in the High Solution and Mastery GO High groups tended to use the tools more appropriately according to the problem-solving stages. HS students used cognitive load and processing tools more and longer during Stages 2 and 3 and Probe Design and Launch centers during Stages 3 and 4, exactly when these tools are most pertinent. Since all students in a class are generally given the same amount of time to solve the central problem, less productive tool use can affect performance scores, as shown by the findings.

Concerning the other two goal orientation groups related to performance, the patterns are less straightforward. In our sample, the same students appeared in both the Performance GO High and the Performance-avoid GO High groups. This puzzling result is perhaps due to the small sample size and therefore limits the conclusions to be drawn. What is more, although the performance-related goal orientation groups showed active use of tools at times, they did not show a clear pattern on in-game productivity in contrast to the high-performing and mastery-oriented groups.

Goal orientation indicates a student’s motivations for completing an academic task, which play an influential role on behaviors and performance (Ames, 1992; Dweck, 1986). Students with a mastery goal orientation tend to focus more on mastering a task and acquiring new skills, and less on how competent they look in front of others (performance-approach goal) or on avoiding unfavorable judgments of capabilities and embarrassment in front of peers (performance-avoidance goal) (Elliot, 1999; Elliot & Harackiewicz, 1996). The findings from this study offered some evidence in support of the literature on goal orientations (Middleton & Midgley, 1997; Midgley & Urdan, 1995; Pajares, Britner, & Valiante, 2000) in that students with a mastery goal orientation tend to show more positive patterns of learning, while students with performance-approach or performance-avoidance goals appear to try to find a quick way to solve the complex problem and do not exhibit purposeful learning patterns.

6.3 Visualization as a Promising Technique for Serious Games Analytics

Our experience of visually exploring log data in combination with data from traditional sources indicates visualization as a promising technique in serious games analytics, especially with multidimensional data sets. Visualization facilitates interpretation of the relationships among multiple data points at no cost to the complexity of the data (Milam & El Nasr, 2010; Scarlatos & Scarlatos, 2010). In our study, by displaying data points (tool use frequency, duration) over days and across stages according to different grouping variables (performance levels and goal orientations) in a multidimensional way, visualization helped present the data and reveal findings not easily detected using traditional measures. The findings confirmed some of our previous research findings and more importantly, also revealed areas that call for further research. For examples, the Mastery GO High group used both the Solar System and Alien Databases consistently more throughout the four stages as compared to the Mastery GO Low group. Why? Could this be attributed to their mastery goal orientation or to other factors? The Performance GO High group only used Probe Design and little of other tools during Stage 1. Does their goal orientation have anything to do with this finding? The findings showed the potential of using visualization to facilitate the interpretation of how multiple data points may contribute to the patterns of play-learners’ behaviors as they engage in an SG environment, and provided empirical support for the use of multifaceted approaches to visually represent complex and sophisticated information (Drachen & Canossa, 2009; Linek et al., 2008; Wallner & Kriglstein, 2013).

6.4 Limitations and Future Directions

This study involved discovering patterns of play-learner behavior among students grouped by performance levels and goal orientations. Therefore, we limited the log data to the students who completed at least one of the measures, which reduced the overall sample size. The small size of the matched data used in this analysis is a limitation. For the log data, it was first necessary to manually compare the time stamp and the school calendar to calculate how long a class used AR while eliminating school holidays and testing days. As a part of our future work, we intend to develop code to parse log data into a more useable format. An attempt to make the processing of the log data automatic is a logical next step for our future research on this topic.

Our research group plans to continue this line of inquiry in several ways. First, we are designing an interactive dashboard for teachers, which will enable them to more closely monitor students’ work and thereby facilitate and intervene in a play-learner’s activity as needed. Visualizations, including some presented in this chapter, will allow teachers to monitor activities at the level of the classroom and an individual student, thereby facilitating both classroom management and grading. As we continue to refine our analytics and visualization techniques, we hope to replace the paper-and-pencil worksheets with more empirically tested analytics. We consider the exploration into visualization reported in this chapter as an important initial step in our application of serious games analytics to AR.

A second application of SEGA to AR will involve the provision of cognitive feedback to play-learners in the environment through visualizations. Thus far, in-game scaffolding and teacher support have been, for practical reasons, restricted to information about the task itself. The introduction of analytics-based visual feedback to play-learners can provide feedback on their decision-making processes and the effectiveness of those decisions, thus increasing the potential of success for all students (Balzer, Doherty, & O’Connor, 1989).

We used the commercial data visualization software, Tableau for data analysis in this study. The ready-made visualizations created using this software have facilitated our exploratory analysis in this study, but the output cannot be fully customized to fit our future needs for displaying just-in-time visualizations within the context of this SG. We are therefore also exploring in-house development of visualizations that can convey the data in forms consistent with the serious game context. Figure 8.12 represents our initial effort: We used Processing (Software, retrieved from https://www.processing.org/download/, 2001) to visualize the overall tool use patterns using a solar system metaphor as aligned with the theme of AR. There are four solar systems, in which each sun represents a tool category and each revolving planet signifies a tool in that category. The size of every object including sun, planets, and moons indicates average frequency of tool use. This is our preliminary attempt to situate data visualization within the specific serious game context. We will continue to pursue this endeavor, particularly in conjunction with efforts to provide gameplay data to teachers and students, as outlined above.
Fig. 8.12

Average frequency of tool use by four categories

7 Conclusion

We have reported on a study using serious games analytics and data visualizations to discover patterns of play-learner behaviors in Alien Rescue, a serious game for sixth-grade space science. Play-learners’ use of built-in cognitive tools was visually presented in multiple formats and discussed according to trends among all students in the sample and between groups that differed according to performance levels and goal orientations. The results showed that specific patterns of tool use do indeed correlate with successful performance. The results were discussed in terms of the pedagogical implications for the design of the serious game and the integral role that serious games analytics and data visualization will play in that effort.

Notes

Acknowledgments

We would like to acknowledge the help by Damilola Shonaike in creating the image in Figure 14 as part of her 2014 summer CERT REU internship program. We also appreciate the help from Divya Thakur and Kelly Gaither from the Texas Advanced Computing Center at the University of Texas at Austin in exploring the use of the Processing language to create visualizations in the specific game environment.

References

  1. Abt, C. C. (1970). Serious games. New York: The Viking Press.Google Scholar
  2. Ames, C. (1992). Achievement goals and classroom motivational climate. In J. Meece & D. Schunk (Eds.), Students’ perceptions in the classroom (pp. 327–348). Hillsdale, NJ: Erlbaum.Google Scholar
  3. Andersen, E., Liu, Y. E., Apter, E., Boucher-Genesse, F., & Popović, Z. (2010). Gameplay analysis through state projection. In Proceedings from The Fifth International Conference on the Foundations of Digital Games, Pacific Grove, CA (pp. 1–8). doi:10.1145/1822348.1822349.
  4. Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., et al. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Longman.Google Scholar
  5. Baker, R. S., & Yacef, K. (2009). The state of educational data mining in 2009: A review and future visions. Journal of Educational Data Mining, 1(1), 3–17.Google Scholar
  6. Balzer, W. K., Doherty, M. E., & O’Connor, R. (1989). Effects of cognitive feedback on performance. Psychological Bulletin, 106(3), 410.CrossRefGoogle Scholar
  7. Barab, S. A., Gresalfi, M., & Ingram-Goble, A. (2010). Transformational play using games to position person, content, and context. Educational Researcher, 39(7), 525–536.CrossRefGoogle Scholar
  8. Bransford, J. D., & Stein, B. S. (1984). The IDEAL problem solver. New York: W.H. Freeman and Company.Google Scholar
  9. Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. The Journal of the Learning Sciences, 2(2), 141–178. doi:10.1207/s15327809jls0202_2.CrossRefGoogle Scholar
  10. Cobb, P., Confrey, J., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9–13. doi:10.3102/0013189X032001009.CrossRefGoogle Scholar
  11. Dede, C. (2014, May 6). Data visualizations in immersive, authentic simulations for learning [Flash slides]. Retrieved from http://www.edvis.org/tuesday-presentations/
  12. Dixit, P. N., & Youngblood, G. M. (2008). Understanding playtest data through visual data mining in interactive 3d environments. In Proceedings from 12th International Conference on Computer Games: AI, Animation, Mobile, Interactive Multimedia and Serious Games (CGAMES) (pp. 34–42).Google Scholar
  13. Djaouti, D., Alvarez, J., Jessel, J. P., & Rampnoux, O. (2011). Origins of serious games. In M. Ma, A. Oikonomou, & L. C. Jain (Eds.), Serious games and edutainment applications (pp. 25–43). Berlin, Germany: Springer.CrossRefGoogle Scholar
  14. Drachen, A., & Canossa, A. (2009). Towards gameplay analysis via gameplay metrics. In Proceedings from the 13th International MindTrek Conference: Everyday Life in the Ubiquitous Era (pp. 202–209). ACM. doi:10.1145/1621841.1621878.
  15. Dweck, C. S. (1986). Motivational processes affecting learning. American Psychologist, 41, 1040–1048.CrossRefGoogle Scholar
  16. Elliot, A. J. (1999). Approach and avoidance motivation and achievement goals. Educational Psychologist, 34, 169–189.CrossRefGoogle Scholar
  17. Elliot, A. J., & Harackiewicz, J. M. (1996). Approach and avoidance achievement goals and intrinsic motivation: A mediational analysis. Journal of Personality and Social Psychology, 70, 461–475.CrossRefGoogle Scholar
  18. Garzotto, F. (2007). Investigating the educational effectiveness of multiplayer online games for children. In Proceedings from the 6th International Conference on Interaction Design and Children, Aalborg, Denmark (pp. 29–36). doi:10.1145/1297277.1297284.
  19. Holcomb, J., & Mitchell, A. (2014, March). The revenue picture for American journalism and how it is changing. Retrieved from http://www.journalism.org/2014/03/26/the-revenue-picture-for-american-journalism-and-how-it-is-changing/
  20. Hsieh, P., Cho, Y., Liu, M., & Schallert, D. (2008). Examining the interplay between middle school students’ achievement goals and self-efficacy in a technology-enhanced learning environment. American Secondary Education, 36(3), 33–50.Google Scholar
  21. Johnson, L., Adams Becker, S., Cummins, M., Estrada, V., Freeman, A., & Ludgate, H. (2013). NMC horizon report: 2013 Higher Education Edition. Austin, TX: The New Media Consortium.Google Scholar
  22. Johnson, L., Adams Becker, S., Estrada, V., & Freeman, A. (2014). NMC horizon report: 2014 Higher Education Edition. Austin, TX: The New Media Consortium.Google Scholar
  23. Lajoie, S. P. (1993). Computer environments as cognitive tools for enhancing learning. In S. P. Lajoie & S. J. Derry (Eds.), Computers as cognitive tools (pp. 261–288). Hillsdale, NJ: Lawrence Erlbaum Associates.Google Scholar
  24. Linek, S. B., Marte, B., & Albert, D. (2008). The differential use and effective combination of questionnaires and logfiles. In Computer-Based Knowledge & Skill Assessment and Feedback in Learning Settings (CAF), Proceedings from The International Conference on Interactive Computer Aided Learning (ICL), Villach, Austria.Google Scholar
  25. Linek, S. B., Öttl, G., & Albert, D. (2010). Non-invasive data tracking in educational games: Combination of logfiles and natural language processing. In L. G. Chova, D. M. Belenguer (Eds.), Proceedings from INTED 2010: International Technology, Education and Development Conference, Spain, Valenica.Google Scholar
  26. List, J., & Bryant, B. (2014, March). Using Minecraft to encourage critical engagement of geography concepts. In Society for Information Technology & Teacher Education International [Conference Proceedings] (pp. 2384–2388). Jacksonville, FL.Google Scholar
  27. Liu, M., & Bera, S. (2005). An analysis of cognitive tool use patterns in a hypermedia learning environment. Educational Technology Research and Development, 53(1), 5–21. doi:10.1007/BF02504854.CrossRefGoogle Scholar
  28. Liu, M., Bera, S., Corliss, S., Svinicki, M., & Beth, A. (2004). Understanding the connection between cognitive tool use and cognitive processes as used by sixth graders in a problem-based hypermedia learning environment. Journal of Educational Computing Research, 31(3), 309–334.CrossRefGoogle Scholar
  29. Liu, M., Horton, L. R., Corliss, S. B., Svinicki, M. D., Bogard, T., Kim, J., et al. (2009). Students’ problem solving as mediated by their cognitive tool use: A study of tool use patterns. Journal of Educational Computing Research, 40(1), 111–139.CrossRefGoogle Scholar
  30. Liu, M., Horton, L., Kang, J., Kimmons, R., & Lee, J. (2013). Using a ludic simulation to make learning of middle school space science fun. The International Journal of Gaming and Computer-Mediated Simulations, 5(1), 66–86. doi:10.4018/jgcms.2013010105.CrossRefGoogle Scholar
  31. Liu, M., Horton, L., Toprac, P., & Yuen, T. T. (2012). Examining the design of media-rich cognitive tools as scaffolds in a multimedia problem-based learning environment. In Educational media and technology yearbook (pp. 113–125). New York: Springer.CrossRefGoogle Scholar
  32. Liu, M., Wivagg, J., Geurtz, R., Lee, S.-T., & Chang, H. M. (2012). Examining how middle school science teachers implement a multimedia-enriched problem-based learning environment. Interdisciplinary Journal of Problem-Based Learning, 6(2), 46–84.CrossRefGoogle Scholar
  33. Loh, C. S. (2008). Designing online games assessment as “Information Trails”. In V. Sugumaran (Ed.), Intelligent information technologies: Concepts, methodologies, tools, and applications (pp. 553–574). Hershey, PA: Information Science Reference. doi:10.4018/978-1-59904-941-0.ch032.CrossRefGoogle Scholar
  34. Loh, C. S. (2011). Using in situ data collection to improve the impact and return of investment of game-based learning. In Old Meets New: Media in Education—Proceedings of the 61st International Council for Educational Media and the XIII International Symposium on Computers in Education (ICEM & SIIE’2011) Joint Conference (pp. 801–811). doi: 10.4018/jvple.2013010101.
  35. Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for educators: A proof of concept. Computers & Education, 54(2), 588–599. doi:10.1016/j.compedu.2009.09.008.CrossRefGoogle Scholar
  36. Middleton, M. J., & Midgley, C. (1997). Avoiding the demonstration of lack of ability: An underexplored aspect of goal theory. Journal of Educational Psychology, 89, 710–718.CrossRefGoogle Scholar
  37. Midgley, C., Maehr, M. L., Hruda, L. Z., Anderman, E., Anderman, L., Freeman, K. E., et al. (2000). Patterns of adaptive learning scales (PALS). Ann Arbor, MI: University of Michigan.Google Scholar
  38. Midgley, C., & Urdan, T. (1995). Predictors of middle school students’ use of self-handicapping strategies. The Journal of Early Adolescence, 15, 389–411.CrossRefGoogle Scholar
  39. Milam, D., & El Nasr, M. S. (2010, July). Design patterns to guide player movement in 3D games. In Proceedings of the 5th ACM SIGGRAPH Symposium on Video Games (pp. 37–42). ACM. doi:10.1145/1836135.1836141.
  40. Pajares, F., Britner, S., & Valiante, G. (2000). Relation between achievement goals and self-beliefs of middle school students in writing and science. Contemporary Educational Psychology, 25, 406–422.CrossRefGoogle Scholar
  41. Reese, D. D., Tabachnick, B. G., & Kosko, R. E. (2013). Video game learning dynamics: Actionable measures of multidimensional learning trajectories. British Journal of Educational Technology. doi:10.1111/bjet.12128.Google Scholar
  42. Rideout, V. J., Foehr, U. G., & Roberts, D.F. (2010, January). Generation M2: Media in the lives of 8- to 18-year-olds. Kaiser Family Foundation. Retrieved from http://kff.org/other/poll-finding/report-generation-m2-media-in-the-lives/
  43. Rieber, L. (1996). Seriously considering play: Designing interactive learning environments based on the blending of microworlds, simulations, and games. Educational Technology Research and Development, 44(2), 43–58. doi:10.1007/BF02300540.CrossRefGoogle Scholar
  44. Romero, C., & Ventura, S. (2010). Educational data mining: A review of the state of the art. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, 40(6), 601–618. doi:10.1109/TSMCC.2010.2053532.CrossRefGoogle Scholar
  45. Romero, C., & Ventura, S. (2013). Data mining in education. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 3(1), 12–27. doi:10.1002/widm.1075.Google Scholar
  46. Romero, C., Ventura, S., & García, E. (2008). Data mining in course management systems: Moodle case study and tutorial. Computers & Education, 51(1), 368–384. doi:10.1016/j.compedu.2007.05.016.CrossRefGoogle Scholar
  47. Rosenbaum, E., Klopfer, E., & Perry, J. (2007). On location learning: Authentic applied science with networked augmented realities. Journal of Science Education and Technology, 16(1), 31–45. doi:10.1007/sl0956-006-9036-0.CrossRefGoogle Scholar
  48. Salen, K., & Zimmerman, E. (2004). Rules of play: Game design fundamentals. Cambridge, MA: MIT Press.Google Scholar
  49. Sawyer, B., & Smith, P. (2008). Serious games taxonomy. [PDF document]. Retrieved from http://www.dmill.com/presentations/serious-games-taxonomy-2008.pdf
  50. Scarlatos, L. L., & Scarlatos, T. (2010). Visualizations for the assessment of learning in computer games. In 7th International Conference & Expo on Emerging Technologies for a Smarter World (CEWIT 2010), September 27–29 2010, Incheon, Korea.Google Scholar
  51. Serrano, A., Marchiori, E. J., del Blanco, A., Torrente, J., & Fernández-Manjón, B. (2012, April). A framework to improve evaluation in educational games. In Proceedings from Global Engineering Education Conference (EDUCON), 2012 IEEE (pp. 1–8). IEEE. doi:10.1109/EDUCON.2012.6201154.
  52. Squire, K. D. (2004). Review. Simulation & Gaming, 35(1), 135–140. doi:10.1177/1046878103255490.CrossRefGoogle Scholar
  53. Squire, K. D., & Jan, M. (2007). Mad City Mystery: Developing scientific argumentation skills with a place-based augmented reality game on handheld computers. Journal of Science Education and Technology, 16(1), 5–29. doi:10.1007/s10956-006-9037-z.CrossRefGoogle Scholar
  54. Sweetser, P., & Wyeth, P. (2005). GameFlow: A model for evaluating player enjoyment in games. Computers in Entertainment, 3(3), 3–3.CrossRefGoogle Scholar
  55. Tanes, Z., & Cemalcilar, Z. (2010). Learning from SimCity: An empirical study of Turkish adolescents. Journal of Adolescence, 33(5), 731–739.CrossRefGoogle Scholar
  56. U.S. Department of Education, Office of Educational Technology (2012). Enhancing teaching and learning through educational data mining and learning analytics: An issue brief. Washington, DC.Google Scholar
  57. van Barneveld, A., Arnold, K. E., & Campbell, J. P. (2012). Analytics in higher education: Establishing a common language. EDUCAUSE Learning Initiative. Retrieved from https://qa.itap.purdue.edu/learning/docs/research/ELI3026.pdf
  58. Wallner, G., & Kriglstein, S. (2013). Visualization-based analysis of gameplay data—A review of literature. Entertainment Computing, 4(3), 143–155. doi:10.1016/j.entcom.2013.02.002.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Min Liu
    • 1
  • Jina Kang
    • 1
  • Jaejin Lee
    • 2
  • Elena Winzeler
    • 3
  • Sa Liu
    • 1
  1. 1.The University of Texas at AustinAustinUSA
  2. 2.The University of Texas at AustinAustinUSA
  3. 3.The University of Texas at AustinAustinUSA

Personalised recommendations