Introduction

Historically, students in the USA have trailed other industrialized nations in mathematics proficiency and achievement (Rave and Golightly 2014). Indeed, recent reports by the National Center for Education Statistics (Aud et al. 2011) and the National Mathematics Advisory Panel (NMAP 2008) agree that, although nation-wide proficiency scores have recently improved, math proficiency continues to decrease substantially as students progress from one grade to the next. These same reports suggest that knowledge of basic math facts may be at the core of these math proficiency deficits. Researchers commonly agree that math fact fluency is essential for later success in more complex mathematics such as algebra (Geary 2011; Nelson et al. 2016). In fact, one report suggests that students who do not achieve fact fluency by the end of fifth grade are unlikely to develop fluency and automaticity in later grades (Steel and Funnell 2001). Therefore, improvements in early math facts education may provide the foundation necessary for later proficiency in math.

Math fact fluency is the ability to rapidly and accurately respond to the four math facts operations (addition, subtraction, multiplication, and division) (Musti-Rao et al. 2015; Nelson et al. 2016). The development of fluency is a multi-step process in which a student progresses from basic counting to calculation and then to automatic recall (Baroody 2006). A student who becomes more fluent leaves behind old methods of calculation such as finger counting and eventually relies entirely on semantic memory (Lemaire and Siegler 1995).

Automaticity in math fact recall is particularly important for later math success as the development of automaticity is directly related to reductions in working memory and, relatedly, cognitive load (Fuchs et al. 2005). An individual (young or old) who cannot automatically recall basic mathematics must expend additional cognitive resources to solve a complex math problem. In this case, one must devote cognitive resources to the mental calculation of basic math facts before moving on to other aspects of a math problem thus increasing the cognitive load or demand. In contrast, automatic recall of basic math facts reduces cognitive load by eliminating extra calculations and focusing cognitive resources toward solving the more complex aspects of math problems (Parkhurst et al. 2010).

Considering the importance of mastering math facts for advancing mathematical thinking, researchers have identified effective practices for building fluency. Specifically, effective fluency-building instruction should incorporate modeling (Codding et al. 2011), provide ample drill and practice with high rates of response (Hawkins et al. 2017; Riccomini et al. 2017), include immediate and corrective feedback (NMAP 2008), and incorporate an appropriate ratio of known to unknown facts (Riccomini et al. 2017).

In classrooms, incorporating these facets of effective instruction can be challenging. To provide enough drill and practice for students to master math facts, curricula must include practice activities with ample opportunities to respond, and teachers need to ensure that students have adequate time to engage with those activities. And yet, the NMAP (2008) had documented that few curricula in the USA include the amount of practice necessary for students to develop math fact fluency. If curricula do not include adequate practice for mastery, teachers may not develop their own mastery materials and allocate adequate time for fluency building, thereby limiting students’ opportunities to become fluent with math facts.

When students practice math facts, practice should focus on appropriate ratios of known to unknown facts. Achieving this can be difficult. Students acquire math fact fluency at different rates and therefore need varying degrees of practice before achieving fluency and automaticity for all math facts (Burns et al. 2015). Also, the ratio of known to unknown facts will vary by student and change over time. For example, students with disabilities may initially require a 9:1 ratio to master math facts. That ratio can later be lowered to 3:1 as mastery increases (Riccomini et al. 2017). Other students in classrooms may need lower initial ratios with adjustments over time to account for mastery. Instructional methods that adapt to a student’s ability level and specific needs are more likely to be effective in teaching basic math facts than methods that are not adaptable to individual difference.

Feedback is a critical component of programs designed to build math fact fluency (NMAP 2008). When students practice math facts, they should have ample opportunities to practice with immediate feedback to prevent them from practicing incorrect responses. Mastery develops and strengthens as students practice responding correctly to math fact prompts (Fuchs et al. 2008). Without immediate feedback, if students answer math fact items incorrectly, they may assume that their incorrect responses are correct and then risk becoming fluent with wrong answers (Hawkins et al. 2017). In classrooms, teachers must ensure that all students receive immediate corrective feedback when practicing math facts. Feedback given after students have completed a practice session (e.g., after students complete an entire worksheet) is not likely to be as effective.

Another important consideration related to math fact fluency may be students’ interest, motivation, or engagement in the learning process (Kebritchi et al. 2010; Luo et al. 2009). Motivation for learning math facts can be challenging for young children who may have trouble dedicating full attention to the memorization of math facts. However, attention is vital for the successful encoding of new information. Therefore, methods that can improve or maintain student attention are more likely to be successful in developing fluency and automaticity (Fuchs et al. 2005; Plass et al. 2013). As technology use continues to increase among younger children, traditional paper-and-pencil learning methods may be less appealing to incoming generations of students. Therefore, in anticipation of a misalignment between student expectations or interests and current teaching methods and tools, many educators are looking to educational technology as a means of increasing motivation and engagement (Plass et al. 2013). A report by Carver in 2016 surveyed teachers in a southeast region of the USA and revealed that, despite some remaining barriers to technology integration in the classroom, the most often reported benefit of using technology in the classroom was an increase in student engagement. Despite the presence of multiple educational technologies already incorporated in the classroom, many often lag entertainment technologies, such as high-end video games, in complexity and design. Therefore, more interesting, immersive, and high-quality digital tools may better maintain student engagement in the classroom (Levine and Vaala 2013; Boyle et al. 2015).

Computer-assisted instruction (CAI) is a potential educational tool capable of accounting for many of the previously mentioned challenges associated with delivering effective math facts instruction (Gross and Duhon 2013). CAI includes any type of computer software or technology designed to display instructional material and monitor learning progress in any educational topic (Cates 2005). When used with fidelity, CAI can aid teachers as a supplementary tool by providing opportunities for added practice and by differentiating the educational experience of each child. Other advantages of incorporating CAI in the classroom include immediate feedback, automated progress monitoring and adaptive instruction, increased engagement, and high accessibility. Multiple studies, including meta-analyses, aimed at assessing the efficacy of CAI as a supplementary educational tool have demonstrated positive learning outcomes for children who use CAI alongside standard classroom instruction for math facts learning (Burns et al. 2012; Gross and Duhon 2013; Hawkins et al. 2017).

Recent reports suggest that CAI and educational technologies improve student learning outcomes in mathematics (Musti-Rao and Plati 2015; Rave and Golightly 2014). The NMAP has recommended CAI as a promising means of developing math fact fluency and automaticity. Specifically, the report recommends that “high-quality CAI drill and practice, implemented with fidelity, be considered as a useful tool in developing students’ automaticity…” (NMAP 2008). A report by Burns et al. (2012) demonstrated that the use of one computer-based math facts learning tool nearly doubled student growth in math fact fluency over approximately 11 weeks. In another report by Rave and Golightly (2014), students who used a computerized math fact fluency program achieved approximately a 22% increase in test scores. Importantly, the authors observed growth for students classified as needing special educational services as well as students without special education needs. Despite the evidence that some CAI programs may improve math fact learning, considerable conflict still exists concerning the true extent to which certain CAI programs may benefit student learning. Indeed, a need remains for more and better empirical evidence demonstrating the impact of specific CAI programs on math facts education (Shin et al. 2012).

One CAI program specifically designed to assist in the development of fluency and automaticity in math facts is Imagine Math Facts by Imagine Learning. The Imagine Math Facts program consists of multiple educational video games designed to improve math fact fluency and automaticity by differentiating instruction for each user and focusing practice on unlearned math facts. Timez Attack is the Imagine Math Facts game designed to teach multiplication facts and is the focus of this study. Timez Attack may be particularly effective in teaching math facts due to its several unique features designed to address common limitations of math facts teaching methods. Namely, its video game style promotes continuous engagement and attention. Also, because the game adapts to performance, each student receives individualized instruction that focuses on automaticity for known facts, and mastery of unknown facts. As students engage with Timez Attack, the program provides modeling of correct answers, and immediate, corrective feedback for errors. Finally, Timez Attack provides ample amounts of practice for all math facts and is unconstrained by the usual progression from smaller numbers to larger numbers. Despite the apparent advantages of using the Imagine Math Facts program, research is currently lacking to determine the efficacy of Timez Attack as a supplementary CAI program in teaching multiplication math facts. Therefore, we sought to determine whether the Timez Attack program was effective in teaching multiplication fact fluency and automaticity to third-grade children. We hypothesized that, in contrast to baseline performance, students who consistently played Timez Attack would demonstrate immediate and consistent improvement in multiplication fact fluency and automaticity.

Method

Study Design

Using a multiple baseline across groups design, we randomly assigned students from three third-grade classes to one of three equally sized study groups. We staggered the baseline-intervention schedules for each group to better establish causality between the intervention and learning outcomes and for replication within the study. Students assigned to group 1 completed five baseline assessments (spanning approximately two and a half weeks) and then used Timez Attack for the remaining seven and a half weeks of the study period. Students in group 2 completed seven baseline assessments (approximately three and a half weeks) and then used Timez Attack for the remaining six and a half weeks. Finally, students in group 3 completed nine baseline assessments and then used Timez Attack for the remaining five and a half weeks. Following the baseline and intervention phases, students discontinued use of the Timez Attack program and completed four more assessments during a maintenance phase. Throughout the baseline phase, students in each group used Imagine Language and Literacy, a CAI program that teaches English language and literacy, to prevent confounding of the intervention effect.

During the week in which students transitioned between the baseline and intervention phases, students completed one assessment immediately before playing Timez Attack for the first time. Therefore, the first assessment for that week still reflected baseline phase performance. Students completed the next assessment immediately before playing Timez Attack for the second time in the week thus reflecting intervention phase performance.

Participants and Setting

A total of 63 students who attended a charter school in a suburban area of the western USA participated in this study. All students were in third grade. Most enrolled students had previously used the Imagine Math Facts program during the same school year, but none had played Timez Attack (whether at school or at home) for training in multiplication fact fluency. Further, all students had already received classroom instruction for all multiplication math facts and continued to receive such instruction throughout the duration of the study.

Nearly an equal number of male (n = 33) and female (n = 30) students participated in this study. The participating charter school serves approximately 700 students enrolled in grades K–8. Twenty percent of the student body is of an ethnic minority, 32% are of low socioeconomic status, 14% of students have some form of learning or physical disability, and less than 2% are English language learners.

Based on results from the most recent administration of a state-specific, standardized assessment administered annually (assessment name not provided to protect against possible deduction of student identities), 11 of the students were below proficient, 21 near proficient, 15 proficient, and 16 above proficient in mathematics. The 2015–2016 school federal accountability report for the participating charter school indicates that third-grade students at the school typically perform on par with local education agencies but better than state averages (values and citation not reported for confidentiality).

Measures

To assess multiplication fact fluency, we used an online multiplication fact generator from Intervention Central (“Math Work—Math Worksheet Generator” 2017) to randomly generate 22 paper-and-pencil assessments with 30 questions each. Each assessment was unique but comparable in content. The assessments included multiplication facts for digits 1 through 9. After handing out the assessments, the classroom teachers instructed the students to write their assigned identification numbers on the front side of the assessment and then immediately turn the assessment over to hide the math fact problems. The teachers then instructed the students to complete as many of the 30 multiplication facts as possible in 1 min. Once all students were ready, teachers gave a cue to turn the assessment over and began the 1-min timer. After exactly 1 min had passed, teachers cued the students to stop, and the assessments were immediately collected.

The study authors scored the completed assessments. To ensure accurate and reliable scoring, 25% of the completed assessments were rescored by two other members of our research team. Ultimately, a student’s score for each assessment was the number of correctly answered multiplication fact problems in 1 min. We averaged the assessment scores for each study group for group level analyses and visualization.

Intervention

Imagine Math Facts creates educational games designed to improve math fact fluency and automaticity in addition, subtraction, multiplication, and division. The games are adaptive to student performance and the learning experiences of each student are tailored to their ability level. Timez Attack is an Imagine Math Facts game designed to teach multiplication facts by placing the learner in an immersive 3D environment and requiring them to navigate throughout the virtual world answering multiple sets of multiplication facts to progress through the game. Prior to the standard game play, each student completes a pretest designed to identify which math facts are unmastered and require additional practice. This pretest acts as the first level of differentiated instruction. As users progress through the game, the program utilizes user performance data to modify the order and content of practice sessions to focus on unmastered facts. Users also navigate through multiple environments designed to maintain interest and engagement. After demonstrating mastery for all multiplication facts, students complete a posttest. Due to the adaptive nature of the program, some students progress faster through the game than others. In this study, we expected that most students would complete the game after approximately 3 months of use with two 20-min sessions per week. Therefore, to ensure that most of the students would be using Timez Attack for the full duration of the study, we limited the combined baseline-intervention phases to 10 weeks. Students who completed Timez Attack before the end of the intervention phase restarted the program with “Ninja Mode” activated which reduces the allowed response time for each math problem and further supports automaticity.

Procedures

Students completed two multiplication fact assessments per week for the duration of the study. During the baseline phase, students completed assessments immediately before their regularly scheduled computer time. At the start of the intervention phase, students completed the assessments immediately prior to each Timez Attack session. By assessing students immediately prior to a Timez Attack session, we could be sure that there would be no carryover effects between recent Timez Attack use and performance on the assessments. Because students used Timez Attack during their regular computer time, they completed all assessments on the same days and at the same times each week for the duration of the study.

During the intervention phase, students played Timez Attack twice per week with each session lasting between 20 and 30 min. As per teacher instruction, students could only use Timez Attack during the scheduled times in school and not at home. All students played on standard PCs and used either Google Chrome or Mozilla Firefox Web browsers to play the most recent Web-accessible version of the game.

To determine the persistence of math facts learning, a 3-week maintenance phase followed the intervention phase. The first week of the maintenance phase spanned the school’s regularly scheduled spring break thereby providing a natural separation between the intervention and maintenance phases. In the 2 weeks following the spring break, students completed two assessments per week for a total of four additional assessments. Teachers instructed their students to not play Timez Attack in school or at home throughout the entirety of the maintenance phase. Teachers also emailed parents at the beginning of the spring break to ensure students did not play Timez Attack at home until after the completion of the maintenance phase. To confirm that students did not access the program, the research team accessed usage records and verified participants did not use the program during the maintenance phase or during spring break.

Results

Typically, visual inspection and analysis is used to present the results for multiple baseline across group studies. Data trends, variability, and other characteristics determine the effect of the intervention on student learning over the course of the study. In our study, we used Stata version 14.2 (2015) for the creation of all graphics. For each study group, we computed mean scores for each assessment and graphically plotted them to demonstrate the change in average assessment scores over the duration of the study.

Missing Data

Teachers administered each assessment only once. Therefore, it was possible that some students might miss some assessment administrations due to absence or other unforeseeable reasons. Following data collection, we determined that approximately 62% of the students were, on average, missing data for approximately one to two of the 22 total assessments, including the maintenance phase, administered in the study with five being the maximum number of missing assessments per student (n = 2).

As groups were relatively small, missing data for any of the assessments could result in significant variation in average performance depending on which students had available data. Based on the study design and the structure of the data itself, we concluded that mean imputation would be the most appropriate and conservative method for protecting against and accounting for missing data. More sophisticated methods of imputation, such as multiple imputation, do not support time series data, particularly when the primary means of analysis is graphic visualization. Instead, by using mean imputation, we could impute at the individual student level. In other words, if a student was missing data for an assessment, the imputed value was that student’s average score for all assessments administered during the baseline and intervention phases.

To determine whether missing data was random, we regressed a binary missing data variable (1 = missing, 0 = non-missing) against teacher, standardized state assessment proficiency level, and study group. None of the regressions were statistically significant indicating that all missing data were random and did not demonstrate any obvious patterns.

Visual Analysis

For each group, we observed a noticeable intervention effect at the transition between the baseline and intervention phases with groups 1 and 2 showing the greatest change in performance over time. However, average scores on assessments in the baseline phase were less consistent than expected. After inspection of the assessments administered during that phase, we found that assessments 6 and 8 appeared to be easier than all other assessments. For example, compared to the others, assessments 6 and 8 contained nearly twice as many problems that multiplied the number 1 against another number. Therefore, we could attribute some of the inconsistency in performance during the baseline phase to this variation in test difficulty. By randomly generating each assessment, such a risk was inherent.

Figure 1 displays the average assessment scores for groups 1 through 3. Despite the variability in baseline performance, group 1 scores were relatively low during baseline (M = 13.0; range = 11.4–16.3). Scores immediately and consistently increased following the implementation of the intervention (M = 18.8; range = 12.2–24.6) and remained high following the removal of the intervention during maintenance (M = 23.1; range = 22.1–24.1).

Fig. 1
figure 1

Average scores of the three study groups for each of the 22 multiplication fact assessments. The assessments are visually separated into three sections for the baseline, intervention, and maintenance phases. The maximum score was 30

In group 2, baseline scores were initially low and continued as such with relatively low variability throughout the baseline phase (M = 11.7; range = 10.2–14.9). After the implementation of the intervention, average scores markedly increased and continued to increase with good stability for the remainder of the intervention phase (M = 16.4; range = 14.5–19.4. Scores also remained high throughout the maintenance phase (M = 18.2; range = 17.8–18.7).

Baseline variability was higher for group 3, particularly due to the variation in test difficulty for assessments 6 and 8. However, initial performance was low and remained lower, on average, than the intervention phase (M = 14.1; range = 11.1–16.4). Once students began using Timez Attack, average scores consistently improved over the duration of the intervention phase (M = 17.5; range = 14.9–19.8). Group 3 performance remained consistently high during the maintenance phase (M = 21.2; range = 20.6–22.0).

While not an original study objective, we further disaggregated performance on each assessment by prestudy math proficiency levels (“Below Proficient”, “Near Proficient”, “Proficient”, and “Above Proficient”) as determined by a standardized state math assessment (Supplementary Table 1). This descriptive information is useful for crudely visualizing any effect differences due to prior ability level. Students classified as “Below Proficient” obtained an average assessment 1 score of 11.4 and an assessment 22 score of 16.4 indicating an average growth of 5.0 points. Subjects classified as “Near Proficient”, “Proficient”, and “Above Proficient” improved, on average, by 5.5, 7.4, and 11.1 points, respectively, between the first and last administered assessments. Therefore, this trend suggests that students who are generally more proficient in math before using Timez Attack will likely achieve greater gains in multiplication fact fluency from using the Timez Attack program.

Effect Sizes

While visual analysis is typically sufficient to demonstrate differences between the baseline and intervention phases, the calculation of effect sizes assists in interpretability and provides more conclusive evidence of a real effect. We determined that the calculation of effect sizes was particularly important for this study because of the higher variability in the baseline phase that could mask the degree to which Timez Attack might be improving performance.

Typically, effect sizes for studies that use a baseline-intervention design are computed by calculating the percentage of nonoverlapping data between the baseline and intervention phases. A widely accepted and preferred nonoverlap technique recently proposed by Parker and Vannest (2009) is the nonoverlap of all pairs (NAP). In contrast to more generalized nonoverlap methods such as percent of nonoverlapping data (PND) or the percent of data exceeding the mean (PEM), NAP requires the comparison of every baseline data point with every intervention data point to determine whether the intervention scores are above, below, or the same as the baseline scores. Most importantly, NAP is relatively unaffected by heteroscedasticity and other data distortions such as the relatively high variability of baseline scores in this study. We calculated the NAP effect size by dividing the total number of nonoverlap pairs by the total possible comparisons. By investigating all possible comparisons, the resulting NAP fraction is “the probability that a score drawn at random from a treatment phase will exceed (overlap) that of a score drawn at random from a baseline phase” (Parker and Vannest 2009). A more detailed description of how to calculate the NAP effect size and how it compares to other nonoverlap methods is in a review by Parker et al. (2011).

For reference, an NAP value of .50 indicates no difference or complete overlap between the baseline and intervention phases while values below or above .50 indicate worse or better performance in comparison to baseline with increasing degrees of nonoverlap. For students in group 1, NAP was .97 while the NAP values for groups 2 and 3 were .96 and .88, respectively. These NAP values suggest a very strong treatment effect and indicate that a student who uses Timez Attack will be very likely to experience significant improvements in multiplication math fact fluency.

Fidelity of Implementation

To monitor program implementation, the principal investigator was present at the participating school for the first 2 weeks of the study during which each group began using Timez Attack, and periodically throughout the remainder of the study to ensure students were using the program correctly and consistently. Further, we generated program usage reports each week to monitor the amount of time (in minutes) each student spent on the program. At the end of the intervention phase, students in group 1 had played Timez Attack for an average of 210 min, while groups 2 and 3 played Timez Attack for 193 and 177 min, respectively. With the completion of the intervention phase, students in group 1 had, on average, completed 67% of the program, students in group 2 completed 53%, and students in group 3 completed about 61% of the program. Due to the staggered start design of the study, we expected differences in usage levels between study groups. However, the fact that group 3, on average, progressed through more of the program than group 2 suggests that there may have been some differences in prior ability level between the study groups. Among the three study groups, group 3 included the highest proportion of students who had received prior state math assessment placements of “proficient” or higher. Therefore, this small advantage in prior math proficiency level may have allowed students in group 3 to progress farther in the program despite playing Timez Attack for fewer weeks.

One student played Timez Attack once during the maintenance phase for approximately 10 min but did not progress any further in the program. A total of five students required activation of “Ninja Mode” due to completion of the base version of Timez Attack before the end of the intervention phase.

Social Validity

Virtually all students who participated in this study regularly expressed enjoyment and excitement as they played Timez Attack. For example, teachers reported that the students randomly assigned to groups 2 and 3 frequently expressed their desires to start using Timez Attack as soon as possible, particularly as they observed the students in group 1 using the program. Further, when asked during observations, students indicated that they preferred using Timez Attack to learn multiplication facts because it was fun, easy to use, and they felt like they were learning. Students often included unsolicited, written messages on the back sides of the assessments such as “I love [Imagine Math Facts]!” or “[Imagine Math Facts] rocks!” For the final assessment, students wrote short letters on the back of the assessment to share their appreciation for Imagine Math Facts. While many of the letters acknowledged that learning the multiplication facts could be difficult at times, all were positive in tone and message. Below is an unmodified sampling of those notes.

  • I like [Imagine Math Facts]! It helps me to learn multiplication. It is also a fun way to learn math. It can be hard, but it helps a lot.

  • I love [Imagine Math Facts]. It has helped me alout. I am a lot better at multiplication. I also like the levels. It is hard to do but fun.

  • I would just like to thank you for [Imagine Math Facts]. It has really helped me with multiplication. [It] helped me memorize all of the 6’s! I think that [Imagine Math Facts] is the funnest math game!

  • I love [Imagine Math Facts]. It has help me alot with my moutapulcasnon. thanck for picking our school to do this porject. it was verey, verey fun. facts are so much fun.

During observations, each of the three third-grade teachers who monitored the students in this study expressed overall satisfaction with Timez Attack and indicated that the students were eager to use the program each week. In a visit with the teachers after the conclusion of the intervention phase, each expressed positive impressions of the impact of Timez Attack on multiplication fact fluency. The teachers were most impressed with how the program tailored each student’s learning experience to their specific needs. All three teachers indicated that they would be interested in continuing to use Imagine Math Facts games to supplement math fact teaching in future school years.

Discussion

In this multiple baseline across groups study, we found that third-grade students who used Timez Attack, an Imagine Math Facts game, improved in multiplication fact fluency over a 12-week intervention and maintenance period as determined by visual analysis and the calculation of effect sizes. Despite greater-than-expected variability in baseline measures, each of the three study groups demonstrated consistent trends of improved performance following the onset of the intervention phase. Further, the positive trends observed during the intervention phase continued during the maintenance phase when students discontinued use of Timez Attack. Based on these findings, we believe that Timez Attack is an effective tool in improving multiplication fact fluency for third-grade students and that educational agencies who utilize this program for multiplication fact education would likely observe similar results. While the efficacy of CAI options may differ, programs such as Imagine Math Facts may provide reliable solutions for addressing some of the limitations associated with delivering effective math facts instruction.

The results of this study emphasize the potentially valuable role of CAI in improving math fact fluency, particularly in third-grade students. Indeed, these results are consistent with other reports demonstrating improved math fact fluency with the use of CAI (Gross and Duhon 2013; Musti-Rao and Plati 2015; Rave and Golightly 2014). Due to its unique video game style, the game is both engaging and effective as a training tool in multiplication fact fluency. Indeed, throughout the study, students regularly commented on the fun and engaging style of the game. At the beginning of the study, after observing the first group of students scheduled to begin using Timez Attack, students assigned to the second or third baseline groups frequently expressed their eagerness to begin using the program. After the conclusion of the study, we received written and verbal comments from students indicating their enjoyment in using the program.

Beyond student engagement, Timez Attack incorporates many of the recommended and effective practices intended to improve math fact fluency such as modeling, drill and practice, immediate and regular feedback, and adaptive, individualized presentation. In interacting with the Timez Attack program, students encounter a unique series of multiplication fact problems based on how they performed on a pretest and on previously completed practice sets. Indeed, prior performance determines the presentation and testing order of specific math facts. As with all other academic subjects, students achieve math fact fluency at different rates and master some facts faster than others (Burns et al. 2015). Therefore, based on the principles of item response theory and computer-adaptive testing, the Timez Attack program continuously monitors a student’s performance so instructional time is focused on unmastered math facts (Shapiro et al. 2015). However, to foster conceptual understanding, practice for more complex math facts includes training for simpler, but related math facts. For example, a student presented with the problem 9 × 8 will obtain the solution by first solving 9 × 2, 9 × 4, and so on until the student can repeatedly solve 9 × 8 without errors. In this way, students learn to conceptualize and memorize more complex math facts through an interactive modeling and practice process. Students receive regular feedback throughout the entirety of the Timez Attack program. A positive reaction from the main character and level progress follows correct responses. A student will not progress following an incorrect response and the main character gives a flat reaction. Regular feedback encourages students to give their best effort throughout the game. For incorrect responses, the program provides immediate feedback for errors. Incorrect answers are subsequently followed by multiple opportunities to practice and ultimately master the problems. By incorporating these recommended practices (Riccomini et al. 2017), Timez Attack may be a particularly effective CAI program for improving math fact fluency in elementary-age students. Indeed, the results of the current study suggest that the Timez Attack program, as currently designed, is in fact an effective tool for math facts education.

The results of this study generate multiple insights regarding the specific use and implementation of Timez Attack as a CAI option for training in multiplication fact fluency. First, because all three study groups demonstrated significant improvement in fluency, we can deduce that playing the Timez Attack program for as little as 5 weeks or approximately 180 min may be sufficient to realize some benefits from the program. However, increased use of the Timez Attack program may be associated with greater improvement in multiplication fact fluency since the study groups with the greatest levels of usage demonstrated the greatest increases in performance. Secondly, use of the Timez Attack program may be beneficial for improving multiplication fact fluency despite prior ability in or knowledge of multiplication facts. While the students recruited for the study had already received classroom instruction and practice for multiplication facts, all three study groups achieved significant improvements in math fact fluency after using the Timez Attack program. Further, a post hoc exploration of student performance revealed that students at all levels of prior math proficiency improved in multiplication fact fluency following use of the Timez Attack program. In fact, students who were more proficient in mathematics prior to the study onset seemed to benefit the most from the Timez Attack program. Therefore, students at all levels of experience and prior proficiency could potentially benefit from using Timez Attack to improve multiplication fact fluency.

Several factors require consideration in interpreting the findings of this study. First, nearly all students were missing scores for at least one assessment. Though we did not observe patters in the missing data and we applied mean imputation to account for it, some uncertainty in true performance will inevitably remain. In a similar vein, we also observed greater-than-expected between-assessment variability in scores. The greatest variability was during the baseline phase in which students performed better than expected on at least two of the assessments. The random generation of each assessment likely led to differences in assessment difficulty which would directly relate to variations in performance. Using a standardized assessment could have addressed this issue. However, to our knowledge, no standardized assessment currently exists that assesses multiplication fact fluency and has enough forms/variations necessary for a multiple baseline across groups study design. We observed a minor ceiling effect for some students who could answer all thirty questions correctly for multiple consecutive assessments. Therefore, had each assessment included more problems, study group score averages may have been larger than currently reported. Also, despite randomly assigning students to study groups, the study sample was very heterogeneous with respect to prior math proficiency level. We generated a supplementary table to display average performance by prior proficiency level (Supplementary Table 1), but the small sample size and study design prohibited an in-depth analysis of how prior math proficiency level may have related to the study outcome. Finally, during the first 2 weeks of the study, some usability and technical issues occurred which may have delayed or interrupted learning opportunities for some students. However, we resolved all technical issues by the third week of the study.

In conclusion, this study adds to the ever-accumulating evidence that CAI programs such as Timez Attack by Imagine Math Facts may be viable, supplementary options for math facts education. However, as more CAI options become available, additional research is necessary to demonstrate the effectiveness of each option. Therefore, the results of this study stand as empirical evidence of the effectiveness of Timez Attack in developing multiplication fact fluency in third-grade students. We recommend that additional research be conducted to determine the effectiveness of other Imagine Math Facts games in developing math fact fluency for other operators such as addition and subtraction. Additionally, future research could explore the effectiveness of Timez Attack or other Imagine Math Facts games in developing math fact fluency in other populations such as students in different grades or who have learning disabilities.