Introduction

Response to intervention (RTI) has been advocated as a practical method to prevent academic deficits, identify learning disabilities, and provide early intervention (Robinson et al., 2013). Further, RTI involves consistent evaluation of a student’s progress within evidence-based interventions with varying individualization and intensity (Daly et al., 2007). There is sizable evidence to suggest RTI is effective in facilitating academic growth (e.g., Ardoin et al., 2005; Barnett et al., 2004; Codding et al., 2009; Fuchs et al., 2007), but a key factor in an effective RTI model is fidelity of implementation of the intervention (McKenna et al., 2014).

Rural schools can face many challenges and barriers to implementing RTI with fidelity. Barriers to RTI implementation can include a lack of resources, geographical isolation, and inability to afford training programs for teachers and staff (Barrio & Hollingshead, 2017). In addition, rural schools may have difficulty recruiting and retaining highly qualified teachers, which could result in staff who are unfamiliar with current evidence-based practices in education (Robinson et al., 2013). This lack of access to adequate resources can hinder the fidelity of RTI implementation due to a lack of training in evidence-based practices and interventions (Oram et al., 2016). In a survey regarding the needs of rural schools, teachers and administrators indicated that they lack “the resources, expertise, and support to meet the needs of struggling students” (Farmer et al., 2018, p. 166). Further, school staff members in rural schools often have multiple responsibilities beyond traditional teaching and administration duties, making the implementation of evidence-based interventions more challenging (Gross & Jochim, 2015).

No Child Left Behind legislation, and later the Every Student Succeeds Act, emphasized accountability for schools to improve student outcomes and recommended that schools use empirically based supports (Stoiber & Kratochwill, 2000). In order to assist rural schools in the provision of empirically supported academic interventions to their students, it is crucial to understand how rural schools can use their resources efficiently to ensure academic success (National Rural Education Association, 2016). The purpose of the present study was to examine the degree to which high school students can be taught to implement academic intervention in the area of math facts fluency.

Math Facts Fluency

According to the National Association of Educational Progress (NAEP), only 34% of students in the fourth grade achieved proficient status in mathematics (NAEP 2019). While many factors contribute to deficits in math, deficits in computational fluency significantly impact a student’s overall performance. Chong and Siegel (2008) reported many of the math difficulties present in the USA include deficits in accuracy or computational fluency, and both are recognized as a fundamental mathematical skill based on research examining its relation to the hierarchy of math performance (Daly et al., 1996; Farmer et al., 2018; Nelson et al., 2016). Students with fluency deficits often lack the ability to meet the time limits of assignments even after they attain mastery criteria in accurate responding (Bliss et al., 2010; Skinner, 2002). Students who possess good math fluency may more easily develop the subsequent skills to solve complex math problems (Nelson et al., 2016). Additionally, when students acquire fluency in math fact computation, they appear to be more likely to maintain that knowledge (e.g., Schutte et al., 2015), and some research also suggests that fluency can lead to greater generalization to new math concepts (Binder, 1996; Brady & Kubina, 2010; Johnson & Street, 2013; Kubina & Wolfe, 2005). Mastery levels of computation fluency are often associated with better skill retention, better ability to problem solve, higher scores across math assessments, and daily living skills (Duhon et al., 2015; Geary et al., 2012; Shapiro, 2011).

Conversely, a lack of fluency in basic computations can lead to mathematics difficulties that can persist across the lifespan (Nelson et al., 2016). Students in early elementary with deficits in fact fluency appear to have difficulties later in their education (Geary, 2004), and so providing academic support to address these concerns is a crucial component to student success (Fuchs et al., 2002; Nelson et al., 2016). Additionally, the literature has demonstrated the importance of repeated practice in the development of fluency (e.g., Burns et al., 2012; Codding et al., 2007, 2009; Daly et al., 2007). Numerous interventions have been developed to increase fluency and many have demonstrated effectiveness. Of these fluency-based interventions, explicit timing (ET; Van Houten & Thompson, 1976) is one that is commonly used across the literature.

Explicit Timing

One reason explicit timing is widely recognized as an effective fluency-based intervention is due to its ease of implementation. In the ET procedure, the interventionist presents an individual with a set of single-skill math facts, and students are encouraged to complete as many problems as they can within the allotted time frame (Poncy et al., 2010). ET can be administered through one-on-one interventions or with large groups of students in a classroom setting (Duhon et al., 2012). The procedures in ET allow repeated practice for math facts and have been shown to be effective for increasing rates of academic response (Duhon et al., 2012; Gross & Duhon, 2013; Powell et al., 2021).

Interventions in Rural Schools

Under ideal circumstances, interventions are provided by content area specialists. However, for many rural school districts, the reality is that teachers are generally expected to provide interventions to their students throughout the school day. While this can be an efficient and resource-saving approach, it can also be a hurdle. Rural schools typically have fewer staff, many of whom must fill multiple roles to support the sustainability of the school as a whole (Barrio & Hollingshead, 2017). Considering this, it can be extremely difficult for teachers in rural schools to spare additional time to conduct academic interventions for their students.

Introducing consultants to assist with implementation of academic interventions at rural schools is another way to address this problem. Through directed consultation, schools can adapt evidence-based practices to their needs (Farmer et al., 2018). The goal of this approach is to identify the needs of the schools as well as the strengths and perspectives of stakeholders (i.e., teachers, parents, students, and administrators), and then use data to ensure evidence-based practices are adapted for the specific school. Directed consultation is typically conducted by an interventionist trained in fields related to education (e.g., school psychology). While consultants provide valuable services, their costs can be prohibitively expensive for rural schools. Some rural districts across large geographic areas pool resources with others to share consultant availability, which allows more access to consultants, but makes daily direct intervention challenging (Barrio & Hollingshead, 2017). Likewise, rural school psychologists are often responsible for the provision of services across multiple buildings, forcing practitioners to prioritize some services (i.e., psychoeducational evaluation, special education eligibility determination, IEP-required services) over other intervention services. While some research has shown paraprofessionals to be a promising option for the delivery of intervention services (Musti-Rao & Cartledge, 2007), these personnel are frequently needed to complete other work duties that are time consuming and can be difficult to predict.

Despite limited staff availability, lack of funding, insufficient facilities, lack of technology, and high staff turnover, rural schools must determine how they can provide necessary services to their students. Even with fewer resources, rural schools are expected to provide high-quality, beneficial support to their students to ensure equal opportunity for success. With these considerations, a rarely considered option is the training of high-school students in the district to implement interventions in elementary school settings. Though the concept of mentoring relationships between older and younger students is not new (Besnoy & McDaniel, 2016) and there is evidence for the effectiveness of peer-assisted learning (Fuchs et al., 1995), sparse literature exists for the application of these relationships to aid in intervention implementation by older students for younger students. Therefore, the purpose of this study was to examine the effectiveness and feasibility of high school students as implementers of academic interventions to remediate fact fluency deficits in elementary students. The research questions were the following:

  1. 1.

    Is there a significant difference in elementary students’ pre/post math facts fluency CBM after receiving interventions from high school students?

  2. 2.

    Is there a significant difference in elementary students’ pre/post math facts fluency CBM after receiving interventions from graduate students?

  3. 3.

    Is there a significant difference between the outcomes of elementary students who received intervention from high school students and graduate students?

Methods

Participants

English-speaking children in second to fifth grades attending a rural elementary school in South-Central USA were identified as ideal elementary school candidates for the study based upon their fall 2018 school-wide math screening data. Students chosen for the study all fell below the school-determined benchmark of the 25th percentile on their beginning of the year benchmark screener, Acadience Math (Acadience Learning, 2022). In total, 30 elementary students took part in the study. Five high school students in the 11th grade who were completing an internship class in the same school district were identified as the high school interventionists to run the intervention with half of the elementary students, while an additional five graduate students in school psychology ran interventions with the other half. All five high school students were White girls. The graduate student interventionists were doctoral (80%) and specialist (20%) students in school psychology. All five graduate students were women. One of the graduate students was Black, and four were White.

Due to the study taking place in 2018, data regarding demographics of the sample are no longer available. However, school-wide demographics indicate that at the time of the study, total enrollment of the school building included 129 students (44.2% girls), with 57.4% Native American, 40.3% White, and 2.3% students of Hispanic origin. Of the total enrollment, 77.5% were identified as economically disadvantaged, and 17.1% of the total enrollment were students with disabilities.

Materials

Baseline data, intervention data, and posttest data were assessed using math probes assessing basic math facts ranging from addition with sums to 10 to division from 81. All probes included an array of single-skill math problems that were randomly generated. All problems were presented vertically using Arial font size 14. Appendix A depicts the scope and sequence of skills used in this study.

Dependent Variable

The dependent variable was the digits correct per minute (DCPM) for all of the participants. DCPM was calculated by summing how many digits each student got correct at the end of their 1-min timing. In scoring DCPM, digits are correct if the student places the correct number in its corresponding column. For example, a student would be scored as having two digits correct if they responded to “5 + 11” with “16.” They would be scored one digit if they responded with “11” or “26” because only one column contains a correct response. In order to assess overall student growth, we summed digits correct per minute across all four addition and subtraction skills for both the pre- and post-tests. While doing so reduced the interpretability of the composite scores within individual skills, this approach allowed us to compare relative student growth at the pre- and post-tests across interventionists using a mixed factorial ANOVA model.

Independent Variables

There were two independent variables in this study, each with two levels. The first was a within-subjects variable measured twice, once at pre-test and again at post-test. This independent variable represented the effect of the intervention, regardless of the interventionist. The second independent variable was a between-subjects variable that represented whether a student received intervention from a graduate student or a high school student.

Procedure

This study received approval from the institutional review board of the first author’s university. Informed consent, including consent to use de-identified and aggregated data, was obtained from the guardians of all thirty elementary students and all five high school students. Additionally, the first author conducted all training sessions with the high school students using the intervention protocol in Appendix B. A research team comprised school psychology graduate students administered both pre- and post-CBM assessments to all students and delivered the intervention to the second half of the elementary students. All intervention occurred across a 16-school-day period.

Upon being identified through the school-wide academic screening process, elementary students were randomly assigned to work with either a high school student or a graduate student. During the pre-test, each elementary student was individually administered the pre-test curriculum-based measurement assessment. Following the pre-test, intervention probes were selected based on mastery criteria described in Poncy & Duhon’s (2017) facts on fire intervention protocol. During intervention, no elementary student dropped below 90% accuracy. Additionally, the primary investigator worked with the high school students in the morning during their internship class to train them on the procedures and protocols for 3 consecutive days. Intervention began immediately following completion of intervention administration training and elementary student pre-tests. All interventions were administered individually in a small classroom. Students were pulled individually from their classrooms and instructions were read during each intervention session. Once seated, the interventionist would read the protocol and provide the probes to the elementary student. Each child received 3 min of explicit timing per day. The graduate student or high school student would then record the elementary student’s data on a sheet to track progress and growth over time. The elementary student would then be sent back to class and another student would arrive. The process continued until all students completed their interventions. The interventions were constructed in two phases. The first phase followed intervention procedures encouraging the student to work as quickly as they could to complete the designated math probe. Phase two focused on goal setting and providing reinforcement for reaching the goal.

Fidelity of Implementation

To assess fidelity of implementation for the high school students on study procedures, research assistants trained on the study procedures used fidelity checklists, which are listed in Appendix B. Each day, depending upon research assistant availability, one to three high school students were randomly selected to have fidelity assessed. The high school student would be observed during all intervention sessions that day. Research assistants would then provide feedback to improve on delivery methods following the intervention session. If observed fidelity of implementation by any high school student fell below 90%, additional training was provided by the secondary author.

Results

Procedural Fidelity

Fidelity of intervention implementation data by the high school students were collected. Three of the five high school students had fidelity implementation ranging from 91 to 100%, and the other two had ranges between 82 and 100% for implementation. The two students who dropped below 90% received immediate feedback from the graduate students and were re-trained before they worked with another student. These data suggest that once high school students are trained in the intervention procedure, they have high rates of fidelity over time.

Analysis

Descriptive statistics are available in Table 1. To analyze these data, we conducted a 2 × 2 mixed factorial analysis of variance. The assumption of homogeneity of variance was assessed via Levene’s test and was met, F(1, 28), = 0.525, p = 0.475. A post-hoc power analysis was conducted using G*Power 3.1.9.7, and indicated acceptable statistical power (1 − ß = 0.92). We found a medium (η2 = 0.092) main effect of the within-subjects variable, F(1, 28) = 72.584, p < 0.001, but no main effect for the between-subjects variable, F(1, 28) = 0.828, p = 0.371. There was no interaction effect, F(1, 28) = 0.004, p = 0.952. Mean differences are graphed in Fig. 1.

Table 1 Descriptive statistics across pre-/post-test and interventionist education
Fig. 1
figure 1

Mean differences across pre-/post-test and interventionist education

Discussion

The results of this study further support an expansive literature base indicative of ET’s effectiveness in increasing math facts fluency (e.g., Duhon et al., 2012; Hawkins et al., 2009; Powell et al., 2021; Van Houten & Thompson, 1976). Improving the math facts fluency of students generally improves their access to more complicated math problems (Nelson et al., 2016), enables them to meet assignment time limits (Bliss et al., 2010; Skinner, 2002), generalize to new math concepts (Binder, 1996; Brady & Kubina, 2010; Johnson & Street, 2013; Kubina & Wolfe, 2005), and retain the accuracy of their responses (Schutte et al., 2015). Additionally, strong fluency skills can serve as a protective factor against a number of downstream academic difficulties (Geary, 2004).

We found no significant differences in elementary student DCPM growth across graduate and high school student interventionists, despite a disparity in starting position between the two groups. This preliminary finding suggests that highly trained academic interventionists (e.g., teachers, school psychologists, intervention specialists) may be able to reach more students in need by training high school students to administer interventions and monitoring procedural fidelity for a subset of intervention sessions. This finding may be especially encouraging for practitioners who work in rural schools with low numbers of staff who must fill multiple roles, sometimes across buildings (Barrio & Hollingshead, 2017).

While the mentorship literature is sparse for examples of high school students implementing interventions with elementary students, we believe that these findings can add to the mentorship literature in two ways. First, we add to the evidence that peer-assisted intervention procedures can be effective (Fuchs et al., 1995). Second, while we did not purposefully follow the procedures described by Besnoy & McDaniel (2016), we implemented a similar strategy and procedural timeline (i.e., plan development, recruitment of mentors and mentees, mentor training, and evaluation). Though the purpose of the mentoring relationships forged by the present study was not to improve overall school functioning across academic and social behaviors, we believe our findings to be generally supportive of the model proposed by the authors.

Limitations and Directions for Future Research

Additional research is needed to address several limitations of this exploratory study. Though these findings are encouraging, our relatively small sample size may limit the degree to which these effects generalize to other groups of students. Future studies should replicate our methods with a larger sample size and across other intervention modalities (e.g., flashcard drill, cover/copy/compare) and subjects (e.g., reading, writing), ideally with a waitlist control, in order to support or refute these preliminary findings. In addition, future studies could apply this methodology for the collection of local norm data across reading and mathematics CBM. Further work should consider whether these findings generalize to 6th through 8th grade students administering intervention, as there may be more availability of middle school students in the case of a K-8 building. Additionally, we recommend that this work be replicated with adult volunteers from the community. In future work, we recommend that researchers collect social validity data from student interventionists, elementary students, and elementary teachers to determine the general acceptability of this work.

High school students who delivered intervention in this study were nominated by their school counselor and were participating in an internship class. Participation in the study and delivery of the intervention were voluntary; therefore, attendance was occasionally a problem. Elementary students received interventions from the high school group for 8–14 (M = 11) days out of the 16 total days the study took place. Some of the missed days were due to regularly scheduled field trips, school programming, and student absences. The findings of the current study support existing literature (Powell et al., 2021; Schutte et al., 2015), indicating the near-immediate effectiveness of relatively low doses of explicit timing on math facts fluency, as the children received intervention for between 24 and 42 (M = 33) minutes over approximately 3 school weeks.

In future studies and practice, the benefits to the high school students for participation should be carefully considered. In the case of current study, high school students received service learning credit for their participation, which may not be an immediately available option for all rural districts. Though this is not possible in all settings, we recommend that the arrangement be beneficial for all involved. While participation may be automatically beneficial to a high school student under specific circumstances (as in the case of a student who wishes to pursue a teaching career), we recommend that high school students receive some benefit (e.g., course credit) for participation. A consideration when replicating these methods is to monitor the student interventionist for both fidelity and appropriate behavior. Though no inappropriate behaviors were noted during the course of this study, we recommend interventions take place in group settings and under adult supervision for safety.

Conclusion

Previous studies have examined the effectiveness of explicit timing interventions by teachers (Powell et al., 2021) and paraprofessionals (Musti-Rao & Cartledge, 2007), but we are not aware of any studies examining the impact of explicit timing administered by high school students. Though more research is warranted, explicit timing administered with high procedural fidelity by high school students produced effects comparable to those of highly trained graduate students in school psychology in our sample of elementary students. In practice, we believe these findings may be applied to partially alleviate the resource needs of rural schools.