Research on how children learn to read and why some young learners have difficulty developing as proficient readers has informed the development of evidence-based instructional approaches and interventions to promote reading proficiency. This vast, interdisciplinary body of knowledge is known as the science of reading (The Reading League, 2022). The science of reading emphasizes the importance of explicit, systematic, and sequential instruction targeting the five essential components of reading: phonemic awareness, phonics, fluency, vocabulary, and comprehension (National Reading Panel, 2000; Spear-Swerling, 2018).

Effective instruction based on the science of reading is beneficial for the approximately 40% of students for whom learning to read with general instruction develops fairly easily, but essential for approximately 60% of students for whom learning to read is more difficult (Foorman et al., 2016; National Reading Panel, 2000). The majority of these emergent readers do not have a learning disability, but simply require explicit, systematic, and sequential instruction to become proficient readers. The term “instructional casualties” has been used to refer to struggling readers who did not receive adequate, scientifically based reading instruction and consequently risked being misidentified as a student with a learning disability (Lyon et al., 2001). The impact on the lives of these young students is devastating. Students who do not develop reading proficiency are more likely to be retained a grade in school, drop out of high school, and enter the juvenile justice system (Fien et al., 2021; Reynolds et al., 2002).

Decades of dismal reading outcomes indicate that the national educational landscape is strewn with instructional casualties. Only 33% of fourth-grade students performed at or above the proficient level in reading in 2022 (National Assessment of Educational Progress [NAEP], 2022). This reading crisis is particularly dire among students who are educationally marginalized - in particular, students from diverse racial and ethnic groups, economically disadvantaged students, English Language Learners, and students with disabilities. Only 19% of fourth-grade students eligible for the National School Lunch Program scored at or above the proficient level in reading in 2022 (NAEP, 2022). Likewise, only 17% of Black fourth-grade students and 21% of Hispanic students performed at or above the proficient level in reading (NAEP, 2022).

Evidence-Based Interventions within a Multi-Tiered System of Supports Framework

An MTSS framework is a systems-level approach to ensuring supports for all students proactively based on academic or behavioral skill need. The core features of an MTSS framework are: (1) universal screening; (2) data-based decision making and problem solving; (3) continuous progress monitoring; (4) a continuum of evidence-based practices across tiers (core instruction with universal support, targeted intervention, and intensive intervention); and (5) a focus on fidelity of implementation (McIntosh & Goodman, 2016).

An MTSS framework aims to meet the needs of all students by considering the influence of ecological variables such as instruction, curriculum, and classroom environment on student outcomes. Through the prevention-focused early intervention approach, with an emphasis on effective core instruction and instructional supports through data-based decision making, MTSS facilitates meeting the needs of marginalized students (Albritton et al., 2016). Research indicates that an MTSS framework, when implemented with fidelity, holds promise for addressing several key outcomes including (1) increasing student achievement (VanDerHeyden et al., 2007); (2) detecting and intervening early with students at risk for academic problems (Al Otaiba & Torgeson, 2007); (3) improving the means by which students are determined to be eligible for special education services (Barrett, 2023); (4) reducing disproportional representation of minoritized students receiving special education (VanDerHeyden & Witt, 2005); and (5) meeting the needs of struggling readers cost-effectively (Morrison et al., 2020).

MTSS Reading Implementation Fidelity Challenges

The core components of MTSS need to be implemented fully in an integrated and consistent manner for the positive outcomes to be attained (Keller-Margulis, 2012; Noell & Gansle, 2006, Sanetti & Luh, 2019). Yet implementation fidelity can be difficult to achieve because it requires educators to fundamentally change how they support learners not only by advancing their knowledge and skills in scientifically-based reading approaches (Kilpatrick, 2015), but also developing their competencies in the timely use of data to inform instructional decision making (Daly et al., 2007; Kratochwill et al., 2007), and adopting an ecological view of learning by seeking to, “eliminate contextual variables as a viable explanation for academic failure” (Vaughn & Fuchs, 2003, p. 142).

Many challenges to MTSS implementation have been identified in the research literature (see Table 1). The failure to implement evidence-based reading interventions with fidelity is the most frequently recognized limitation (Sanetti & Luh, 2019). Reading intervention researchers contend that the evidence-based interventions that produce significant word-reading outcomes (Torgesen et al., 2001; Vellutino et al., 1996) risk being diluted or dropped within an MTSS framework (Vellutino et al., 2008). Furthermore, when MTSS implementation efforts fail to address weaknesses in the core instruction, a high proportion of students may meet the criteria for more intensive intervention resulting in more school resources dedicated to intensive intervention, when the resources may have been more effectively used to improve core instruction (Fuchs & Fuchs, 2017).

Table 1 Challenges to MTSS Implementation Fidelity

Increasing MTSS Implementation Fidelity with Training and Coaching

Building teachers’ capacity to provide effective instruction and intervention based on the science of reading within an MTSS framework involves a fundamental shift in how teachers provide reading instruction (Spear-Swerling, 2018) and gather, monitor, and respond to reading assessment data (Kratochwill et al., 2007). Training alone is insufficient. Research on teacher professional learning has shown that newly learned practices are crude compared to the performance by a master practitioner, fragile in the face of reactions from others, and incomplete when translated to the school setting (Joyce & Showers, 2002). As a result, training coupled with ongoing and embedded coaching is essential for promoting competent usage of evidence-based instructional practices (Joyce & Showers, 2002). Training and coaching function as critical drivers of implementation (Blase et al., 2013; Fixsen et al., 2013), but more research is needed on models for training and coaching to promote sustained use of evidence-based reading instruction and intervention within an MTSS framework.

Purpose of the Study

Evidence-based instruction and interventions emerging from the science of reading are key to addressing the needs of struggling readers and closing the opportunity gap for educationally marginalized learners (Fletcher et al., 2004). Yet, weak reading intervention fidelity and inconsistent fidelity of implementation of the core components of MTSS have resulted in the proliferation of initiatives that feature only the surface manifestations of an MTSS framework (e.g., sorting students into tiers based on universal screening data) grafted on top of traditional practices and routines not aligned with the science of reading (Hall, 2018; Kilpatrick, 2015; Sabnis et al., 2020). The purpose of this study was to evaluate the effectiveness of an MTSS initiative designed to build the capacity of teachers to provide effective instruction and intervention based on the science of reading. In particular, process evaluation examined the implementation of a novel problem-solving approach for analyzing literacy practices across an MTSS framework (i.e., core instruction and strategic intervention). The Promoting Achievement in Reading Through Needs-driven Evidence-based Reading Structures (PARTNERS) Project aimed to provide comprehensive professional learning (i.e., training and coaching) in a problem-solving approach to examining the tiers of literacy instruction to improve early literacy outcomes for students in kindergarten through second grade. This process evaluation examined teacher team outcomes during the project’s third year of implementation, which focused on strengthening the core reading curriculum and instruction (Tier 1) and strategic intervention (Tier 2). The study addressed the following process evaluation question: To what extent did the PARTNERS Project increase teacher teams’ capacity to analyze and improve the core curriculum and instruction and intervention supports?

Method

The methods used to evaluate the effectiveness of the PARTNERS Project were reviewed by the Institutional Review Board. Data from the LAP-G presented in Figures 1, 2 and 3 are not publicly available in order to protect teacher participants’ privacy. The data are available from the corresponding author upon reasonable request. The PARTNERS Project and its evaluation were funded as a Model Demonstration Project by the U.S. Department of Education, Office of Special Education Programs.

Fig 1
figure 1

Results of the Annual Administration of the LAP-G Tier 1 and Tier 2 at Loweland School: K2 Teacher Team

Fig. 2
figure 2

Results of the Annual Administration of the LAP-G Tier 1 and Tier 2 at St. Mark School: K1 Teacher Team

Fig. 3
figure 3

Results of the Annual Administration of the LAP-G Tier 1 and Tier 2 at St. Mark School: Grade 23 Teacher Team

Participants and Setting

Loweland School (a pseudonym) is an elementary school in a public school district classified as Urban-High Student Poverty and Average Student Population according to the state department of education. Loweland School had an enrollment of 278 students in kindergarten through Grade 6, of which 100% were classified economically disadvantaged. In the year prior to the PARTNERS Project, only 21.2% of the third-grade students scored at or above the proficient level on the state-mandated achievement test.

St. Mark School (a pseudonym) is a non-public, Catholic School serving an urban, predominately Hispanic community. St. Mark School had an enrollment of 215 students in preschool through Grade 8, of which 94% were economically disadvantaged. State-mandated achievement test data were not available for non-public schools. However, Acadience screening data collected at the onset of the study indicated that only 25% of the third-grade students were at benchmark for reading at the middle of the year checkpoint. Demographic information for each school’s student population is provided in Table 2.

Table 2 Demographic Data for the Student Population at Each School

Loweland School formed a K2 teacher team that consisted of four teachers, an intervention teacher, and the principal. St. Mark School formed two teams, a K1 teacher team with eight members, and a grades 23 teacher team with six members. At St. Mark School, the principal, English Language Learner (ELL) teacher, reading intervention teacher, and the Title I teacher all served on both the K1 team and the grades 23 team. Table 3 provides information regarding the gender, race/ethnicity, years of teaching experience at the start of the PARTNERS Project, and any relevant training each participant had received prior to the start of the PARTNERS Project. Given the opportunity to participate in the PARTNERS Project, all of the participants expressed a willingness to be involved. The teachers were encouraged, but not required, to participate in the project Table 3.

Table 3 Demographic Data for the Teacher Team Participants

The PARTNERS Project

The PARTNERS Project provided comprehensive professional learning (i.e., training and coaching) to teams of teachers in a problem-solving, data-driven process whereby they evaluated their own instructional program and identified areas of needed improvement to align with the science of reading. The problem-solving approach was operationalized in the Literacy Analysis and Planning Guide (LAP-G). A description of the LAP-G and the professional learning supports are provided in this section.

Literacy Analysis and Planning Guide Tool

The Literacy Analysis and Planning Guide (LAP-G) provided for an analysis of evidence-based literacy practices across each tier of an MTSS framework (i.e., core instruction, strategic intervention, and intensive intervention). The core components of evidence-based literacy instruction analyzed by the LAP-G process at Tier 1 were (1) Screening; (2) Instructional Materials by Essential Component—Core and Supplemental; (3) Implementation of Tier 1 Instruction; and (4) Differentiated Instruction. The reliability, validity, and comprehensiveness of the school’s screening system was operationalized by 11 items on the LAP-G (see Table 4). The second core component pertained to the instructional materials (the core curriculum and supplemental materials) used to address each of the essential components of reading: Phonological awareness, phonics, reading fluency, vocabulary, comprehension, and writing (see Table 5). Implementation of Tier 1 instruction was the third core component. Six items on the LAP-G operationalize how implementation is assessed through multiple measures (e.g., permanent product review, direct observation) and multiple data sources (see Table 6). The fourth core component was differentiated instruction. Seven items on the LAP-G operationalize differentiated instruction (see Table 7).

Table 4 Core Component 1: Screening
Table 5 Core Component 2: Instructional Materials by Essential Component—Core and Supplemental
Table 6 Core Component 3: Implementation of Tier 1 Instruction
Table 7 Core Component 4: Differentiated Instruction
Table 8 PARTNERS Project’s Training and Coaching Supports

The core components of evidence-based literacy intervention analyzed by the LAP-G process at Tier 2 were (1) Assessments—Intervention-based Diagnostics and Progress monitoring; (2) Instructional Materials for Each Intervention Program; and (3) General Considerations: Effective Intervention Design, Professional Development, Implementation Checks. The core components of evidence-based literacy intervention at Tier 3 were (1) Assessment; (2) Designing Tier 3 Supports—Collaborative Problem Solving, Intervention Components; (3) Effective Implementation of Tier 3 Interventions—Effective Implementation, Appropriate Placement in Tier 3, Professional Learning for Tier 3. A copy of the LAP-G with the core components of evidence-based literacy interventions at Tier 2 and 3 is available upon request to the corresponding author.

The LAP-G engaged a team of educators in a problem-solving process whereby the effectiveness of each tier of instruction/intervention was evaluated. The tool utilizes a five-step collaborative problem-solving process: (1) Define and Analyze Needs: Collect Initial Information; (2) Define and Analyze Needs: Summarize, Analyze, and Prioritize; (3) Plan Support; (4) Implement Plan; and (5) Evaluate. Teams began by documenting and reviewing student data to identify areas of concern with the support of their PARTNERS Project consultant who served as a facilitator. Priority areas of opportunity were then identified and action plans were developed to improve upon current practices (plan development and implementation). The LAP-G process was completed annually in April/May to evaluate progress and determine next steps.

Professional Learning: Training and Coaching

An overview of the scope of the professional learning provided through the PARTNERS Project is presented in Table 8. At Loweland Elementary, the need for changes in the core curriculum and instructional practices was identified in the first year of PARTNERS Project implementation with its focus on Tier 1 instruction. Teachers participated in training in Language Essentials for Teachers of Reading and Spelling (LETRS) to build foundational understanding of the science of reading. LETRS has been shown by research to be an effective professional development program for reading teachers (Garet et al., 2008). Kindergarten and first-grade teachers at Loweland identified a need to focus on supplemental phonics instruction in Tier 1. A review of curriculum aligned with the science of reading and matched to the instructional need resulted in the selection of Superkids. The Superkids Reading Program meets the criteria for which programs can be called evidence-based established by the federal Every Student Succeeds Act (https://media.zaner-bloser.com/reading/superkids-reading-program/pdfs/R1727_SK_EvidenceforESSA.pdf). Teachers at Loweland were trained in effective literacy instruction and high-fidelity curriculum implementation in the use of Superkids at the kindergarten, first and second grade levels. Training also included a two-day Summer Institute focused on Superkids and data-based decision making prior to implementation. Due to teacher requests during a monthly teacher team PARTNERS meeting, the Superkids consultant was brought back for a day of modeling lessons and program-specific coaching the following spring.

At St. Mark School, the need to strengthen the core instructional program and use of differentiation in instruction were identified as priorities at the kindergarten and first-grade levels. Teachers participated in training in Language Essentials for Teaching Reading and Spelling (LETRS) to build foundational understanding of the science of reading. Teachers at St. Mark School identified a need to focus on supplemental phonics instruction in Tier 1. Superkids was adopted as the curriculum prior to the start of the PARTNERS Project, but there was a recognized need for additional professional learning. St. Mark School teachers participated alongside Lowland Elementary teachers for the 2-day Summer Institute focused on Superkids and data-based decision making. Kindergarten and first-grade teachers at St. Mark School also participated in ongoing training in the valid and reliable use of Acadience measures beginning in their first year of engagement with the PARTNERS Project. In Year 2, the LAP-G process showed gaps in the phonics instruction with Superkids. The teacher teams reviewed alternative curriculums and selected the 95% Group Phonics Lesson Library. In addition to training in this new curriculum, the teams at St. Mark School focused on writing skills in Years 2-3 and reading comprehension in Year 3. Given that the student population was predominately Hispanic, the PARTNERS Project professional learning included a year-long book study on supporting the early literacy skills of English Language Learners.

At both schools, PARTNERS Project consultants observed teacher instruction and assessed implementation fidelity using a fidelity checklist co-designed with the teachers. Observations were conducted weekly in Years 2 and 3, such that each teacher was observed at least once a month. The collaborative coaching model included individualized performance feedback shared in person or via email following each observation.

Measure and Analysis

The LAP-G tool was developed for use in the PARTNERS Project based on a prototype developed by the first author and colleagues. The development of the LAP-G was informed by an extensive review of the science of reading research and the literature on MTSS. Preliminary content validation was established through expert review. A panel of nine individuals with expertise in the science of reading were asked to review the LAP-G overall and by tier on three dimensions: quality, relevance, and usefulness (QRU). Definitions for QRU were based on the U.S. Department of Education, Office for Special Education Program’s Government Performance and Results Act (GPRA) measures (Moore & Lammert, 2019). Quality was defined as the degree to which the tool is grounded with current research or policy. On a 10-point scale where 10 represented the highest level of quality, the mean ratings were high for the LAP-G overall (M = 8.8, SD = 1.28), Tier 1 (M = 9.0, SD = 1.41), Tier 2 (M = 9.2, SD = 1.30), and Tier 3 (M = 9.1, SD = 1.1). Relevance was defined as the degree to which the tool addresses current educational problems or issues. On a 10-point scale where 10 represented the highest level of relevance, the mean ratings were high for the LAP-G overall (M = 9.1, SD = 1.36), Tier 1 (M = 9.3, SD = 1.66), Tier 2 (M = 9.2, SD = 1.64), and Tier 3 (M = 9.7, SD = 0.7). Usefulness was defined as the degree to which the tool could be readily and successfully used by practitioners (i.e., ease of use, suitability). On a 10-point scale where 10 represented the highest level of usefulness, the mean ratings were high for the LAP-G overall (M = 9.1, SD = 1.36), Tier 1 (M = 8.9, SD = 1.62), Tier 2 (M = 8.9, SD = 1.62), and Tier 3 (M = 9.1, SD = 1.1). Teacher team members were also asked to complete a survey to assess their perceptions of the Quality, Relevance, and Usefulness of the LAP-G and solicit qualitative data regarding their acceptability of the tool.

In using the LAP-G, teacher teams complete a problem-solving process led by their PARTNERS Project consultant who had a primary role in the development of the tool and served as a coach in this project. The problem-solving process involved an examination of many sources of information including Acadience screening data, assessment schedules, and decision rules for the screening section; sample lesson plans for the Tier 1 section; weekly schedules to examine time allotted for instruction and classroom observation data for the classroom environment section; and sample intervention programs for the Tier 2 section. After reviewing these student screening data and permanent products, each of the core components was scored on a 3-point scale, where 3 represented “Strong evidence/No need to problem solve,” 2 “Mixed or inconsistent evidence/Possible area for problem solving,” and 1 represented “No evidence/An area in need of problem solving.” Many of the LAP-G items included a checklist of essential elements that were required to be evident for a score of a 3 for that core component. The number of possible points for each component was based on the number of items and not the number of essential elements. In the Tier 1 section, the number of possible points for the core components was: Screening (33 points), Instructional Materials by Essential Component – Core and Supplemental (18 points), Implementation of Tier 1 Instruction (18 points), and Differentiated Instruction (21 points). The number of possible points for the Tier 2 section by core component was as follows: Assessments—Intervention-based Diagnostics and Progress Monitoring (30 points), Instructional Materials for Each Intervention Program (12 points), and General Considerations: Effective Intervention Design, Professional Development, Implementation Checks (64 points). Only the Tier 1 and Tier 2 sections had been completed by the teacher teams during the period of time reported on in this study.

Design and Procedures

The evaluation of the PARTNERS Project used descriptive research methods. The LAP-G was completed annually in April/May during the first three years of the project. The results were calculated as the percentage of possible points earned for each of the core components in the Tier 1 section and the Tier 2 sections.

Results

The results of this process evaluation indicate that the PARTNERS Project increased teachers’ capacity to implement key MTSS practices pertaining to reading. Over the course of the first three years of the PARTNERS Project, implementation gains were made at Tier 1 and Tier 2, as measured by the LAP-G.

At Loweland School, Tier 1: Screening remained stable from Year 1 to Year 2 at 81.8% and increased to 87.9% in Year 3. Instructional Materials by Essential Component increased from 63.9% in Year 1 to 83.3% in Years 2 and 3. Implementation of Tier 1 Instruction was more variable with a decrease from 94.4% in Year 1 to 77.8% in Year 2 before a slight increase to 88.9% in Year 3. This dip is attributed to greater teacher understanding of the components of the science of reading leading to more accurate rating of practices in Year 2 relative to Year 1, as well as a possible implementation dip while learning the newly adopted phonics program. Differentiated Instruction saw a steady increase from 42.9% in Year 1 to 61.9% in Year 2 and 69.0% in Year 3.

The Tier 2 section was completed in Years 2 and 3 of the PARTNERS Project at Loweland. Tier 2 Assessments increased from 70.4% in Year 2 to 92.6% in Year 3. Instructional Materials for Each Intervention Program remained stable at 83.3% both years. General Considerations/Implementation increased from 59.5% to 72.6%. Loweland School’s results on the annual administration of the LAP-G are presented graphically in Figure 1.

The K-1 teacher team at St. Mark School demonstrated marked gains in Tier 1. Tier 1 Screening increased sharply from 39.4% in Year 1 to 100% in Years 2 and 3. Instructional Materials by Essential Component increased from 33.3% in Year 1 to 86.1% in Year 2 to 100% in Year 3. Implementation of Tier 1 Instruction increased sharply from 44.4% in Year 1 to 100% in Years 2 and 3. Differentiated Instruction increased from 61.9% in Year 1 to 100% in Years 2 and 3.

The K1 teacher team at St. Mark School also demonstrated marked gains in Tier 2 evidence-based literacy practices. Tier 2 Assessments increased sharply from 33.3% in Year 1 to 100% in Years 2 and 3. Instructional Materials for Each Intervention Program increased from 33.3% in Year 1 to 91.7% in Year 2 to 100% in Year 3. General Considerations/Implementation increased sharply from 21.9% in Year 1 to 96.9% in Year 2 to 100% in Year 3. Through this process, the K1 team focused on increasing the coordination of the interventionists delivering a seamless continuum of supports. St. Mark School’s K1 teacher team results are presented graphically in Figure 2.

The Grade 23 teacher team at St. Mark School demonstrated similarly positive gains on the annual administration of the LAP-G at Tier 1. Tier 1 Screening increased sharply from 39.4% in Year 1 to 100% in Years 2 and 3. Instructional Materials by Essential Component increased from 33.3% in Year 1 to 72.2% in Years 2 and 3. Implementation of Tier 1 Instruction increased from 50.0% in Year 1 to 72.2% in Year 2 to 83.3% in Year 3. Differentiated Instruction increased sharply from 52.4% in Year 1 to 100% in Years 2 and 3.

The Grade 23 teacher team at St. Mark School also demonstrated marked gains in Tier 2 evidence-based literacy practices. Tier 2 Assessments increased sharply from 33.3% in Year 1 to 100% in Years 2 and 3. Instructional Materials for Each Intervention Program increased from 33.3% in Year 1 to 91.7% in Years 2 to 100% in Year 3. General Considerations/Implementation increased sharply from 21.9% in Year 1 to 96.9% in Year 2 to 100% in Year 3. St. Mark School’s Grade 23 teacher team results are presented graphically in Figure 3.

Through this process at St. Mark School, a screening system was installed, the core instruction program at K1 was determined to be focused on skills that were too advanced given the students’ actual instructional levels, LETRS training was initiated to increase their knowledge and skills in the science of reading, and both teams focused on small-group differentiated instruction. Mid-year changes in Year 1 were made to the supplemental instructional program at K1 when the Acadience benchmark data indicated a stronger focus on phonics was needed.

Qualitative data regarding teachers’ perceptions of the acceptability of the PARTNERS Project and the LAP-G process were gathered from 100% of the teachers on St. Mark’s K1 and Grades 23 teams at the end of their first year of implementation. Teachers’ comments provide additional evidence to support the contribution of the PARTNERS Project on increasing teachers’ capacity to implement key MTSS practices pertaining to reading:

  • I have high hopes that the LAP-G will do everything that is listed above. I am excited about the progress that has been made this year with the Superkids curriculum and am looking forward to more progress next year!

  • It provided a systematic way to look at strengths and needs.

  • Still learning about the LAP-G, but I think it's a wonderful system and learning tool to help promote early literacy inside the classroom and provide the teachers with extra support to see areas of needs.

The teachers’ comments provided at the end of the first year of implementation also highlight the teachers’ recognition that they are still developing their skill fluency and require the scaffolded support of their PARTNERS Project consultant and the sustained investment of their school’s administration.

  • I believe the LAP-G does a great job at pin-pointing areas of need, which makes it easier to plan ahead and supplement those areas for improvement. However, without the guidance of [PARTNERS Project consultant], I'm not sure how successful I'd be filling it out on my own.

  • I hope the administration will be able to work with the PARTNERS work to provide instructional materials & training needed to continue the positive direction PARTNERS & our teachers have taken this year!

Discussion

One only needs to consider the decades of substandard reading outcomes to recognize that we have a national reading crisis. Given that research shows upwards of 95% of all students are capable of becoming proficient readers (Foorman et al., 2003; Simos et al., 2002; Torgesen, 2007), low rates of reading proficiency are indicative of an inadequate core curriculum, instruction, and tiered supports to target student needs. The travesty of instructional casualties is particularly devastating among educationally marginalized students. Whereas children from affluent families can pursue private tutoring to compensate for weaknesses in early literacy instruction, families with fewer financial resources often find their children falling farther behind due to unmet instructional needs.

Advances in the science of reading have produced a robust evidence base for effective curriculum and instruction. MTSS provides a framework within which literacy instruction aligned with the science of reading can be provided to address the needs of students proactively. Fidelity of implementation is essential to realizing the potential of evidence-based instruction delivered within an MTSS framework. However, schools often lack the capacity to implement research-based practices with fidelity. The PARTNERS Project was designed to build the capacity of teachers to strengthen the core reading curriculum, instruction, and tiered intervention supports for students in kindergarten through second grade.

As presented in Table 8, the PARTNERS Project’s training and coaching supports varied from school to school to meet the unique needs, opportunities, and challenges at each school. A highly engaged and effective principal at St. Mark School committed time and energy to drive PARTNERS Project implementation. As a result, the K1 teacher team was able to strengthen phonics instruction at Tier 1 and Tier 2 in the first year of implementation, which enabled them to focus on writing instruction, comprehension, and Tier 3 intervention in the second year of implementation. The lack of continuity of effective, invested leadership at Loweland Elementary created challenges in fostering engagement among the teachers in the PARTNERS Project. These varied experiences of the PARTNERS Project highlight the importance of leadership as a driver of meaningful systems change.

The results of this study provide evidence that a problem-solving process focused on evaluating the core curriculum and instruction and intervention supports based on the science of reading and delivered within an MTSS framework can increase the capacity of teacher teams to implement evidence-based literacy practices. Evidence of the effectiveness of the LAP-G problem-solving process was demonstrated at two elementary schools serving educationally marginalized students. Anecdotal evidence based on the observations of the PARTNERS consultants indicates that teachers gained the knowledge and skills to analyze and improve their reading instruction. Classroom observations showed teachers using the targeted instructional practices, for example replacing word walls with sound walls, directing students to sound out known words instead of guessing, prompting students to retrieve previously learned skills and knowledge such as why a given word (e.g., white) has a long i sound (recalling the “silent e” or “magic e” rule—e makes the vowel say its name). Teachers demonstrated increased knowledge and intention in the questions they would ask, such as why a certain word appeared in the curriculum before the sounds had been taught. Finally, the principal of St. Mark School commented that the teachers are now quick to provide the needed instructional support to help students when the learner first shows signs of struggling, whereas before the PARTNERS Project, special education referral and retention seemed like the only ways of dealing with students who were falling behind.

The findings from this process evaluation are consistent with advances in implementation science, which emphasizes training and coaching as crucial for supporting high-fidelity practices, in sharp contrast to the flawed “train and hope” (Stokes & Baer, 1977) approach to professional learning. This study extends the research literature by describing the problem-solving process (focused on evaluating weaknesses in the core curriculum and instruction) and professional learning supports needed to build the capacity for teachers to improve Tier 1 instruction to improve outcomes and prevent instructional casualties.

Limitations of the Study

Several limitations need to be considered when interpreting the results of this study. First, this evaluation employed descriptive research methods to show changes in teachers’ capacity to provide scientifically based reading instruction and intervention based on only one measure, the LAP-G, used with teachers at two schools. Additional research is needed to triangulate these findings and validate the impact of the PARTNERS Project on teachers’ literacy instructional practices and the impact on student reading outcomes across a larger sample of schools.

A second limitation of the evaluative study was the lack of a research-validated instrument for conducting classroom observations of teachers’ instructional practices. As part of the PARTNERS Project, PARTNERS Project consultants observed teacher instruction and assessed implementation fidelity using a fidelity checklist that was developed collaboratively with the teachers based on the targeted instructional practices identified through the LAP-G problem-solving process and focused on in the coaching cycle. In addition, interobserver agreement should have been measured to determine the reliability of the observation data collection. The lack of evidence regarding the reliability and validity of the classroom observation is a limitation of this study.

As a third limitation, in the 3rd year of implementation the PARTNERS Project focused only on strengthening the core reading curriculum and instruction (Tier 1) and targeted intervention (Tier 2). In the 4th year of the project, the focus will extend to intensive intervention (Tier 3). Thus, the outcomes reported represent changes that focused on Tiers 1 and 2 only, an incomplete application of the PARTNERS Project.

A final limitation of the study was the historical threat to validity of the COVID-19 pandemic. The first 2 full years of PARTNERS Project implementation (20202021 and 20212022) coincided with significant disruptions to teaching and learning, created by the public health crisis. The demands of physical distancing, health insecurity, financial hardship, and dramatically reduced access for families to school-based instruction, specialized instruction and behavioral health services, and social supports (i.e., school lunch, after-school care) created unprecedented challenges for school communities (Schaffer et al., 2021). The positive outcomes attained in 20212022 through the implementation of the PARTNERS Project focused on Tier 1 instruction are all the more noteworthy for having been achieved during the latter part of the pandemic. Future research should examine the effectiveness of the PARTNERS Project, or a similar initiative, to improve reading instruction in postpandemic conditions.

Implications of the Study

The PARTNERS Project does not assert a single curriculum or instructional program, if selected, will meet the needs of all learners. Just as a functional assessment is needed to develop a hypothesis regarding how and why an individual student is struggling to read (Daly et al., 2005, 2006), an analysis of the instructional environment and resulting learning outcomes is needed to determine specific evidence-based literacy practices to be targeted for improvement. In essence, the LAP-G problem-solving process serves as an autopsy examining the fatal flaws in the core curriculum and instruction and tiered system of interventions resulting in mass instructional casualties.

The PARTNERS Project engages teams of teachers in a problem-solving, data-driven process whereby they evaluate their own instructional program and identify areas of needed improvement to align with the science of reading. With considerable training and coaching, teacher teams can be equipped to shift their efforts to correcting deficits in instruction (rather than exclusively focusing on remediating academic skill deficits in students). The results of this study have significant implications for school districts and state departments of education urgently seeking to align the core curriculum and instruction with the science of reading and prevent instructional casualties among our most educationally marginalized students.