Introduction

Children’s holistic well-being is at the forefront of conversations and policy development in the New Zealand education and political landscape. ‘Learning and developing’ is one of the core principles of the world-leading Child Wellbeing Strategy, and includes the aspiration that children are “positively engaged with, and, progressing and achieving in education” (Department of the Prime Minister and Cabinet, 2019). However, data on children’s educational achievement in New Zealand, particularly in literacy, suggest there are significant challenges to achieving this aspiration; a shift in practice needs to occur to meet the learning needs of all learners, including Māori and Pasifika children, children from low socioeconomic areas, children with disabilities and boys (PIRLS, Ministry of Education, 2017; McNaughton, 2020).

A recent classroom-based intervention attempted to address the needs of all learners in an integrated classroom-based literacy intervention within a cohort of children from low socioeconomic communities who had low levels of oral language and phonological awareness development at school entry (both significant predictors of literacy outcomes; Gillon, 2017). The Better Start Literacy Approach (BSLA) is a 10-week, Tier 1 classroom literacy approach that integrates activities focused on letter-sound knowledge, phonological awareness and vocabulary development within the context of classroom-based literacy instruction (Gillon et al., 2019). Results indicated this intervention was effective at accelerating children’s development (including Māori and Pasifika children, and boys) in these key literacy skills. The context for this study was one of high need and at a pilot phase of the approach’s development, thus a high level of professional development and researcher support was required—on average 12 h per teacher over the 10 week intervention. This support included professional development workshops, online learning support and in-class modelling and coaching of the intervention lessons. In addition, all assessment of children in the research cohort was provided by the research team—time not captured by the 12 h per week reported by the study. Further exploration is required to determine if similar levels of change can occur in communities with lower levels of need and less intensive support from researchers. To establish fundamental, integrated change in classroom practice, a sustainable model of teacher-led assessment and intervention implementation warrants attention.

Linguistically Diverse Learners

Ethnic (and therefore, linguistic) diversity in New Zealand schools is ever increasing (ero.govt.nz, 2018). This diversity adds immeasurably to the richness of a school, however, it also presents as a challenge to teachers attempting to implement effective teaching methods in mainstream English classrooms for children with a wide range of English language skill. In particular, teaching English literacy to children with limited levels of English language is a specialised skill more appropriately aligned with the expertise of trained English as Second or Other Language (ESOL) teachers, but one that often falls within the domain of classroom teachers with less specialist skills in this area. Further, the importance of maintaining connection to one’s home culture, and continuing to strengthen children’s first languages, cannot be omitted within the focus of attaining English literacy.

The New Zealand Ministry of Education (MoE) present seven principles to support academic and language progress in all curriculum areas for linguistically diverse students (Ministry of Education, n.d.). Table 1 provides an overview of the seven principles, some prompt questions for these, and how the BSLA integrates these principles into its implementation.

Table 1 Principles for supporting ESOL learners as it relates to BSLA

Existing research into the BSLA demonstrates equally positive impact for Māori and Pasifika learners (Gillon et al., 2019), but detailed analysis of the impact of this classroom-based literacy intervention for ESOL learners has not yet been explored. With the obvious alignment to principles of teaching for these learners, an exploration of this aspect is warranted.

The Current Study

The current study considers how teacher-led implementation of an evidence-based approach to classroom literacy teaching may be facilitated for a range of different learners, with a particular focus on those who are linguistically diverse. The intervention (Gillon et al., 2019) has been shown to be effective at accelerating children’s early literacy skills when implementation was strongly driven by the research team. The current study aims to further explore the impact of this evidence-based approach, including a focus on learners from diverse linguistic backgrounds.

The following research questions will be investigated:

Is a teacher-led implementation of an evidence-based approach to early literacy instruction more effective than a control group engaging in standard classroom practice?

What is the impact of the intervention for learners with diverse linguistic backgrounds?

How effective is a teacher-led implementation of the approach compared to a previous, researcher-led implementation of the same intervention?

Method

Participants

Two schools in Christchurch, New Zealand, self-selected to participate in the study, following the success of the earlier controlled intervention study (Gillon et al., 2019). Across the two schools, a total of 93 children across 8 classrooms participated in the study (School A, n = 32; School B, n = 61). The children were all in their first year of school (mean age: 64.1 months, SD = 3.4 months; 40 males, 53 females). Of these 93 students, one was described as having high learning needs. The ethnicity of this cohort was New Zealand European (43%), Māori (8%), Pasifika (2%), Asian (26%) and Other (19%). There were 19 children (i.e., 20%) who had English as a second language.

A non-matched control group was employed, to establish the impact of the intervention compared to normal classroom practice. The control group consisted of 73 children in their first year of school (mean age: 64.3 months, SD: 3.0 months; 37 males, 36 females), who were randomly allocated to receive the intervention following a term of monitoring the impact of normal classroom practice on their development of early literacy skills. Of these 73 students, 42% were NZ European, 27% were Māori, 12% were Pasifika, 9% were of Other ethnicity, and ethnicity information was not recorded for 10% of the cohort. There were 11 students (i.e., 15%) who had English as a second language.

In New Zealand, an accepted way of characterising socioeconomic profiles is the New Zealand Index of Deprivation (NZDep; Atkinson et al., 2019). This measure is based on nine census variables and assigns levels of deprivation to small geographic areas, displayed as a decile system. A decile of 1 represents areas with the least deprivation whereas a decile of 10 represents areas with the most deprivation. The NZDep index of one school in the research cohort of this study was 1, and the other was 6—indicating a spread of levels of deprivation between the two schools. The control cohort were drawn from a lower socioeconomic community, with students attending schools ranked between 8 and 10 on the NZDep index.

Procedure

A pre-test post-test research design with non-matched control group was used to evaluate the effectiveness of the intervention for children’s early literacy and oral language development.

Assessment Tasks

The following assessments tasks were used to measure progress in children’s early literacy and oral language development:

  1. 1.

    Phonological awareness and letter-sound knowledge was evaluated using the Computer Based Phonological Awareness Assessment Tool (CBPAT; Carson et al., 2011). This tool has previously been used to evaluate phonological awareness development in children aged 4–6 years of age. The CBPAT has good psychometric properties. Test–retest reliability coefficients are 0.70 or above for all subtests (Carson et al., 2011). Children completed three subtests suitable for evaluating phonological awareness in this age group. Raw scores were collected for each subtest. The first three subtests were amalgamated into a combined phonological awareness score. The subtests were as follows:

    1. a.

      Initial phoneme identity: Children were asked to identify which of the three images starts with the target sound, e.g., “Dog likes words that start with the /d/ sound. Let’s see what words Dog likes. Moon, duck, whale”. The total possible score was 10.

    2. b.

      Phoneme segmentation: Children were asked to identify the number of sounds in the target word, e.g., “How many sounds do you hear in the word ‘flush’”. The total possible score was 18, with a discontinuation rule applied after four consecutive errors.

    3. c.

      Phoneme blending: Children were asked to blend sounds together to form a word, and match it with the corresponding image, e.g., “This is a cake, a cape, and a ring, which word am I saying: c-ae-k”. The total possible score was 15, with a discontinuation rule applied after four consecutive errors.

    4. d.

      Letter-sound knowledge: Children were asked to identify which letter out of six options matched to the target sound, e.g., “Which letter makes the /m/ sound?” The total possible score was 18, with a discontinuation rule applied after four consecutive errors.

  2. 2.

    To evaluate non-word reading skills, children were asked to read 10 non-words (e.g., vab, sim, dup) (Calder, 1992). The total number of graphemes (out of 30) and words read correctly (out of 10) was collected.

  3. 3.

    To explore children’s response to the vocabulary teaching within the intervention, expressive vocabulary probes were utilised. The probes consisted of 20 ‘tier 2’ words taken from the story books utilised within the intervention. Ten of the words were systematically elaborated during the shared reading (i.e., teachers provided a definition of the word and reinforced the use of the word in the context of the book). The remaining ten items were unelaborated (i.e., teachers read the words during the shared book reading, but did not provide any additional elaboration of the words). The probe asked children to “tell me what [item] means”, with further prompting provided by “tell me anything else about [item]”. Responses for the probe were recorded verbatim and later scored by a trained research assistant using the original protocol developed by Justice et al. (2005). A score of 2 was awarded for complete knowledge of the word, a score of 1 was awarded for incomplete knowledge of the word, and a score of 0 was awarded for no knowledge. The total possible score for this task was 40.

  4. 4.

    Children’s oral language was evaluated via a story retell task (Westerveld & Gillon, 2010). Children first listened to an audio recording of a story (Alice and the Suitcase), accompanied by related illustrations. Following the presentation of the story, children were then prompted to retell the story in their own words, using the pictures as prompts. Retells were then transcribed by research assistants, coded and evaluated using Systematic Analysis of Language Transcription software (Miller et al., 2012) for a number of language quality and quantity measures. The measures included were as follows:

    1. a.

      Number of utterances: the total number of utterances produced in the story retell task.

    2. b.

      Number of different words: the total number of different/unique words produced.

    3. c.

      Number of total words: the total number of words produced.

    4. d.

      Mean length of utterance—words: the average length of a child’s utterance, in words.

Professional Learning for Teachers and Intervention Content

The researcher-level support for this study comprised two professional development workshops, one focused on assessment and one focused on intervention. In addition to the professional development workshops, modelling and/or observation of a teaching session was offered within the first 4 weeks of the programme. Following 6 h of professional development, teachers implemented both the assessments and intervention, with less than 1.5 h of support each over the 10-week intervention period.

Assessment workshop: The lead researcher conducted a 3-h assessment workshop with all teachers and support staff (e.g., Teacher Aides/Learning Assistants) for each school participating in the research. The workshop started with an introduction to the evidence-based approach to literacy instruction, focused on systematic phonological awareness instruction and vocabulary elaborated during shared reading of high-quality children’s books. Following this introduction, the remaining workshop time (2.5 h) was focused on an introduction to the four assessment tasks each school would complete with participating children.

During the assessment workshop, teachers and support staff were introduced to each assessment task, provided with examples, and given the opportunity to practice the tasks and familiarise themselves with the required resources. A comprehensive assessment manual was also provided, to supplement and reinforce the workshop content. Following the assessment workshop, schools carried out pre-intervention assessment of all participating children. The approach to assessment varied between the two schools. One school provided release time to teachers to complete assessment of children; the other school used Learning Assistants to complete assessment. All tasks requiring manual scoring (e.g., expressive vocabulary and non-word reading) were scored by the research team.

For children in the control group, all assessment was completed by trained Speech-Language Therapists employed by the research team.

Intervention workshop: The week prior to beginning the classroom-based intervention, all participating teachers attended a 3 h professional development workshop focused on the teaching component of the study—the Better Start Literacy Approach (BSLA; Gillon et al., 2019). The BSLA is a 10-week classroom-based intervention targeting phonological awareness, letter-sound knowledge and vocabulary development within a child’s first year at school. The intervention consists of 4 × 30-min sessions per week for 10 weeks, and replaced the classroom’s typical literacy programme for this time. The structure of a lesson following the BSLA is as follows:

  1. 1.

    Shared reading of a high-quality children’s book, using systematic vocabulary elaboration. The book was read in totality twice a week and summarised twice a week. Elaboration of target vocabulary was integrated into each retell or summary, with a minimum exposure to each target word of four times per week (approximately 6–8 min in duration). Elaboration consisted of a child-friendly definition of the target word, and then use of the word in the content of the story. For example, ‘shelter is something that protects you. As in, the kiwi were safe in their shelter from the sun’.

  2. 2.

    Skill-building phonological awareness activities focused at the phoneme level, with integrated letter-sound knowledge teaching. These activities consisted of 3 short games targeting initial phoneme identity, phoneme segmentation and blending, and phoneme manipulation skills (approximately 10–15 min in duration).

  3. 3.

    The final activity in the lesson focused on linking phonological awareness knowledge to a reading and/or spelling activity, for example reading or writing short sentences (approximately 5–8 min in duration).

During the intervention workshop, teachers were introduced to the lesson structure, activities and resources, and given the opportunity to practice activities with their peers. Teachers were also shown demonstration videos and the online self-directed learning modules which complemented the workshop content. A comprehensive intervention manual was provided, outlining each lesson and reinforcing the learning from the workshop.

To support teacher learning and integration of the systematic phonological awareness and vocabulary instruction into the classroom literacy programme, lesson planning support was scaffolded throughout the 10-weeks of teaching. Initially, teachers were provided with detailed lesson plans, books and activity resources (for weeks 1–8). For weeks 2–8, teachers were required to plan the fourth lesson of each week themselves, following the structure of the lesson and the targets words and phonemes for each week. During this self-planned lesson, teachers could choose to repeat preferred activities from the week or develop their own activities that integrated the week’s targets. The final 2 weeks of the 10 weeks of teaching were completely planned by teachers.

Within the first 4 weeks of BSLA teaching, all teachers were offered the opportunity to observe modelling of a lesson within their own classroom by the lead researcher and/or receive feedback following observation of their own teaching of a lesson. Of the eight teachers involved, 50% took up the offer of observing a modelled lesson.

Results

To complete baseline comparisons between research and control groups, one-way analyses of variance (ANOVAs) were run for each of the 10 dependent variables. Table 2 presents these results, showing significant differences between the research and control groups at baseline on some measures of early literacy and language skills. The research group scored significantly higher on phonological awareness, letter-sound knowledge, both non-word reading variables (graphemes and words), and elaborated vocabulary. These differences were accounted for in subsequent analyses. There were no differences between the groups at baseline on unelaborated vocabulary or any of the oral language measures.

Table 2 Baseline comparisons between research and control groups

Effectiveness of Evidence-Based Early Literacy Instruction

The first research question aimed to determine whether an evidence-based approach to early literacy instruction is more effective than standard classroom literacy practice. To do this, a repeated measures general linear models (GLM) was used, to compare pre- and post-test scores between the research and control groups. We also compared whether the pattern of results differed for children who were linguistically diverse compared to children who were not linguistically diverse.

Using a repeated measures design provided the ability to test whether the rate of change is different between the two groups (i.e., whether one group shows greater growth than the other), and it is not of particular concern that the groups are starting out at different levels at pre-test (Tabachnick & Fidell, 2001). Further, because the research and control groups attended schools from different socioeconomic communities, socioeconomic deprivation scores were included as a covariate in all models.

Repeated measures models were run for all 10 dependent variables: phonological awareness, letter-sound knowledge, non-word reading (words correct and graphemes correct), vocabulary (elaborated and unelaborated), and oral language (total number of utterances, number of different words, number of total words, and mean length of utterance). All models included the within-subjects factor of time (pre-test, post-test), between subjects-factors of group (research, control) and linguistic diversity (linguistically diverse, not linguistically diverse), and socio-economic deprivation as a covariate. The first results presented focus on understanding the differences in growth over time between research and control groups.

Table 3 provides the tests of Time*Group interaction terms from each of the repeated measures analyses. A significant interaction indicates that the pattern of growth between pre-test and post-test differed significantly between the research and control groups.

Table 3 Results for Time*Group interactions

The results in Table 3 indicate a significantly different pattern of growth for the research and control groups on a number of measures: phonological awareness, non-wording reading (both graphemes and words), vocabulary (both elaborated and unelaborated), and the total number of utterances and words in the oral language task. The two groups did not show significantly different growth in letter-sound knowledge and the other three oral language variables (number of different words, mean length of utterance, and intelligibility).

Inspection of the estimated marginal means indicated that all significant interactions resulted from significantly greater growth by the research group than the control group (see Table 4).

Table 4 Mean change in scores from pre-test to post-test for research and control groups

Figures 1 and 2 show the growth in phonological awareness and non-word reading (graphemes), respectively, for the research and control groups. The steeper slopes for the research group as compared to the control group indicate accelerated growth in children who received the intervention.

Fig. 1
figure 1

Growth in phonological awareness for research group and control group

Fig. 2
figure 2

Growth in non-word reading (graphemes) for research group and control group

Impact of Teaching on Linguistically Diverse Learners

The second research question explored whether response to the evidence-based classroom literacy instruction would differ for children who were linguistically diverse learners when compared to their non-linguistically diverse peers. Within the research group, 18 children were identified as linguistically diverse (19%) compared to 11 children in the control group (15%).

First, the main effect of linguistically diversity in the repeated measures models described above was examined. Table 5 provides the estimated marginal means as well as the test of the main effect.

Table 5 Results for main effect of linguistic diversity

Table 5 shows that linguistically diverse children differed on a number of measures as compared to the other children in the sample. Specifically, they scored significantly lower on measures of vocabulary (both elaborated and unelaborated) and all measures of oral language except for intelligibility.

To understand how linguistically diverse children responded to the teacher-led intervention, as compared to other children, we examined the Time*Group*LinguisticallyDiverse interaction term in our repeated measures models. This effect would be significant if the pattern of growth in the research and control groups differed based on whether or not the child was linguistically diverse. Tests of the interaction effect are presented in Table 6.

Table 6 Results for the Time*Group*LinguisticallyDiverse interactions

As shown in Table 6, there were no significant differences in the growth patterns for the research and control groups based on whether or not children were linguistically diverse. This means that the research intervention was equally effective for children who were linguistically diverse as for other children in the sample. Figure 3 demonstrates the growth pattern for elaborated vocabulary by group and by linguistic diversity. It shows that although linguistically diverse children start out with lower scores, they have similar rates of growth (slopes) in both the research and control group as compared to non-linguistically diverse children, with children in the research group showing accelerated growth.

Fig. 3
figure 3

Growth in elaborated vocabulary by research group and linguistic diversity

Effectiveness of Teacher-Led Implementation—Comparison to Gillon et al. (2019)

The third research question compared how effective the current model of implementation, which involved reduced researcher support was, compared to a previous implementation of the same intervention with more substantial researcher support. To answer this question effect sizes were compared between the current sample and a published report of the previous implementation (Gillon et al., 2019). Gillon et al. report Cohen’s d calculations between their research and control groups at 3 separate time points. To complete the comparison, the highest effect size reported for each measure was selected for comparison. Due to the pre-existing differences at baseline in the present sample, the partial eta-squared (ηp2) for the Time*Group interaction terms was used and converted to Cohen’s d (Cohen, 1988). This ensured the effect size values for the present sample were not inflated by pre-existing differences in the research and control groups.

A comparison could then be made against the Cohen’s d effects reported by Gillon et al., (2019) for phonological awareness, non-word reading (graphemes only), and elaborated and unelaborated vocabulary. The oral language measures were not included in the Gillon et al. publication and therefore are not included in this comparison. Effect sizes from both samples are presented in Table 7.

Table 7 Comparison of effect sizes between implementation approaches

The findings show effect sizes were comparable across the two samples, with slightly higher effect sizes in the current sample for phonological awareness, non-word reading (graphemes), and unelaborated vocabulary.

Gillon et al. (2019) also report the mean changes over time for phonological awareness, letter-sound knowledge, elaborated vocabulary, and unelaborated vocabulary. Changes were compared from Time 1 to Time 2 in the Gillon et al. report to the mean changes in the current sample in Table 8.

Table 8 Comparison of mean changes between implementation approaches

There was a similar pattern of change between the current research sample and the Gillon et al., sample, although slightly more growth was seen in this study’s research sample for measures other than letter-sound knowledge.

Discussion

This study investigated the impact of an evidence-based class-level literacy intervention for beginner readers, on the development of key foundational learning skills. It proposed three research questions, which will be addressed consecutively below.

Children’s Response to The Classroom Literacy Intervention

Analysis using repeated measures general linear models showed the Better Start Literacy Approach was effective at improving children’s phonological awareness, non-word reading skills at the grapheme and word level, vocabulary and total utterances and total words produced in the oral narrative measure. No significant difference was found in letter-sound knowledge and the remaining measures of oral language, namely MLU-words, number of different words and intelligibility. The finding that there was no effect on letter-sound knowledge in response to the implementation of the BSLA is a replication of the Gillon, 2019 study. This reinforces the finding that the ‘typical classroom practice’ the control group engaged in, which in this case included some phonics instruction, is effective at growing children’s knowledge of letters and sounds, but does not carry over into more complex phonological awareness activities. With phonological awareness being one of the strongest predictors of latest literacy success (Russell et al., 2018), classroom interventions that effectively develop these skills are of paramount importance to addressing literacy inequalities.

The growth in non-word reading for the research group is particularly note-worthy. For this task, children in the control group, engaging in ‘typical classroom practice’ made almost no growth in this skill during the 10-week period. This suggests that despite making good gains in letter-sound knowledge, and some growth in phonological awareness, they had not yet learnt to transfer these skills to the reading context. This is in stark contrast to the research group, whose slope and mean scores indicate considerable growth in their ability to apply their phonological awareness and letter-sound knowledge to reading non-words.

The positive impact of the classroom intervention on children’s critical early literacy skills is promising. Much of the older literature in this area has focused on the effectiveness of such interventions at a small-group or individual level, outside of the context of the classroom environment (Ehri et al., 2001; Gillon, 2000), and there is consensus that phonological awareness training in this manner is effacacious for early literacy development. There is some disagreement within the literature as to whether ‘pull-out’ models are best for children with increased literacy learning difficulties. Some suggest the literature is “inadequate” (Cirrin et al., 2010) and others report a benefit to collaborative classroom-based services over a pull-out model for some early literacy skills (Archibald, 2017; McGinty & Justice, 2006). Regardless, a significant, systemic change to children’s literacy achievement as is called for in McNaughton (2020), requires a shift in classroom-based or Tier 1 literacy teaching.

An implementation science view of interventions requires the transfer of known, efficacious interventions into ‘real-world’ environments, such as the classroom, with minimal researcher input. While the Better Start Literacy Approach had been successfully piloted, the conditions under which it was successful (e.g., high level of researcher support) are not sustainable in the long term. When interventions are similarly or equally effective when implemented in a ‘real-world’ setting, further confidence can be given in the robustness of the approach for supporting children’s learning. This study provides additional support for the effectiveness of classroom-level teaching of foundational, critical early literacy skills (Carson et al., 2013; Gillon et al., 2019, 2020).

One limitation of this research is the control group that was used. Children in the control group came from schools within low socioeconomic communities while children in the research group were drawn from higher socioeconomic communities. This is likely the reason that baseline differences were found in a number of literacy and language measures, with children in the research group starting out with higher scores on these measures (Lee & Otaiba, 2015). However, the statistical analysis of the data focused on the slopes, or rate of change, in these measures, which means that pre-existing differences are not of particular concern. Even though the research group started out with higher scores, they still showed significantly greater growth following the intervention than the control group did over the same time period. Given the control group had a much greater scope for growth, the strong differences on post-intervention measures are impressive. Despite this, further research will need to explore the impact of teacher-led implementation in communities with higher levels of challenge.

Impact for Linguistically Diverse Learners

A further factor explored by the current study is the impact of the BSLA for children from linguistically diverse backgrounds. Anecdotal comments from teachers during the project suggested the positive impact of explicit, deliberate teaching of the English alphabet system, along with opportunity for systematic, repeated vocabulary learning integrated into the teaching for ESOL students. In addition, the clear alignment of the key elements of the BSLA with principles of teaching ESOL learners (Ministry of Education, n.d) suggested a need to systematically explore the impact for this group. Analysis of this subset of linguistically diverse learners indicated the Better Start Literacy Approach was equally effective for linguistically diverse children as for all other children in the sample; that is, the pattern of accelerated growth following intervention did not differ significantly between children based on linguistic diversity.

Given their status as ESOL students, it is unsurprising that the linguistically diverse cohort had lower scores on vocabulary knowledge and oral narrative measures than their non-linguistically diverse peers at the baseline assessment point. However, their comparable scores on letter-sound knowledge, phonological awareness and non-word reading suggests that they had sufficient understand of English to engage with these assessment tasks (and therefore presumably, the classroom teaching), and, reinforces the knowledge that phonological awareness and reading skills transfer between languages. Monitoring the growth of linguistically diverse students in both phonological awareness and oral language is important, as these skills in language minority children have been shown to predict their future reading performance (Lindsey et al., 2003). It is positive to see that despite these differences at baseline, linguistically diverse students responded similarly to the BSLA teaching.

While these results provide some support for effectiveness with linguistically diverse children, it is also important to note the small number of linguistically diverse children in the sample (17%, or 29 children) and further research with a more robust sample is required.

Comparison with Gillon et al. (2019)

The third research question compared the impact of BSLA teaching in the current study, with an existing published pilot study (Gillon et al., 2019). It is worth noting first, that the children in the current study were both qualitatively and quantitatively different from those in the original Gillon et al. (2019) study. The context of the original research was significantly more challenging—it was undertaken in post-earthquake Christchurch, in communities most heavily impacted by the lasting effects of this devastating event. The impact of this on children, whānau (family) and teachers working in this community cannot be underestimated, and the challenges of that cohort are described extensively within the Gillon et al. 2019 article. The cohort of the current project were children from socioeconomically higher backgrounds and predominantly less impacted by the events of the earthquake and this must be taken into consideration when comparing effect sizes.

For three of four variables compared, slightly higher effect sizes were found in the present sample compared to a previously published implementation of the same approach that included more substantial support from researchers (Gillon et al., 2019). While this could indicate that a reduced support model of implementation is more effective, it is also important to note that the previous implementation included only children who entered school with lower levels of oral language ability. Thus, the current implementation approach could have been more effective because the children in the research group were not selected based on oral language abilities. The research cohort in the current study also displayed a similar growth in letter-sound knowledge as the Gillon et al., comparison cohort, with no additional benefit observed as a result of the BSLA teaching when compared to typical classroom practice. This adds further strength to the argument that BSLA, even with reduced levels of implementation support, is as effective at teaching letter-sound knowledge as other classroom-based approaches, but provides additional added benefits to more complex skills such as phoneme awareness and non-word reading.

The results of this comparison provide evidence that the Better Start Literacy Approach is effective for a wide range of children, not just those starting school with low levels of oral language. Further research should look to explore the impact of this teaching on diverse populations of children from a range of socioeconomic backgrounds, ethnicities and language competency.

Conclusion

This study adds to existing evidence that classroom-based literacy instruction with a focus on quality phonological awareness and vocabulary teaching positively impacts children’s development of critical foundational literacy skills (Carson et al., 2013; Gillon et al., 2019, 2020). The findings of this study indicate effective implementation of this approach can be conducted with minimal on-the-ground support from researchers, further progressing the research into Better Start Literacy Approach down the implementation pathway. Future research would benefit from further exploring the implementation of this effective intervention on a wider scale, with the combined resource of online professional learning and development, face-to-face workshops, and within-community coaching and mentoring support for teachers.