Introduction

Continuous advancements in digital technologies alongside a growing range of complex societal problems (e.g. climate change and growing inequality) are predicted to cause rapid and unprecedented changes in our lifetime to both the nature of work and how we live (van der Vlies, 2020). In response, more than half of the Organisation for Economic Cooperation and Development (OECD) countries have developed specific digital education goals (van der Vlies, 2020), with many nations attempting to achieve these through the introduction of Digital TechnologiesFootnote 1 learning into their curricula (OECD, 2018). In 2017, New Zealand followed suit, introducing the ‘Hangarau Matihiko’ (HM) curriculum content to Te Marautanga o Aotearoa (Indigenous Māori medium) and the ‘Digital Technologies’ (DT) curriculum content to the New Zealand curriculum (English medium) (Ministry of Education (MOE), 2017). These areas cover all school-aged students in Years 1–13 (students aged 5–17 years).

Falling under the broader Computer Science (CS) umbrella, DT aims to teach students the CS principles and programmes that make up the design of Information and Communications Technologies (ICT), enabling them to design and produce digital solutions in authentic ways (MOE, 2018). Computational thinking (CT) is a crucial part of DT, encompassing a broad set of skills and concepts. It is generally understood to be a ‘problem-solving process that includes formulating problems, logically organising data, representing data through abstractions, automating solutions, reflecting on the efficiencies of possible solutions and generalising and transferring this process to a variety of problems’ (Dong et al., 2019, p. 907). To ensure students develop DT aptitudes, teachers are expected to plan for twenty-first-century pedagogies such as digital literacy and problem-solving, and adopt and model these practices themselves (Battelle for Kids, 2019).

Yet, globally, implementation of these curricula has been low and inconsistent (Larke, 2019; Mertala, 2021; Roche, 2019). Researchers have found a multitude of barriers to teachers’ DT implementation, such as teachers’ limited experience with DT (Vivian et al., 2020), low levels of teacher and student digital literacy (van der Vlies, 2020), challenges adopting effective DT pedagogical approaches, teacher and student misconceptions in the learning area (Reinsfield, 2016) and a lack of agreement on basic concepts in the learning area (Garvin et al., 2019).

To boost teachers’ knowledge and implementation of DT curricula, a range of supports have been utilised, e.g. DT licences, certifications and professional learning and development (PLD), with PLD being used most extensively to support teachers’ knowledge and skill growth (Lim et al., 2020; Pargman et al., 2020). Yet the unique challenges teachers face implementing DT curricula mean the demands on DT PLD are different from PLD for other learning areas where the focus is mainly on knowledge growth. This research provides an empirical view of the impact of HM|DT PLD on teachers’ understanding and efficacy beliefs regarding the HM|DT curricula, aiding decision-making at both a macro (e.g. government and technology-related subject agencies) and micro (school leaders and teachers) level to support teachers’ implementation.

Literature review

The benefits of DT education are vast and seen at various levels. Students are provided with increased opportunities to develop problem-solving and communication skills by designing and producing solutions to issues that are meaningful to them (Reinsfield & Fox-Turnbull, 2020), resulting in heightened student engagement, motivation and attitudes (Mason & Rich, 2019). This has a flow-on effect on society as these students have the skill set that our future workforce requires, as well as the resilience and ability to learn new knowledge and skills as demands in the labour market change (European Commission/EACEA/Eurydice, 2019). Additional benefits to society are gained as using their problem-solving skills and DT knowledge, these citizens can develop effective and innovative solutions to complex ethical, environmental and economic issues utilising technology (OECD, 2019). Finally, economies are improved through the development of an innovative and creative workforce that can take advantage of economic and global opportunities and drive change (Heintz & Mannila, 2018).

Professional learning and development

The OECD defines professional development as any ‘activities that develop an individual’s skills, knowledge, expertise and other characteristics as a teacher’ (OECD, 2009, p. 49). Facilitation of DT PLD continues to be a focus in many countries given that teachers lack foundational knowledge and experience with DT content (Kong et al., 2020); it can be used to support teachers to upskill in relevant pedagogical approaches (Lim et al., 2020); it is believed by teachers to be the most significant influence on their attitude towards implementation (Zha et al., 2020), and there is a range of misconceptions by teachers about DT that require clarification (Reinsfield, 2016).

Effective DT PLD has been shown to involve active participants, provide time for discussion, reflection and connection to teachers’ contexts (Sentance & Humphreys, 2018), link to relevant pedagogical approaches (Kong et al., 2020) and provide opportunities for teachers to concurrently teach DT with their students (Rich et al., 2021; Saxena & Chiu, 2022). Claims have been made that due to the unique challenges teachers face implementing DT (particularly requirements for teachers to be digitally literate and adopt twenty-first-century pedagogies), mandated one-size-fits-all PLD models do not meet teachers’ individual learning needs (Redmond et al., 2021) or self-efficacy levels (Mannila et al., 2018). Instead, new kinds of PLD programmes should be developed that are individualised to meet teachers’ specific DT goals (Celepkolu et al., 2020), build upon DT concepts and skills they already feel confident teaching (Bartholomew et al., 2022; Vivian et al., 2020), focus on areas in DT that teachers find difficult (Rich et al., 2021) and develop teachers’ ability to support other teachers’ DT implementation in their school (Heintz & Mannila, 2018).

Time to plan, upskill and teach DT

While the impact a lack of time has on teachers’ response to any change curriculum is not a new finding (e.g. Lindberg et al., 2017; Pak et al., 2020), it is of particular concern to the DT field given the need for teachers to familiarise themselves with the new technical learning area prior to being able to plan and teach it. Lindberg et al. (2017) categorises this into limited time to plan and prepare resources to teach DT, limited time to upskill in the curriculum and DT knowledge and limited time to teach DT within an already crowded curriculum.

Efficacy beliefs

Teachers’ beliefs about a given learning area are a well-researched subject given the influence beliefs have on motivation, behaviour, perseverance, resilience (Mannila et al., 2018), commitment to teaching and beliefs about their ability to affect student achievement (Rich et al., 2020). Many factors have been shown to impact efficacy beliefs, with time to upskill, plan and teach being raised in many studies as a major influence (Pak et al., 2020; Rich et al., 2021). The school environment and level of support teachers perceive they have from their school community have also been shown to have a large effect on DT efficacy beliefs, with resourcing in terms of ICT, PLD and time to upskill (Saxena & Chiu, 2022) and promotion of school DT visions (Brown, 2021) seen to contribute greatly to teachers’ efficacy beliefs and implementation. Recognising a gap in tools to quantify and measure primary and intermediate practising teachers’ efficacy towards CS and CT, Rich et al. (2020) developed the Teachers’ Beliefs about Computing and Computation Thinking instrument (TBaCCT). The TBaCCT considers efficacy as three separate constructs: value beliefs, self-efficacy beliefs and teaching efficacy beliefs.

Value beliefs

Seen to impact teachers’ implementation, value beliefs measure the importance that teachers place on a given area by asking them to state their level of agreement with a range of statements regarding the learning area (Rich et al., 2020). It has been shown to be an important component to consider when designing PLD to support teachers to overcome any misconceptions and instil them with the motivation to implement DT (Duncan et al., 2017).

Self-efficacy beliefs

Grounded in Bandura’s (1991) social cognitive theory, self-efficacy describes ‘one’s beliefs in his/her capabilities’ (Christensen & Knezek, 2020, p. 491). Positive experiences in teaching DT and support from parents, students, school leadership teams and colleagues have been shown to positively influence teachers’ self-efficacy (Bower et al., 2017). Determined through self-reported measures, self-efficacy is an important measure to consider when considering the impact of PLD, due to the correlations it has with teacher knowledge, confidence and beliefs about the learning area (Rich et al., 2021).

Teaching efficacy beliefs

Subsequently, teaching efficacy beliefs measure one’s beliefs in one’s capability to teach a given learning area or concept (Rich et al., 2020). Another useful measure for PLD providers, teaching efficacy beliefs, includes teachers’ perceptions of their content knowledge, pedagogical capabilities and planning skills to bring about desired learning outcomes (Rich et al., 2020).

Rich et al.’s (2020) efficacy beliefs model is a useful lens for CS research due to its ability to measure and consider each of the separate components of teachers’ efficacy that are seen to impact their implementation.

Digital technologies in New Zealand

In New Zealand, students in Years 1–10 (students aged 5–14 years) are exposed to HM|DT through cross-curricula learning experiences where HM|DT is taught in other curriculum areas, whereas students in Years 11–13 (students aged 15–17 years) are given the opportunity to specialise in HM|DT (Kellow, 2018). Both the HM (written in Te reo Māori) and DT (written in English) are written in general terms to allow schools to develop their own unique implementation of the content and empower teachers as decision-makers (Crow et al., 2019). From 2017, schools were able to apply for funding for the delivery of personalised, on-site training by accredited facilitators through either the Digital Fluency for Schools’ PLD (aimed to build teachers’ confidence and knowledge on how to use ICT to enhance learning) or the ‘HM|DT’ PLD (designed to build teachers’ knowledge of the new curriculum areas) (MOE, 2023). The MOE support package also included the Kia Takatū ā-Matihiko/Digital Readiness Programme, which consisted of online lessons and online or in-person workshops designed to upskill teachers to embed HM|DT in their teaching (MOE, 2023).

Current study

The literature presented shows the multifaceted benefits of DT curricula, yet there are still many areas surrounding teachers’ experiences with DT curricula that require further investigation to support faster adoption. The following research gaps have been identified: CT research with a focus on K-12 teachers (Dong et al., 2023), CS research that utilises Bandura’s (1991) social cognitive theory (Szabo & Sheard, 2022) and studies that investigate the long-term effect of teachers’ CT self-efficacy or compare in-person and online PLD (Love et al., 2022).

This study was designed to fill these identified research gaps by measuring the impact of three different PLD models on New Zealand primary and intermediate teachers (teachers of students aged between 5 and 12 years) HM|DT understanding and efficacy beliefs over a 6-month period. Firstly, it analysed three different types of PLD models (in-person, online-facilitated and online self-led), it had three data collection points (pre-PLD, post-PLD and 6 months post-PLD), and, finally, it utilised Bandura’s social cognitive theory (1991) framed about the components of Rich et al.’s (2020) validated TBaCCT instrument. The research questions guiding this study were as follows:

  1. 1.

    How does participation in professional learning and development models influence primary teachers’ understanding of the Hangarau Matihiko and Digital Technologies curricula?

    1. a.

      How do different professional learning and development models influence teachers’ Hangarau Matihiko and Digital Technologies understanding of the HM|DT curricula?

  2. 2.

    How does participation in professional learning and development models influence primary teachers’ Hangarau Matihiko and Digital Technologies value beliefs?

    1. a.

      How do different professional learning and development models influence teachers’ Hangarau Matihiko and Digital Technologies value beliefs?

  3. 3.

    How does participation in professional learning and development models influence primary teachers’ Hangarau Matihiko and Digital Technologies self-efficacy beliefs?

    1. a.

      How do different professional learning and development models influence teachers’ Hangarau Matihiko and Digital Technologies self-efficacy beliefs?

  4. 4.

    How does participation in professional learning and development models influence primary teachers’ Hangarau Matihiko and Digital Technologies teaching efficacy beliefs?

    1. a.

      How do different professional learning and development models influence teachers’ Hangarau Matihiko and Digital Technologies teaching efficacy beliefs?

The belief that teachers have the largest impact on student achievement means PLD is commonly used in education to upskill educators and support them through periods of change (Meissel et al., 2016). The introduction of the DT curriculum has been no different, with many countries investing in DT PLD (Bocconi et al., 2018). Yet the distinct differences between DT curricula and other learning areas and the multitude of internal and external barriers that teachers face when implementing DT mean the demands on DT PLD are quite different (Celepkolu et al., 2020). Through this research, the impact of PLD on teachers’ behaviours, attitudes, efficacy beliefs and challenges regarding the DT curricula will be better understood, aiding decisions to support teachers’ implementation.

Methodology

In designing this convergent parallel mixed-methods study, the researchers adopted two frameworks. Firstly, the Raranga Matihiko Kaiako framework (RMKF) (Raranga Matihiko, 2018) (Fig. 1) based on Howell’s (1982) conscious-competence model was used to illustrate the four stages that teachers progress through as they learn and master new HM|DT skills and knowledge.

Fig. 1
figure 1

Raranga Matihiko Kaiako framework. Raranga Matihiko Kaiako framework, by Raranga Matihiko (2018, p. 3). Copyright 2018 by Learning Media Limited

Secondly, Bandura’s (1991) theory of social learning was applied by measuring the impact of the PLD on teachers’ self-efficacy. Value beliefs, self-efficacy beliefs and teaching efficacy beliefs were considered by adapting the TBaCCT instrument (Rich et al., 2020) to the DT focus of this research. More information on how this instrument was used is provided in the ‘Data collection and questionnaire’ section.

Professional learning and development models

While the focus and content were similar between PLD models, each option had a distinctly different mode of delivery; hence, participants self-selected their PLD group.

In-person facilitated workshops

Designed and delivered by an MOE-accredited facilitator, this PLD option consisted of two 1-hour workshops delivered at a small primary school in New Plymouth, New Zealand. The focus of this PLD was on raising teachers’ knowledge and efficacy beliefs about the DT curriculum yet the concepts were transferrable to teachers implementing the HM curriculum also. The first session began by clarifying teacher misconceptions about DT, digital literacy and digital fluency in order to raise teachers’ value beliefs. To raise their DT self-efficacy beliefs, participants were then introduced to the two strands of the DT curricula (CT and Designing and Developing Digital Outcomes) with the facilitator unpacking significant terminology and answering any questions participants had. Following this, the teachers then undertook a range of digital and non-digital activities alongside the facilitator to teach skills such as offline coding, computational thinking, sorting, decomposition and the design thinking process. This approach of using activities which the participants could then replicate in their classroom was designed to raise both their self-efficacy and teaching efficacy beliefs. The second session followed much the same hands-on format designed to raise teachers’ efficacy beliefs but with a focus on looking into the progress outcomes listed in each strand related to student skill level and connecting DT to other learning areas. Additionally, each session allowed time for reflection and discussion between participants. Teachers were given online access to a range of CT and Designing and Developing Digital Outcomes activities developed by the facilitator to utilise and adapt in their own teaching programme. No collaboration between participants was initiated by the facilitator.

Online-facilitated workshops

Delivered by the same accredited facilitator as the in-person workshops, these two 1-h workshops were delivered online through Google Classrooms. They introduced participants to the same information and followed the same format as the in-person sessions. Teachers were asked to gather a range of objects prior to the session so that they could complete each of the hands-on digital and non-digital activities as the facilitator introduced each activity. These teachers were given online access to the same activities as the in-person group to utilise and adapt in their own teaching programme. No collaboration between participants was initiated by the facilitator.

Kia Takatū ā-Matihiko/digital readiness programme

From 2018 to 2020, the MOE funded the Kia Takatū ā-Matihiko/digital readiness programme. It was designed and delivered by education specialists Tātai–ho Rau—CORE Education to build teachers’ and school leaders’ confidence about implementing HM|DT. In its original format, the programme consisted of self-directed online pīkau (short online lessons) and online and in-person workshops designed to increase teachers’ efficacy beliefs and ability to embed HM|DT in their teaching. Both an HM and a DT course were developed, with each pīkau constructed on a different aspect of the curriculum. In 2020, funding for this programme ceased, yet many of the online resources were transferred to an open platform for educators to access. The pīkau consisted of explanations, videos and lesson ideas surrounding aspects of DT designed to increase teachers’ self-efficacy and teaching efficacy beliefs while showcasing the benefits of DT to their students. Participants who were selected to undertake this PLD model as part of this research project were able to work independently through the short online lessons focusing on the areas of HM or DT that they wanted to learn more about.

Data collection and questionnaire

A descriptive longitudinal survey design methodology collecting both qualitative and quantitative data was created based on a range of previously validated instruments from education researchers Dong et al. (2023), Rich et al. (2020), Sentance and Czizmadia (2017) and Shin et al. (2021).

The questionnaire (shown in Online Appendix A) started with collecting background information (e.g. year level taught, previous HM|DT PLD undertaken). To gain participants’ understanding of the HM|DT, the four stages of the RMKF were modified to first-person statements, and participants were asked to select which category (unaware, developing understandings, integrating, embedded) they belonged. To measure teachers’ efficacy beliefs, the researchers created the Teachers’ Beliefs about Digital Technologies (TBaDT) instrument (see Online Appendix A), following the approach originally used by Rich et al. (2020) when designing the TBaCCT instrument. A list of key skills was created using the primary/intermediate level progress outcomes from the HM|DT curricula, which was then compared with the questions in the TBaCCt. The TBaDT was created by adapting relevant original questions from a CT focus to an HM|DT one, removing questions that weren’t applicable to HM|DT and adding similarly structured questions to fill the gaps between the TBaCCT and the list of HM|DT key skills. Teachers responded to five questions per construct, using Likert-type responses. Each construct measured good or acceptable reliability levels, using Nunnally’s (1978) widely used Cronbach’s alpha acceptable range: value beliefs (α = .72), self-efficacy beliefs (α = .86) and teaching efficacy beliefs (α = .80). Open-ended questions about participants’ confidence, goals and challenges were included to gather detailed information to support the quantitative questions. After pre-testing the questionnaire with a group of educators, the self-administered online confidential questionnaire was collated through the secure Qualtrics platform.

The questionnaire was employed at three stages throughout the research period. Participants were invited to complete the questionnaire two weeks prior to beginning their chosen PLD model (July/August 2022), then within two weeks of finishing the PLD (September/October 2022) and, finally, approximately 6 months after completing the PLD (February/March 2023).

Sampling and participants

Due to the mix of online and in-person PLD models, a mixture of convenience and clustered sampling practices was used to invite potential participants to partake in the study. In July 2022, technology subject associations, closed social media groups and all school principals of intermediate, composite, contributing and full primary schools listed in the Education Counts school directory were provided with the details of the research available in their region and invited to forward the research information to their members/teaching staff. To retain participants throughout the research project, participants were provided with a $20 gift voucher for each of the questionnaires they completed.

Due to the longitudinal approach of this research, only data from the 33 participants who completed all three questionnaires were used in the analysis. These participants came from 21 different schools across New Zealand. The background data of these participants and their schools are shown in Online Appendix B. An a priori power analysis was conducted using G*Power version 3.1.9.7 (Faul et al., 2007) to determine the minimum sample size required to detect a significant interaction between the time variable and the PLD model variable. Results indicated the required sample size to achieve 80% power for detecting a medium effect, at a significance criterion of α = .05, was N = 36 for Analysis of Variance (ANOVA) statistical analysis, or 12 participants for each of the three groups. Thus, the obtained sample size of N = 33 sits slightly short of this sample size.

Analysis

Quantitative analysis

Numerical values were given to participants’ RMKF categorisation with a value of 1 for the lowest level of understanding of the HM|DT curriculum through to a 4 for the highest level of understanding. Each of the value beliefs, self-efficacy beliefs and teaching efficacy beliefs constructs was made up of five 6-point Likert questions where participants were asked to rate their level of agreement (strongly disagree, disagree, somewhat disagree, somewhat agree, agree, strongly agree). Following Rich et al.’s (2020) approach, numerical values from 1 (strongly disagree) through to 6 (strongly agree) were given to each response, negatively worded questions were reverse-coded, and values in each construct were summed. This resulted in a total for each of the four constructs (RMKF, value beliefs, self-efficacy beliefs and teaching efficacy beliefs) that could be used to track changes between each data collection period.

After considering the approaches of similar studies (Love et al., 2022; Rich et al., 2021), the researchers used the dataset to assess the assumptions of two-way repeated-measures ANOVA statistical analysis. There was one outlier in the self-efficacy beliefs variable, which had a studentised residual value of − 3.24. Comparison ANOVA was run, and, as there were no changes in statistical conditions after removing the outlier, the decision was made to include the outlier in the analysis reported. The assumption of normality was tested using Shapiro–Wilk’s test of normality as well as normality testing using skewness and kurtosis (Zskewness > 1.96; Zkurtosis > 1.96) (Kim, 2013). Where this assumption was violated (RMKF, self-efficacy beliefs and teaching efficacy beliefs), transformations (square root, reflect and square root, logarithmic, reflect and logarithmic, inverse, reflect and inverse) were used unsuccessfully to bring the variable within the normal range. A reflect and logarithmic transformation was undertaken on the teaching efficacy beliefs variable, which resulted in Shapiro–Wilk’s test of normality being met with (p > .05), but, using the skewness and kurtosis method, the assumption of normality was still violated. Comparison testing was run using the original and transformed data, and, as there were no changes in statistical conditions, the decision was made to report the analysis using the original data. The homogeneity of variances assumption was not met at the post-period for RMKF (p = .010) and self-efficacy beliefs (p = .020), and none of the transformations enabled the data to meet this assumption. Mauchly’s test of Sphericity indicated that for teaching efficacy, the assumption of sphericity was violated for the two-way interaction (x2(2) 5.974, p = .050). The Greenhouse–Geisser method was used to correct for this violation.

Given the robustness of ANOVA in handling a small level of non-normality (provided each of the groups is similarly skewed) and the belief that the homogeneity of variance is not pertinent to small sample sizes such as this, it was determined that two-way repeated-measures ANOVA was the most appropriate statistical analysis to undertake (Field, 2009). This allowed for comparisons between the dependent variables (RMKF, value beliefs, self-efficacy beliefs and teaching efficacy beliefs) and the between-subjects factor (PLD models: in-person, online-facilitated and online self-led) and the within-subjects factor (time: pre-, mid- and post-). The SPSS software package was used to conduct this analysis.

Qualitative analysis

Qualitative data were collected from the open-ended questions regarding areas participants felt confident in, areas they were less confident in, goals they had set themselves and any challenges they and their school faced with implementing DT. To create patterns from the dataset to answer the study’s research questions, the qualitative data were thematically analysed using Braun and Clarke’s (2022) reflexive approach. The dataset exceeded the recommendations of a participant group size of more than 30 for large research projects as this (Braun & Clarke, n.d.). Following the assumption of reflexive thematic analysis that the meanings of qualitative data are personal and subjective (Braun & Clarke, 2022), the coding was first conducted by the main researcher before meeting with the remaining researchers to deepen the developed themes through discussion and reflection. The inductive thematic analysis found five themes in the qualitative data centred on implementation, support, confidence, goals and time. These are discussed in the ‘Results’ section.

Triangulation

To validate the two analyses, the initial results from the quantitative and qualitative analyses were integrated and further analysed following a concurrent convergence triangulation model (Plano Clark & Creswell, 2018).

Results

This section begins with a focus on the number of HM|DT PLD hours participants completed and their intentions for future PLD before presenting the triangulated results in relation to each of the TBaDT constructs. Quotes from participants are provided alongside information about the PLD model they undertook and which data collection period the quote was taken from.

PLD hours

The number of hours of HM/DT PLD that participants completed is shown in Fig. 2. Both the in-person PLD and online-facilitated PLD groups consisted of two 1-h workshops, yet some participants undertook additional learning using the resources and recordings provided by the facilitator.

Fig. 2
figure 2

Hours of professional learning and development completed as part of research

At the mid-point, 78% of participants intended to continue with some form of additional HM|DT PLD, with a drop to 55% at the close of the research. Of the 18 teachers who intended to continue with more PLD at the close of the research, 17 of them reported high levels of understanding of the HM|DT curricula (RMKF rankings of 3 and 4) and, on average, they reported higher efficacy beliefs than participants that did not intend to continue with PLD. Reasons for the drop in the number of participants who intended to complete further PLD at the final data collection point could be attributed to the qualitative theme about a lack of time to complete PLD and conflicting school priorities such as other PLD and school events. Additionally, after completing the PLD, participants’ goals were more directed towards building confidence through implementation and resource development rather than undertaking further PLD.

Raranga Matihiko Kaiako framework to measure teachers’ understanding of the Hangarau Matihiko and digital technologies curricula

The mean and standard deviations for teachers’ RMKF for each PLD group over the three data collection periods are shown in Table 1.

Table 1 Descriptive statistics for participants’ Raranga Matihiko Kaiako framework rating representing teachers’ understanding of the Hangarau Matihiko and digital technologies curricula

Participants in the online self-led group initially reported higher levels of awareness of the HM|DT curriculum than the other two PLD groups (2.56/4), yet their growth over the research period was smaller than the other two groups.

There was a statistically significant interaction between the PLD model and time on RMKF ranking, F(4, 60) = 3.25, p = .018, partial η2 = .18. At the pre-PLD data collection point, there was a statistically significant difference in RMKF rating between PLD groups F(2,40) = 4.36, p = .019, partial η2 = .18. RMKF ratings were found to be statistically significantly lower in the in-person group (− 0.88 ± 0.3, p = .015) compared to the online self-led group. No statistical significance was found between any other combination of groups or at the mid- or post-points.

For the in-person PLD group, there was a statistically significant effect of time on RMKF pre- to post- and pre- to mid. Taking longer to show a change, the online-facilitated group RMKF rankings were found to be statistically significantly increased at pre- compared to post- only. No statistical significance was found between RMKF and time for the online self-led PLD group, which could be attributed to the ceiling effect where participants began the PLD with relatively high levels of RMKF, meaning there was less scope for improvement over the research period. Further simple main effects are reported in Online Appendix C.

As a measure of teachers’ proficiency, the RMKF examines how the HM|DT content is implemented; not at all, as an isolated subject, integrated with other learning areas or embedded naturally throughout teaching. Relating to the implementation theme of the qualitative analysis, many participants stated that the PLD had increased their awareness of how HM|DT can be embedded throughout other curriculum areas.

My confidence in the content I deliver has grown [since the PLD], and I am able to integrate DT through more curriculum areas. (Questionnaire 2, Online-facilitated PLD participant)

These results, alongside over 70% of participants reporting an increase in RMKF rankings from prior to completing the PLD to post-PLD, show that, in general, the PLD had a positive effect on teachers’ understanding of and implementation of the HM|DT curricula.

Value beliefs

Values from 1 (strongly disagree) to 6 (strongly agree) were given to each of the five statements for this construct before they were summed, meaning the highest possible cumulative score a teacher could have was 30. The mean and standard deviations for teachers’ value beliefs for each PLD group over the three data collection periods are shown in Table 2.

Table 2 Descriptive statistics for value beliefs

There were nine participants who showed a small drop in value beliefs of 1 or 2 points from pre- to post-PLD. Their qualitative responses did not provide any further insight into why their value beliefs may have dropped slightly, although it could be attributed again to the ceiling effect where these teachers began the research with relatively high-value beliefs, providing less room for growth in their views about HM|DT. One in-person PLD participant experienced a 5-point rise in value beliefs immediately after the PLD and then a subsequent 5-point drop in value beliefs from mid- to post-, raising questions about the short-term nature of the PLD and lack of follow-up support.

No statistically significant interaction between the PLD model and time on value beliefs was found, F(4,60) = 1.81, p = .139, partial η2 = .11. The main effect of time showed a statistically significant difference in value beliefs from pre- to post-, but not between any other data collection periods. No statistically significant difference in value beliefs between the PLD groups was found. Further main effects are reported in Online Appendix C.

Qualitative comments connected to value beliefs related to the willingness of teachers to engage with HM|DT, participants’ attitudes towards screen time and how the PLD had impacted their views about student learning.

We have a small number of teachers who are flexible, open learners who are confident to take on the technology and learn. We also have teachers who are limited in their knowledge and reluctant to engage and implement DT. (Questionnaire 3, Online self-led PLD participant)

I’m teaching in a junior classroom with an Apple TV and my MacBook as the only two screens. This is a deliberate choice as most tamariki [Māori, ‘children’] have too much screen time. We occasionally use iPads to present/publish work. (Questionnaire 3, Online-facilitated PLD participant)

Although this last participant had the lowest value beliefs towards HM|DT pre- and post-PLD and was still reluctant to introduce ICT in their teaching post-PLD, they did comment that the PLD had increased their knowledge of utilising unplugged learning experiences to teach HM|DT concepts and had set themselves goals to trial this approach.

While no relationship between the PLD model and value beliefs was found, the results still show that, in general, PLD has a positive effect on the importance teachers place on HM|DT.

Self-efficacy beliefs

Values from 1 (strongly disagree) to 6 (strongly agree) were given to each of the five statements in this construct before they were summed, meaning the highest possible cumulative score a teacher could have was 30. Mean and standard deviations for teachers’ self-efficacy beliefs for each PLD group over the three data collection periods are shown in Table 3.

Table 3 Descriptive statistics for teacher self-efficacy beliefs

All groups showed a minimum of a 2-point increase in self-efficacy from pre- to post-PLD, highlighting the positive impact each PLD model had on teachers’ beliefs in their HM|DT knowledge and abilities. Yet, with an average post-PLD score of only 22.8/30, there is still significant room for teachers to develop their HM|DT knowledge. Participants in the online-facilitated group reported higher levels of self-efficacy throughout the research period, yet the in-person PLD model had the highest jump of almost 7 points from pre-PLD to six months post-PLD. It is important to note that three of the teachers in the in-person group came from the same school where the PLD was conducted; hence, there may have been additional factors contributing to this rise.

No statistically significant interaction between the PLD model and time on self-efficacy beliefs was found, F(4, 60) = 1.94, p = .120 partial η2 = .11. The main effect of time showed a statistically significant difference in self-efficacy beliefs between the pre- and mid-, and pre- and post-data collection periods but not mid- and post-data collection periods. No statistically significant difference in mean self-efficacy beliefs between PLD groups was found. Further main effects are reported in Online Appendix C.

Pre-PLD, participants’ comments related to the goal theme focused on hopes the PLD would increase their understanding of the curriculum, with two participants stating they needed to know more before they could begin implementing it.

My goal is to develop my content knowledge. Once I have content knowledge, I will be able to confidently deliver. (Questionnaire 1, Online-facilitated PLD participant)

Post-PLD, the comments in this theme related to how the PLD had increased their self-efficacy and given them the boost to keep learning.

I am really enjoying trying new things and investigating how I can improve. The more confidence I have, the more I am exploring. (Questionnaire 3, In-person PLD participant)

While no relationship between the PLD model and self-efficacy beliefs was found over the research period, the PLD was shown to have a positive effect on teachers’ confidence in their HM|DT competence.

Teaching efficacy beliefs

Values from 1 (strongly disagree) to 6 (strongly agree) were given to each of the five statements in this construct before they were summed, meaning the highest possible cumulative score a teacher could have was 30. The mean and standard deviations for participants’ teacher beliefs for each PLD group over the three data collection periods are shown in Table 4.

Table 4 Descriptive statistics for teaching efficacy beliefs

Highlighting the significant impact that the PLD had on teachers’ beliefs in their ability to teach HM|DT to their students, all groups showed a minimum of a four point increase in teaching efficacy from pre- to post-PLD. This is more than double the rise seen in self-efficacy, yet, with an average post-PLD score of only 22.2, there is still significant room for growth in teachers’ beliefs in their confidence to teach HM|DT. While the qualitative data support the quantitative analysis in that the rise in teaching efficacy is due to the PLD, some participants commented that they didn’t realise how much of the HM|DT curricula they had already been teaching until they unpacked the curriculum in the PLD sessions, meaning some participants may have initially underreported their skills in the pre-PLD questionnaire.

Pre- to post-PLD, only two participants reported no change or a drop in teaching efficacy. These teachers started with relatively high teaching efficacy beliefs (22 and 21 points), and each completed more than eight hours of PLD and intended to continue with more HM|DT PLD. One of the in-person PLD participants recorded a 15-point increase in teaching efficacy, and post-PLD, they were energised to continue their own learning and support colleagues HM|DT growth.

I am now keen to look for ways I can open teachers’ minds further to what is available and how they can implement the DT curriculum. My mind shift is due to our PLD from the facilitator and their expertise. (Questionnaire 2, In-person PLD participant)

Mauchly’s test of sphericity indicated that the assumption of sphericity was violated for the two-way interaction (x2(2) 5.97, p = .050); hence, the Greenhouse–Geisser method was used to correct for this violation. No statistically significant interaction between the PLD model and time on teaching efficacy beliefs was found, F(3.37, 50.58) = 0.43, p = .754, partial η2 = .03. The main effect of time showed a statistically significant difference in mean teaching efficacy beliefs between the pre- and mid-, and pre- and post-data collection periods but not the mid- and post-period. No statistically significant difference in mean teaching efficacy beliefs between PLD groups was found. Further main effects are reported in Online Appendix C.

Prior to the PLD, participants commented on how they hoped the PLD would increase their knowledge and support their implementation.

I just don’t know where to start. Some PD will be vital to help me know which direction to head and how best to support the junior teachers in my team. (Questionnaire 1, In-person PLD participant)

Post-PLD, comments related to how the PLD had increased their confidence to teach the HM|DT curricula.

I have more confidence in the content that I am teaching and have been able to explain and answer students’ questions better now that I understand the purpose of digital technologies better. (Questionnaire 2, Online-facilitated PLD participant)

Participants described a range of barriers coded to the support theme that impacted their teaching efficacy beliefs. Both positive and negative comments about school leaders were reported with mismatched visions for HM|DT shown to be a challenge for some of the participants.

The School Leadership have been incredibly responsive. (Questionnaire 2, In-person PLD participant)

Leadership are often not willing to listen to the teachers on the ground with the kids when it comes to DT. (Questionnaire 3, Online-facilitated PLD participant)

Teachers found the varied behaviour and learning needs of students, alongside their limited ICT skills, made it difficult to plan and implement HM|DT learning that was suitable for all students.

We have many students with high and complex and varied needs—this sometimes can be an issue when teaching ANY curriculum—but especially one I am not completely 100% confident in/knowledgeable about. (Questionnaire 2, Online self-led PLD participant)

I think students’ ability to break down elements of a programme to create code for them is a real weak point—they are excellent at following tutorials/instructions but not so great and figuring things out themselves! (Questionnaire 1, Online-facilitated PLD participant)

A significant theme that impacted participants’ teaching efficacy was a lack of time. This was categorised as either a lack of time to teach the HM|DT content, limited time to upskill themselves or a lack of time to plan for meaningful learning experiences.

Moving forward, I need more time to cement these ideas and continue to ‘have a go’ to build my skills and confidence. (Questionnaire 2, In-person PLD participant)

There are some activities that need to be properly prepared and developed before the lesson can begin, and, with the number of tasks teachers need to complete, it is competing against other curriculum responsibilities. (Questionnaire 3, Online-facilitated PLD participant)

Participants described how the resources introduced in the PLD helped their teaching efficacy by providing them with something familiar to trial in the classroom and provided a basis to continue to build their knowledge. Two teachers commented on how they wanted to improve their teaching of HM|DT for students with diverse learning needs and abilities.

While no relationship between the PLD model and teaching efficacy beliefs was found over the research period, the PLD was seen to increase teachers’ beliefs in their ability to teach HM|DT and provide a basis for future teaching efficacy growth.

Discussion

Statistically significant improvements in RMKF, value beliefs, self-efficacy beliefs and teaching efficacy beliefs were found pre- to post-PLD. The analysis did not find any statistically significant difference in teachers’ efficacy beliefs between PLD models to answer the sub-research questions, yet a statistically significant difference in RMKF was found pre-PLD, with the in-person group displaying lower RMKF ratings than the online self-led group. Alongside the qualitative dataset, this finding leads us to believe that the statistically significant rise in teachers’ efficacy beliefs can be attributed to PLD in general and to the similar content and hands-on format of the PLD models. Four findings related to DT understanding and efficacy beliefs were found.

Finding one: PLD has a positive effect on teachers’ Hangarau Matihiko and digital technologies understanding and efficacy beliefs

Statistically significant improvements in RMKF rankings and all three efficacy beliefs were found pre- to post-PLD, reiterating the findings of Love et al. (2022) and Rich et al. (2021), in that the PLD increased the value teachers placed on HM|DT, raised their HM|DT self-efficacy and increased their confidence in their ability to teach HM|DT. Like Love et al. (2022), participants showed a larger increase in teaching efficacy than any of the other constructs. Through completing these short HM|DT PLD models, teachers felt more confident in explaining HM|DT concepts and answering students’ questions using appropriate terminology. Having a bank of familiar lesson ideas from the PLD provided participants with all the resources they needed to trial implementation, which, similarly to participants in Rich et al.’s (2021) research, enabled teachers to continue to develop their teaching efficacy while engaging with students.

Like participants in Celepkolu et al. (2020) and Pargman et al. (2020) research, pre-PLD internal factors such as lack of exposure to the HM|DT curricula were seen to have a large impact on teachers’ efficacy beliefs. Yet, at the close of this research period, external barriers such as limited leadership support and competing school commitments were seen to have the largest effect on efficacy beliefs. Reiterating the findings of Saxena and Chiu (2022), teachers with higher value beliefs were found to be more likely to set themselves goals to undertake further HM|DT PLD. These discoveries highlight the importance PLD must place on showcasing the value of HM|DT to students to encourage teachers to undertake further learning.

Finding two: school environments influence teaching efficacy

The school environment was seen to play a critical role in participants’ self-efficacy and teaching efficacy beliefs, with participants citing the importance of school leadership teams making HM|DT a school priority, communicating a shared vision for this learning area and providing the resources they need to deliver the curriculum effectively (e.g. ICT, PLD, time to upskill). This finding reinforced those of Saxena and Chiu (2022), who found that teachers who came from well-resourced schools with a culture of teamwork reported high levels of CT self-efficacy and teaching efficacy. The crucial role school administrators were seen to play in raising teachers' HM|DT teaching efficacy supports Brown’s (2021) claim that a supportive school structure provides teachers with the flexibility and confidence to take risks and experiment with implementing CS curriculum alongside their students.

Finding three: teachers require support to adopt twenty-first-century pedagogies and plan for Hangarau Matihiko and digital technologies implementation

The slow statistically significant rise in value beliefs (only between the pre- and post-data collection points) could be attributed to comments participants made about the benefits of implementing HM|DT in their classroom and seeing first-hand the value the learning had on students. Alongside teachers in Rich et al. (2021) and Saxena and Chiu’s (2022) research projects, participants claimed engaging with students in HM|DT experiences cemented their knowledge and motivated them to keep learning. This relationship highlights a crucial aspect of DT PLD in supporting teachers’ adoption of twenty-first-century pedagogies, such as willingness to learn alongside their students.

Additionally, teachers claimed that the PLD addressed the barrier of accessing basic lesson ideas and resources, but post-PLD still reported a lack of confidence in their ability to develop lessons that targeted students with diverse needs, varied skills and behavioural difficulties. Echoing the findings of Celepkolu et al. (2020) and Saxena and Chiu (2022), this shows the importance of PLD in providing teachers with the skills to create their own resources and plans based on the HM|DT curricula that are suited to their own unique school situation and group of students.

Therein lies the challenges of DT-based PLD in going beyond simply increasing teachers’ knowledge of a subject to instil high DT value beliefs, provide the tools to transition towards twenty-first-century teaching practices and the skills to plan for their own unique situation using the curriculum.

Finding four: teachers require additional time to plan, upskill and teach

The time constraints participants described throughout this research related to those recognised by Lindberg et al. (2017): limited time to plan for classroom lessons, limited time to upskill and a lack of available classroom teaching time.

Participants found demands from other curriculum areas, additional school activities and students’ limited ICT skills impacted the classroom time they had to teach HM|DT. And, while Love et al. (2022) and Redmond et al. (2021) suggest integrating DT in other learning areas as a solution to the barrier of limited classroom time, participants in this research that were following an integrated approach still struggled to find adequate time to implement HM|DT. Teachers also described how important adequate planning time was to their teaching efficacy to enable them to develop lessons and resources that were individualised to their students and that they felt confident to teach. Alongside this, teachers lacked the time to upskill themselves in HM|DT content, appropriate pedagogical approaches and technological knowledge. The finding that time impacts teachers’ ability to implement new curricula is not new (e.g. Lindberg et al., 2017; Pak et al., 2020), yet the number of participants that mentioned time as a constraint to their DT self-efficacy and teaching efficacy is concerning.

Limitations

Study limitations impacting the generalisability of the findings include the absence of a control group, the loss of participants throughout the research period, non-randomisation of PLD groups and self-selection bias of participants volunteering to take part in the research. The in-person and online-facilitated models were delivered by the same facilitator, and seven of the eleven in-person PLD participants were from the same school. The self-directed online model was no longer available in its full format and consisted solely of legacy resources. The external validity of the research was affected by the small sample size, the focus on New Zealand primary teachers, a slight underrepresentation of male teachers and the reliance on recruiting participants through electronic means (email and social media). Attempts to reduce these limitations were made through offering incentives to participants for each of the questionnaires they completed.

Conclusions and future work

This research investigated the impact of three different PLD models on primary teachers’ understanding of and efficacy beliefs towards implementing the HM|DT curricula. PLD models were chosen with distinctly different delivery modes, yet each model used similar established practices and similar content. While no one PLD model was found to be more effective, the analysis suggested that through undertaking PLD such as the three models used in this research that introduced teachers to the DT and HM curriculum through exploring key terminology and undertaking hands-on activities, teachers experienced a positive long-term impact on their HM|DT understanding and efficacy beliefs.

The crucial role school environments have both on participants’ self-efficacy and teaching efficacy was unpacked, with misalignments between leadership teams and teachers suggesting a need for initiatives that target school leaders and showcase the ways in which they can support school-wide adoption of HM|DT. The study emphasised the multifaceted approach of effective HM|DT PLD to promote teachers’ adoption of twenty-first-century teaching practices and give them the skills to plan HM|DT lessons that cater to the learning and behaviour needs of their individual group of students. The findings also showed HM|DT PLD should focus on raising teachers’ value beliefs to encourage further learning. Teachers believed that to have the greatest chance of successfully implementing HM|DT as intended, they required additional time to plan and familiarise themselves with HM|DT lessons, time to upskill on an ongoing basis and more time in the classroom to implement HM|DT.

The study highlighted multiple areas where further research is required. Of particular interest would be larger-scale investigations into the impact different DT PLD models (e.g. content, delivery methods and ongoing support) have on teachers’ DT implementation. A final research gap previously identified by Shin et al. (2021) and briefly raised in the findings of this study is research into the dynamic relationship between school leaders and teachers’ DT implementation.

The differences in DT compared to other subject areas (e.g. need to be confident utilising ICT, need to continually upskill) and the many factors influencing teachers’ adoption (e.g. stereotypes, technical terminology, lack of resourcing) have led to a slow (Larke, 2019) and inconsistent (Mertala, 2021) uptake of the DT curricula. The findings from this research help to guide decision-making to support teachers’ uptake of HM|DT curricula by reinforcing the conclusions of other CS research and adding new knowledge about the long-term effect of PLD on teachers’ understanding and self-efficacy. A second article based on this research project is currently in publication, focusing on how PLD impacts teachers’ implementation of the HM|DT curricula.