The ongoing process of changing and augmenting traditional teaching and learning activities by replacing or complementing them with digital alternatives has been - and still is - daunting. Digital learning materials are rapidly being implemented in schools on a global scale. They consist of curriculum-based applications containing learning activities involving texts, pictures, video and audio materials, and exercises, bringing many affordances that can potentially leverage more effective methods for teaching and learning (Yeung et al., 2021). For example, a video of a curriculum subject that is made available to students in their mother tongue and with subtext in the host language affords a higher degree of user control: the possibility of watching the video several times, pausing and rewinding when difficulties arise, and even removing the second language text or audio when feeling more comfortable. Revisiting the material affords repetition, while both text and audio afford multisensory memorization. Digital data generated by digital learning materials can be analyzed through learning analytics to report on the learning progress of individuals or groups of learners. Learning analytics can further aid the teachers’ pedagogical decisions by visualization using learning analytics dashboards. However, evidence of what digital learning materials and learning analytics dashboards actually bring to the classroom for teachers and students is still modest (Viberg & Grönlund, 2021), and its investigation is not simple. According to the Technology Pedagogy and Content Knowledge (TPACK) framework of technology integration (Koehler & Mishra, 2009), the interplay between knowledge about technology, pedagogy, and content is necessary for a teacher to successfully integrate technology use into teaching and learning activities. Therefore, the introduction to, and effective use of, digital learning materials requires development of digital skills and competencies (Vuorikari et al., 2022) through a systematic approach that can contribute both to teaching practice and research evidence.

Our research program is dedicated to exploring the impact of systematically scaffolding and preparing educators and educational institutions for the integration of digital learning materials and learning analytics dashboards. This preparation, which constitutes our core intervention, aims to facilitate not only the effective and sustained application of these technological innovations but also to enhance teaching practices and student performance outcomes. Guided by the five Active Implementation Frameworks (AIF) Usable Innovations, Implementation Teams, Implementation Drivers, Implementation Stages, and Improvement Cycles (Fixsen et al., 2019), our study design and interventions are crafted to provide deeper insights into how educators can effectively utilize digital learning resources and educational technologies (EdTech). This includes a focus on advancing digital competence and optimizing the use of learning analytics dashboards. Additionally, our research endeavors to establish a foundation for these technologies’ innovative and validated use, underscored by sustainable implementation strategies, evidence-based evaluations, and the continuous re-design and improvement of existing digital learning materials. Through this comprehensive approach, we aim to contribute significantly to the knowledge base surrounding the effective integration of digital innovations in educational settings, thereby enhancing both teaching practices and student learning experiences (Masiello et al., 2023).

The full-scale program will run with an estimated 200 + Swedish schools and 13,000 students in kindergarten (ages 5–6) to 12th Grade (ages 18–19). This paper presents findings from a pilot project to inform both the implementation plan for the full-scale program and the development of data collection methods and data analysis plan. The pilot project involved five intermediary schools from four different municipalities in southeast Sweden. Two years into the pilot project, this article presents and discusses the experiences and findings thus far, and outlines the road ahead in the light of these.

Methods

Implementation science involves researching and analyzing methods and strategies that help integrate evidence-based practices and research findings into regular use by professionals and decision-makers, and it has provided promising results for sustained and deliberate improvements in schools (Fixsen et al., 2005). In the case of our program, this is evident in the vision for our research program: educational technology is used as an obvious and self-evident part of all teachers’ everyday practice. Consequently, the research program and the pilot project discussed in this article have been designed to make use of the advantages of implementation science, and the four longitudinal and interrelated stages of the AIF as shown in Fig. 1. These are: Exploration (assessing needs, creating readiness, etc.), Installation (selecting and training practitioners/participants, etc.), Initial Implementation (introducing changes, improvement cycles, etc.), and Full Implementation (new practices adopted, etc.) (Fixsen et al., 2019). Moreover, the project makes use of common components of successfully implemented programs, i.e., implementation drivers: competency, organization, and leadership (Fixsen et al., 2019). Both stages and drivers are derived from analyses of long-lasting successful programs and reviews of the implementation research literature (Fixsen et al., 2005). The process of running the four stages can span over 2–4 years.

Fig. 1
figure 1

The Four Interrelated AIF Stagesa. a Redrawn from the implementation stages of the National Implementation Research Network; https://nirn.fpg.unc.edu/

It is important to note that the implementation stages do not constitute a linear process. Several stages can run in parallel, and activities from different stages can occur simultaneously. Some activities might even need to be revised and/or rebooted in the light of assessment.

The group of researchers who initiated the program is multidisciplinary, with extensive knowledge of educational technology, pedagogy, computer and data science, media technology, psychology, implementation science, and special education. An intense work period of about ten months was initially necessary to assemble the core research group, design the overall project structure, and apply for funding. Those activities were closely followed by mapping the project design to the AIF, recruiting external project partners, setting up contracts, and preparing for the formal start of the project. The initial efforts were rendered more complex and challenging by the effects of the COVID-19 pandemic, and complications due to the lockdown periods. This resulted in a down-scaling of the pilot project, with a limited number of year groups and subjects involved, and a development focus on data-collection methods and data analysis plan. These are described in the following sections along with an account of the main findings to date. We discuss our findings and experiences in terms of the Technology, Pedagogy, And Content Knowledge framework (TPACK) (Koehler & Mishra, 2009) to help to identify the knowledge teachers need to teach effectively with technology.

Piloting Implementation

Exploration Stage Activities

As mentioned earlier, we used the AIF as the basis for our work (Fixsen et al., 2019). The activities related to the various stages, see Fig. 1, sometimes overlapped due to the individual progression of each school. During the exploration stage, we thoroughly mapped the four AIF stages, actions, and interventions to match the design of our research program (a detailed mapping of the AIF stages can be found in the link provided in the footnoteFootnote 1). The mapping was carried out through an extensive series of research group meetings and interviews, using the online collaborative ideation and planning tool Miro (www.miro.com). Through this process the project’s overall implementation drivers, implementation activities and tasks were defined according to the AIF (see the project study protocol (reference not showing during review process).

The main aim of the exploration stage is to consider the extent to which a potential innovation or approach meets the needs of the community, and whether implementation is feasible (Fixen et al., 2019); in other words, to examine needs and assets, and prepare the organizations for implementation readiness in our project through the efforts of school teams. To fulfill this aim, the following project tasks (Table 1) were identified within the broader implementation activities:

Table 1 Implementation activities and Respective Project tasks of the Exploration Stage

Considering Overall Implementation Drivers

Consideration of the implementation drivers related to competency, organization, and leadership took place during the first year of the pilot project. At this time, we drafted and signed contracts with municipalities and EdTech developers, determined data-collection methods, submitted a research ethics application, designed and produced project information materials, assessed, fine-tuned and sent out survey instruments, and planned agendas and held meetings for upcoming data sharing and storage issues. We also formed the school teams and scheduled workshops and meetings for them and for other stakeholders and accomplished some initial individual interviews and focus-group meetings. This ‘chaotic’ start can be partially explained by several factors, one being the pandemic and the aggravating circumstances that followed for all parties involved. Another being the moderate size of our research group combined with the scale and complexity of the project. We were breaking new ground in Sweden, pioneering a unique approach, with a partnership-based collaboration that included researchers across different disciplines, schoolteachers and managers, and EdTech developers. Additionally, we incorporated implementation science to systematically develop and enhance digital competencies and learning materials within existing educational settings.

Because of limitations following the pandemic, we restricted the pilot project to one school team per municipality, not one per school, and the intervention was directed towards pupils in grades 4–6, and their respective teachers in three specific subjects: Mathematics, Swedish, and Swedish as a second language. Limiting the number of subject areas was not altogether a negative approach since it allowed for a more detailed understanding of the various factors that contribute to effective implementation. The participating schools had different characteristics regarding the number of students and the range of their backgrounds, financial and infrastructural resources, and time schedules for teacher training. An outline of the participants involved can be found in Table 2.

Table 2 Municipality and school participants

At the outset, in accordance with the AIF mapping, we launched the planned project tasks by setting up the school teams to address the competency aspects related to the implementation drivers. For each municipality, the school team consisted of two researchers, one or two specially appointed teachers, one of whom might also be an EdTech specialist and/or the principal. In addition, the teams contained a representative from the EdTech development companies, who took part only when necessary or requested. An overview of participants is shown in Table 2.

Assessing Needs

The needs of teachers and principals relating to their use of digital technology in the classroom and school were assessed through a series of interviews and surveys. Focus group interviews with participating teachers and individual interviews with participating principals were conducted at the start of the project and follow-up. (Not all teachers were interviewed at the very start, since they only became involved later in the project.) Additionally, interviews – individually and in a group – have been carried out with the developers of the digital learning materials. Raw data from all interviews have been analyzed with qualitative content analysis (Shreierer, 2023). Outcomes from the follow-up interviews and the interviews with EdTech developers are presented in the section Installation Stage Activities.

The interviews with teachers at the start of the project took place in the first semester of 2022 and consisted of six one-hour focus-group interviews with thirty-four teachers of mathematics, Swedish, and Swedish as a second language from three municipalities. The findings indicated that:

  1. 1)

    initiatives for introducing educational technology in teaching and school environments often have a top-down approach,

  2. 2)

    overall, hardware and digital infrastructure are well functioning, yet some schools still have problems with a stable internet connection,

  3. 3)

    digital transformation is experienced as exciting and necessary among teachers, but its implementation is rarely systematic,

  4. 4)

    the respondents expressed a wish for professional development grounded in practical work in smaller groups, and with follow-up workshops,

  5. 5)

    the digital transformation strategies for municipalities and schools are rarely known, or only used by teachers for support issues.

Additional findings indicated that tablets and/or laptops needed constant updating and charging, and that fetching equipment from storage stole lesson time. Finally, it was found that many teachers had little or no previous experience with learning analytics, while a few already worked actively with simple learning analytics and believed in its potential.

The one-hour interviews with three principals were conducted in the first semester of 2022. The preliminary indications revealed that the school principals had a more visionary perspective and understanding of the implementation of digital learning materials and learning analytics dashboards in education, compared to the teachers. The principals were also fully aware of the increasing role of digital tools in enhancing and facilitating education, with the pandemic being a notable accelerator. They also recognized the significance of ongoing professional development, training, and collaboration for educators in the digital realm. Despite expressing concern about the over-reliance on, and the challenges posed by, digital tools in educational settings, they highlighted a vision for the future where digital tools are seamlessly integrated into education to improve student outcomes. They emphasized collaboration and strategic planning at various levels (municipal, community, and school) to ensure the effective implementation of digital tools and they expressed a holistic view of how the role of technology in education has evolved.

The SELFIE for Teachers (SFT)Footnote 2 questionnaire was used for assessing teachers’ own views on their digital competencies. SFT is partly based on the DigCompEdu FrameworkFootnote 3 and is the EU instrument designed for the self-reflection and systematic estimation of teachers’ digital capacity and maturity. Figure 2 shows the 22 digital competencies of the DigCompEdu Framework, organized into six areas: (1) the broader professional environment; (2) the creation and use of digital resources for learning; (3) the management and orchestration of digital technologies in teaching and learning; (4) the use of digital strategies for assessment; (5) learner-centered teaching and learning strategies using digital technologies; and (6) specific pedagogical competencies related to facilitating student digital competence. The framework also includes a progression model with six levels to help educators assess and improve their digital competence: Newcomer (A1), Explorer (A2), Integrator (B1), Expert (B2), Leader (C1), and Pioneer (C2). These levels are also used for the self-reflection and systematic assessment in the SFT questionnaire. For this project, we focused on the development of the educators’ pedagogic competencies in the DigCompEdu areas 2–5, since we believed that those areas were most pertinent to the teaching and learning activities of everyday classroom practice. We aimed to bring all participating teachers to at least the Expert (B2) level.

Fig. 2
figure 2

The 22 Digital Competencies of the DigCompEdu Frameworkb. bhttps://joint-research-centre.ec.europa.eu/digcompedu_en

The SFT questionnaire was initially distributed to all participating teachers during the spring of 2022, and it will be repeated yearly throughout the project. The procedure involved the teachers making a self-estimation of their digital competence levels and then completing a systematic questionnaire, followed by a second self-estimation. At the end of this process, the teachers could compare their self-assessed levels to the level calculated by the SFT system based on the questionnaire. The SFTs were answered by twenty teachers from four participating schools, including local school-level team leaders (who are also teachers). The results indicated that the majority show a realistic self-estimation of their current level of digital competence. Thus, most teachers’ self-estimations were in line with the assessment provided by the SFT evaluation, however there were a few deviations.

Of the twenty respondents, only one received an overall higher-level assessment of competence from the SFT evaluation compared to the two self-estimations. Eleven respondents had an SFT evaluation level of competence lower than the self-estimation before the questionnaire, while only five were after. Thus, fewer teachers overestimated their digital competence level, as compared to the SFT evaluation level, after completing the questionnaire. The self-assessed level of competence was matched by the SFT evaluation level in seven cases, and the difference between the self-assessment made after completing the questionnaire and the SFT system result is smaller than the difference before completing the SFT questionnaire. Even though the initial measurements and indications are so far too few to identify trends, they denote that the competence levels measured thus far are at modest levels. Only one respondent achieved level Expert (B2). 50% of the respondents were at the Integrator (B1) level - that is, in the middle of the scale; 45% of the respondents were at the Explorer (A2) level. Thus, there is a need for further competence development among teachers to reach the goal of bringing all teachers to the target Expert (B2) level.

Examining Intervention Components

During the entire year of 2022 exploratory project meetings, steering group meetings, and workshops with researchers and EdTech developers served as a means for training the school teams in the overall AIF setup and the implementation plan, enabling them to start collaborating with their respective teams of teachers. Each of the four school teams, in which two research group representatives are present, met separately every week throughout the school year, holidays excluded. The school teams assembled online to discuss the ongoing work at each school, any issues that may have come up, and to share news from the research group. In addition, the local school teams also met with participating teachers according to a schedule set up by each school. Researchers and EdTech developers attended when required. So far, the local meetings have been held either weekly, bi-weekly, or every three weeks, either as discussions, or workshops on specific strategies and classroom activities. One school team arranged their workshops in a three-week-cyclic fashion, the first session being for brainstorming ideas and overall planning, the second for final planning and starting activities, and the last one for follow-up, reflection, and logbook reports.

When the project scales up to the full program with more participating municipalities, schools, and subjects, more than one school team will be needed per municipality. The aim is for the members of the school teams originally involved in the pilot project to train the new trainees, thus extending on the overall scope of the program.

The learning analytics dashboards are crucial intervention components since they are products designed to support teachers in making data-informed pedagogical decisions. It follows that the design of the dashboards requires careful attention to ethical, security, and technical issues, and compliance with the General Data Protection Regulation (GDPR) (Voigt & Von Dem Bussche, 2017). However, the issue of data collection has so far been the most time-consuming part of the project. The GDPR (Voigt & Von Dem Bussche, 2017) gives a clear roadmap on how to work with data related to the personal information of children, but since this project is pioneering these issues, it has been challenging to sort them out in a timely fashion. Hence, the learning analytics dashboards are now being developed in close collaboration with the users in the initial implementation stage. Although we do not have preliminary results, we have the technical infrastructure in place. Data from the digital learning materials of the EdTech developers (timestamps, quizzes, test answers, and behavioral information regarding how the students use them) and data from the schools (attendance, grades, schedules for students and teachers, and various school questionnaires) will be combined through a secure pipeline and data server. The two datasets will then be matched and analyzed through data mining and machine learning techniques. We would like to point out that the researchers only work with pseudo-anonymized data.

Determining Core Components

The practice profile, referred to as innovation configurations by Hall and Hord (2019), describes the essential activities that permit the innovation to be teachable, learnable, and doable in typical service settings. In this setting, classroom observations of the teachers’ use of the digital learning materials provide measures of the fidelity performance of the core components. Hence, for the practice profile created for this program, we identified behaviors and practices that reflect the program principles and the activities associated with the core components. Descriptions of these behaviors were entered into our practice profile document, drawing on the design of the SFT, and of the DigCompEdu Framework. These areas describe the competencies around: Digital Resources which comprises selecting, creating and modifying, managing, protecting, and sharing digital equipment and software; Teaching and Learning which entails teaching practices, guidance, collaborative learning, and self-regulated learning; Assessment, which concerns assessment strategies, analyzing evidence, and feedback and planning; and finally Empowering Learners, which covers accessibility and inclusion, differentiation and personalization, and actively engaging learners (Fig. 2). Our assessments of the application of the Practice Profile across teachers and digital learning materials were practically based and could be done routinely, through classroom observations and analysis of logbooks.

Assessing Fit

The research group held regular group and individual meetings with all local school teams to discuss their tasks and expectations, and to oversee training progress. The steering group meetings were held online once a month (except for summer holidays) to discuss project progression and general issues, to follow up on the activities, adapt and plan, using Plan-Do-Study-Act (PDSA) cycles (Deming, 1994). All members of the steering group have an equal opportunity to assess and determine the program fit, and to propose re-designing if necessary. The research group recorded notes during the steering group meetings. Due to the COVID restrictions, physical meetings were not possible during the project startup, but the whole project team has been able to meet subsequently on two occasions.

Installation Stage Activities

The installation stage is when organizational and personal competencies needed are established to ensure the successful implementation of the selected innovation (Fixsen et al., 2019). In our case, the innovation is the implementation of digital learning materials and learning analytics dashboards. To fulfill the objectives for the installation stage, the following project tasks (Table 3) were identified within the broader implementation activities:

Table 3 Implementation activities and Respective Project tasks of the installation stage

Preparing Implementation Drivers

During the initial part of the installation stage, the respective school teams participated in all types of activities, which included distributing informed consent forms, teaching the teachers, and collecting information about all participants (except for students) involved in the program.

Building Implementation Capacity by Preparing Participants

In the respective school teams - one for each municipality in the pilot project - we built capacity for implementation and scaling up the pilot into the full program. To build capacity, the school teams were tasked with developing continuing professional development workshops for the participating teachers. These workshops were held physically and online and followed different program formats dictated by the conditions of each school and teachers’ needs. The planning of the workshops was done from an overall project management point-of-view in the researcher group, and in greater detail by each local school team. The overall workshop plan was set semester by semester in a Miro online workboard, accessible to all participants, and connected to the overall project aims and objectives, and to the content and tasks of the various stages. The workshops ran for the duration of the stage, with researchers on a standing invitation, and EdTech developers participating when needed or wished for by the teachers; workshops will continue to run throughout the project. One important finding to emerge from our experience in this stage is that regular, and not too-extended workshops are imperative. One of the school teams began by having meetings only every three weeks, which the teachers thought was too far apart, making them feel less involved and less engaged in the overall project aim and vision. Subsequently, the local school team leader and principal agreed on a tighter meeting schedule, with bi-weekly meetings instead.

The continuing professional development workshops covered specific project-related content or allowed for reflection on a subject that the teachers wanted to discuss, be it technology, pedagogy, or something else. Many EdTech / digital learning materials developer workshops were hands-on, addressing specific areas of the respective digital learning material so that teachers could get help and support to fit the digital learning material into their classroom activities. PDSA cycles allowed the researchers and local school team representatives to influence and adapt the content of the continuing professional development workshops to the specific needs of the teachers. This design also meant that the pace, collection of research data, and achievements of the research and development work differed for each participating subject and school.

The researchers held several meetings including system administrators of the municipalities who are in charge of the student data, system administrators in the EdTech developers, and the municipality data protection officers to learn about the data that was available, and the aspects of data-collection that were most concerning. These meetings also included learning all the ethical aspects related to students’ data and exploring everyone’s goals and expectations about their participation and collaboration. We all agreed to aid the municipalities’ system administrators with several scripts to automate the data uploading, as this competency was low. This work is still in progress and has so far been one of the lengthiest of the various project tasks, but an unreservedly critical step that needs to be done properly.

Acquiring Resources

An important task was to obtain ethical approval for conducting the research and collecting signed informed consent. Though the focus of the program was on teachers we were working with data about individual students, which required several ethical considerations. An application to the Swedish Ethical Advisory Board was made and accepted before any data collection or student contact was initiated. We acquired informed consent orally from all participating teachers since teachers as research participants are not subjected to equally strict ethical regulations as students. However, signed informed consent from all guardians of participating students is obligatory by law when researchers handle personal data according to the Swedish Government Act 460 (2003) concerning the ethical review of research involving humans. Consent information to guardians included the purposes of the study, how the intervention is carried out, what data is collected, and, how data is stored and managed according to General Data Protection Regulation (Voigt & Von Dem Bussche, 2017), who has access to the data collected, and how the results are communicated. We asked the guardians to inform their children about the research, and their rights as participants, and to confirm if they wanted to participate.

The information was shared with the guardians in writing and attached to the informed consent form to be signed, and it was also communicated orally to the guardians by the teachers of the students on several occasions, for example during parents’ assembly or follow-up of student’s progress. Oral consent from the students then was collected and documented in the classroom by their respective class teachers, who went over the objectives of the research and its implications. Only when both guardians and students agreed to participate could the researchers start collecting and analyzing digital data related to the students. Since, a significant number of the students had different language backgrounds, there was a need for ten language versions of the consent forms and all information materials (currently available in Arabic, Dari, English, Farsi, Kurdish, Tigrinya, Somali, Swedish, Ukrainian, and Vietnamese).

Building Implementation Capacity by Preparing the Organization

Meetings with municipality representatives, principals, and EdTech developers were held regularly throughout the exploration and installation stages, to discuss digital transformation on a management level, to strategize and build implementation capacity for the scaling-up of the project. There was also a series of meetings concerning data collection, sharing, and storage which will continue during the initial implementation stage.

Collecting Research Data

Follow-up interviews with participating teachers took place a year after the first workshop and were aimed at understanding the development of the integration of digital learning materials into teaching and learning activities. This time, nineteen teachers from all participating schools participated in three-hour-long group interviews between May and June 2023. The drop in number of teachers interviewed was due to busy schedules. The findings indicated that:

  1. 1)

    digital tools in education were seen by the teachers not as replacements but as valuable complements to traditional teaching methods,

  2. 2)

    teachers needed to be comfortable with technology, achieved through both self-guided trial and error, and learning from peers for successful implementation,

  3. 3)

    the content, and tools used, needed to be intuitively designed and easily accessible,

  4. 4)

    the process often demanded dedicated expert support to address the challenges teachers faced,

  5. 5)

    students’ engagement with the digital learning materials was vital, and their interactions with the digital learning materials significantly dictated the effectiveness of digital education.

  6. 6)

    teachers saw many benefits of integrating digital tools, including enhanced engagement and personalized learning,

  7. 7)

    collaborative learning, through sharing experiences among educators, can be a foundation for competence development,

  8. 8)

    challenges arose, relating to training gaps and the suitability of digital learning material content,

  9. 9)

    conversations around education develop and evolve, reflecting the changes driven by technology,

  10. 10)

    teachers expressed concerns about integrating technology, the importance of verifying digital sources, the potential of future classrooms driven by Artificial Intelligence (AI), and the broader societal impacts of a digitally driven education system.

The teachers also emphasized knowledge-sharing, support structures, and the potential of data analytics to enhance individualized learning, demonstrating increased awareness compared to the views expressed in previous interviews. Teachers also reported they were pleased with the collegial approach that the project allowed for, and for the possibility of close collaboration with their respective information and communications technology (ICT) pedagogues. Teachers also reported that the sessions on how to integrate digital learning materials into learning activities contributed by the different EdTech developers were very useful. The democratic design of the project – the three-constituent partnership between the municipalities, researchers, and EdTech developers, with the teachers in the absolute center of attention – was greatly appreciated. This provided teachers with a voice, and the opportunity to participate and influence both digital learning material content and development, as well as decisions in the municipality about procurement of educational technology and resources in the future.

At the beginning of August 2023, we also conducted the first set of one-hour interviews with the EdTech developers of the digital learning materials. The participants were six representatives from three different EdTech providers. The overall topic of the interviews was the implementation of their digital learning material application in the respective schools, and the teachers’ experiences from working with the application. The outcomes suggested that:

  1. 1)

    they perceived teachers appreciated the benefits of digital tools, but there was resistance from those unfamiliar with the technology,

  2. 2)

    teachers’ feedback was critical for immediate tool enhancement,

  3. 3)

    teachers often resisted the new digital learning tools, and proper training could facilitate adoption,

  4. 4)

    mutual benefits arose from partnerships between teachers and EdTech developers,

  5. 5)

    due to lack of time, teachers needed structured training sessions to fully grasp the benefits provided by the digital learning materials, and.

  6. 6)

    evidence of effective practices emerged from teacher-researcher collaboration.

Measuring implementation fidelity was also an important data collection task. To achieve an ongoing assessment of implementation fidelity, we introduced measuring instruments to our respective school teams and other project members. For the participating teachers, we designed online digital logbooks and conducted a first series of classroom observation sessions. We also set up an implementation fidelity questionnaire for teachers and all participating school and municipality staff.

The digital logbooks were developed to report on teachers’ use of digital learning materials. In our practice profile we identified a range of behaviors and practices connected to digital competencies and the project’s core components that show the development of the teachers’ educational practices. The participating teachers completed reports at intervals ranging from two to three weeks, aligned with their planning, teaching, and evaluation cycle, detailing their views and reflections on the use of the digital learning materials. Table 4 illustrates the number of times, and for what purpose, the materials were used during spring term 2023, and Table 5 provides information about specifically planned pedagogical purposes for using the digital learning materials, and in which setting they were deployed (individually, in groups, or both).

Table 4 Digital Learning materials Use in Spring Semester 2023
Table 5 Specific Digital Learning materials uses during Spring Semester 2023

The digital logbook was complemented by objective, researcher-driven, classroom observations, using an observation protocol based on the practice profile. The classroom observations were divided into two types and therefore served multiple purposes. The first type aimed to observe and contextually understand the overall context and use of digital learning materials, followed by a short follow-up discussion with the teachers involved. The second type aimed to assess the development of the teachers’ educational practice adding to the level of evidence on the effects of the program.

Thus far, the first type of observation sessions has been conducted in all participating schools on one occasion, and the data collected has been transcribed and analyzed. The outcomes showed that teachers used digital teaching aids in a wide variety of ways depending on different teaching styles and different experiences, but also related to their respective digital competence levels. Many used digital technologies simply as an alternative way of presenting information for tasks that were then done on paper, while other teachers used digital opportunities for both the presentation of information and for carrying out the tasks, as well as for assessments and communication. A few teachers could also find new ways to link different digital materials and tools so that students could work independently and efficiently. Regardless of which approach was being used, teachers often expressed great frustration that they were rarely able to influence the procurement choices of different systems, due to centralized purchasing policies.

Observations revealed several examples of good practice, but there is still a long way to go, particularly in terms of creating and providing digital learning materials that can be used with the ease and flexibility that teachers require, while also maintaining quality standards. In some cases, the more digitally competent teachers were seen to lead the way in finding diverse ways to circumvent the limitations of existing materials, while also supporting less experienced colleagues. From a TPACK perspective (Koehler & Mishra, 2009), digitally experienced teachers were using the digital learning materials seamlessly. Thus, Technology, Pedagogy, and Content Knowledge are balanced out and synthesized without causing distractions. In contrast, the use of digital learning materials by less experienced teachers often ended up overshadowing the teaching. Thus, the T. in TPACK dominated the lessons at the expense of the P.A.C.K.

The S-NoMAD, which is the Swedish version of the Normalization Process Theory Measure (NoMAD) instrument (Elf et al., 2018), was used to monitor the fidelity of the program and to provide information about the participants experience of the project implementation process. (For a detailed description of the NoMAD instrument, see our study protocol article (reference not showing during review process). The first batch of the S-NoMAD data was collected from twenty-six participants out of thirty-three in May 2023, including the local schoool-team leaders. The S-NoMAD data collection will be repeated twice more to follow the overall implementation and fidelity of the project. The overall outcomes of were as follows:

  • It was difficult to organize the use of educational technology in practice.

  • There was a lack of adequate training and other resources to support the use of educational technology in general, and the respective digital learning materials in particular.

  • There was not a clear and common school policy or purpose for using digital learning materials.

  • There was a lack of consensus among staff as to whether it was worth investing in implementing digital learning materials.

  • The use of digital learning materials was partly experienced as an obvious part of their work practice. There were however large variations within each school.

  • Most staff agreed that active participation in the decisions about, and use of, digital learning materials was a relevant part of their profession. There were however large variations in attitudes towards different teaching materials.

  • Most staff agreed that they will continue to support the use of digital learning materials.

Finally, we wanted to learn more about the content of the digital learning materials and their overall quality. We used the steering group meetings and the weekly school team meetings to inquire about those issues. In general, teachers perceive that they are getting the support they initially needed and wished for. The workshops fulfilled their need to talk to each other for collegial support and get support from the EdTech developers on how to best integrate and structure the competence development activities. Challenges remain regarding software quality and functionality, difficulty in organizing the use of the materials, lack of training resources, and absence of a clear purpose or consensus, but overall our experiences so far indicate a recognition of the potential benefits of digital learning materials and a commitment to their continued use and integration into educational practices.

Challenges and Workarounds

In projects of this complexity, situations often arise in which there are no quick fixes or simple solutions. The following sections focus on two of the most challenging issues so far, and how we have tried to work around them.

Collecting Individual Student Data

Hitherto, we have collected project data from interviews, surveys, logbooks, observations, workshops, team meetings, and focus groups. We are however still anticipating being granted full access to student user data from some of the municipalities. An extensive series of meetings with data protection officers, municipality staff, and systems administrators of EdTech developers has been conducted during the exploration and installation stages, in which the question of sharing and storing student data has been the overarching theme. One important finding was that municipality confidence and knowledge about technical processes was low, resulting in many iterations and consultations, with the main topic being ‘trust’. A contract of confidentiality and shared responsibility for sharing and storing personal data according to Swedish law was established early between all stakeholders but municipality representatives are still tentative, which in turn hinders the overall project progression and development. This is a complication known by the participating EdTech developers, who report the same experience regardless of municipality size. Finalizing learning analytics dashboard design and development has been delayed as a result of this issue, and thereby relocated from the last part of the installation stage to the implementation stage. On the positive side, the work conducted for this pilot project will establish norms on how to share educational and student data inter-organizationally, but more importantly, create the conditions to contribute to educational data standards in Sweden and elsewhere.

Informed Consent Form

The text of the informed consent form, which complied with the regulatory standards of the Swedish Ethical Advisory BoardFootnote 4, turned out to be far too extensive and complex for parents and guardians to deal with. Therefore, we contacted the National Ethical Board and asked them whether we could use an abridged version of the consent form containing only the absolute explanatory necessities, but with the offer and easy access to the full information. We also proposed adding an additional checkbox to the guardian consent form, allowing guardians to indicate if their child agreed to participate. The board gracefully granted our request, which allowed for the local school teams to disseminate the form both electronically, and in paper format. Additionally, we also received permission to collect digital signatures from guardians, which allowed us to collect informed consent from both guardians and children more efficiently.

Next Steps: Initial Implementation and Full Implementation

At the start of the initial implementation stage during the steering group meeting in August 2023, with forty-seven participants, the critical role of teachers and the need for utilizing digital technology in education was again emphasized. The participants discussed challenges and progress, including how digital educational materials can be tailored to support teaching. Additionally, common barriers and the need for further training were discussed, emphasizing evaluating the impact of digital learning materials and dashboards. The project continues to increase teachers’ competence levels and to create a more digitally informed learning environment.

Once we have analyzed all data available from the next stages of data collection and co-designed a learning analytics dashboard, we can create data literacy workshops to train the teachers during the initial implementation stage. The benefits of data-informed practices using learning analytics dashboards must be visualized, demonstrated, and explained to teachers for them to perceive the opportunities and advantages of dashboard support. The role of the school teams in supporting teachers during the initial implementation stage is therefore crucial to success, both in helping teachers understand how to translate and apply this knowledge in pedagogical interventions, and in supporting them in making effective use of the learning analytics dashboards.

In the full implementation stage, the school teams will work on implementation and sustainability, creating a professional learning community, providing ongoing training and collaborative learning by introducing updated content, and extending use to other teachers and subjects. The long-term goal is to adjust and scale-up the pilot project into a full program, in which we will continue to monitor changes by analyzing aggregated data from experimental and comparison schools, continue to monitor fidelity with the new S-NoMAD instruments, and to measure the development of teachers’ digital competencies with the SFT instrument.

Discussion

The results of the exploration and installation stages so far demonstrate that: (a) teachers’ digital competence levels are still low in general, but – more importantly – the potential and motivation for further competence development is high; (b) digital learning materials need a thorough makeover in close collaboration with the users content-wise, design-wise, and functionality-wise; (c) the increased digital competence the teachers have gained through the project has given them a deeper insight into assessing and evaluating digital learning materials, and also a voice to share these insights, resulting in several side projects together with the EdTech developers to raise quality and improve usability; and finally, (d) that the task of sorting out how to handle the different kinds of data collection, data sharing, and data storage issues, as well as preparing, distributing and retrieving ethical permits and consent forms, are extremely complex and time-consuming, and need to be planned and initiated very early on in the implementation cycle.

A systemic change of well-established practices is a daunting task to take on in any context. In an educational context, several crucial issues need to be addressed for the change to start taking place, and the process will always vary between different schools. Our findings from the S-NoMAD instrument thus far show variation in the implementation of digital learning materials in the classrooms between schools. It is no secret that professional development of teachers’ digital competencies takes time and embraces many aspects other than those related specifically to teachers. More general issues include school leadership and organization, technical solutions, implementation drivers, and the participation of EdTech developers as equal research and development partners. It is fair to say that the systematic implementation of educational technologies and digital learning materials is highly complex and requires motivation, collaboration, understanding, knowledge, and significant resources in terms of time, staff, and funds.0

Implementation science applied in educational settings offers several advantages. Our initial results indicate that professional development of teachers’ digital competencies is a necessary process for the adoption of digital learning materials. The longitudinal scaffolding provided by the structured workshops and school teams contributes to the effective adoption of the innovation. This result is in line with other findings using implementation science, which have demonstrated that a systematic approach to adopting and integrating new practices, programs, or technologies is crucial for the success of a program (Damschroder et al., 2009; Blase et al., 2018) and can serve as methodological scaffolding. By understanding the factors influencing successful implementation, school leaders, principals, and teachers can ensure the effective adoption of innovations, leading to improved teaching and learning outcomes. Data and research findings inform the implementation process, ensuring that interventions or changes are based on sound evidence rather than assumptions, leading to more informed choices and increasing the likelihood of positive outcomes.

In our project, we have included the EdTech developers as project partners, and they, too, see the benefits of collaborating with teachers and getting invaluable product improvement feedback. As the EdTech developers have expressed, this close relationship would not have developed, if it were not for the project, making use of one key aspect of implementation science, which is identifying and addressing challenges and barriers to change (Durlak & DuPre, 2008). From a broader perspective, overcoming barriers within educational settings entails both reducing resistance to new teaching methods and addressing logistical challenges. Our experience is that by systematically addressing barriers, and leveraging facilitators, implementation science can help create an environment conducive to successful educational change. In addition, we also experienced that a combination of a partner-based collaboration between researchers, teachers, and EdTech developers in promoting new teaching methods with the support of digital learning materials can lead to positive implementation outcomes, as has been demonstrated in other contexts (Domitrovich et al., 2008).

Educational settings often present several limitations, and our project has not been immune from these. They include: complexity of the educational system, resistance to change, resource constraints, measurement and evaluation challenges, lack of stakeholder involvement, and the dynamic nature of educational policies. However, those limitations are not unique to education, and implementation science has shown that, for example, overcoming resistance and fostering a culture of receptiveness to change is a significant challenge in the implementation of new practices (Fixsen et al., 2013). If stakeholders are not adequately engaged in the planning and execution of changes, it can lead to disconnection between proposed interventions and the needs of those directly affected, hindering the success of implementation efforts (Blase et al., 2018; Fixsen et al., 2019). Furthermore, the dynamic nature of education policy can create uncertainty and disrupt sustained efforts to implement and scale-up effective practices (Damschroder et al., 2009). All those limitations are being put to the test today in Sweden by a Government that has swung back in favor of restricting the use of digital learning materials in the primary classroomFootnote 5. Teachers and principals get frustrated by those actions but also confused. This is why the systematic work provided by implementation science in this project may provide a structure to focus on the value that digital learning materials can bring to teachers and students.

The implementation-informed approach of our research program brings a new element, Systematic, to the TPACK model which we propose as TPACK(S), see Fig. 3. The S for Systematic added to the TPACK acronym that is applied in our research program, provides structure and sustainability to the organization, whilst allowing flexibility with implementation. Through this systematic approach, we see that teachers have started to think and reflect on the use of digital materials, bringing a more conscious integration into everyday practice. Currently, the development of teachers’ digital competencies in Sweden is provided through elective courses about digital tools or programming for pre-service and in-service teachers. We argue that single courses are not enough to fully integrate digital learning materials within teacher practice. The TPACK(S) structure of this project could provide long-term scaffolding to enable effective and sustainable implementation of digital learing materials and the development of teachers’ digital competencies.

Fig. 3
figure 3

Systematic – to the TPACK Framework

Conclusions

Educational technologies are now in widespread use. Indeed, the global market value of educational technology is impressive, currently estimated at over $268 billion, with projections suggesting an increase to over $404 billion by 2025Footnote 6. There is a limited but growing body of research that demonstrates the effectiveness of these technologies in teaching and learning contexts (Verbruggen et al., 2021; Sarama et al., 2008). However, despite these advancements, the integration of educational technologies into everyday teaching practice has not been widely achieved, as our research demonstrates. Simply having access to digital technologies does not guarantee effective and sustained use (Niederhauser & Lindstrom, 2018). Clearly, schools should employ digital resources in a manner that enhances teaching and learning, offering tangible benefits to both educators and students. Our project demonstrates an enthusiasm and willingness to pursue this goal. However, the deployment of educational technology is highly dependent on the specific context, indicating that the implementation of any innovation will vary across different environments and even within the same environment over time (Niederhauser et al., 2018). Therefore, we argue that implementation science guidance can provide a systematic and structured approach to the application of educational technologies in teachers’ everyday practices.

In summary, our research effort, although only half-way, provides an important example of the application of implementation science in the field of education. We show that it is important to take into account the views of teachers, and to co-design technology that meets their needs in relation to teaching and learning goals. The use of this approach, informed by implementation science, means that there is no single solution that fits all schools, no quick fix that can be done within a single school year, and no individual research method that provides all answers. In line with this more dynamic and collaborative approach, we have better understood the complexity of each step within the implementation plan but have also recognized that workarounds and alternative solutions are feasible and can be found. Perhaps most important of all, we have learned that teachers are willing to put in both time and effort when their professionalism is taken seriously, and their ideas, suggestions, and concerns are earnestly listened to and acted upon.