Many educational technology enterprises, researchers and policy makers assume that technology will transform or even revolutionize classrooms all over the world (Reich, 2020). Especially technology facilitating personalized learning is seen as a key force in impacting educational practice (Kolchenko, 2018; Reich, 2020). Reviews and meta-analyses depict that digital personalized learning (DPL) research is expanding, not solely because of the expected potential to leverage students’ motivation and performances, but also because of its ability to support teachers who are facing large heterogenous class groups (see Bernacki et al., 2021; Major & Francis, 2020; Van Schoors et al., 2021; Xie et al., 2019; Zheng et al., 2022).

Although the role of the teacher is very important when implementing DPL in education, it is mostly absent in contemporary literature (Van Schoors et al., 2021). The question of how teachers perceive the value of DPL in their classroom is essential. On the one hand, teachers’ perceptions have a profound impact on the general technology adoption process in education (Vanderlinde & Van Braak, 2010). On the other hand, teachers’ expectations can be meaningful for policy makers and software developers to establish more useful and resourceful contributions in the DPL field (Major & Francis, 2020; Walkington & Bernacki, 2020). Hence, by considering teachers’ perceptions and expectations towards opportunities (or challenges), the reconciliation of human and digital tutors can be enhanced.

Therefore, teachers’ perceptions concerning DPL are the basic premise within this contribution and were scrutinized by performing an online survey. In total 370 teachers participated and reported on their (1) technology use for personalization, (2) perceptions towards adaptivity and dashboards in DPL tools and (3) expectations of support in view of implementing DPL. First, we introduce a framework on DPL. Second, we report on the methodology used. Third, we present the results in which we distinguish teacher profiles. Finally, we reflect on these profiles and propose follow-up research with a specific focus on teacher perceptions when implementing DPL tools in their classroom.

Theoretical Background

The theoretical background consists of two sections. In the first section, we introduce the definition and possible benefits of DPL, followed by a discussion of technology software, adaptivity and visualizing learning data. In the second section, we elaborate on expanding teacher responsibilities, competences and support for implementing DPL tools.

DPL

DPL is a relatively “young” area of research that is becoming more extensive and complex along with ubiquitous digitization (Groff, 2017; Kolchenko, 2018; Shemshack & Spector, 2020). According to Van Schoors et al. (2021), DPL can be defined by setting apart the following characteristics of personalization in DPL tools: (1) various learner characteristics are considered, (2) different aspects of a learning environment can be adapted, (3) personalization can be driven by the teacher, learner or tool itself and (4) teachers might enhance personalization through the use of learner data visualized by the tool.

Although reviews and meta-analyses are rather scarce, DPL has found to be beneficial for cognitive outcomes such as higher learning achievement, and non-cognitive outcomes such as engagement (Major & Francis, 2020; Pfeiffer et al., 2021; Zheng et al., 2022). In addition, DPL is also expected to be beneficial for teachers; DPL tools can invoke reflection on numerous fields of interests and knowledge levels within a differing class group (Baker, 2016; Holmes et al., 2018; SRI international, 2018).

Technology Software and Adaptivity

Given the many expectations concerning teacher and student benefits, DPL tools proliferated and became popular in education (Aleven et al., 2017; Baker, 2016; Basham et al., 2016; Xie et al., 2019). In the wide range of DPL tools, there is a variety of built-in personalization systems. These systems provide unique learning experiences: while some supply adaptive recommendations, others facilitate the provision of adaptive learner content (Groff, 2017; Shemshack & Spector, 2020).

To untangle the complexity within adaptivity genres, many authors developed classifications or frameworks. One example is the framework of Vandewaetere and Clarebout (2014) depicting a four-dimensional view on adaptivity genres. They discuss the time, target, method and source of adaptation. The first dimension, time of adaptation, pertains to three possible moments an adaptation can take place: before the learning activity starts (static), during the learning activity (dynamic) and a combination of both (dual pathway). A second dimension, the target of adaptation, relates to what is being personalized. Some examples are content, presentation, instruction and support (Vandewaetere & Clarebout, 2014). A third dimension is the method of adaptation which pertains to how the adaptations are made, either system-controlled (adaptations made by the developer or the instructor), learner-controlled (adaptations made by the learner) or the combination of both (shared-controlled). The source of adaptation is the fourth dimension and refers to what drives the adaptation. Regarding the source, Vandewaetere and Clarebout (2014) distinguish characteristics of the learner which are cognitive, affective and behavioral characteristics. The latter considers the interaction between the learner and the system.

In addition to Vandewaetere and Clarebout (2014), Groff (2017) as well as Bulger (2016) both acknowledge a spectrum of DPL tools or technology focusing on various intensity of adaptivity (instead of types of adaptivity). At the beginning of the spectrum, they situate the category ‘Data Driven Systems’, comprising management systems that offer pre-determined learning materials based on students’ mastery level (Bulger, 2016). Groff (2017) similarly refers to data-driven learning technology as Learning Management Systems (LMSs). LMSs can provide students with individual learning pathways, evaluations and recommendations based on learner data (Bulger, 2016; Groff, 2017). Further up the spectrum are adaptive software genres that move beyond pre-determined learning materials. Bulger (2016) classified these tools as ‘Adaptive Learning Tools’. They offer more dynamic data-driven learning material according to learners’ behaviors or competences. Groff (2017) states that in this case, dynamic adaptation is made possible via machine learning which goes beyond the pre-determined decision tree used in LMSs. At the end of the spectrum are the ‘Intelligent Tutor Systems’ (ITSs), often referred to as the highest category of DPL-tools (Bulger, 2016). ITSs subsume a pro-active model, often referred to as a ‘system-based tutor’ which provides real-time instruction by analyzing students’ learning needs and progress (Bulger, 2016; Groff, 2017). In addition, most ITSs can track mental processes and diagnose errors. A promising generation of more sophisticated ITSs, also named conversational ITSs, is currently being developed. For example, ITSs that consider affective learner characteristics, giving more qualitative system-based feedback, acknowledging conceptual reasoning and stimulate deep user-system dialogue (Groff, 2017).

Aforementioned examples of technology software are not discrete and distinctions between types can be ambiguous. As Groff (2017) illustrates: “For example, a learning management system may or may not include data-driven learning capabilities. At the same time, a game-based learning environment may also be data-driven but not be a learning management system” (p.11). In addition, Groff (2017) points out that educational technology companies are often quick to attribute the label “personalized”, despite the rather low-quality adaptivity of the technology at hand.

Visualizing Learning Data on Dashboards

Some DPL technology include learning analytics, which measure, aggregate, analyze and visualize learner data (Maseleno et al., 2018; Schwendimann et al., 2017; Teasley, 2017). Visualizations are generally presented through real-time visual interfaces or displays, also called dashboards. Specific learning activities are displayed in a meaningful way through graphs, gauges or maps (Schwendimann et al., 2017). Examples of data visualizations are completed learning content/exercises, amount of time spent on learning content/exercises, results on exercises/tests, analyses of failed tasks, progress over months/weeks/years, feedback from the system and/or the teacher, achieved goals and agendas with personal schedules or deadlines (Maseleno et al., 2018; Schwendimann et al., 2017; Teasley, 2017). These visualizations aim to encourage teachers’ awareness, reflection and analysis of students’ learning processes (Teasley, 2017; van Leeuwen et al., 2019). Important information regarding the learning progress or behavior can be monitored immediately (Maseleno et al., 2018; Teasley, 2017). Most dashboards are designed for teachers; however, student dashboards are also on the rise (Maseleno et al., 2018; Schwendimann et al., 2017). Student dashboards can increase engagement and may empower students towards their own learning process and progress (Maseleno et al., 2018; Teasley, 2017).

Dashboards have the potential to enhance human-prompted feedback, initiated by the teacher (Knoop-van Campen & Molenaar, 2020; Maseleno et al., 2018; Teasley, 2017; Van Leeuwen et al., 2019). More specifically, teachers provided with the competence to interpret data—also called data-driven decision-making—can take additional pedagogical actions considering individual learners’ needs (Knoop-van Campen & Molenaar, 2020; Maseleno et al., 2018; Teasley, 2017).

Enactment of DPL Tools in the Classroom

With their increasing smart features (e.g. adaptivity, dashboards), DPL tools can be utilized for multitudinous purposes such as automated adaptation of learning opportunities, coaching students, support for curation/management of the learning process and support for using student data (SRI international, 2018). Consequently, DPL tools can be seen as a source of support for teachers. However, they also complicate teachers’ roles by challenging their responsibilities and competences (Shaikh & Khoja, 2012). Allocating ample resources to empower teachers and reinforce their capacity has therefore been considered a timely issue for many educational stakeholders and researchers around the globe (Marienko et al., 2020).

Expanding Teacher Responsibilities and Competences

When using DPL to enhance students’ learning processes, teachers play a crucial role (Kolchenko, 2018; Major & Francis, 2020). As Basham et al. (2016) state, implementing DPL calls for a shift in teachers’ instructional strategies: there is a continuous responsibility to make well-informed decisions based on one’s own experience in conjunction with acquired learner data through DPL tools. Such collaborations between tool and teacher, also referred to as distributed scaffolding (see Tabak, 2004), can augment the learning process in various ways. This is, for example, the case for adaptivity. The combination of adaptive AI-based and dynamic personalized tutor-based decisions ignite amplified adaptivity, also identified as human-AI hybrid adaptivity (Holstein et al., 2020).

Holstein et al. (2020) describe four dimensions within human-AI hybrid adaptivity. The first dimension, goal augmentation, involves the reciprocal process of information sharing to improve instructional learning goals for various students. The second dimension, perceptual augmentation, pertains to leveraging complementarity in perception. What the DPL tool gathers as relevant learner information might enhance opportunities for teachers to examine and interpret the learning process in a more profound way (Holstein et al., 2020). The third dimension, action augmentation, refers to expanding the ability, capacity and availability of instructional actions (Holstein et al., 2020). The fourth dimension is decision augmentation: by informing each other, more adequate pedagogical decisions can be made (Holstein et al., 2020).

Although technology can certainly benefit teachers, they are also required to stay attentive to DPL tools constraints (e.g. inappropriate learning goals, false learner information, improper decisions). DPL tools are not yet flawless: fragmented or contaminated data can generate erroneous predictions or unintelligent adaptations (Baker, 2016; Basham et al., 2016). Today’s DPL tools often contain models which are still very simplistic and inadequate to acknowledge rich educational contexts (Baker, 2016; Basham et al., 2016). Thus, teachers are expected to take responsibility and use their experience-based knowledge to make critical judgments of these pitfalls (Baker, 2016; Kolchenko, 2018).

Support for Implementing DPL Tools

As DPL integration in education accelerates, teachers are progressively challenged. They need to acquire new competencies at fast pace, which are substantially different from what they learned during training (Groff, 2017; Marienko et al., 2020; Shaikh & Khoja, 2012). In this evolving process, acknowledging the complex teacher role through appropriate professionalization support is essential (Kaiser & König, 2019). Within this support, teachers’ individual perceptions and feelings must be specifically considered as this is crucial for successful implementation (see concerns-based adoption model by Hall & Hord, 1987; Hord et al., 1987). Similarly, teachers need help to navigate through the wide range of DPL tools and guidance to deploy them in the classroom (Groff, 2017; Holmes et al., 2018; Marienko et al., 2020).

Relevant professional development resources or opportunities can take on many forms. Werquin (2010) distinguishes between formal (e.g. DPL training courses provided by teacher education or other formal institutions), informal (e.g. collegial conversations) and non-formal (e.g. websites, books, videos and e-manuals related to DPL software) learning. Grossman (1990) states that teacher education is imperative to address prior misconceptions which could debilitate mindful teaching strategies (Grossman, 1991). This also applies to the integration of new innovations (such as DPL tools), given adequate supportive initiatives can (1) encourage teachers to get used to the new technology and (2) distribute pedagogy-specific advice related to curriculum standards (Holmes et al., 2018). In turn, teacher competences and perceptions (enhanced by these learning opportunities) do not only influence instructional strategies but also students’ learning outcomes (Kaiser & König, 2019).

Some studies have already been carried out examining teachers’ views of technology in general. An example is the Belgian quadrennial research project MICTIVO which stands for Monitoring information and communications technology in Flemish education (“Monitoring ICT in het Vlaamse Onderwijs”, Heymans et al., 2018). The project investigates teachers’ use of various hardware/software and perceptions of ICT integration in the classroom. Although previous MICTIVO surveys (MICTIVO 2008 monitor, MICTIVO 2012 monitor, MICTIVO 2018 monitor) showed no frequent ICT use, teachers generally had positive perceptions of the importance and effects of ICT. The needs related to DPL specifically, were not addressed in this project. In general, a focus on teachers’ needs is rather scarce in DPL research (Groff, 2017). Considering teachers’ needs through dialogue can provide better alignment of support initiatives, as well as further developments within the field of DPL.

Aims of the Study

Expanding digitization fosters a worldwide interest for DPL innovations in education: many have high expectations concerning student and teacher benefits. Due to growing interest, research involving numerous features of DPL tools (such as adaptivity and dashboards) is likewise on the rise (see Bernacki et al., 2021; Major & Francis, 2020; Van Schoors et al., 2021; Xie et al, 2019). Along high expectations, DPL also invokes many challenges for teachers (expanding responsibilities, new competences…). Therefore, appropriate support to implement DPL tools is particularly salient (Groff, 2017; Marienko et al., 2020; Shaikh & Khoja, 2012). To shine light on teacher perceptions concerning DPL implementation, the following research questions are addressed:

  1. (1)

    To what extent do teachers report using technology for personalized learning in their classroom?

  2. (2)

    What are teachers’ perceptions towards adaptivity within DPL tools?

  3. (3)

    What are teachers’ perceptions towards dashboards within DPL tools?

  4. (4)

    What are teachers’ expectations regarding support in view of implementing DPL tools?

  5. (5)

    What different teacher profiles with respect to a) reported use, b) perceptions of adaptivity, c) perceptions of dashboards and d) expectations of support can be identified?

Method

To provide answers to the research questions, a large-scale survey was created, piloted and used to determine teachers’ perceptions regarding the implementation of DPL. A total of 370 teachers from primary and secondary education (students aged 6–18 years old) participated. The quantified data were analyzed via a descriptive and cluster analysis. The latter is a method that distinguishes subjects within a dataset in homogeneous groups (or clusters) according to characteristics (Romesburg, 2004). In what follows, the (1) participants, (2) instrument, (3) procedure and (4) data analysis are further explained.

Participantsi and ii

The survey targeted teachers employed in primary or secondary Flemish (the Dutch speaking part of Belgium) education. In total, 370 teachers participated. The sample consists of teachers from primary (39.73%), special needs primary (6.76%), secondary (52.43%) and special secondary (1.08%) education. Most of the participants are female (74.05%), which is a small overrepresentation compared to the population (see Table 1 and Appendix). The respondents’ age range lies between 21 and 63 years and is almost similarly distributed when compared to the populations’ age range. Under-25 and over-59 years old participants are rather in the minority with 4.05% and 3.24% respectively. More than half of the participants (54.05%) have more than 15 years of teaching experience. In sum, the sample can be viewed as representative for the population data regarding school grade, gender, age and educational experience (see Table 1 in which teacher percentages within the sample are compared to total teacher percentages within the population).

Table 1 Demographic characteristics of the participants (n = 370)

Instrument

To investigate teachers’ self-reported use and perceptions towards DPL, a questionnaire was developed and sent to all Flemish primary and secondary schools (Appendix Appendix 2). The questionnaire contained four main sections, with a total of 40 questions (see Table 2): (1) Current use of digital technology for personalized learning, (2) Perceptions towards adaptivity within DPL tools, (3) Perceptions towards dashboards within DPL tools (4) Expectations towards support for implementing DPL tools. All survey questions utilized a 5-point Likert-type scale to quantify and rank teachers’ use and perceptions from positive (towards 5) to negative (towards 1) beliefs/frequent use. The estimated time to complete the survey was approximately thirty minutes. All four sections of the survey are separately related to the first four research questions. The fifth research question (i.e., identification of teacher profiles) is related to an interlinked analysis of all four sections. This structure was also used to report the results in section five (results).

Table 2 Sections of the questionnaire

Procedure

Development of the questionnaire was based on two sources: On the one hand, the questions were based on research literature (see Table 2). On the other hand, ten focus group interviews (including 56 teachers) were conducted exploring teachers’ perceptions on DPL, which helped refinement of the constructed questions (e.g. adding frequently mentioned support initiatives to the set of items related to ‘DPL implementation support’). Next, an iterative evaluation of the refined questionnaire was carried out with a focus on the content and formulation of items. In this evaluation, experts were involved from different academic fields (e.g. statistical modeling, educational technology) and from other projects in which large-scale surveys were used (e.g. MICTIVO). Finally, to (1) pilot and (2) improve the validity of the survey a cognitive interview was performed with a small group of teachers (n = 15). Based on these teachers’ comments, final refinements were made related to comprehensibility (e.g. addition of a short and understandable definition of DPL in the beginning of the survey) and wording (e.g. replacements of English words ‘system-control’, with Dutch variations), resulting in the finalized survey.

After approval from the ethical committee (Case number: G- 2,019,101,978), the survey was digitized and assigned to school principals of all Flemish schools. The survey was sent out together with an information letter (addressing the purpose of the study) and a request to distribute the mail among their educational staff. In view of representativeness, extra activities were initiated to reach specific target groups. For example, the survey was shared in an open-source network for teachers in secondary education.

Participating teachers first had to click on a link which directed them to an informed consent with relevant information concerning the procedure and goals of the questionnaire. In this informed consent, a short introduction was included describing DPL as learning that takes place in a digital learning environment which adapts to characteristics of individual learners. The informed consent also clearly mentioned that personal data would be processed anonymously. Only when the participants gave their active consent, the questionnaire could be accessed.

The survey was launched mid-January 2020 and provisionally accessible until the end of April 2020. However, administration of the survey ended early on March 16 as schools were obliged to shift to online learning due to the Covid-19 pandemic. The mandatory switch invoked risks of distorted results, not only in terms of reported use, but also in terms of perceptions and needs.

Data Analysis

A two-fold quantitative research design was applied: (1) To examine the first four research questions, a descriptive analysis was conducted using SPSS (version 28.0.1) to explore univariate data (e.g. frequencies, averages, minimum and maximum). (2) The fifth research question was examined trough a K-means cluster analysis; a method that can be used to distinguish profiles with similar and dissimilar characteristics and gather them in clusters. The cluster analysis was carried out according to rigorous guidelines and considerations provided by Romesburg (2004). First a data matrix was created, again via the software SPPS (version 28.0.1), with columns (referring to the variables) and rows (individual datapoints per participant). Next, the data was recoded via sum scoring to compute variables (see Table 3). For every variable, all related items of the survey were included when calculating the sum score (for example: we included all twelve items related to the software types for the variable ‘technology use’). As depicted in Table 3, all Cronbach’s α coefficients are above 0.75, which indicated an acceptable reliability.

Table 3 Internal consistency of the created variables

Taking the new variables into consideration, a K-means cluster analysis was performed, using Euclidean distances. Tabulated results of both the descriptive analysis and the K-means cluster analysis will be presented in the next section.

Resultsiii

This study aims to investigate teacher perceptions concerning DPL in the classroom. In what follows, the results of two analyses are provided. As depicted in Fig. 1, the first is descriptive in nature and investigates -in accordance with the first four research questions- teachers’ (1) reported technology use, (2) perceptions towards adaptivity as well as dashboards and (3) expected support. The second comprises a cluster analysis which relates to the fifth research question, i.e., the identification of teacher profiles.

Fig. 1
figure 1

Overview of the results regarding the 4 research questions

RQ1: To What Extent do Teachers Report Using DPL Tools in Their Classroom?

For the first research question, the survey aimed at identifying (1) participants’ reported technology usage to foster personalized learning during the school year. Participants were presented with a list of technology types to indicate their usage on a 5-point Likert scale from ‘never’ to ‘daily’. This list comprises twelve software types as mentioned by Bulger (2016), Groff (2017) and Heymans et al. (2018): (1) ‘Learning management systems’ (for offering learning content, management of learning progress and administration), (2) ‘authoring tools’ (an environment to be created with own learning content or self-chosen methods), (3) ‘blended learning tools’, (combination of offline teaching and online learning), (4) ‘communication tools’ (to facilitate communication between students, fellow students and teachers), (5) ‘tools for students with disabilities’ (to support students), (6) ‘e-portfolio tools’ (Online storage for tasks, essays, …), (7) ‘simulation tools’ (visualizations in 2D, 3D, virtual reality, augmented reality,…), (8) ‘assessment tools’ (software to facilitate tests and/or exams), (9) ‘games’ (educational games or tools with gamification elements such as levels, reward system, avatars…), (10) ‘online learning environments’ (an environment in which students are guided in exploring and practicing learning content), (11) ‘collaboration tools’ (facilitating students to interact with each other, share their learning experiences and co-create) and (12) ‘presentation tools’ (software to present learning content).

Table 4 depicts an overview of these findings. ‘Learning management systems’ are reported to be used most often and most frequently (66.22% daily). Besides ‘online learning environments’ (weekly 28.65% or even daily 10.54%) and ‘presentation tools’ (monthly 29.73%), other tools are reported to be used distinctly less. For example, many participants indicate never using ‘communication tools’ (63.78%), ‘e-portfolio tools’ (62.97%) and ‘blended learning tools’ (61.89%) to support personalized learning.

Table 4 Teachers reported use of technology to support personalized learning

RQ2: What are Teachers’ Perceptions Towards Adaptivity Within DPL Tools?

For the second research question, the survey probed for teacher perceptions towards adaptivity in DPL. Based on the work of Vandewaetere and Clarebout (2014), a set of examples concerning source, target, method and time of adaptivity were presented to the participants, which they had to identify from ‘not important at all’ to ‘very important’.

First, participants’ perceptions towards the most optimal source of adaptation were investigated. Examples were given, associated with three categories of source of adaptation: 1) affective, 2) cognitive and 3) meta-cognitive learner characteristics. Two examples related to cognitive learner characteristics, i.e., ‘student level’ and ‘student progress/pace’, were most often identified as ‘important’ (53.46% and 55.97%, respectively) and even ‘very important’ (40.25% and 36.48%, respectively). Next, the example related to meta-cognitive learner characteristics, i.e., ‘self-regulatory skills’, was chosen as second most ‘important’ with 54.40%. Finally, two examples related to affective characteristics, i.e., ‘attitude/motivation toward the learning content’ and ‘self-confidence’, were preferred as ‘important’ with 49.69% and 52.20%, respectively (Tables 5 and 6).

Table 5 Source of adaptivity
Table 6 Target of adaptivity

Second, participants’ perceptions towards the most optimal target of adaptation were investigated. Again, a selection of examples related to two categories of targets, i.e., (1) content and (2) support/instruction were presented. ‘Difficulty of exercises/content’ (category content) is most identified as ‘important’ (61.95%) and ‘very important’ (31.45%).

Participants’ answers are likewise varying between ‘important’ (62.89%) and ‘very important’ (17.30%) for ‘number of exercises/content’. For ‘order of the exercises/content’ answers are varying between ‘’neutral (27.04%), ‘important’ (48.11%) and ‘very important’ (15.72%).

Related to support/instruction, participants answers are varying between ‘important’ and ‘very important’ for ‘degree of instruction’ (58.49% and 28.93%) and ‘degree of feedback’ (51.57% and 29.56%). Except for ‘number of recommendations’ (category support/instruction) responses ranged respectively between ‘neutral’ (46%) and ‘important’ (40%).

Third, participants’ perceptions towards the most optimal method of adaptation were investigated. Two set of examples (one related to teacher control and one to learner control) were presented to the participants to rank from ‘not important at all’ to ‘very important’. Each set contained 7 examples: control over (1) ‘the level of tasks’, (2) ‘type of tasks’, (3) ‘sequence of tasks’, (4) ‘number of tasks’, (5) ‘instruction prior to the tasks’, (6) ‘evaluation of the tasks’ and (7) ‘guidance during the tasks’.

Participants scored most examples related to student control, as ‘important’ (see Table 7). For teacher control, most examples were scored ‘important’ and ‘very important’ (see Table 8).

Table 7 Examples related to student control
Table 8 Examples related to teacher control

Fourth, participants’ perceptions towards the most optimal time of adaptation were investigated. Participants were presented with examples related to static and dynamic adaptivity. ‘A DPL tool with a pre-test which measures student characteristics in advance’ (static) and (2) ‘a DPL tool that measures how a student feels during the exercises’ (dynamic) were found to be ‘important’ by 55.24% and 53.65% respectively. A third example related to dynamic adaptivity, i.e., ‘change of level difficulty according to performance’, was not only found to be ‘important’ (52.38%), but also ‘very important’ (40.95%) (Tables 9 and 10).

Table 9 Time of adaptivity
Table 10 Features on student dashboards

RQ3: What are Teachers’ Perceptions Towards Dashboards Within DPL Tools?

There are many possibilities dashboard features within DPL tools, both for students and teachers. To examine perceptions concerning these features, participants were shown two sets comprising dashboard features for students and teachers as mentioned by Maseleno et al. (2018), Schwendimann et al. (2017), Teasley (2017) and van Leeuwen et al. (2019). Each set holds eight features: (1) ‘an overview on completed tasks’, (2) ‘time spent on tasks’, (3) ‘an overview of results’, (4) ‘an overview of progress’, (5) ‘a failure analysis’, (6) ‘an overview of completed learning goals’, (7) ‘feedback’ and (8) ‘a deadline planning’. Participants were asked to rate the importance of these features separately for student dashboards and teacher dashboards from ‘not important at all’ to ‘very important’. In general, teachers show positive perceptions towards both feature sets, scoring most features important to very important. As Table 11 depicts, participants find student dashboards most optimal when containing feedback features such as ‘failure analysis’ and a ‘feedback system’ (by 45.54% ‘very important’ and 44.90% ‘important’ respectively). They also value overview of ‘progress’, ‘results’ and ‘completed tasks’ as ‘important’ (by 50%, 55.73% and 56.69% respectively). In addition, deadline planning features were likewise rated as ‘important’ (51.91%). For teacher dashboards (see Table 11), participants mainly desire features to follow up the learning process such as ‘failure analysis’ and ‘completed learning goals’ (respectively rated as ‘very important’ by 53.50% and 46.18%). In addition, overviews of ‘time spent on tasks’, ‘progress’, ‘completed tasks’, ‘feedback systems’ and ‘deadline planning’ are rated as ‘important’ (by 55.41%, 55.41%, 54.14%, 53.82% and 51.59% respectively).

Table 11 Features on teacher dashboards

RQ4: What are Teachers’ Expectations Regarding Support in View of Implementing DPL Tools?

For the fourth research question, participants’ expectations were examined concerning support for implementing DPL tools. A list with types of support as mentioned by various authors such as (Groff, 2017; Holmes et al., 2018; Marienko et al., 2020; Shaikh & Khoja, 2012) were presented to the participants to score from ‘not important’ at all to ‘very important’. Regarding technical support various types of support such as ‘e-manuals, user videos, websites’ (43.31% important, 32.80% very important) and ‘ICT coordinator/support at school’ (40.76% important, 30.89% very important) are valued by the participants. In terms of didactic support, ‘books, websites’ (51.91%) and ‘training courses’ (45.54%) were scored highest as ‘important’ (Tables 12 and 13).

Table 12 Technical support
Table 13 Didactical support

RQ5: What Different Teacher Profiles with Respect to a) Reported use, b) Perceptions of Adaptivity, c) Perceptions of Dashboards and d) Expectations of Support can be Identified?

For the fifth research question, a K-means cluster analysis was used to identify different groups among participants. Four variables were considered: (1) ‘technology use’ relating to participants’ reported use of technology to foster personalized learning, (2) ‘adaptivity’ relating to participants’ perceptions concerning types of adaptivity, (3) ‘dashboards’ relating to participants’ perceptions concerning student and teacher dashboard features, (4) ‘support’ relating to participants’ expected support when implementing DPL tools. By using the software SPSS 28.0, three clusters among participants were identified based on average respondents’ scores in each cluster group (see Table 14 and Appendix 1).

Table 14 Cluster centers per variableiiii

The names of the clusters ‘high technology users’, ‘medium technology users’, ‘low technology users’ are based on what the numbers represent with respect to the Likert scale (1 = never, 5 = daily). Cluster 1 can be referred to as ‘high technology users’ as they report most use of ‘software’. The teachers within this cluster generally have positive perceptions towards ‘adaptivity’ and ‘dashboards’, and additionally value ‘support’. Cluster 2 can be referred to as ‘medium technology users’ and shares, similar to cluster 1, positive perceptions towards ‘adaptivity’, ‘dashboards’ and ‘support’. Cluster 3 can be referred to as ‘low technology users’, given they report the lowest use of ‘software’. It is rather remarkable that this groups, similar to cluster 1 and 2, also shares positive perceptions towards ‘adaptivity’, ‘dashboards’ and ‘support’. Altogether, all three clusters seem to acknowledge the value of DPL as the results identify little variety in perceptions towards DPL tool features (such as adaptivity and dashboards) and a small variety in expectations towards implementation support. However, great variety between clusters is notable in software use for personalized learning.

Discussion

Summary and Interpretation Findings

Many argue DPL can be beneficial for both teachers and students (Baker, 2016; Pfeiffer et al., 2021; Shaikh & Khoja, 2012). However, implementing DPL tools can be very challenging and demanding for teachers (Shaikh & Khoja, 2012). Previous work brings to light that there is a big difference between what teachers want and what happens in their classrooms (Van Schoors et al., under review). The same conclusion can be drawn based on the results of this study: Although teachers seem to acknowledge the value of DPL tools, their reported technology use (RQ1) does not align with their positive perceptions (RQ2, RQ3) and expectations (RQ4). Thus, for several participants, there seems to be a gap between their optimal learning environment (perceptions and expectations) and their daily practice (behavior). These findings align with results from other large-scale surveys. This is for example shown by Hämäläinen et al. (2021), who found that teachers acknowledged the importance of digital technologies in the classroom but that there was also a huge variance in their ICT competences and usage.

Nonetheless, several studies (e.g. Vanderlinde & Van Braak, 2010) emphasize that differences in perceptions are of great influence on ICT use. If teachers show low expectations or perceptions, they will be less inclined to deploy the technology in the classroom. Therefore, it is somewhat surprising that the results of this study do not detect different perceptions in the three identified groups (low, medium and high technology users).

Next to perceptions, other possible reasons for variation in technology use remained undetected as well. Existing frameworks can be used as a starting point for reflection. For example, Ferede et al. (2021) investigated ICT use of teachers in higher education by assessing their experiences and opinions. Along determinants related to the institution (e.g. ICT plan/vision) or the individual (e.g. teacher competences and attitudes), they discuss ICT infrastructure and equipment to be important assets. One explanation for the difference in technology use in this study, could be a shortcoming of adequate hardware or software in the participants’ schools. Furthermore, Ferede et al. (2021) touch upon the importance of professional development and technical support as an imperative condition to overcome implementation chasms of new technology (see e.g. Moore, 1991). It is advised to invest in tailored training, especially since the implementation of DPL tools in education is a challenging process for teachers (Groff, 2017; Marienko et al., 2020; Shaikh & Khoja, 2012).This study showed that teachers are particularly willing to engage in training, as they found several support initiatives important. It can be interesting for researchers, policymakers and software developers to further investigate specific teacher needs towards DPL implementation, so that they optimize and refine those support initiatives. In addition, it can also be useful to consider teachers’ needs towards DPL tools. This study revealed insights into teachers’ perceptions towards adaptivity within DPL tools. For example, participants showed their interest in sources of adaptivity related to cognitive, affective and meta-cognitive learner characteristics, with scores mostly varying between ‘important’ and ‘very important’. However, several reviews and meta-analyses (e.g. Bernacki et al., 2021; Van Schoors et al., 2021) acknowledge that adaptivity within DPL tools is often limited to cognitive learner characteristics (e.g. mastery learning).

Next to adaptivity, this study also revealed positive teachers’ perceptions towards dashboards, as all features were perceived to be important. It is encouraging to see that teachers appreciate dashboard visualizations, as these can indeed assist and empower them to enhance personalization practices (Molenaar, 2021).

In sum, if researchers, policy makers and software developers further optimize support and DPL tools according to teachers’ needs, teachers will more likely see the charm of DPL, instead of all the chasms and will accordingly be more willing to implement DPL in their classrooms.

Limitations and Follow-up Research

Despite the merits of this study, there are some limitations. The results are obtained from an online self-reported survey. A first limitation is that online surveys also invoke less accountability. For example, some teachers might be more invested into technology than others whom we did not reach with the survey. This could implicate participants with, for example, a higher acceptance towards technology. In addition, a survey always invokes the risk of socially desirable responding as participants could report falsely about their behavior (Fisher, 1993). Further research should investigate DPL use through observations to see if teachers’ reported use is similar to actual use. Another interesting pathway for further research is the investigation of the research questions in the post-COVID-19 context to see if there are different/similarities compared to current results. Finally, the descriptive results of this study can only show a trend, not a causal relationship.

This study could raise awareness – both in policy, research and technology industry– on several aspects of DPL integration in education and teachers’ point-of-view in this matter. In general, a focus on teachers’ needs is limited (Groff, 2017). Considering teachers’ needs through dialogue and co-design can have the potential to move the DPL field forward and establish more powerful co-teaching technologies (Groff, 2017). Although no differences in perceptions are being identified in this study, it is acknowledged that positive teacher perceptions could lead towards a better DPL implementation (e.g. Groff, 2017; Holmes et al., 2018). These perceptions can be fostered via adequate professionalization opportunities, enhance teacher competences and attitudes concerning DPL. As Groff (2017) states: “Teachers are integral to the development of these learning technologies and practices – and need significant support to do so” (p. 27).

Conclusion

By utilizing a survey, the goal of this study was to identify teachers’ reported technology use for personalized learning, perceptions towards DPL tool features (such as adaptivity and dashboards) and their expectations towards implementation support. On the one hand, the results indicate a consensus concerning the value of DPL. Participants appreciate DPL tools and consider most adaptive features as important. On the other hand, these positive perceptions do not align their reported technology use to foster personalized learning as three groups of teachers were identified (low, medium and high technology users). This study provides unique insights in the teacher point-of-view, which is still underexamined in the field of DPL. However, our study also stresses the need for future research to detect possible reasons for the difference in technology use.