Abstract
Although digital personalized learning (DPL) is assumed to be beneficial for the student as well as the teacher, the implementation process of DPL tools can be challenging. Therefore, the aim of our study is to scrutinize teachers’ perceptions towards the implementation of DPL in the classroom. A total of 370 teachers from primary and secondary education (students aged 6–18 years old) were questioned through an online survey. An overview of descriptive results is presented regarding (1) teachers’ reported technology use, (2) their perceptions towards adaptivity and dashboards in DPL tools and (3) their expectations of support in view of implementing DPL. Based on a cluster analysis, three teacher clusters are distinguished. Results reveal all three clusters had positive perceptions towards DPL. Nevertheless, there is great variety in reported use of DPL tools.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Many educational technology enterprises, researchers and policy makers assume that technology will transform or even revolutionize classrooms all over the world (Reich, 2020). Especially technology facilitating personalized learning is seen as a key force in impacting educational practice (Kolchenko, 2018; Reich, 2020). Reviews and meta-analyses depict that digital personalized learning (DPL) research is expanding, not solely because of the expected potential to leverage students’ motivation and performances, but also because of its ability to support teachers who are facing large heterogenous class groups (see Bernacki et al., 2021; Major & Francis, 2020; Van Schoors et al., 2021; Xie et al., 2019; Zheng et al., 2022).
Although the role of the teacher is very important when implementing DPL in education, it is mostly absent in contemporary literature (Van Schoors et al., 2021). The question of how teachers perceive the value of DPL in their classroom is essential. On the one hand, teachers’ perceptions have a profound impact on the general technology adoption process in education (Vanderlinde & Van Braak, 2010). On the other hand, teachers’ expectations can be meaningful for policy makers and software developers to establish more useful and resourceful contributions in the DPL field (Major & Francis, 2020; Walkington & Bernacki, 2020). Hence, by considering teachers’ perceptions and expectations towards opportunities (or challenges), the reconciliation of human and digital tutors can be enhanced.
Therefore, teachers’ perceptions concerning DPL are the basic premise within this contribution and were scrutinized by performing an online survey. In total 370 teachers participated and reported on their (1) technology use for personalization, (2) perceptions towards adaptivity and dashboards in DPL tools and (3) expectations of support in view of implementing DPL. First, we introduce a framework on DPL. Second, we report on the methodology used. Third, we present the results in which we distinguish teacher profiles. Finally, we reflect on these profiles and propose follow-up research with a specific focus on teacher perceptions when implementing DPL tools in their classroom.
Theoretical Background
The theoretical background consists of two sections. In the first section, we introduce the definition and possible benefits of DPL, followed by a discussion of technology software, adaptivity and visualizing learning data. In the second section, we elaborate on expanding teacher responsibilities, competences and support for implementing DPL tools.
DPL
DPL is a relatively “young” area of research that is becoming more extensive and complex along with ubiquitous digitization (Groff, 2017; Kolchenko, 2018; Shemshack & Spector, 2020). According to Van Schoors et al. (2021), DPL can be defined by setting apart the following characteristics of personalization in DPL tools: (1) various learner characteristics are considered, (2) different aspects of a learning environment can be adapted, (3) personalization can be driven by the teacher, learner or tool itself and (4) teachers might enhance personalization through the use of learner data visualized by the tool.
Although reviews and meta-analyses are rather scarce, DPL has found to be beneficial for cognitive outcomes such as higher learning achievement, and non-cognitive outcomes such as engagement (Major & Francis, 2020; Pfeiffer et al., 2021; Zheng et al., 2022). In addition, DPL is also expected to be beneficial for teachers; DPL tools can invoke reflection on numerous fields of interests and knowledge levels within a differing class group (Baker, 2016; Holmes et al., 2018; SRI international, 2018).
Technology Software and Adaptivity
Given the many expectations concerning teacher and student benefits, DPL tools proliferated and became popular in education (Aleven et al., 2017; Baker, 2016; Basham et al., 2016; Xie et al., 2019). In the wide range of DPL tools, there is a variety of built-in personalization systems. These systems provide unique learning experiences: while some supply adaptive recommendations, others facilitate the provision of adaptive learner content (Groff, 2017; Shemshack & Spector, 2020).
To untangle the complexity within adaptivity genres, many authors developed classifications or frameworks. One example is the framework of Vandewaetere and Clarebout (2014) depicting a four-dimensional view on adaptivity genres. They discuss the time, target, method and source of adaptation. The first dimension, time of adaptation, pertains to three possible moments an adaptation can take place: before the learning activity starts (static), during the learning activity (dynamic) and a combination of both (dual pathway). A second dimension, the target of adaptation, relates to what is being personalized. Some examples are content, presentation, instruction and support (Vandewaetere & Clarebout, 2014). A third dimension is the method of adaptation which pertains to how the adaptations are made, either system-controlled (adaptations made by the developer or the instructor), learner-controlled (adaptations made by the learner) or the combination of both (shared-controlled). The source of adaptation is the fourth dimension and refers to what drives the adaptation. Regarding the source, Vandewaetere and Clarebout (2014) distinguish characteristics of the learner which are cognitive, affective and behavioral characteristics. The latter considers the interaction between the learner and the system.
In addition to Vandewaetere and Clarebout (2014), Groff (2017) as well as Bulger (2016) both acknowledge a spectrum of DPL tools or technology focusing on various intensity of adaptivity (instead of types of adaptivity). At the beginning of the spectrum, they situate the category ‘Data Driven Systems’, comprising management systems that offer pre-determined learning materials based on students’ mastery level (Bulger, 2016). Groff (2017) similarly refers to data-driven learning technology as Learning Management Systems (LMSs). LMSs can provide students with individual learning pathways, evaluations and recommendations based on learner data (Bulger, 2016; Groff, 2017). Further up the spectrum are adaptive software genres that move beyond pre-determined learning materials. Bulger (2016) classified these tools as ‘Adaptive Learning Tools’. They offer more dynamic data-driven learning material according to learners’ behaviors or competences. Groff (2017) states that in this case, dynamic adaptation is made possible via machine learning which goes beyond the pre-determined decision tree used in LMSs. At the end of the spectrum are the ‘Intelligent Tutor Systems’ (ITSs), often referred to as the highest category of DPL-tools (Bulger, 2016). ITSs subsume a pro-active model, often referred to as a ‘system-based tutor’ which provides real-time instruction by analyzing students’ learning needs and progress (Bulger, 2016; Groff, 2017). In addition, most ITSs can track mental processes and diagnose errors. A promising generation of more sophisticated ITSs, also named conversational ITSs, is currently being developed. For example, ITSs that consider affective learner characteristics, giving more qualitative system-based feedback, acknowledging conceptual reasoning and stimulate deep user-system dialogue (Groff, 2017).
Aforementioned examples of technology software are not discrete and distinctions between types can be ambiguous. As Groff (2017) illustrates: “For example, a learning management system may or may not include data-driven learning capabilities. At the same time, a game-based learning environment may also be data-driven but not be a learning management system” (p.11). In addition, Groff (2017) points out that educational technology companies are often quick to attribute the label “personalized”, despite the rather low-quality adaptivity of the technology at hand.
Visualizing Learning Data on Dashboards
Some DPL technology include learning analytics, which measure, aggregate, analyze and visualize learner data (Maseleno et al., 2018; Schwendimann et al., 2017; Teasley, 2017). Visualizations are generally presented through real-time visual interfaces or displays, also called dashboards. Specific learning activities are displayed in a meaningful way through graphs, gauges or maps (Schwendimann et al., 2017). Examples of data visualizations are completed learning content/exercises, amount of time spent on learning content/exercises, results on exercises/tests, analyses of failed tasks, progress over months/weeks/years, feedback from the system and/or the teacher, achieved goals and agendas with personal schedules or deadlines (Maseleno et al., 2018; Schwendimann et al., 2017; Teasley, 2017). These visualizations aim to encourage teachers’ awareness, reflection and analysis of students’ learning processes (Teasley, 2017; van Leeuwen et al., 2019). Important information regarding the learning progress or behavior can be monitored immediately (Maseleno et al., 2018; Teasley, 2017). Most dashboards are designed for teachers; however, student dashboards are also on the rise (Maseleno et al., 2018; Schwendimann et al., 2017). Student dashboards can increase engagement and may empower students towards their own learning process and progress (Maseleno et al., 2018; Teasley, 2017).
Dashboards have the potential to enhance human-prompted feedback, initiated by the teacher (Knoop-van Campen & Molenaar, 2020; Maseleno et al., 2018; Teasley, 2017; Van Leeuwen et al., 2019). More specifically, teachers provided with the competence to interpret data—also called data-driven decision-making—can take additional pedagogical actions considering individual learners’ needs (Knoop-van Campen & Molenaar, 2020; Maseleno et al., 2018; Teasley, 2017).
Enactment of DPL Tools in the Classroom
With their increasing smart features (e.g. adaptivity, dashboards), DPL tools can be utilized for multitudinous purposes such as automated adaptation of learning opportunities, coaching students, support for curation/management of the learning process and support for using student data (SRI international, 2018). Consequently, DPL tools can be seen as a source of support for teachers. However, they also complicate teachers’ roles by challenging their responsibilities and competences (Shaikh & Khoja, 2012). Allocating ample resources to empower teachers and reinforce their capacity has therefore been considered a timely issue for many educational stakeholders and researchers around the globe (Marienko et al., 2020).
Expanding Teacher Responsibilities and Competences
When using DPL to enhance students’ learning processes, teachers play a crucial role (Kolchenko, 2018; Major & Francis, 2020). As Basham et al. (2016) state, implementing DPL calls for a shift in teachers’ instructional strategies: there is a continuous responsibility to make well-informed decisions based on one’s own experience in conjunction with acquired learner data through DPL tools. Such collaborations between tool and teacher, also referred to as distributed scaffolding (see Tabak, 2004), can augment the learning process in various ways. This is, for example, the case for adaptivity. The combination of adaptive AI-based and dynamic personalized tutor-based decisions ignite amplified adaptivity, also identified as human-AI hybrid adaptivity (Holstein et al., 2020).
Holstein et al. (2020) describe four dimensions within human-AI hybrid adaptivity. The first dimension, goal augmentation, involves the reciprocal process of information sharing to improve instructional learning goals for various students. The second dimension, perceptual augmentation, pertains to leveraging complementarity in perception. What the DPL tool gathers as relevant learner information might enhance opportunities for teachers to examine and interpret the learning process in a more profound way (Holstein et al., 2020). The third dimension, action augmentation, refers to expanding the ability, capacity and availability of instructional actions (Holstein et al., 2020). The fourth dimension is decision augmentation: by informing each other, more adequate pedagogical decisions can be made (Holstein et al., 2020).
Although technology can certainly benefit teachers, they are also required to stay attentive to DPL tools constraints (e.g. inappropriate learning goals, false learner information, improper decisions). DPL tools are not yet flawless: fragmented or contaminated data can generate erroneous predictions or unintelligent adaptations (Baker, 2016; Basham et al., 2016). Today’s DPL tools often contain models which are still very simplistic and inadequate to acknowledge rich educational contexts (Baker, 2016; Basham et al., 2016). Thus, teachers are expected to take responsibility and use their experience-based knowledge to make critical judgments of these pitfalls (Baker, 2016; Kolchenko, 2018).
Support for Implementing DPL Tools
As DPL integration in education accelerates, teachers are progressively challenged. They need to acquire new competencies at fast pace, which are substantially different from what they learned during training (Groff, 2017; Marienko et al., 2020; Shaikh & Khoja, 2012). In this evolving process, acknowledging the complex teacher role through appropriate professionalization support is essential (Kaiser & König, 2019). Within this support, teachers’ individual perceptions and feelings must be specifically considered as this is crucial for successful implementation (see concerns-based adoption model by Hall & Hord, 1987; Hord et al., 1987). Similarly, teachers need help to navigate through the wide range of DPL tools and guidance to deploy them in the classroom (Groff, 2017; Holmes et al., 2018; Marienko et al., 2020).
Relevant professional development resources or opportunities can take on many forms. Werquin (2010) distinguishes between formal (e.g. DPL training courses provided by teacher education or other formal institutions), informal (e.g. collegial conversations) and non-formal (e.g. websites, books, videos and e-manuals related to DPL software) learning. Grossman (1990) states that teacher education is imperative to address prior misconceptions which could debilitate mindful teaching strategies (Grossman, 1991). This also applies to the integration of new innovations (such as DPL tools), given adequate supportive initiatives can (1) encourage teachers to get used to the new technology and (2) distribute pedagogy-specific advice related to curriculum standards (Holmes et al., 2018). In turn, teacher competences and perceptions (enhanced by these learning opportunities) do not only influence instructional strategies but also students’ learning outcomes (Kaiser & König, 2019).
Some studies have already been carried out examining teachers’ views of technology in general. An example is the Belgian quadrennial research project MICTIVO which stands for Monitoring information and communications technology in Flemish education (“Monitoring ICT in het Vlaamse Onderwijs”, Heymans et al., 2018). The project investigates teachers’ use of various hardware/software and perceptions of ICT integration in the classroom. Although previous MICTIVO surveys (MICTIVO 2008 monitor, MICTIVO 2012 monitor, MICTIVO 2018 monitor) showed no frequent ICT use, teachers generally had positive perceptions of the importance and effects of ICT. The needs related to DPL specifically, were not addressed in this project. In general, a focus on teachers’ needs is rather scarce in DPL research (Groff, 2017). Considering teachers’ needs through dialogue can provide better alignment of support initiatives, as well as further developments within the field of DPL.
Aims of the Study
Expanding digitization fosters a worldwide interest for DPL innovations in education: many have high expectations concerning student and teacher benefits. Due to growing interest, research involving numerous features of DPL tools (such as adaptivity and dashboards) is likewise on the rise (see Bernacki et al., 2021; Major & Francis, 2020; Van Schoors et al., 2021; Xie et al, 2019). Along high expectations, DPL also invokes many challenges for teachers (expanding responsibilities, new competences…). Therefore, appropriate support to implement DPL tools is particularly salient (Groff, 2017; Marienko et al., 2020; Shaikh & Khoja, 2012). To shine light on teacher perceptions concerning DPL implementation, the following research questions are addressed:
-
(1)
To what extent do teachers report using technology for personalized learning in their classroom?
-
(2)
What are teachers’ perceptions towards adaptivity within DPL tools?
-
(3)
What are teachers’ perceptions towards dashboards within DPL tools?
-
(4)
What are teachers’ expectations regarding support in view of implementing DPL tools?
-
(5)
What different teacher profiles with respect to a) reported use, b) perceptions of adaptivity, c) perceptions of dashboards and d) expectations of support can be identified?
Method
To provide answers to the research questions, a large-scale survey was created, piloted and used to determine teachers’ perceptions regarding the implementation of DPL. A total of 370 teachers from primary and secondary education (students aged 6–18 years old) participated. The quantified data were analyzed via a descriptive and cluster analysis. The latter is a method that distinguishes subjects within a dataset in homogeneous groups (or clusters) according to characteristics (Romesburg, 2004). In what follows, the (1) participants, (2) instrument, (3) procedure and (4) data analysis are further explained.
Participantsi and ii
The survey targeted teachers employed in primary or secondary Flemish (the Dutch speaking part of Belgium) education. In total, 370 teachers participated. The sample consists of teachers from primary (39.73%), special needs primary (6.76%), secondary (52.43%) and special secondary (1.08%) education. Most of the participants are female (74.05%), which is a small overrepresentation compared to the population (see Table 1 and Appendix). The respondents’ age range lies between 21 and 63 years and is almost similarly distributed when compared to the populations’ age range. Under-25 and over-59 years old participants are rather in the minority with 4.05% and 3.24% respectively. More than half of the participants (54.05%) have more than 15 years of teaching experience. In sum, the sample can be viewed as representative for the population data regarding school grade, gender, age and educational experience (see Table 1 in which teacher percentages within the sample are compared to total teacher percentages within the population).
Instrument
To investigate teachers’ self-reported use and perceptions towards DPL, a questionnaire was developed and sent to all Flemish primary and secondary schools (Appendix Appendix 2). The questionnaire contained four main sections, with a total of 40 questions (see Table 2): (1) Current use of digital technology for personalized learning, (2) Perceptions towards adaptivity within DPL tools, (3) Perceptions towards dashboards within DPL tools (4) Expectations towards support for implementing DPL tools. All survey questions utilized a 5-point Likert-type scale to quantify and rank teachers’ use and perceptions from positive (towards 5) to negative (towards 1) beliefs/frequent use. The estimated time to complete the survey was approximately thirty minutes. All four sections of the survey are separately related to the first four research questions. The fifth research question (i.e., identification of teacher profiles) is related to an interlinked analysis of all four sections. This structure was also used to report the results in section five (results).
Procedure
Development of the questionnaire was based on two sources: On the one hand, the questions were based on research literature (see Table 2). On the other hand, ten focus group interviews (including 56 teachers) were conducted exploring teachers’ perceptions on DPL, which helped refinement of the constructed questions (e.g. adding frequently mentioned support initiatives to the set of items related to ‘DPL implementation support’). Next, an iterative evaluation of the refined questionnaire was carried out with a focus on the content and formulation of items. In this evaluation, experts were involved from different academic fields (e.g. statistical modeling, educational technology) and from other projects in which large-scale surveys were used (e.g. MICTIVO). Finally, to (1) pilot and (2) improve the validity of the survey a cognitive interview was performed with a small group of teachers (n = 15). Based on these teachers’ comments, final refinements were made related to comprehensibility (e.g. addition of a short and understandable definition of DPL in the beginning of the survey) and wording (e.g. replacements of English words ‘system-control’, with Dutch variations), resulting in the finalized survey.
After approval from the ethical committee (Case number: G- 2,019,101,978), the survey was digitized and assigned to school principals of all Flemish schools. The survey was sent out together with an information letter (addressing the purpose of the study) and a request to distribute the mail among their educational staff. In view of representativeness, extra activities were initiated to reach specific target groups. For example, the survey was shared in an open-source network for teachers in secondary education.
Participating teachers first had to click on a link which directed them to an informed consent with relevant information concerning the procedure and goals of the questionnaire. In this informed consent, a short introduction was included describing DPL as learning that takes place in a digital learning environment which adapts to characteristics of individual learners. The informed consent also clearly mentioned that personal data would be processed anonymously. Only when the participants gave their active consent, the questionnaire could be accessed.
The survey was launched mid-January 2020 and provisionally accessible until the end of April 2020. However, administration of the survey ended early on March 16 as schools were obliged to shift to online learning due to the Covid-19 pandemic. The mandatory switch invoked risks of distorted results, not only in terms of reported use, but also in terms of perceptions and needs.
Data Analysis
A two-fold quantitative research design was applied: (1) To examine the first four research questions, a descriptive analysis was conducted using SPSS (version 28.0.1) to explore univariate data (e.g. frequencies, averages, minimum and maximum). (2) The fifth research question was examined trough a K-means cluster analysis; a method that can be used to distinguish profiles with similar and dissimilar characteristics and gather them in clusters. The cluster analysis was carried out according to rigorous guidelines and considerations provided by Romesburg (2004). First a data matrix was created, again via the software SPPS (version 28.0.1), with columns (referring to the variables) and rows (individual datapoints per participant). Next, the data was recoded via sum scoring to compute variables (see Table 3). For every variable, all related items of the survey were included when calculating the sum score (for example: we included all twelve items related to the software types for the variable ‘technology use’). As depicted in Table 3, all Cronbach’s α coefficients are above 0.75, which indicated an acceptable reliability.
Taking the new variables into consideration, a K-means cluster analysis was performed, using Euclidean distances. Tabulated results of both the descriptive analysis and the K-means cluster analysis will be presented in the next section.
Resultsiii
This study aims to investigate teacher perceptions concerning DPL in the classroom. In what follows, the results of two analyses are provided. As depicted in Fig. 1, the first is descriptive in nature and investigates -in accordance with the first four research questions- teachers’ (1) reported technology use, (2) perceptions towards adaptivity as well as dashboards and (3) expected support. The second comprises a cluster analysis which relates to the fifth research question, i.e., the identification of teacher profiles.
RQ1: To What Extent do Teachers Report Using DPL Tools in Their Classroom?
For the first research question, the survey aimed at identifying (1) participants’ reported technology usage to foster personalized learning during the school year. Participants were presented with a list of technology types to indicate their usage on a 5-point Likert scale from ‘never’ to ‘daily’. This list comprises twelve software types as mentioned by Bulger (2016), Groff (2017) and Heymans et al. (2018): (1) ‘Learning management systems’ (for offering learning content, management of learning progress and administration), (2) ‘authoring tools’ (an environment to be created with own learning content or self-chosen methods), (3) ‘blended learning tools’, (combination of offline teaching and online learning), (4) ‘communication tools’ (to facilitate communication between students, fellow students and teachers), (5) ‘tools for students with disabilities’ (to support students), (6) ‘e-portfolio tools’ (Online storage for tasks, essays, …), (7) ‘simulation tools’ (visualizations in 2D, 3D, virtual reality, augmented reality,…), (8) ‘assessment tools’ (software to facilitate tests and/or exams), (9) ‘games’ (educational games or tools with gamification elements such as levels, reward system, avatars…), (10) ‘online learning environments’ (an environment in which students are guided in exploring and practicing learning content), (11) ‘collaboration tools’ (facilitating students to interact with each other, share their learning experiences and co-create) and (12) ‘presentation tools’ (software to present learning content).
Table 4 depicts an overview of these findings. ‘Learning management systems’ are reported to be used most often and most frequently (66.22% daily). Besides ‘online learning environments’ (weekly 28.65% or even daily 10.54%) and ‘presentation tools’ (monthly 29.73%), other tools are reported to be used distinctly less. For example, many participants indicate never using ‘communication tools’ (63.78%), ‘e-portfolio tools’ (62.97%) and ‘blended learning tools’ (61.89%) to support personalized learning.
RQ2: What are Teachers’ Perceptions Towards Adaptivity Within DPL Tools?
For the second research question, the survey probed for teacher perceptions towards adaptivity in DPL. Based on the work of Vandewaetere and Clarebout (2014), a set of examples concerning source, target, method and time of adaptivity were presented to the participants, which they had to identify from ‘not important at all’ to ‘very important’.
First, participants’ perceptions towards the most optimal source of adaptation were investigated. Examples were given, associated with three categories of source of adaptation: 1) affective, 2) cognitive and 3) meta-cognitive learner characteristics. Two examples related to cognitive learner characteristics, i.e., ‘student level’ and ‘student progress/pace’, were most often identified as ‘important’ (53.46% and 55.97%, respectively) and even ‘very important’ (40.25% and 36.48%, respectively). Next, the example related to meta-cognitive learner characteristics, i.e., ‘self-regulatory skills’, was chosen as second most ‘important’ with 54.40%. Finally, two examples related to affective characteristics, i.e., ‘attitude/motivation toward the learning content’ and ‘self-confidence’, were preferred as ‘important’ with 49.69% and 52.20%, respectively (Tables 5 and 6).
Second, participants’ perceptions towards the most optimal target of adaptation were investigated. Again, a selection of examples related to two categories of targets, i.e., (1) content and (2) support/instruction were presented. ‘Difficulty of exercises/content’ (category content) is most identified as ‘important’ (61.95%) and ‘very important’ (31.45%).
Participants’ answers are likewise varying between ‘important’ (62.89%) and ‘very important’ (17.30%) for ‘number of exercises/content’. For ‘order of the exercises/content’ answers are varying between ‘’neutral (27.04%), ‘important’ (48.11%) and ‘very important’ (15.72%).
Related to support/instruction, participants answers are varying between ‘important’ and ‘very important’ for ‘degree of instruction’ (58.49% and 28.93%) and ‘degree of feedback’ (51.57% and 29.56%). Except for ‘number of recommendations’ (category support/instruction) responses ranged respectively between ‘neutral’ (46%) and ‘important’ (40%).
Third, participants’ perceptions towards the most optimal method of adaptation were investigated. Two set of examples (one related to teacher control and one to learner control) were presented to the participants to rank from ‘not important at all’ to ‘very important’. Each set contained 7 examples: control over (1) ‘the level of tasks’, (2) ‘type of tasks’, (3) ‘sequence of tasks’, (4) ‘number of tasks’, (5) ‘instruction prior to the tasks’, (6) ‘evaluation of the tasks’ and (7) ‘guidance during the tasks’.
Participants scored most examples related to student control, as ‘important’ (see Table 7). For teacher control, most examples were scored ‘important’ and ‘very important’ (see Table 8).
Fourth, participants’ perceptions towards the most optimal time of adaptation were investigated. Participants were presented with examples related to static and dynamic adaptivity. ‘A DPL tool with a pre-test which measures student characteristics in advance’ (static) and (2) ‘a DPL tool that measures how a student feels during the exercises’ (dynamic) were found to be ‘important’ by 55.24% and 53.65% respectively. A third example related to dynamic adaptivity, i.e., ‘change of level difficulty according to performance’, was not only found to be ‘important’ (52.38%), but also ‘very important’ (40.95%) (Tables 9 and 10).
RQ3: What are Teachers’ Perceptions Towards Dashboards Within DPL Tools?
There are many possibilities dashboard features within DPL tools, both for students and teachers. To examine perceptions concerning these features, participants were shown two sets comprising dashboard features for students and teachers as mentioned by Maseleno et al. (2018), Schwendimann et al. (2017), Teasley (2017) and van Leeuwen et al. (2019). Each set holds eight features: (1) ‘an overview on completed tasks’, (2) ‘time spent on tasks’, (3) ‘an overview of results’, (4) ‘an overview of progress’, (5) ‘a failure analysis’, (6) ‘an overview of completed learning goals’, (7) ‘feedback’ and (8) ‘a deadline planning’. Participants were asked to rate the importance of these features separately for student dashboards and teacher dashboards from ‘not important at all’ to ‘very important’. In general, teachers show positive perceptions towards both feature sets, scoring most features important to very important. As Table 11 depicts, participants find student dashboards most optimal when containing feedback features such as ‘failure analysis’ and a ‘feedback system’ (by 45.54% ‘very important’ and 44.90% ‘important’ respectively). They also value overview of ‘progress’, ‘results’ and ‘completed tasks’ as ‘important’ (by 50%, 55.73% and 56.69% respectively). In addition, deadline planning features were likewise rated as ‘important’ (51.91%). For teacher dashboards (see Table 11), participants mainly desire features to follow up the learning process such as ‘failure analysis’ and ‘completed learning goals’ (respectively rated as ‘very important’ by 53.50% and 46.18%). In addition, overviews of ‘time spent on tasks’, ‘progress’, ‘completed tasks’, ‘feedback systems’ and ‘deadline planning’ are rated as ‘important’ (by 55.41%, 55.41%, 54.14%, 53.82% and 51.59% respectively).
RQ4: What are Teachers’ Expectations Regarding Support in View of Implementing DPL Tools?
For the fourth research question, participants’ expectations were examined concerning support for implementing DPL tools. A list with types of support as mentioned by various authors such as (Groff, 2017; Holmes et al., 2018; Marienko et al., 2020; Shaikh & Khoja, 2012) were presented to the participants to score from ‘not important’ at all to ‘very important’. Regarding technical support various types of support such as ‘e-manuals, user videos, websites’ (43.31% important, 32.80% very important) and ‘ICT coordinator/support at school’ (40.76% important, 30.89% very important) are valued by the participants. In terms of didactic support, ‘books, websites’ (51.91%) and ‘training courses’ (45.54%) were scored highest as ‘important’ (Tables 12 and 13).
RQ5: What Different Teacher Profiles with Respect to a) Reported use, b) Perceptions of Adaptivity, c) Perceptions of Dashboards and d) Expectations of Support can be Identified?
For the fifth research question, a K-means cluster analysis was used to identify different groups among participants. Four variables were considered: (1) ‘technology use’ relating to participants’ reported use of technology to foster personalized learning, (2) ‘adaptivity’ relating to participants’ perceptions concerning types of adaptivity, (3) ‘dashboards’ relating to participants’ perceptions concerning student and teacher dashboard features, (4) ‘support’ relating to participants’ expected support when implementing DPL tools. By using the software SPSS 28.0, three clusters among participants were identified based on average respondents’ scores in each cluster group (see Table 14 and Appendix 1).
The names of the clusters ‘high technology users’, ‘medium technology users’, ‘low technology users’ are based on what the numbers represent with respect to the Likert scale (1 = never, 5 = daily). Cluster 1 can be referred to as ‘high technology users’ as they report most use of ‘software’. The teachers within this cluster generally have positive perceptions towards ‘adaptivity’ and ‘dashboards’, and additionally value ‘support’. Cluster 2 can be referred to as ‘medium technology users’ and shares, similar to cluster 1, positive perceptions towards ‘adaptivity’, ‘dashboards’ and ‘support’. Cluster 3 can be referred to as ‘low technology users’, given they report the lowest use of ‘software’. It is rather remarkable that this groups, similar to cluster 1 and 2, also shares positive perceptions towards ‘adaptivity’, ‘dashboards’ and ‘support’. Altogether, all three clusters seem to acknowledge the value of DPL as the results identify little variety in perceptions towards DPL tool features (such as adaptivity and dashboards) and a small variety in expectations towards implementation support. However, great variety between clusters is notable in software use for personalized learning.
Discussion
Summary and Interpretation Findings
Many argue DPL can be beneficial for both teachers and students (Baker, 2016; Pfeiffer et al., 2021; Shaikh & Khoja, 2012). However, implementing DPL tools can be very challenging and demanding for teachers (Shaikh & Khoja, 2012). Previous work brings to light that there is a big difference between what teachers want and what happens in their classrooms (Van Schoors et al., under review). The same conclusion can be drawn based on the results of this study: Although teachers seem to acknowledge the value of DPL tools, their reported technology use (RQ1) does not align with their positive perceptions (RQ2, RQ3) and expectations (RQ4). Thus, for several participants, there seems to be a gap between their optimal learning environment (perceptions and expectations) and their daily practice (behavior). These findings align with results from other large-scale surveys. This is for example shown by Hämäläinen et al. (2021), who found that teachers acknowledged the importance of digital technologies in the classroom but that there was also a huge variance in their ICT competences and usage.
Nonetheless, several studies (e.g. Vanderlinde & Van Braak, 2010) emphasize that differences in perceptions are of great influence on ICT use. If teachers show low expectations or perceptions, they will be less inclined to deploy the technology in the classroom. Therefore, it is somewhat surprising that the results of this study do not detect different perceptions in the three identified groups (low, medium and high technology users).
Next to perceptions, other possible reasons for variation in technology use remained undetected as well. Existing frameworks can be used as a starting point for reflection. For example, Ferede et al. (2021) investigated ICT use of teachers in higher education by assessing their experiences and opinions. Along determinants related to the institution (e.g. ICT plan/vision) or the individual (e.g. teacher competences and attitudes), they discuss ICT infrastructure and equipment to be important assets. One explanation for the difference in technology use in this study, could be a shortcoming of adequate hardware or software in the participants’ schools. Furthermore, Ferede et al. (2021) touch upon the importance of professional development and technical support as an imperative condition to overcome implementation chasms of new technology (see e.g. Moore, 1991). It is advised to invest in tailored training, especially since the implementation of DPL tools in education is a challenging process for teachers (Groff, 2017; Marienko et al., 2020; Shaikh & Khoja, 2012).This study showed that teachers are particularly willing to engage in training, as they found several support initiatives important. It can be interesting for researchers, policymakers and software developers to further investigate specific teacher needs towards DPL implementation, so that they optimize and refine those support initiatives. In addition, it can also be useful to consider teachers’ needs towards DPL tools. This study revealed insights into teachers’ perceptions towards adaptivity within DPL tools. For example, participants showed their interest in sources of adaptivity related to cognitive, affective and meta-cognitive learner characteristics, with scores mostly varying between ‘important’ and ‘very important’. However, several reviews and meta-analyses (e.g. Bernacki et al., 2021; Van Schoors et al., 2021) acknowledge that adaptivity within DPL tools is often limited to cognitive learner characteristics (e.g. mastery learning).
Next to adaptivity, this study also revealed positive teachers’ perceptions towards dashboards, as all features were perceived to be important. It is encouraging to see that teachers appreciate dashboard visualizations, as these can indeed assist and empower them to enhance personalization practices (Molenaar, 2021).
In sum, if researchers, policy makers and software developers further optimize support and DPL tools according to teachers’ needs, teachers will more likely see the charm of DPL, instead of all the chasms and will accordingly be more willing to implement DPL in their classrooms.
Limitations and Follow-up Research
Despite the merits of this study, there are some limitations. The results are obtained from an online self-reported survey. A first limitation is that online surveys also invoke less accountability. For example, some teachers might be more invested into technology than others whom we did not reach with the survey. This could implicate participants with, for example, a higher acceptance towards technology. In addition, a survey always invokes the risk of socially desirable responding as participants could report falsely about their behavior (Fisher, 1993). Further research should investigate DPL use through observations to see if teachers’ reported use is similar to actual use. Another interesting pathway for further research is the investigation of the research questions in the post-COVID-19 context to see if there are different/similarities compared to current results. Finally, the descriptive results of this study can only show a trend, not a causal relationship.
This study could raise awareness – both in policy, research and technology industry– on several aspects of DPL integration in education and teachers’ point-of-view in this matter. In general, a focus on teachers’ needs is limited (Groff, 2017). Considering teachers’ needs through dialogue and co-design can have the potential to move the DPL field forward and establish more powerful co-teaching technologies (Groff, 2017). Although no differences in perceptions are being identified in this study, it is acknowledged that positive teacher perceptions could lead towards a better DPL implementation (e.g. Groff, 2017; Holmes et al., 2018). These perceptions can be fostered via adequate professionalization opportunities, enhance teacher competences and attitudes concerning DPL. As Groff (2017) states: “Teachers are integral to the development of these learning technologies and practices – and need significant support to do so” (p. 27).
Conclusion
By utilizing a survey, the goal of this study was to identify teachers’ reported technology use for personalized learning, perceptions towards DPL tool features (such as adaptivity and dashboards) and their expectations towards implementation support. On the one hand, the results indicate a consensus concerning the value of DPL. Participants appreciate DPL tools and consider most adaptive features as important. On the other hand, these positive perceptions do not align their reported technology use to foster personalized learning as three groups of teachers were identified (low, medium and high technology users). This study provides unique insights in the teacher point-of-view, which is still underexamined in the field of DPL. However, our study also stresses the need for future research to detect possible reasons for the difference in technology use.
References
Aleven, V., McLaughlin, E. A., Glenn, R. A., & Koedinger, K. R. (2017). Instruction based on adaptive learning technologies. In R. E. Mayer & P. Alexander (Eds.), Handbook of Research on Learning and Instruction (2nd ed., pp. 522–560). Routledge.
Baker, R. S. (2016). Stupid tutoring systems, intelligent humans. International Journal of Artificial Intelligence in Education, 26(2), 600–614. https://doi.org/10.1007/s40593-016-0105-0
Basham, J. D., Hall, T. E., Carter, R. A., Jr., & Stahl, W. M. (2016). An operationalized understanding of personalized learning. Journal of Special Education Technology, 31(3), 126–136. https://doi.org/10.1177/0162643416660835
Bernacki, M. L., Greene, M. J., & Lobczowski, N. G. (2021). A systematic review of research on personalized learning: Personalized by whom, to what, how, and for what purpose(s)? Educational Psychology Review, 33, 1675–1715. https://doi.org/10.1007/s10648-021-09615-8
Bulger, M. (2016). Personalized learning: The conversations we’re not having. Data and Society, 22(1). Retrieved January 01, 2022, from https://www.datasociety.net/pubs/ecl/PersonalizedLearning_primer_2016.pdf
Ferede, B., Elen, J., Van Petegem, W., Hunde, A. B., & Goeman, K. (2021). Determinants of instructors’ educational ICT use in Ethiopian higher education. Education and Information Technologies. https://doi.org/10.1007/s10639-021-10606-z
Fisher, R. J. (1993). Social desirability bias and the validity of indirect questioning. Journal of Consumer Research, 20(2), 303. https://doi.org/10.1086/209351
Groff, J. S. (2017). Personalized learning. Retrieved January 01, 2022, from https://curriculumredesign.org/wp-content/uploads/PersonalizedLearning_CCR_April2017.pdf
Grossman, P. L. (1990). The making of a teacher: Teacher knowledge and teacher education. Teachers College Press.
Grossman, P. L. (1991). Overcoming the apprenticeship of observation in teacher education coursework. Teaching and Teacher Education, 7, 345–357. https://doi.org/10.1016/0742-051X(91)90004-9
Hall, G., & Hord, S. (1987). Change in schools: Facilitating the process. State University of New York Press.
Hämäläinen, R., Nissinen, K., Mannonen, J., Lämsä, J., Leino, K., & Taajamo, M. (2021). Understanding teaching professionals’ digital competence: What do PIAAC and TALIS reveal about technology-related skills, attitudes, and knowledge? Computers in Human Behavior, 117, 106672. https://doi.org/10.1016/j.chb.2020.106672
Heymans, P. J., Godaert, E., Elen, J., van Braak, J., & Goeman, K. (2018). MICTIVO2018. Monitor voor ICT-integratie in het Vlaamse onderwijs. Eindrapport van O&O-opdracht: Meting ICT-integratie in het Vlaamse onderwijs (MICTIVO). KU Leuven / Universiteit Gent.
Holmes, W., Anastopoulou, S., Schaumburg, H., & Mavrikis, M. (2018). Technology-enhanced personalised learning: Untangling the evidence. Robert Bosch Stiftung GmbH, Stuttgart. Retrieved January 01, 2022, from http://www.studie-personalisiertes-lernen.de/en/
Holstein, K., Aleven, V., & Rummel, N. (2020). A Conceptual Framework for Human–AI Hybrid Adaptivity in Education. In I. I. Bittencourt, M. Cukurova, K. Muldner, R. Luckin, & E. Millán (Eds.), Artificial Intelligence in Education (vol. 12163, pp. 240–254). Springer International Publishing. https://doi.org/10.1007/978-3-030-52237-7_20
Hord, S. M., Rutherford, W. L., Huling-Austin, L., & Hall, G. E. (1987). Taking Charge of Change. Association for Supervision and Curriculum Development.
Kaiser, G., & König, J. (2019). Competence measurement in (mathematics) teacher education and beyond: Implications for policy. Higher Education Policy, 32(4), 597–615. https://doi.org/10.1057/s41307-019-00139-z
Knoop-van Campen, C., & Molenaar, I. (2020). How teachers integrate dashboards into their feedback practices. Frontline Learning Research, 37–51. https://doi.org/10.14786/flr.v8i4.641
Kolchenko, V. (2018). Can modern AI replace teachers? Not so fast! Artificial intelligence and adaptive learning: Personalized Education in the AI age. HAPS educator, 22(3), 249–252. https://doi.org/10.21692/haps.2018.032
Major, L., & Francis, G. A. (2020). Technology-supported personalised learning: Rapid evidence review. EdTechHub. https://doi.org/10.5281/zenodo.3948175
Marienko, M., Nosenko, Y., Sukhikh, A., Tataurov, V., & Shyshkina, M. (2020). Personalization of learning through adaptive technologies in the context of sustainable development of teachers’ education. E3S Web of Conferences, 166, 10015. https://doi.org/10.1051/e3sconf/202016610015
Maseleno, A., Sabani, N., Huda, M., Ahmad, R., AzmiJasmi, K., & Basiron, B. (2018). Demystifying learning analytics in personalised learning. International Journal of Engineering & Technology, 7(3), 1124. https://doi.org/10.14419/ijet.v7i3.9789
Molenaar, I. (2021). Personalisation of learning: Towards hybrid human- AI learning technologies. In OECD Digital Education Outlook - Pushing the Frontiers with Artifical Intelligence, Blockchain and Robots (pp. 57–77). https://read.oecd-ilibrary.org/education/oecd-digital-education-outlook-2021_589b283f-en#page125
Moore, G. A. (1991). Crossing the Chasm: Marketing and Selling High-Tech Goods to Mainstream Customers. Harper Business.
Pfeiffer, A., Bezzina, S., Dingli, A., Wernbacher, T., Denk, N., & Fleischhacker, M. (2021). Adaptive Learning and Assessment: From The Teachers’ Perspective, 375–379. https://doi.org/10.21125/inted.2021.0103
Reich, J. (2020). Failure to disrupt: Why technology alone can’t transform education. Harvard University Press.
Romesburg, H. (2004). Cluster Analysis For Researchers. Lulu Press.
Schwendimann, B. A., Rodriguez-Triana, M. J., Vozniuk, A., Prieto, L. P., Boroujeni, M. S., Holzer, A., Gillet, D., & Dillenbourg, P. (2017). Perceiving learning at a glance: A systematic literature review of learning dashboard research. IEEE Transactions on Learning Technologies, 10(1), 30–41. https://doi.org/10.1109/TLT.2016.2599522
Shaikh, Z. A., & Khoja, S. A. (2012). Role of teacher in personal learning environments. Digital Education Review, (21), 23–32. Retrieved January 01, 2022, from https://www.learntechlib.org/p/55234/
Shemshack, A., & Spector, J. M. (2020). A systematic literature review of personalized learning terms. Smart Learn. Environ., 7, 33. https://doi.org/10.1186/s40561-020-00140-9
SRI International (2018). Using Technology to Personalize Learning in K–12 Schools. SRI International. Retrieved January 01, 2022, from https://www.sri.com/work/publications/using-technology-personalize-learning-k-12-schools
Tabak, I. (2004). Synergy: A complement to emerging patterns of distributed scaffolding. Journal of the Learning Sciences, 13(3), 305–335. https://doi.org/10.1207/s15327809jls1303_3
Teasley, S. D. (2017). Student facing dashboards: One size fits all? Technology, Knowledge and Learning, 22(3), 377–384. https://doi.org/10.1007/s10758-017-9314-3
van Leeuwen, A., Rummel, N., & van Gog, T. (2019). What information should CSCL teacher dashboards provide to help teachers interpret CSCL situations? International Journal of Computer-Supported Collaborative Learning, 14(3), 261–289. https://doi.org/10.1007/s11412-019-09299-x
Van Schoors, R., Elen, J., Raes, A., & Depaepe, F. (2021). An overview of 25 years of research on digital personalised learning in primary and secondary education: A systematic review of conceptual and methodological trends. British Journal of Educational Technology, 52(5), 1798–1822. https://doi.org/10.1111/bjet.13148
Vanderlinde, R., & van Braak, J. (2010). The e-capacity of primary schools: Development of a conceptual model and scale construction from a school improvement perspective. Computers & Education, 55(2), 541–553. https://doi.org/10.1016/j.compedu.2010.02.016
Vandewaetere, M., & Clarebout, G. (2014). Advanced Technologies for Personalized Learning, Instruction, and Performance. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology (pp. 425–437). Springer New York. https://doi.org/10.1007/978-1-4614-3185-5_34
Walkington, C., & Bernacki, M. L. (2020). Appraising research on personalized learning: Definitions, theoretical alignment, advancements, and future directions. Journal of Research on Technology in Education, 52(3), 235–252. https://doi.org/10.1080/15391523.2020.1747757
Werquin, P. (2010). Recognition of non-formal and informal learning: Country practices. Retrieved January 01, 2022, from https://www.oecd.org/edu/skills-beyond-school/44600408.pdf
Xie, H., Chu, H. C., Hwang, G. J., & Wang, C. C. (2019). Trends and development in technology-enhanced adaptive/personalized learning: A systematic review of journal publications from 2007 to 2017. Computers & Education, 140, 103599. https://doi.org/10.1016/j.compedu.2019.103599
Zheng, L., Long, M., Zhong, L., & Gyasi, J. F. (2022). The effectiveness of technology-facilitated personalized learning on learning achievements and learning perceptions: A meta-analysis. Education and Information Technologies, 1–24. https://doi.org/10.1007/s10639-022-11092-7
Acknowledgements
This study was made possible by the i-Learn project which aims to make an open portal that supports DPL-tools for Flemish primary and secondary schools. We thank imec-living-labs for digitalizing the survey. We also thank the participants who took part in this study.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Statements on Data, Ethics and Conflict of Interest
Within the broader research project of i-Learn, all data collections are approved by an ethics committee (file number: G-2019 10 1798). The author declares that there is no conflict of interest/involvement with entities that have (non-)financial interest in this research.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix
Appendix
Appendix 1: Extra Comment Section
-
i
Population data, used to indicate representativeness of the sample, was extracted from the Flemish Department of Education website (www.onderwijs.vlaanderen.be). All data sourced from the schoolyear 2019–2020.
-
ii
Percentages are rounded up (if the next decimal is five or above) and down (if the next decimal is four or less) to the nearest number of decimals.
-
iii
Bold text is used to highlight the highest percentages.
-
iiii
For ‘software’ we did not average the scores, instead we used the original sum of scores because -in comparison with ‘adaptivity’, ‘dashboards’ and ‘support’ the differences of the Likert-scale were not proportional.
Appendix 2: An Overview of the Survey Questions
No | Part one: Current use of digital technology for personalized learning 5-point Likert scale: (1) Never, (2) 1–3 a year, (3), Monthly, (4) Weekly, (5) Daily |
---|---|
1 | How often did you use learning management systems to support personalized learning? |
2 | How often did you use authoring tools to support personalized learning? |
3 | How often did you use tools for blended learning to support personalized learning? |
4 | How often did you use communication tools to support personalized learning? |
5 | How often did you use tools for students with disabilities to support personalized learning? |
6 | How often did you use e-portfolio tools to support personalized learning? |
7 | How often did you use simulation tools to support personalized learning? |
8 | How often did you use assessment tools to support personalized learning? |
9 | How often did you use games to support personalized learning? |
10 | How often did you use online learning environments to support personalized learning? |
11 | How often did you use collaboration tools to support personalized learning? |
12 | How often did you use presentation tools to support personalized learning? |
No | Part two: Perceptions towards adaptivity within DPL tools 5-point Likert scale: (1) Not important at all, (2) Not important, (3) Neutral, (4) Important, (5) Very important |
---|---|
Source of adaptation | |
1 | To what extent do you think it is important for a digital personalized tool to take into account attitude/motivation of the student? |
2 | To what extent do you think it is important for a digital personalized tool to take into account level of the student? |
3 | To what extent do you think it is important for a digital personalized tool to take into account self-confidence of the student? |
4 | To what extent do you think it is important for a digital personalized tool to take into account progress/pace of the student? |
5 | To what extent do you think it is important for a digital personalized tool to take into account self-regulatory skills of the student? |
Target of adaptivity | |
1 | To what extent do you think it is important for a digital personalized tool to adapt the difficulty of exercises/content? |
2 | To what extent do you think it is important for a digital personalized tool to adapt the number of exercises/contents? |
3 | To what extent do you think it is important for a digital personalized tool to adapt the order of exercises/content? |
4 | To what extent do you think it is important for a digital personalized tool to adapt the degree of feedback? |
5 | To what extent do you think it is important for a digital personalized tool to adapt the degree of instruction? |
6 | To what extent do you think it is important for a digital personalized tool to adapt the number of recommendations? |
Method of adaptation | |
1 | To what extent do you think it is important for a student to have control over the level of tasks? |
2 | To what extent do you think it is important for a student to have control over the type of tasks? |
3 | To what extent do you think it is important for a student to have control over the sequence of tasks? |
4 | To what extent do you think it is important for a student to have control over the number of tasks? |
5 | To what extent do you think it is important for a student to have control over the instruction of tasks? |
6 | To what extent do you think it is important for a student to have control over the evaluation of tasks? |
7 | To what extent do you think it is important for a student to have control over the guidance of tasks? |
8 | To what extent do you think it is important for a teacher to have control over the level of tasks? |
9 | To what extent do you think it is important for a teacher to have control over the type of tasks? |
10 | To what extent do you think it is important for a teacher to have control over the sequence of tasks? |
11 | To what extent do you think it is important for a teacher to have control over the number of tasks? |
12 | To what extent do you think it is important for a teacher to have control over the instruction of tasks? |
13 | To what extent do you think it is important for a teacher to have control over the evaluation of tasks? |
14 | To what extent do you think it is important for a teacher to have control over the guidance of tasks? |
Time of adaptation | |
1 | To what extent do you think it is important for a digital personalized tool to challenge students during practice with different levels of difficulty according to their performances? |
2 | To what extent do you think it is important for a digital personalized tool to start with a teste/questionnaire to identify certain characteristics (e.g. prior knowledge, interests,…) of the learner in advance? |
3 | To what extent do you think it is important for a digital personalized tool to consider how the student feels during practice (e.g. a student can indicate how he/she feels)? |
No | Part three: Perceptions towards dashboards within DPL tools 5-point Likert scale: (1) Not important at all, (2) Not important, (3) Neutral, (4) Important, (5) Very important |
---|---|
1 | To what extent do you think it is important for a student dashboard to include an overview of completed tasks? |
2 | To what extent do you think it is important for a student dashboard to include an overview of time spend on tasks? |
3 | To what extent do you think it is important for a student dashboard to include an overview of results? |
4 | To what extent do you think it is important for a student dashboard to include an overview of progress? |
5 | To what extent do you think it is important for a student dashboard to include a failure analysis? |
6 | To what extent do you think it is important for a student dashboard to include an overview of completed learning goals? |
7 | To what extent do you think it is important for a student dashboard to include an overview of feedback? |
8 | To what extent do you think it is important for a student dashboard to include an overview of deadlines? |
9 | To what extent do you think it is important for a teacher dashboard to include an overview of completed tasks? |
10 | To what extent do you think it is important for a teacher dashboard to include an overview of time spent on tasks? |
11 | To what extent do you think it is important for a teacher dashboard to include an overview of results? |
12 | To what extent do you think it is important for a teacher dashboard to include an overview of progress? |
13 | To what extent do you think it is important for a teacher dashboard to include a failure analysis? |
14 | To what extent do you think it is important for a teacher dashboard to include an overview of completed learning goals? |
15 | To what extent do you think it is important for a teacher dashboard to include an overview of feedback? |
16 | To what extent do you think it is important for a teacher dashboard to include an overview of deadlines? |
No | Part four: Support for implementing DPL tools 5-point Likert scale: (1) Not important at all, (2) Not important, (3) Neutral, (4) Important, (5) Very important |
---|---|
1 | To what extent do you think support from the softwareprovider is important in order to implement DPL tools? |
2 | To what extent do you think e-manuals, user videos and websites are important in order to implement DPL tools? |
3 | To what extent do you think support from the ICT department or school support are important in order to implement DPL tools? |
4 | To what extent do you think meetings with colleagues are important in order to implement DPL tools? |
5 | To what extent do you think books/websites are important in order to implement DPL tools? |
6 | To what extent do you think informal conversations are important in order to implement DPL tools? |
7 | To what extent do you think training courses are important in order to implement DPL tools? |
8 | To what extent do you think online platforms are important in order to implement DPL tools? |
9 | To what extent do you think pedagogical support services are important in order to implement DPL tools? |
Rights and permissions
Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Van Schoors, R., Elen, J., Raes, A. et al. The Charm or Chasm of Digital Personalized Learning in Education: Teachers’ Reported Use, Perceptions and Expectations. TechTrends 67, 315–330 (2023). https://doi.org/10.1007/s11528-022-00802-0
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11528-022-00802-0