Abstract
The COVID-19 pandemic challenged universities to maintain teaching, leading to online classes becoming the standard teaching mode and accelerating digitalization. Learning from the influence of these developments on students’ technology commitment may hold valuable information for various stakeholders. The present study investigated the development of three facets of technology commitment in higher education during the first two semesters under the COVID-19 pandemic: technology acceptance, technology competence belief, and technology control belief. The sample consisted of N = 132 graduate students at one German university who filled out questionnaires at two measurement points in two waves. The change in all three facets of technology commitment over time was examined with latent change models. There was a significant increase in technology competence belief. This change was stronger for students in the second COVID-19 semester than those in the first COVID-19 semester. Participants’ age, sex, and the number of webinars attended during the semester of data collection had no significant effect on the change in the three facets of technology commitment. Overall, the present study provides new insights into the development of technology commitment during the COVID-19 pandemic, proposes an explanatory approach for the change in technology commitment, and emphasizes the relevance of direct experience with technology in the development of technology competence belief at different skill levels. The results indicate that students can increase their level of technology competence belief, by engaging directly with new technology.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
In December 2019, COVID-19 emerged and paralyzed the world for an unknown while. The obvious impact of the pandemic triggered a need for location-independent workplaces and worked as an accelerator of the (involuntary) digitalization. Gärtner et al. [1] pointed out that the COVID-19 pandemic led to difficulties in university life and teaching operations due to the sudden change in context factors and low experience with digital education. Combined with Yüce’s studies [2,3,4], it becomes clear that various nations are struggling with the same problems and that experiencing oneself dealing effectively with technology is an essential task of contemporary times. Distance teaching in higher education went from a less popular option to the means of choice. Coincidently, it opened an exceptional chance to study how social interaction within online classes in the form of web conferences, the most applied digital form of teaching [5], shaped the technology commitment of a broad user spectrum. In this exceptional situation, the majority of students at German universities were facing a largely new technology by participating in webinars instead of face-to-face teaching. These exceptional circumstances made it possible to investigate students' development of technology commitment in a phase of an unprecedented digitalization push. Given the importance of investigating technology commitment during COVID-19, it is not surprising that other authors have already focused on this topic [6, 7]. However, the previous studies have produced contradictory findings. While Berling et al. [6] found an increase in technology competence belief during the COVID-19 pandemic, Rogge et al. [7] found increases in the facets of technology acceptance and technology control belief, but not for technology competence belief.
Based on these contradictory findings, the present study aimed to generate further evidence concerning change in technology commitment during COVID-19. More specifically, it examined change in technology commitment in higher education students during the first and second semesters of the COVID-19 pandemic. To this end, the three facets of technology commitment were measured according to Neyer et al.’s [8] technology commitment model at the beginning and the end of online classes. Hypotheses about changes in the facets of technology commitment were derived based on theoretical deliberations with a particular focus on experiential learning processes, as postulated in experiential learning theory [9, 10], as well as empirical findings on change in technology commitment during COVID-19 [6, 7].
A change in technology commitment during the COVID-19 pandemic would be of great interest to various interest groups, such as students, lecturers, and curriculum managers. First and foremost, students could benefit from improving their technology commitment, as this may also improve their learning experiences. They could be more empowered to use digital tools and resources, leading to a richer and more interactive learning environment for students [11]. At the same time, a change in students' willingness to use technology would be important for lecturers. Teachers could use appropriate technological tools profitably depending on the students' technology commitment [12]. Curriculum managers could use the results to optimize and further develop their degree programs with regard to comprehensible technology interaction, ensuring that students have acquired the necessary technology skills at various points in their studies [12].
2 Theoretical background
2.1 Technology commitment
Neyer et al.’s [8] model of technology commitment is based on Davis’ [13,14,15] technology acceptance model, which explains technology use by technology acceptance, as well as more recent developments in personality psychology that consider individual differences in technology-related beliefs. The technology acceptance model is a variation of the theory of reasoned action [16, 17] but does not implement personality factors like competence and control belief, which are considered in more recent developments of the theory of reasoned action [18]. One attempt to close this research gap was made by Arning and Ziefle [19], who studied users’ attitudes toward technology and their performance and considered personality characteristics as moderators. More recently, Neyer et al. [8] criticized the technology acceptance model for missing personality traits and proposed an integrative model of technology commitment that consists of three facets: technology acceptance, technology competence belief, and technology control belief.
Technology acceptance represents an individual’s subjective opinion about technology and technology change. It refers to individuals’ personal relationship with technology rather than their perceived importance of technological innovation for society. During the COVID-19 pandemic, the technology of internet-based videoconferences in terms of online classes allowed to maintain teaching at universities. The experience of being able to take part in online classes, in contrast to the alternative of not taking part and being slowed and inhibited in one’s own university career, should have led to an increased acceptance of online classes and their advancements at this time.
Technology competence belief is defined as the expectation to be able and proficient to act in interactions with technology. It can be considered a self-concept that is based on individual experiences with technology and the expected ability to adapt the existing experience to new technology and use it self-effectively. During the COVID-19 semesters, many students had the experience of using online classes successfully. This experience could have led to the self-perception to act competent with new technologies.
Technology control belief relates to individuals’ expectations concerning the results of their technology-related actions. It reflects the extent to which individuals perceive technology as controllable. With increasing experience with online conferences, the own dealing with this technology is likely to become more confident. With an increased use of a technology, one will encounter similar problems, which one can handle more easily with increased experience. As a result of the experience of satisfactorily handling problems with web conferences, one could develop the perception to be able to control this concrete technology to some degree.
Technology commitment was considered in the context of various studies and from different perspectives, which emphasizes the importance of the construct. For example, Wicki et al. showed in their study on the user behavior of autonomous vehicles that technology commitment significantly influences the choice of whether to use an autonomous vehicle [20]. Furthermore, Kretschmer and Terhaen evaluated a virtual reality-based training course from the field of intralogistics and examined, among other things, technology commitment in terms of usability, user experience, mental workload, and intrinsic motivation. Their study found a significant negative correlation between the facets of technology acceptance and mental workload [21]. In addition to the contexts of automated transport and work-related virtual training, the importance of technology commitment has also been examined in health- and education-related studies. For example, in a study by Voderholzer et al. on whether technology commitment affects the therapy results of videoconference-based cognitive behavioral therapy, it was found that neither technology commitment nor a sustainable working alliance affected therapy success [22]. Ratz et al. investigated the influence of technology commitment on the identification of health-related lifestyles related to dropouts from web-based intervention studies on physical activity in older age. They found that although latent lifestyle profiles were associated with dropout, they were independent of technology commitment [23].
In education, a study by Reich-Stieber and Eyssel on attitudes toward educational robots showed that technology commitment is a significant predictor of attitudes toward educational robots [24]. Furthermore, some exploratory studies have examined technology commitment during COVID-19. In one study, which mainly focused on the relationships between mobile work and work ability, Berling et al. [6] measured technology competence at two measurement points before and after the outbreak of the pandemic in a sample of employees in Germany. They found a significant increase in technology competence across time, suggesting that the pandemic may have led to an increase in this technology commitment facet.
In contrast to Berling et al. [6], Rogge et al. [7] examined the development in all three facets of technology commitment after the beginning of the pandemic. Specifically, they measured technology commitment at three measurement points (T1, T2, T3) with three cohorts of teacher trainees studying in a bachelor’s program at a German university at the beginning (T1) and the end of the lecture period (T2) and at the end of the semester (T3) in the first three semesters under COVID-19. They found an increase in technology acceptance in the summer semester of 2021 as well as increases in technology control belief in the summer semester of 2020 and the winter semester of 2020/21. Interestingly, technology competence belief showed no significant change in any semester under COVID-19 in their study.
2.2 Experiential learning theory
Experiential learning theory [9, 10] describes the process of learning through experience as well as adult development and offers a theoretical framework for experience-led education. To corroborate hypotheses about change in the facets of technology commitment, the focus in this section is on the core concept of the experiential learning theory: the learning cycle.
The learning cycle depicts a framework of learning that gains its dynamic through the two dual processes of action versus reflection and experience versus abstraction. All four modes of the learning cycle are understood as experiences, gathering experience through Concrete Experience (CE) and Abstract Conceptualization (AC), and transforming experience through Reflective Observation (RO) and Active Experimentation (AE). Integrating these patterns moves the learning cycle, in which a CE provides the basis for RO, which is transformed into an AC that is validated through AE.
For example, consider a student who struggles with an online class platform because she cannot hear and speak with the other students (CE). The student processes this experience and concludes she does not use the platform correctly. To investigate possible errors in her operation of the platform, she observes other students who also use the platform and compares her settings with those of her classmates (RO). Subsequently, she develops ideas about why she has experienced technical malfunctions (AC). After that, she tries several settings on the platform and enters the next online course with modified settings (AE). Consequently, she can participate actively and interact with others during the online class and has a much better experience (CE). In sum, she understands that it is productive to consider setting options to be able to participate in online classes adequately and satisfactorily.
Real-world examples like this illustrate that experiential learning theory, especially the concept of the learning cycle, applies to human–computer interaction and can explain technical learning with the relevant technology by experiences. Furthermore, it is likely that the described learning experiences contribute to an increase in technology commitment to the technology used.
3 The present study
The present study examined change in the three technology commitment facets—technology acceptance, technology competence belief, and technology control belief—in a sample of university students studying in the Master of Education during the first two semesters of the COVID-19 pandemic. A longitudinal study, including two waves (first and second COVID-19 semester) was conducted, with two measurement points each at the beginning and end of the respective semester (T1 and T2). An increase in all facets from T1 to T2 was hypothesized. Moreover, it was tested in an exploratory manner whether change in the various facets depended on survey wave, number of webinars attended in the respective semester, age, or sex.
Considering the theoretical deliberations above, as well as the fact that an increase in each facet of technology commitment during COVID-19 has been found in at least one empirical study [9, 10], it was assumed to find increases in all three facets of technology commitment. Specifically, an increase in technology acceptance was anticipated for theoretical reasons as new digital technologies made it possible to keep teaching during the COVID-19 pandemic. This experience should have led to an increasing acceptance of technology and technological advances among students. Regarding technology competence belief, an increase was expected, as students should have had the experience of being able to successfully operate web conferences during the COVID-19 pandemic. As a result of increasing experiences with web conferencing, according to experiential learning theory, there should be an increase in competence in using this digital technology, which should be reflected in higher technology competence belief. With regard to technology control belief, an increase was expected because, with increasing experience with web conferencing, one should become more and more confident in dealing with it. Moreover, after frequent use of a web conference, one may encounter similar problems, which one can increasingly solve with increasing experience. As a result of this knowledge and experience of being able to solve problems in dealing with web conferencing, there is likely an increasing impression of being able to control technology (in this case, the specific technology of web conferencing) to a certain extent.
4 Method
4.1 Context of the study
The study took place at the beginning of the COVID-19 pandemic in the context of a research project that aimed to examine the impact of the pandemic on the student body and university operations. Due to the contact restrictions at the time of the project administration, many universities had switched their face-to-face teaching to online teaching, which went hand in hand with implementing new digital technologies, such as video conferencing. The fact that German students did not have sufficient digital competence before the pandemic [25] made an investigation of development in technology commitment in the context of the pandemic particularly interesting. In the present study, students were confronted with a new teaching mode using webinars as a substitute for face-to-face teaching in teacher training programs, which was not feasible during the COVID-19 pandemic. The data was collected within seminars in which the lecturers sent the students links to online surveys, which the students completed during the seminars.
4.2 Research design
The present quantitative field study used a longitudinal design with two measurement points, T1 at the start and T2 at the end of the semester, in two waves (summer semester of 2020 and winter semester of 2020/2021). At each measurement point, participants were asked to fill out questionnaires on the construct of technology commitment, including the facets of technology acceptance, technology competence belief, and technology control belief.
This longitudinal approach allowed the same variables to be observed over a longer period to examine changes and developments in technology commitment. Thus, with regard to the research question of the present study, the longitudinal approach had the central advantage that changes in technology commitment during the COVID-19 pandemic could be examined. Moreover, this design allowed to test for possible differences in change in technology commitment during the first and second semesters under COVID-19.
4.3 Sample
The sample consisted of 132 students (65.15% female) with an average age of M = 25.2 years (SD = 5.07). All participants were teacher trainees in the Master of Education and attended an online class in educational psychology at one German university during the first year of the COVID-19 pandemic. Participants attended, on average, the ninth semester of their teacher training (M = 9.08, SD = 2.77). All participants studied several subjects, of which 80% studied a language, 73% studied social sciences, 27% natural sciences, and 20% studied art or sport. During online teaching, lecturers and students were at home on their devices. On average, participants attended M = 4.86 online classes (SD = 3.94) in the semester of data collection. A power analysis showed that this sample would have allowed us to detect effects of d = 0.3 with a power of 0.93 at an alpha level of 0.05 in paired-sample t-tests (two-sided). Although the hypotheses were tested using more sophisticated analyses (see Data Analyses section), this power analysis indicated that this study had sufficient power.
4.4 Procedure
The data was collected during the summer semester of 2020 (first wave) and the winter semester of 2020/2021 (second wave). Thus, the survey period covered nearly the whole first year of the COVID-19 pandemic. Data collection took place in 13 online courses on educational psychology. Data at T1 were collected in the week following the first session; data at T2 in the week following the last session of each course. Data were collected using online questionnaires in LimeSurvey [26]. Every participant was assigned an individual identification number to ensure complete anonymity and to link the individual data derived from the two measurement points. The questionnaire was part of a larger survey for quality management of digital education due to the COVID-19 pandemic. Participants rated their technology commitment at both measurement points. At T1, they also provided demographic information. At T2, they indicated how many webinars they had attended in the current semester. Completing each questionnaire lasted around 15 min.
In order to meet the ethical requirements, the participants were informed in the seminar about the objectives and course of the study, as well as about the rights of the participants and the voluntary nature of participation in the study. This information also appeared in written form at the beginning of the online survey. Subsequently, the participants were presented with a consent form, which they had to agree to in order to take part in the study. With their consent, they agreed to the collection, processing, and storage of their data as well as the transmission of the data as part of the study. Rejecting the declaration of consent had no disadvantages or other consequences for the participants.
4.5 Measures
4.5.1 Technology commitment
The three facets of technology commitment were measured with the 12-item technology commitment scale [8]. This scale consists of four items to measure technology acceptance (e.g., “With regards to new technical developments, I am very curious”), technology competence belief (e.g., “For me, dealing with technical innovations is usually too much of a challenge”), and technology control belief (e.g., “It’s on me, if I’m successful with the usage of recent technological developments—it has little to do with chance or luck”). Participants rated all items at each measurement time point on a 7-point Likert scale, ranging from 1 = does not apply at all to 7 = fully applies. The negatively formulated items (those measuring technology competence belief) were reverse-recoded. Cronbach’s α for all constructs at T1 and T2 ranged from 0.85 to 0.89. Technology commitment has also already proven to be reliable in previous studies, with Cronbach’s α ranging from 0.71 to 0.89 for all facets [10, 24, 27].
Support for the scale’s validity was provided by Neyer et al. [8] in three studies with different samples. The authors examined the scale’s construct validity by deriving hypotheses about the correlations between the technology commitment scale and demographic variables, personality traits, intelligence, indicators of successful aging, and other scales for technology evaluation and experience based on previous studies. The results of the present study were compared with those of Neyer et al.’s studies [8]. For example, based on the available findings, Neyer et al. expected gender-specific differences in the sense of higher levels of technology commitment facets for men and were able to show this, which could be replicated in the present study. Moreover, Neyer et al. assessed convergent, discriminant, and incremental validity based on their correlations with related scales on technology use and technology experience. Criterion validity was measured by whether technology commitment can predict current technology use and technology biography. Factorial validity was examined across the entire sample.
4.5.2 Control variables
As control variables, participants’ age, sex, and the number of webinars in which they had participated in the semester of data collection were considered. In addition, it was considered whether the data came from the first or second wave.
4.6 Data analyses
The analyses were conducted in Mplus 8.3 [28]. For model estimation, the implemented robust maximum likelihood estimator (MLR) was used. This estimator has the advantage of using standard errors and goodness-of-fit statistics to correct for possible non-normality [29]. Technology acceptance, technology competence belief, and technology control belief were specified as latent variables to account for measurement error at the indicator level. Effect coding was used to scale the latent variables to maintain the original metric [30]. As depicted in Table 1, scalar measurement invariance for all three facets of technology commitment was confirmed and specified to ensure comparability of the latent variables over time [31]. Although the Δχ2-test became significant in the model comparison for technology control belief, scalar measurement invariance was assumed, considering the small changes in the comparative fit index (CFI) below 0.01 and in the root mean square error of approximation (RMSEA) below 0.015 [32, 33]. To account for any residual effects that could not be attributed to the latent constructs, correlations between the same indicator variables across time were allowed [34].
Three univariate latent change score models were computed to estimate change in technology acceptance, technology competence belief, and technology control belief over time [35, 36]. By way of illustration, Fig. 1 shows the model used to examine change in technology competence belief. In each model, the score at T2 was decomposed into the score at T1 and the difference between T2 and T1. To this end, the path coefficient was fixed from the facet at T1, and the change score to the facet at T2 to 1. The intercept of the facet at T1 and the change score were freely estimated, whereas the intercept of the facets at T2, as well as the residual variance of the facet at T2 was set to 0. The facet at T1 and the change score were allowed to correlate. Furthermore, the change score and facet at T1 were regressed on participants’ age, sex (0 = male, 1 = female), number of online courses attended, and the wave in which the data collection took place (0 = summer semester 2020, 1 = winter semester 2020/21) to examine whether these variables were related to the initial scores or changes in the facets.
Some participants did not fill out the questionnaire either at T1 (n = 12) or T2 (n = 55). To deal with these missing values, the full information maximum likelihood (FIML) procedure implemented in Mplus was used. Under the missing at random assumption (MAR), this model-based procedure to deal with missing data is unbiased and maintains the given statistical power. Since no observations are deleted using FIML, it is considered a superior procedure of data management in comparison to traditional methods like listwise deletion [37].
The model fit was evaluated by referring to the common χ2-statistic as well as the CFI and RMSEA. To interpret the estimated model as a sufficient fit to the data, CFI values should not be lower than 0.9, and RMSEA values should not be larger than 0.08 [38].
5 Results
5.1 Preliminary analyses
As can be seen in Table 2, there were very strong positive correlations between technology acceptance at T1 and T2, as well as between technology control belief at T1 and T2. In contrast, the correlation between technology control belief at T1 and T2 was only moderately to strongly positive. The correlations between the different facets of technology commitment were moderately to strongly positive, except for the nonsignificant correlation between technology competence belief and technology control belief at T2.
5.2 Change score models
Table 3 presents the results of the latent change score models analyzing the changes in the three facets of technology commitment. As shown in Table 3, the fit of all three models was good (all RMSEA ≤ 0.05, all CFI ≥ 0.96).
5.2.1 Change in technology commitment
In contrast to Hypotheses 1 and 3, the change scores of technology acceptance and technology control belief were nonsignificant (both ΔM ≤ 0.13, p ≥ 0.23), indicating a high stability of these facets across time. However, consistent with Hypothesis 2, the change score of technology competence belief was significantly positive (ΔM = 0.30, p < 0.001, d = 0.73), indicating a moderate increase in this facet (see also Fig. 2).
5.2.2 Variance of the difference scores
The variances of the difference change scores varied over a wide range. While the variances of the change scores of technology acceptance and technology control belief were significant (0.53 ≤ s2 ≤ 1.18, p ≤ 0.001), the variance of the difference score of technology competence belief was not (s2 = 0.06, p = 0.35).
5.2.3 Correlations between initial levels and change scores
Correlations between the initial levels and change scores within the facets were moderately negative (− 0.56 ≤ r ≤ − 0.30), implying that participants with lower initial levels in a facet showed a stronger increase (or a weaker decrease) than participants with higher initial levels. However, the correlation between initial levels and change scores were only significant for technology acceptance and technology control belief (p ≤ 0.02), whereas it was not significant for technology competence belief (p = 0.07).
5.2.4 Control variables
Participants’ age, sex, and number of webinars attended showed no significant effect on any change score. However, the wave of data collection had a positive effect on change in technology competence belief (b = 0.35, p = 0.02, β = 0.68), indicating a stronger increase in the second wave. In addition, the control variables showed no effect on the initial scores, with the exception of sex, which showed significant negative effects on the initial levels of all three technology commitment facets, indicating higher technology commitment for males compared to females.
6 Discussion
The aim of this study was to investigate changes in technology commitment among university students confronted with a new learning context during the COVID-19 pandemic. Based on experiential learning theory as well as the theoretical deliberations and empirical findings presented in the theoretical sections, it was expected to find an increase in all three facets of technology commitment: technology acceptance, technology competence belief, and technology control belief. In the following sections, the study’s main findings will be discussed, as well as its strengths and limitations.
6.1 Increase in technology competence belief
The results of this study showed that technology competence belief increased over the period of one semester. This observed change suggests that increasing experience with technologies during COVID-19 may have affected students’ self-perceived competence in dealing with technologies. In terms of experiential learning theory [9, 10], it may be that students’ competence belief enhanced as a consequence of increasing experience with webinars and other digital technologies used during the pandemic, in that students moved permanently “along the learning circle” while gaining and transforming experience with new technologies. Thus, students may not only have learned how to use new technologies but may also have perceived this increase in competence. However, since the number of webinars attended had no significant impact on students’ change in technology competence belief, students’ participation in online courses likely was, at best, partly responsible for their increase in technology competence. This is also true given that the increase was stronger in the second COVID-19 semester, where students had already prior experience with online teaching.
It is noteworthy that, despite the significant change in mean technology competence belief across time, the very strong correlation between technology competence belief at T1 and T2 of r = 0.96 indicates an extremely high stability of this technology commitment facet (which was also higher than that of the other two facets examined). In line with this, the variance of the change score of technology competence belief also indicated no significant interindividual differences. Thus, students seemed to have benefited to a similar extent from the changes during COVID-19 in terms of gains in technology competence belief.
The present results contrast to Rogge et al. [7], who also investigated the facets of technology commitment during COVID-19 and found no significant increase in technology competence belief. However, it can be stated that Rogge et al.’s difference scores of the individual semesters also show a trend towards a slight increase in technology competence belief. The difference in the significance level may be due to sampling. While Rogge et al. [7] surveyed Bachelor's students in their study, only Master's students were surveyed in this study. On the one hand, Master's students have a higher level of academic experience, enabling them to acquire knowledge and skills more effectively [e.g., 39]. In addition, the ability to adapt increases with the years studied [e.g., 40], which may translate to Master's students being better able to adapt to new academic challenges. Thus, they may also be able to build up technology competence belief more quickly than Bachelor's students, which were the sample in Rogge et al.’s study [7]. In contrast, Bachelor's students are more likely to struggle to adapt to academic challenges in a university context [41], which might also apply when facing a new technology such as online webinars. On the other hand, it is not clear in the study by Rogge et al. [7] whether the students surveyed took part in courses at all, whereas lecturers surveyed all participants in the present study during ongoing seminars. Furthermore, Rogge et al. [7] collected data at three measurement points in their longitudinal study but only reported results and changes in the data from the first to the second measurement point. This raises the question of the extent to which the results would change if the data from the third measurement point were also considered.
Students’ digital competence is a critical factor for digital education, which was weak before COVID-19, especially in Germany. For example, in his study on digital media competencies of German students and their academic and career success, Senkbeil [25] showed that 20% of German first-year university students did not have a sufficient level of digital competence to start studying and that 53% of students in the 6th semester did not have a sufficient digital competence relating to the expected digital competence for their study phase. Considering the results of the present study from this point of view, it is to be expected, despite all tragedy, that the circumstances of the COVID-19 pandemic led to a benefit for the student population, at least with regard to their technology competence belief. This conclusion is further supported by the results of Berling et al. [6], who considered the relation between mobile work and work ability under the COVID-19 pandemic conditions and also found an increase in technology competence belief after the beginning of the pandemic.
6.2 No change in technology acceptance and technology control belief
Technology acceptance did not change during the first two semesters of the COVID-19 pandemic in the sample examined in this study. An explanation of this finding could be that implementation difficulties accompanied the switch to digital teaching at university on the side of the students, universities, and educators. Furthermore, it can be assumed that students desired face-to-face teaching during the COVID-19 pandemic, which inhibited an increase in technology acceptance. Considering Davis’ [13,14,15] technology acceptance model, according to which technology acceptance is predicted by the usefulness and ease of use of a particular technology, online teaching could not have outperformed face-to-face teaching, which is why no change in technology acceptance was found.
Participants’ technology control beliefs also did not change throughout the considered period. One possible explanation for this finding is that the technology used by all participants in this study (i.e., web conferences) is not always controllable. For example, factors on the technology side, such as an unstable internet connection, slow computer speed, or an outdated device, can lead to malfunctions and, as a result, decreased technology control perceptions. Furthermore, the nonsignificant change in technology control belief may have been due to experiences that other participants, especially significant others such as friends or people with whom participants had a particularly positive attitude, have experienced technology control problems during webinars, which created shared attitudes among participants that videoconferencing was difficult to control.
Another possible reason could be that the integration process of new technologies (e.g., online teaching instead of face-to-face teaching) as a reaction to the COVID-19 pandemic was organized differently at different universities, for example, related to the pre-pandemic experiences with educational technology, the university’s crisis communication, and available resources [42]. According to the experiential learning theory, this could have resulted in different levels of acceptance and control belief of the technology in question.
Interestingly, the results considering change in technology acceptance and technology control belief also stand in contrast with Rogge et al. [7], who found increases in technology acceptance in the summer semester of 2021 and technology control belief in the summer semester of 2020 and winter semester of 2020/2021. Possible causes have already been discussed in the previous subsection.
6.3 Strengths and limitations
The present study used a longitudinal design with two measurement points and estimated multivariate latent change models to examine change in three facets of technology commitment among university students attending online courses during the COVID-19 pandemic. Due to the pandemic, students were confronted with the direct switch from face-to-face to digital teaching. In this sense, the pandemic offered a unique chance to examine the development of technology commitment following a sudden change in the use of digital technologies at university and beyond. However, the study also has several limitations.
First, there was no control group to examine change in technology commitment under “natural” conditions (i.e., face-to-face classes) at the university. Accordingly, no causal effects can be derived from the COVID-19 pandemic on change in technology commitment. However, two other longitudinal studies conducted before the pandemic by Trauzettel [26] and Mertens et al. [43] found no significant changes in the three technology commitment facets. It should be noted that these studies were realized with older participants. Nevertheless, considering their findings, it seems rather plausible that the increase in technology competence belief found in this study is at least partly due to the switch to digital teaching, as well as other changes, especially regarding the use of technology, during the pandemic.
Second, no tests were conducted to measure technology competence directly but only collected self-assessments regarding technology competence belief. Thus, it remains uncertain whether participants’ actual digital competences have also increased during this study. However, several studies have shown that self-concepts of ability and actual abilities are strongly correlated and show reciprocal effects on each other (e.g., [44, 45]). Considering this, it seems conceivable that the participants in this study may also have shown an increase in their actual digital competence. The longitudinal investigation of reciprocal effects between self-concept and actual technology competence would be an interesting challenge for future studies.
Third, it is unclear to what extent the findings of this study can be generalized to other populations. This study only considered a specific sample of teacher trainees from one German university. Since the pandemic has ended, it is no longer possible to collect additional data to examine changes in technology commitment during COVID-19. Another point that Berling et al. [6] also discussed concerning their study’s results is that a selection bias could have occurred due to the voluntary nature of participation, meaning that more motivated students took part in the study, which could have distorted the results in a positive direction. In addition, Bailey and Kurland [46] pointed out another bias in studies on mobile work, which refers to participants possibly responding more positively because they preferred mobile work and were interested in a positive evaluation of mobile work and, consequently, a continuation of this work mode. With regard to the present study, it is possible that the students preferred online teaching compared to face-to-face teaching and, therefore, responded more positively to the effect that they rated their technology competence belief higher at T2 than at T1. However, this seems unlikely, as students in Germany reported that the sudden online teaching and increasing isolation in their studies have placed a heavy burden on them [47]. However, the findings of Berling et al. [6] suggest that an increase in technology competence during the pandemic also occurred in other populations.
Fourth, when comparing the results, it may be important to consider how a new technology or teaching mode is implemented. In our study, the development of students' technology commitment was examined, but not the technical implementation of teaching by the universities. It could make a big difference whether online teaching is suddenly implemented due to an emergency or is well thought out and planned [48]. For example, Reiss-Anderson [49] pointed out the negative effects of unfavorable digitalization of teaching. Based on this finding, one could assume that the change to online teaching may have had different effects on student populations at different universities, depending on whether online teaching was implemented within or without the context of COVID-19.
Finally, it is unclear whether the change in technology competence belief in this study persisted over time or whether participants’ technology competence belief decreased after the pandemic when universities returned to face-to-face teaching. In this study, the participants were followed only for the period of one semester. Thus, no conclusions can be drawn about their long-term development in technology commitment.
Further research should be encouraged in accordance with the limitations listed here. Since the study was conducted during the COVID-19 pandemic, it is impossible to replicate it under exactly the same conditions. However, future studies could test technology commitment and its facets in an experimental design to obtain data through external validation (e.g., by lecturers) to exclude any bias due to self-report, e.g., that it only feels as if one has become more competent. The urgency to deal with a technology could be evoked, and a control group could be compared to the test group to see whether the external circumstances influence on technology commitment. With such a design, it would be possible to derive causal relationships for the development of technology commitment.
In addition, different populations should be tested in a comparable design to test any generalizability of the effect. Ultimately, it would also be interesting to reassess the people from studies tested during the COVID-19 pandemic to check whether the change in technology commitment is only a short-term or long-lasting effect. However, it is hoped that there may be other datasets collected during the pandemic by other researchers that would allow to examine the development of technology commitment during the pandemic and beyond.
6.4 Summarized integration into other studies and implications
It would be desirable if this study was able to explain the differences between the single studies. However, this seems difficult, as various reasons (e.g., characteristics of the sample or learning environment) may be responsible for the partly different findings. In this context, it should also be noted that even Rogge et al. [7] found increases in technology acceptance and technology control belief only in some cohorts, so their findings were not even completely consistent between groups of students from the same university. Rather than speculating about reasons for the differences in the findings, especially between this study and that conducted by Rogge et al. [7], further data on the development of technology commitment during COVID-19 should be analyzed, should other researchers have collected them. Such additional studies and their integration in meta-analyses or systematic reviews may help to gain more valid estimates for changes in technology commitment during COVID-19 and to identify possible moderators of the strength of the effects.
Notwithstanding this, however, it is important to emphasize once more that the results of the existing studies examining change in technology commitment show that technology commitment can increase among students, possibly resulting from the confrontation with new technologies, partly forced by COVID-19. Consequently, students should be encouraged to engage with new technologies and digital learning environments to increase their technology commitment. This is also true since it can be assumed that higher technology competence belief leads to higher actual technology competence [44], which in turn brings new opportunities, for example, concerning the use of innovative digital teaching and learning methods. Improved technology skills could enable students to collaborate more effectively in digital environments. In turn, this could positively impact the development of certain soft skills, such as social competence during teamwork [50]. Moreover, with improved technology skills, students’ opportunities in the job market should improve, as many employers demand technology skills [51]. Syahrin et al. [52] underlined the points made in their post-COVID-19 study by showing that student teachers rated themselves as more technically competent after the end of the pandemic. They also hypothesized that student teachers with a higher level of technology competence belief were more likely to use technology in the classroom later in their careers and support their students regarding technology and its use.
This contrasts with the findings of Pathiranage and Kuranaratne [53], who compared the results pre- and post-COVID-19 in their systematic literature review on the role of teachers concerning technology in education. They found that teachers rated themselves as very technologically proficient before COVID-19 but significantly less technologically proficient after the pandemic. According to the authors, this may be due to the fact that online teaching during the COVID-19 pandemic made teachers aware of their limitations in dealing with technology. Based on their finding, the authors emphasized the need for regular and adequate further training regarding the application of new technology in educational contexts. The present study can contribute here by showing how technology competence belief can be promoted, with a focus on students, by giving learners opportunities to engage with new technology, gain personal experiences with it, and, in a further step, use it to solve given problems and thus gain further experience with the new technology.
Reiss-Anderson [49] highlighted the relevance of decision-makers in the education sector with regard to digitalization and the teaching of technology skills. On the one hand, it was described that there is no clear responsibility for promoting digital skills. On the other hand, it became clear that the lack of digital skills at the level of decision-makers for schools harms the support of school principals and thus inhibits the digitalization process. The author stated that the responsibility for digitalization processes on the part of the school principals lies with the decision-makers, with clear guidelines, a long-term plan, regular evaluation as well as further training for employed teachers; particularly given that an unfavorable use of technology can also lead to negative consequences.
Pathiranage and Kuranaratne [53] also pointed out that teachers with higher technology competence beliefs were more likely to use technology in the classroom than teachers with lower technology competence beliefs. Considering that Rogge et al. [7] and the present study found increases in students’ technology commitment facets after exposure to new technology, it is, therefore, important for curriculum leaders to plan for future teachers’ exposure to technology in the sense of experiential learning, with appropriate curricula in order to promote their technology commitment. This could be reflected later in the increased integration of technology into the classroom and the corresponding promotion of their own students’ technology commitment in their own teaching activities. In this regard, this study proposes a possible model for promoting students’ and future teachers’ technology commitment, which could be further utilized in guidelines and curricula to promote an appropriate level of technology commitment among students of the respective semester or grade level.
Thus, this emphasizes the relevance of research on technology commitment during the COVID-19 pandemic for students, lecturers, and curriculum managers. Importantly, building on this knowledge of the developments during the pandemic offers opportunities to adapt and further develop learning environments.
6.5 Conclusion
In this study, the development of technology acceptance, technology competence belief, and technology control belief as facets of technology commitment were examined among students in higher education in the context of the COVID-19 pandemic. An increase over time was observed for the technological competence belief facet. The observed increase in technology competence belief highlights the dynamic nature of technology commitment, particularly in response to the unprecedented educational shifts during the pandemic, setting the stage for a deeper exploration of its long-term implications.
The present study’s findings are of particular interest concerning a broader understanding of technology commitment and its change over time, particularly under the COVID-19 pandemic. The study proposes an experiential learning approach to explain how technology commitment can change and how it has changed among students during the COVID-19 pandemic. Thus, it contributes to the theoretical understanding of the development of technology commitment and supports the assumptions of the experiential learning theory in the application of webinars in terms of change in technology competence belief. To summarize, research on change in technology commitment during COVID-19 suggests that all three facets of technology commitment have increased in individual populations, whereas no study reported any decrease in any technology commitment facet. These results correspond to the hypotheses postulated in the present research about an increase in the various facets through experiential learning. As a practical implication, this study offered suggestions for various interest groups, such as students, lecturers, and curriculum managers, on how to promote technology commitment.
However, future research is needed to examine the extent to which higher technology competence belief is actually predictive of actual technology competence. Accordingly, researchers are invited to follow the considerations made here, dig deeper into the matter, and address the questions raised in this paper in future studies with different samples and using diverse methods.
Data availability
Data and MPlus Syntax are available in the Open Science Framework repository and can be retrieved from https://osf.io/ztc4p/?view_only=8544e5f4c9024ab1bc260c265b2fc79a.
References
Gärtner A, Gollwitzer M, König LM, Tibubos AN. Chancen und Herausforderungen digitaler Lehre [Opportunities and challenges of digital teaching]. Psychol Rundsch. 2021;72(4):273–5. https://doi.org/10.1026/0033-3042/a000555
Yüce E. The immediate reactions of EFL learners towards total digitalization at higher education during the Covid-19 pandemic. J Theor Educ Sci. 2022;15(1):1–15. https://doi.org/10.30831/akukeg.939836.
Yüce E, Seitova M, Şentürk B. Online learning self-efficacy in using technology among Turkish and Kazakh EFL teachers. Asia Pac Educ Res. 2023. https://doi.org/10.1007/s40299-023-00784-4.
Yüce E, Çetin KZ. Pre-service EFL teachers’ perceptions towards online education and online teaching writing skills during the Covid-19 pandemic: a phenomenological research. In: Köksal D, Ulum ÖG, Genç G, editors. Undividing digital divide. Cham: Springer; 2023. p. 123–45. https://doi.org/10.1007/978-3-031-25006-4_7.
Breitenbach A. Digitale Lehre in Zeiten von Covid-19: Risiken und Chancen [Digital teaching in times of Covid-19: risks and opportunities]. Marburg 2021, 18 S. https://doi.org/10.25656/01:21274
Berling I, Jöllenbeck M, Stamer T, Ochsmann E. Association between mobile work and work ability: a longitudinal study under the impact of the COVID-19 pandemic. Int Arch Occup Environ Health. 2022. https://doi.org/10.1007/s00420-022-01849-5.
Rogge F, Wagner S, Nowak V, Liebner S, Entrich SR, Krauskopf K, Knigge M. Die Entwicklung technikbezogener Überzeugungen im Rahmen inklusionspädagogischer Professionalisierung im Lehramtsstudium Sekundarstufe während der Corona-Semester [The development of technology-related beliefs in the context of inclusive pedagogical professionalization in secondary teacher education during corona semesters]. QfI Qualifizierung für Inklusion. 2023. https://doi.org/10.21248/Qfl.108.
Neyer FJ, Felber J, Gebhardt C. Entwicklung und Validierung einer Kurzskala zur Erfassung von Technikbereitschaft [Development and validation of a short scale for the assessment of technology commitment]. Diagnostica. 2012;58(2):87–99. https://doi.org/10.1026/0012-1924/a000067.
Kolb DA, Rubin IM, McIntyre JM. Organizational psychology: an experiential approach. Hoboken: Prentice Hall; 1971.
Kolb AY, Kolb DA. Experiential learning theory as a guide for experiential educators in higher education. Exp Learn Teach Higher Educ. 2017;1(1):7–44.
Dumford AD, Miller AL. Online learning in higher education: exploring advantages and disadvantages for engagement. J Comput High Educ. 2018;30(3):452–65. https://doi.org/10.1007/s12528-018-9179-z.
Bates AW. Teaching in a digital age: Guidelines for designing teaching and learning. Tony Bates Associates Ltd. 2015. https://opentextbc.ca/teachinginadigitalage/
Davis FD. Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Q. 1989;13(3):319–40. https://doi.org/10.2307/249008
Davis FD. User acceptance of information technology: system characteristics, user perceptions and behavioral impacts. Int J Man Mach Stud. 1993;38(3):475–87. https://doi.org/10.1006/imms.1993.1022.
Davis FD, Venkatesh V. A critical assessment of potential measurement biases in the technology acceptance model: three experiments. Int J Hum Comput Stud. 1996;45(1):19–45. https://doi.org/10.1006/ijhc.1996.0040.
Ajzen I, Fishbein M. Understanding attitudes and predicting social behavior. Hoboken: Prentice-Hall; 1980.
Fishbein M, Ajzen I. Belief, attitude, intention and behavior: an introduction to theory and research. Reading: Addison-Wesley; 1975.
Ajzen I. The theory of planned behavior. Organ Behav Hum Decis Process. 1991;50:179–211. https://doi.org/10.1016/0749-5978(91)90020-T.
Arning K, Ziefle M. Understanding age differences in PDA acceptance and performance. Comput Hum Behav. 2007;23(6):2904–27. https://doi.org/10.1016/j.chb.2006.06.005.
Wicki M, Guidon S, Becker F, Axhausen K, Bernauer T. How technology commitment affects mode choice for a self-driving shuttle service. Res Transp Bus Manag. 2019;32: 100458. https://doi.org/10.1016/j.rtbm.2020.100458.
Kretschmer V, Terharen A. Serious games in virtual environments: cognitive ergonomic trainings for workplaces in Intralogistics. In Ahram, T. (Eds.) Advances in human factors in wearable technologies and game design. AHFE 2018. Advances in intelligent systems and computing (Vol. 795, pp. 266–274). Springer International Publishing; 2019. https://doi.org/10.1007/978-3-319-94619-1_26
Voderholzer U, Beintner I, Backes B, Esguerra E, Hessler-Kaufmann JB. Implementing videoconference CBT for depression in routine outpatient care: outcome, working alliance, and influence of patients’ technology commitment. Verhaltenstherapie. 2021;31(3):238–47. https://doi.org/10.1159/000513643.
Ratz T, Voelcker-Rehage C, Pischke CR, Muellmann S, Peters M, Lippke S. Health-related lifestyle and dropout from a web-based physical activity intervention trial in older adults: a latent profile analysis. Health Psychol. 2021;40(8):481–90. https://doi.org/10.1037/hea0001091.
Reich-Stiebert N, Eyssel F. Learning with educational companion robots? Toward attitudes on education robots, predictors of attitudes, and application potentials for education robots. Int J Soc Robots. 2015;7:875–88. https://doi.org/10.1007/s12369-015-0308-9.
Senkbeil M, Ihme JM, Schöber C. Wie gut sind angehende und fortgeschrittene Studierende auf das Leben und Arbeiten in der digitalen Welt vorbereitet? Ergebnisse eines Standard-Setting Verfahrens zur Beschreibung von ICT-bezogenen Kompetenzniveaus [How well prepared are prospective and advanced students for living and working in the digital world? Results of a standard-setting procedure to describe ICT-related competence levels]. Z Erzieh. 2019;22:1359–84. https://doi.org/10.1007/s11618-019-00914-z.
LimeSurvey Project Team & Schmitz, C. (2012). LimeSurvey: an open source survey tool [computer software]. LimeSurvey Project. https://www.limesurvey.org/de/
Trauzettel, F. (2021). Evaluation präventiver und gesundheitsförderlicher Aspekte von Serious Games im Alter [Evaluation of preventive and health-promoting aspects of serious fames in old age]. [Doctoral Dissertation, Humboldt-Universität zu Berlin]. https://doi.org/10.18452/22328
Muthén LK, Muthén BO. Mplus user´s guide. 8th ed. Los Angeles: Muthén & Muthén; 2017.
Yuan K-H, Bentler PM. Three likelihood-based methods for mean and covariance structure analysis with nonnormal missing data. Sociol Methodol. 2000;30:167–202. https://doi.org/10.1111/0081-1750.0.
Little TD, Slegers DW, Card NA. A non-arbitrary method of identifying and scaling latent variables in SEM and MACS models. Struct Equ Model. 2006;13(1):59–72. https://doi.org/10.1207/s15328007sem1301_3.
Little TD, Preacher KJ, Selig JP, Card NA. New developments in latent variable panel analyses of longitudinal data. Int J Behav Dev. 2007;31:357–65. https://doi.org/10.1177/0165025407077757.
Chen FF. Sensitivity of goodness of fit indexes to lack of measurement invariance. Struct Equ Model. 2007;14:464–504. https://doi.org/10.1080/10705510701301834.
Cheung GW, Rensvold RB. Evaluating goodness-of-fit indexes for testing measurement invariance. Struct Equ Model. 2002;9:233–55. https://doi.org/10.1207/S15328007SEM0902_5.
Marsh HW, Hau K-T. Assessing goodness of fit: is parsimony always desirable? J Exp Educ. 1996;64:364–90. https://doi.org/10.1016/10.1080/00220973.1996.10806604.
Geiser C. Data analysis with Mplus. New York: Guilford Press; 2013.
McArdle JJ. Latent variable modeling of differences and changes with longitudinal data. Ann Rev Psychol. 2009;60:577–605. https://doi.org/10.1146/annurev.psych.60.110707.163612.
Enders CK. Applied missing data analysis. New York: Guilford Press; 2010.
West SG, Taylor AB, Wu W. Model fit and model selection in structural equation modeling. In: Hoyle RH, editor. Handbook of structural equation modeling. New York: Guilford Press; 2012. p. 209–31.
Guerra-Carrillo B, Katovich K, Bunge SA. Does higher education hone cognitive functioning and learning efficacy? Findings from a large and diverse sample. PLoS ONE. 2017;12(8): e0182276. https://doi.org/10.1371/journal.pone.0182276.
Oliveira ÍM, Marques C. The role of career adaptability and academic engagement in college student’s life satisfaction. Int J Environ Res Public Health. 2024;21(5):596. https://doi.org/10.3390/ijerph21050596.
Cameron RB, Rideout CA. ‘It’s been a challenge finding new ways to learn’: first-year students’ perceptions of adapting to learning in a university environment. Stud High Educ. 2020;47(3):668–82. https://doi.org/10.1080/03075079.2020.1783525.
Kaqinari T, Makarova E, Audran J, Döring AK, Göbel K, Kern D. The switch to online teaching during the first COVID-19 lockdown: a comparative study at four European universities. J Univ Teach Learn Pract. 2021;18(5):10. https://doi.org/10.13140/RG.2.2.26652.41604.
Mertens A, Rasche P, Theis S, Seinsch T, Boddin M, Küpper R, Bröhl C, Wille M, Zweck A, Brandl C, Nitsch V, Schäfer K. Health technology use in Germany among older adults (part I): Short time changes in information and communication technology. In: Duffy, V.G. (Eds.) Digital human modeling and applications in health, safety, ergonomics and risk management. Health operations management, and design. HCII 2022. Lecture notes in computer science (Vol. 13320). Springer, Cham. 2022; https://doi.org/10.1007/978-3-031-06018-2_8
Marsh HW, Craven RG. Reciprocal effects of self-concept and performance from a multidimensional perspective: beyond seductive pleasure and unidimensional perspectives. Perspect Psychol Sci. 2006;1(2):133–63. https://doi.org/10.1111/j.1745-6916.2006.00010.x.
Wolff F, Sticca F, Niepel C, Götz T, Van Damme J, Möller J. The reciprocal 2I/E model: an investigation of mutual relations between achievement and self-concept levels and changes in the math and verbal domain across three countries. J Educ Psychol. 2021;113:1529–49. https://doi.org/10.1037/edu0000632.
Bailey DE, Kurland NB. A review of telework research: findings, new directions, and lessons for the study of modern work. J Organ Behav. 2002;23:383–400. https://doi.org/10.1002/job.144.
Werner AM, Tibubos AN, Mülder LM, Reichel JL, Schäfer M, Heller S, Pfirrmann D, Edelmann D, Dietz P, Rigotti T, Beutel ME. The impact of lockdown stress and loneliness during the COVID-19 pandemic on mental health among university students in Germany. Sci Rep. 2021;11:22637. https://doi.org/10.1038/s41598-021-02024-5.
Hodges CB, Moore S, Lockee BB, Trust T, Bond MA. The difference between emergency remote teaching and online learning. EDUCAUSE Review. 2020. https://bit.ly/3dzG1U0
Reiss-Andersson J. Leading the digitalization process in K-12 schools – the school leaders’ perspective. Educ Inf Technol. 2024;29(3):2585–603. https://doi.org/10.1007/s10639-023-11935-x.
Cimatti B. Definition, development, assessment of soft skills and their role for the quality of organizations and enterprises. Int J Qual Res. 2016;10(1):97–130. https://doi.org/10.18421/IJQR10.01-05.
Sá MJ, Serpa S. Transversal competencies: their importance and learning processes by higher education students. Educ Sci. 2018;8(3):126. https://doi.org/10.3390/educsci8030126.
Syahrin S, Almashiki K, Alzaanin E. The impact of COVID-19 on digital competence. Int J Adv Comput Sci Appl. 2023;14(1):511–9. https://doi.org/10.14569/IJACSA.2023.0140156.
Pathiranage A, Karunaratne T. Teachers’ agency in technology for education in pre- and post-COVID-19 periods: a systematic literature review. Educ Sci. 2023;13(9):917. https://doi.org/10.3390/educsci13090917.
Acknowledgements
We thank Friederike Helm, Thorben Jansen, Julia Jensen, Sonja Krämer, Jennifer Meyer, and Steffen Zitzmann for their help with data collection.
Funding
Open Access funding enabled and organized by Projekt DEAL. No funds, grants, or other support was received.
Author information
Authors and Affiliations
Contributions
Conceptualization: Fabian Wolff; Methodology: Fabian Wolff; Formal analysis: Leonard Puderbach, Fabian Wolff; Investigation: Alexandra Petrak, Fabian Wolff; Writing: Leonard Puderbach; Writing—review and editing: Alexandra Petrak, Fabian Wolff; Supervision: Fabian Wolff.
Corresponding author
Ethics declarations
Ethics approval and consent to participate
All study participants gave their consent to the processing of their data after being informed about the content and purpose of the study. In accordance with the guidelines of the German Psychological Society (DGPs), ethical approval was not required for this study, because participants were fully informed about the aims and procedures of the studies before taking part in the study and because they were not expected to take any risks by participating in the study.
Competing interests
The authors declare no competing interests.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Puderbach, L., Petrak, A. & Wolff, F. “Can you Hear me?” Change of technology commitment during the first two semesters under COVID-19. Discov Educ 3, 149 (2024). https://doi.org/10.1007/s44217-024-00240-5
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s44217-024-00240-5




