Freshmen in Higher Education are required to exhibit a strong inclination to taking ownership of their own learning. It entails well-developed self-regulated learning competences. This demand is further exacerbated in purely online settings such as open distant learning, MOOCs, or disruptive circumstances like the COVID pandemic. Time management skills are an essential component in this process and the target of this study, wherein 348 students covered a course through two conditions: the control group attended the semester in an unchanged way, while students in the experimental group were weekly invited to estimate and log their workload and time allocations, via “reflection amplifiers” provided on their mobile devices. While no major difference in time management and learning performance was observable, data reveals that perceived time allocation and prescribed study-time differ substantially. These results raise questions, on the students’ side, about the potential of qualitative (self-inputted) learning analytics to raise awareness on where time investments go. On the teachers’ side, the results highlight the need to better plan the curricula workload specifically for first-year students.
Students in their first year at university (freshmen) face a very different organization of learning compared to secondary school . A deficient adaptation process leads to growing helplessness, ultimately culminating in withdrawals [2,3,4,5,6,7,8]. Among prominent factors influencing success in the first year, one finds the increased workload  requiring a renewed awareness to time-related issues . Needed and available time, quantity and difficulty level of learning activities, teacher versus student estimations [11, 12], strategic balance between superficial and deep learning , time requirements in single courses and in the curriculum as a whole are all entangled and dynamic parameters that freshmen will have to deal with, often without being adequately prepared for it, as shown by studies of low and high profile students .
Time allocation issues have also grown in importance since the Bologna reform and its European Credit Transfer System (ECTS) which introduced a generalized and comparable time quantum. Workload-related issues have de facto imbued official documents as a curricular measure (one credit accounts for 25 h of studying, both via contact hours with teachers as well as self-study or practice), and has become an explicit concern for Higher Education (HE) institutions. However, because study-time is highly student- and task-dependent, not much is known about possible discrepancies between theoretical and real amount of invested time [13, 14]. Curriculum developers, therefore, regularly refer to estimates based on experience and sometimes rough guesses. Daele et al.  pinpoint mismatches between students and teachers in their study-time expectations and perceptions, concluding that teachers lacked tools allowing them to more objectively estimate the quantity of study-time needed for their course. Samoilova et al.  observe that both in an online and blended learning context, measuring student workload is essential for optimizing learning, but workload research is still under development. Gerrard et al.  or Prégent et al.  highlight the importance of appropriate workload planning and recommend the coordination of more integrated curricula by academic staff. Black  shows that students’ study-time is often unequally distributed between subject modules and the resulting irritation carries the danger, in severe cases, of subject withdrawals and dropout, especially in the first university year. Various internal and external factors might affect the curricular equilibrium as envisaged by the university course developers: uneven prior knowledge of the subject domains, disparate number of homework tasks requested in each subject, or a poor distribution of assignments over the term. Additionally, a diversity of self-regulated learning (SRL) skills affects student efficacy. Claessens et al.  conclude their review of empirical studies on time management by stating that the relationship between time management and academic performance is not well understood.
An improved knowledge of study-time allocation is, therefore, not only vital for students but also for teaching staff. Designing syllabi and individual learning activities with an appropriate workload both within and between parallel and successive courses, is no trivial undertaking and even a bigger challenge in times of a pandemic, when large portions of the curriculum pivot toward distant and blended learning , imparting an even more critical role than before to self-regulated learning and time management skills. Recent research works on time management in college students conclude that time management is a significant self-regulatory process through which students actively manage when and for how long they engage in the activities deemed necessary for reaching their academic goals . Indeed, becoming a “professional student” implies taking an active role to manage one’s own studying, learning, or academic engagement [21, 22]. However, despite the overall importance granted to SRL , the evidence that it should not be taken for granted among freshmen , and the indications from research that its underpinning skills, including time management, can be learnt [18, 25], reports of concrete programs, tactics and tools likely to sustain the development of such skills remain scarce in HE [26,27,28,29,30,31]. This stands in contrast to more frequent contributions concerned with measurement and scales for those skills, correlations they can have with achievement or stress [32, 33], or specific inquiries on procrastination behaviors . Even more striking, while on national and international level , HE institutions have been tasked with providing environments also supportive of study-time skills, that is “behaviors that aim at achieving an effective use of time while performing certain goal-directed activities” (, p.262), only a few attempts emerge in literature to harness new technological possibilities to guiding appropriate interventions and educating students to better self-regulate their study-time, by devoting deliberate attention to it. Among this modicum of ideas to teachers and institutions for tackling the challenge of training students in effective time management strategies, one can nevertheless pinpoint the recent push toward innovative use of educational datasets for pedagogic purposes learning analytics  has yet recently found a fertile cross-disciplinary application in promoting and facilitating self-regulated learning [37,38,39]. Claessens et al.  suggest that self-monitoring and logging of study-time by students have the potential to improve the perceived control of time and, therefore, student satisfaction, while reducing stress and anxiety. Tabuenca et al.  use self-observation data on mobile devices to create awareness, as a helpful mechanism to stimulate reflection and make students seek better time management strategies to master the intended workload. By offering educators possibilities to reflect on their teaching effectiveness through the interaction footprints of their students , Learning Analytics (LA) can also inform some important pedagogic planning that has hitherto largely depended on guess-work, such as curricular workload and the awarding of credits for student effort. Recent research investigated, the institutional conditions and aspirations for implementing LA, and concluded that teaching staff aspired to adapt the curriculum and to improve student support  using these new sources of knowledge about teaching and learning practice . By measuring and monitoring students’ actual time spent on tasks per subject, useful comparisons on credit awards (ECTS points) between different modules of a course and different phases of the syllabus can be drawn, with the goal to inform curricular adaptations, arrive at realistic and balanced student workloads and improve the current pedagogical models by modifying their interaction model [41, 43,44,45]. A recent survey study highlighted that students themselves displayed a strong interest in receiving regular updates on their learning that facilitated their SRL [42, 46]. Similarly, institutions indicated an interest in knowing what’s in each course .
The aim of the current study, therefore, is to explore the effect of a weekly prompt to reflect on workload and study-time management, conveyed through smartphones. Its originality is to combine a technology-enhanced approach based on learning analytics with more traditional pedagogical emphasis put on meta-learning, defined as an awareness and understanding of the phenomenon of learning itself as opposed to pure subject knowledge . Among the many concrete aspects of learning that can be made an object of meta-learning, in our case study-time management is the target. Three research questions guided this study:
Research question 1 (RQ1). Do students who conduct regular reflection exercises on their study-time (experimental group) outperform the rest of the students (control group) regarding their time management skills and learning performance? The assumption here is that regular reflection exercises on study-time along the semester will impact positively students’ time management skills, and consequently their learning performance. The objective behind RQ1 is to determine whether students' time and learning management skills can be enhanced through increased awareness.
Research question 2 (RQ2). Do time logs help first-semester students to organize their study-time according to their expected workload? The assumption here is that students do not distribute their study-time (workload) coherently by considering the ECTS distribution in first-semester modules. In this study, first-semester students will seize the opportunity to reflect on their time management using time logs.
Research question 3 (RQ3). What evidence can be extracted using learning analytics and reflection amplifiers with first-semester students toward smoothing the transition to higher education? The rationale here is that data provided by students will provide a fuller picture of the role of time allocation in first-semester university courses and deliver valuable information for evidence-based decision making by course developers and academic workload policies.
Self-observation mechanisms  were combined with performance data to understand how time allocation occurs in regular university courses. The multi-part longitudinal design of the experiment consisted of two types of data collection actions. Firstly, diagnostic tests along the course assessed, on a regular basis, students’ overall knowledge in the Computer Fundamentals module. This involved the entire cohort. Secondly, in the experimental group, an intervention took place, whereby an external researcher invited students to reflect on their study-load allocation across the six different subject disciplines. Students were asked, through dedicated mobile prompts called “reflection amplifiers”, to log their weekly study-time spent on Computer Fundamentals using their own mobile device. The experiment ran over an entire first semester at the university.
2.1 Course and participants
The first-year semester of a computer sciences degree at Technical University of Madrid comprises six modules. Mathematical Analysis (MA), Discrete Mathematics (DM), Computer Fundamentals (CF) and Programming Fundamentals (PF) carry each an equal weight of 6 ECTS (the equivalent of 4 h face-to-face + 5.4 h of self-organized study distributed over 16 weeks), whereas the Programming Lab (PL) and Operating Systems Lab (OS) modules require a 3 ECTS workload. Alarmed by the high rates of dropouts and failed students in previous years within the CF module, this study aimed at supporting students to reflect on the time devoted to studying in their first semester with a special attention to the CF module. Hence, interventions were always performed during CF classes. The module comprises both theoretical lectures and practical workshops. A total of 796 students were enrolled in the course. They were all offered, at the start of the semester, to take part in the experiment.
Weekly structured introspective episodes on self-perceived efficacy (See “Appendix”) were designed as “Reflection Amplifiers” ). This appellation refers to compact, structured and repeated reflection affordances offered to students in order to make aspects of their learning (here: time management) deliberate objects of attention. Interspersed with a first-order learning task, reflection amplifiers steer learners’ attention toward a meta-learning level (here: a measurement of time investment and time management strategies). To students’ natural focus on the learning task at hand, RAs add an explicit focus on the learning dispositions and processes that are at play in this very task. RAs are purposely designed as “quick-to-fill” meta-learning affordances in order not to divert learners from the first-order task. These prompts present delimited, structured, and concise opportunities to develop observations about aspects of the learning process and strategies . RAs provide a single engagement point with a defined target of reflection. They assume that by being invited to interpret their learning actions, learners will develop an increased awareness of and an intensified presence in the learning process itself . Specific to our study, the RAs were presented on the students’ mobile devices as a personal response system raising two multiple-choice questions (Fig. 1a, b) related to time management aspects: First, weekly most time-consuming module, and, second, estimated invested time in this module (scale: less than 1 h/between 1 and 2 h/between 2 and 4 h/between 4 and 6 h/more than 6 h). This focus on an individual module makes sense because the six modules mentioned above run in parallel, and, therefore, demand student time to be divided up. Proper time management measurements were also requested (Fig. 1c).
Additionally, three RA prompts were sent every week (see “Appendix A”) that invited students to think and answer about how they were using their time for learning the six modules comprising their first semester at the university.
2.3 Measure instruments
In this study, the following variables were explored to investigate the above cited research questions:
2.3.1 Learning performance
During the semester, three different types of assessments measured students’ knowledge progression: Firstly, a Moodle test measured students’ knowledge at the end of each of the five teaching blocks comprising the CF module. These online tests included ten multiple-choice questions that students should complete in 20 min; Secondly, a practical exam at the end of the course asked students to solve a problem on paper and simulate the solution both on the computer (using Multisim software) and in the digital-analog training system (using the ETS-7000 hardware). Finally, an overall written test exam at the end of the course measured students’ learning outcomes in the subject. The overall assessment values ranged from 0 (lowest learning outcome) to 10 (highest learning outcome). The marks obtained for the three types of tests were used to relate self-reported study-time spent to perceived efficacy and performance.
2.3.2 Time management
There are not many validated scales to measure time management skills. The most relevant according to Claessens et al.  are the following: the time management behavior scale (TMBS) comprises three subscales: (a) setting goals and priorities, (b) mechanics of time management, and (c) preference for organization ; the time structure questionnaire (TSQ) focuses on five different aspects: (i) sense of purpose, (ii) structured routine, (iii) present orientation, (iv) effective organization, and (v) persistence ; and, the time management questionnaire (TMQ), which includes items on attitudes toward time management and planning the allocation of time . Since this research study is focused on self-regulated time toward knowledge achievement in the context of a higher education course, we decided to use the TMQ as it is the most frequently tested in an educational context. The TMQ consists of three subscale constructs including: (1) short-range planning, (2) long-range planning, and (3) time attitudes. The TMQ uses an 18-item scale with a 5-point Likert-type response format having values ranging from strongly agree (5) to strongly disagree (1). Pre- and Post-questionnaires were held using the TMQ test capturing time management experiences of students in the entire cohort, i.e., experimental and control group.
This section presents the results from a quantitative analysis of the data collected through the instruments mentioned above: the pre/post time management questionnaires (TMQ) completed by control and experimental group, reflection exercises where students in the experimental group logged their weekly study-time, and the grades obtained by control and experimental group in the final assessment. A total of 348 students out of 796 accepted the consent form and completed the pre-questionnaire (88.51% male, 11.49% female).
3.1 Effects of being prompted by reflection amplifiers
This study aimed to explore the effects of being prompted by RAs on learning performance. It was expected that students logging study-time would improve their time management skills and obtain higher grades in the assessment as a consequence of reflection episodes suggested to the experimental group (RQ1).
The Cronbach’s alpha coefficient concluded in reliable values for all the TMQ subscales ranging from 0.83 to 0.70 (See Table 1). These scores demonstrate adequate internal consistency. Nunnally  has suggested that score reliability of 0.70 or better is acceptable. A Shapiro–Wilk test was conducted with the aim to confirm the normal distribution assumption toward performing an analysis of variance (ANOVA). The p-values (higher than 0.05) and the observations of the Q-Q plots confirm that time management samples are normally distributed. However, the p-values obtained in the grades sample deviate from normality.
The number of students participating in the reflection exercise decreased as the course progressed. Therefore, the experimental group (n = 45) finally comprised students who had participated in all measurements: (1) logged their study-time at least 3 times out of the 6 week-points; (2) participated in the final CF assessment; (3) completed the pre and the post TMQ questionnaire. The control group involved 93 students who agreed to participate in the experiment, participated in the final CF assessment, and completed the pre and the post TMQ questionnaire.
The figures presented in Table 2 show that TMQ means decreased from the pre-questionnaire to the post-questionnaire both in the control and experimental group. Regarding the assessment in the CF module, the results show slightly higher means in the experimental group. The values obtained in the assessment show a slight difference between the control (M = 2.25; SD = 2.15) and the experimental group (M = 2.36; SD = 2.27).
An analysis of variance (ANOVA) was performed to confirm no significant differences between the control and the experimental group in the pre-test. The results presented in Table 3 show that there are no significant differences in the initial measure with the exception of long-range planning (Pr. = 0.02 < 0.1). These results confirm that students from both the control and experimental groups started into the semester without meaningful differences in time management skills. A second ANOVA was performed to identify significant time management variations in the post-test as a consequence of the interventions in the experimental group. Contrary to our expectations, the results concluded in non-significant variances between the means with the exception of long-range planning (Pr. = 0.07 < 0.1). Additionally, a third ANOVA test aimed at identifying significant differences in learning performance (CF assessment) as a consequence of the treatment between control and experimental group. Likewise, the results concluded in non-significant differences.
3.2 Study-time distribution in the freshmen cohort
Table 4 illustrates the distribution of votes by module, as collected along the semester (Fig. 1a above). Although the top four modules (MA, DM, CF, PF) are all officially weighted the same in the curriculum (6 ECTS equivalent to 150 h of study), students weighted them very differently in terms of perceived time requirements. The MA module received 3.6 times as much temporal attention than CF module, despite equal credits. Similarly, PL received 3 times as much temporal attention than the OS module.
Figure 2 illustrates the longitudinal distribution of study-time over the entire semester (6 measurements). Here it becomes obvious that the demands per module differ over the runtime of the course with a spike in PL at week-point 3 and in CF at week-point 4. Nonetheless, as also illustrated in Table 4, MA comes out as the most time intensive part over the entire period, with the exception of week-point 4. These results confirm the assumption (RQ2) that students do not distribute their study-time (workload) coherently by considering the ECTS distribution.
3.3 Correlation between time logged and learning performance
In order to investigate RQ2, a Pearson's correlation analysis was run (Table 5) to determine the relationship between the mean quantity of time logged throughout the course (Time), the number of logs performed throughout the course (Number of logs), the measure of TMQ performed at the end of the course (Time management), and the grades obtained in the final assessment (Assessment). Pearson’s r indicates the strength of the linear relationship between two variables for which the values range between − 1 < 0 < 1. The values closer to 1 (− 1) depict a stronger positive (negative) correlation, meaning that the second variable tends to increase (decrease) when the values of the first value are increased and vice versa. The closer the values are to 0, the weaker the correlation is. We can verbally describe the strength of the correlation using the guide that Evans  suggested for the absolute value of r (Strength: 0.00–0.19 “very weak”; 0.20–0.39 “weak”; 0.40–0.59 “moderate”; 0.60–0.79 “strong”; 0.80–1.0 “very strong”).
In RQ2, we anticipated that time management (TMQ) skills would be positively correlated with logging time (i.e., Number of logs and Time logged). On the one hand, the results of the analysis do not depict a significant correlation between time management skills and logging time (Number of logs). On the other hand, the correlation analysis suggests a significant weak positive correlation between time management skills and the quantity of time logged (Time). The correlation between the number of logs and the quantity of time logged is obvious and consequently discarded from further analysis. Additionally, we expected that the learning performance measured in different assessments throughout the course would be positively correlated with time management and logging time. However, the results do not suggest any correlation between learning performance (Assessment) and the rest of the measurements. This result is similar to other studies as recounted by Broadbent and Poon (, p.10).
3.4 Findings from reflection episodes toward improved accessibility
This work analyzes the results obtained during the case study to identify the potential of learning analytics and reflection amplifiers to make of learning time a deliberate object of attention in new HE students (See “Appendix”). The following aspects were investigated to explore the potential of fostering time management skills toward improved adaptation of students in their first semester at the university (RQ3).
3.4.1 Weekly time spent on the course
Students in the experimental group were prompted across different week-points (Appendix RA-B in wp. 1, 2, 3, 5) on how much time they had devoted to studying Computer Fundamentals on a scale ranging from 1 to 5 (1-Less than 1 h; 2-Between 1 and 2 h; 3-Between 2 and 4 h; 4-Between 4 and 6 h; 5-More than 6 h). A total of 554 reports were collected throughout the semester resulting in M(SD) = 2.07(0.80) which hints at 1–2 h weekly and lies far from the 5.4 h of the Bologna calculation for this 6 ECTS module which might further explain the low grades obtained in the assessments. Dropouts would perhaps devote even less time to study the module.
3.4.2 Knowledge of theoretical study-time
In the presented CF module example, 6 ECTS are equivalent to approximately 5.4 weekly hours of self-study outside the classroom. In week-point 3, we asked students whether they knew how much time they are supposed to devote to self-study for a 6 ECTS credits subject (Appendix RA-F). Only 23% answered correctly between 4–6 h. Some 1% of the students responded ‘less than 1 h’; ‘between 1–2 h” was reported by 4%; ‘between 2–4 h” was reported by 25%; ‘more than 6 h” was reported by 31%; and 13% of them replied that they did not know.
3.4.3 Spreading effect of time-consuming modules
In week-point 2 (n = 105), students that reported an imbalance in their workload (Appendix RA-E in wp. 2), clearly pinpointed to Mathematical Analysis (44%), Programming Fundamentals (38%), and Discrete Mathematics (11%) as the most influential modules on their time allocation. With regard to the most time-consuming modules during that week, students considered that this influenced the time they should have devoted to the rest of the subjects. The answers provided show that 52% of the students indicated that those subjects had contributed to an imbalance of their weekly study-time.
3.4.4 Study-time anticipation
In week-point 5 (n = 81), students were invited to report how much time they devoted to Computer Fundamentals that week (RA-B). Additionally, they were invited to look forward and estimate how much time they expected to devote to the same subject the following week (RA-C). 49% of the students responded they would devote the same amount of time, 37% reported they would devote more time, whereas 12% said they would devote even less time. Reports from the following data point show that only 16% devoted more time, 53% devoted same time, and 31% devoted less time.
3.4.5 Work at home
A prompt in week-point 6 (n = 77) aimed at exploring which activities required most of their study-time at home (RA-H). Multiple answers were possible for this item. 35% reported “study for exams”, 30% answered “make exercises”, 17% reported “write assignments”, 9% reported “prepare practical workshops”, 3% reported “other activities”, 2% reported “look for documentation”, and 2% chose “install or/and understand a new software”.
3.4.6 Familiarity with subject modules
The post-questionnaire included an extra item that invited students (n = 175) from both the control and experimental group to quantify their previous knowledge of the modules when they started the university (0.-Nothing, 1.-Minor things, 2.-Most things; 3.-Everything). Mean (and standard deviation) calculation resulted in disparate values across modules: 1.24 (1.41) in ‘Mathematical Analysis’, 0.79 (2.12) in ‘Discrete Mathematics’, 0.78 (0.71) in ‘Programming Fundamentals’, 0.77(0.71) in ‘Programming Lab’, 0.58 (1.41) in ‘Computer Fundamentals’, and 0.5(0) in ‘Operating Systems Lab’.
4 Discussion and implications
This work focused on the abrupt change in study methodology that students undergo as they move from secondary education to higher education. In this transition, students move from a teacher guided education, to an education in which they must guide their own learning tasks and organize their time according to their particular circumstances (e.g., expertise, available tools). Most students in their initial university semester face, for the first time, the challenge of managing their study-time (workload) in a balanced way considering the complexity and specific needs of each course remit. In the worst case, this leads some students to withdraw modules prematurely or even abandon their studies. This work rekindled the need to promote the ability of students to reflect on their own methods by making of study-time a deliberate object of attention . Here, reflection amplifiers and learning analytics were combined and evaluated to bridge this gap and to facilitate universal access to higher education by fostering time management skill. Therefore, the following research questions were investigated.
In RQ1, we explored whether students who conducted regular reflection exercises on their study-time outperformed the rest of the students regarding their time management skills and learning performance. Contrary to our expectations, the resulting estimate of time on task among subjects confirmed other similar studies [14, 57] as being sometimes dramatically off target to the expected curricular presets. Since no notable improvement in neither time management (as compared with the control group) nor assessment grades were observable, the longitudinal aspect also suggests, that, left to their own devices in terms of study-time management, students will develop into a diverse range of directions. Data confirms the impression that first-semester university students do not have a habit to regulate their study-time effectively, and cannot be expected to do so by themselves. This may be an indication of a more general lack of SRL competences that would, therefore, need further support from the HE institutions. We may speculate that with specific guidance and training in study-time management, the results would perhaps look different . It reminds us that like any aspect of self-regulation, enhancement of time management requires reliable feedback and cannot be learnt entirely through SRL . This feedback could take different forms and intensities, from a mirroring of their learning analytics data to students (possibly enriched with some kind of social yardstick) to more complex and time-consuming oral feedback by a pedagogical adviser. A substantial number of students left the course, without us being able to unambiguously identify the reasons for such withdrawals. From other research investigations, we noted that time-related issues are seen as the single biggest factor for dropout [57, 59].
In RQ2, we explored whether time logs can help first-semester students to organize their study-time considering their expected workload. This study had, as a goal, a better understanding and an improved knowledge of study-time amount and allocation in first-semester university courses. Our data showed that self-reported time allocation and prescribed study-time differ substantially among students and modules (Table 4). In this respect, we used learning analytics to show that students do not distribute their workload coherently by considering the ECTS distribution in first-semester modules (Fig. 2). Students were able to produce, through self-observation mechanisms, information related to study-time which is likely to help institutions to better plan their curricula, while at the same time making students more aware of where their time investments go, thus providing evidence-based input for reflection.
The inquiry looked specifically at students’ perceived time management behavior as part of their SRL over the period of one semester, comparing one subject module with parallel running ones. The experimental group was prompted—but not guided—to self-report and reflect on their study-time investment over the respective week. On the one hand, the results of the correlation analysis presented in Table 5 suggests that students who logged more time obtained higher time management skills. On the other hand, neither the number of the time logs, nor the quantity of time logged moderated students learning performance. These results are consistent with previous studies investigating SRL and improved academic achievement (, p.10), and inconsistent with others [53, 60]. Therefore, further research is needed to investigate whether time logs (reflection exercises) integrated in all subject modules running in parallel during the first-year semester might lead to better time management skills and learning performance in longitudinal studies. On the whole, and despite a lack of observed effects or RA on performance and time-management skills development, the results question the fundamental challenge of the following three aspects: (1) establishing the “right” frequency of reflective episodes during learning while preventing under- and over-prompting effect; (2) fixing the “right” length of the reflective episodes. In this study, RA was kept as relatively short. The important looming question is: can a reflection be short and useful?; (3) assessing the “right” total time of exposure to RA needed to anchor “habits of reflection”  without taxing cognitive resources for first-order tasks, otherwise reflection periods could themselves fuel time-management problems!
In RQ3, we aimed to explore the potential of learning analytics and reflection amplifiers in first-semester students toward facilitating the transition to higher education The weak correlation between time management and academic achievement as well as the fact that in our study even persistent students’ time management skills and perceived self-efficacy decreased (if only slightly) over the term, would further indicate that irregularities in student study-time depend less on the time management skills students bring to the course, but rather puts more responsibility on the study organizers to arrange for more balanced environmental arrangements and better integrated and orchestrated syllabi across the curriculum. This finding is in line with Gerrard et al.  who call for better coordinated scaffolding arrangements for first-year students.
Since time management skills decreased slightly from the first part to the last part of the semester (Table 2), even as non-persistent students removed themselves from the course, greater pressure on time-related skills seem to be inherent in the curriculum. HE institutions would have a role to play as part of measuring and promoting time managements skills to counter potential stress and work overload caused by varying demands on student time and effort. Data provided by students through a combination of learning analytics and reflection amplifiers delivered valuable information for evidence-based decisions to develop academic workload policies. On the one hand, most students (77%) did not know how much time they should devote to study outside contact hours depending on the number of ECTS credits assigned to a module (see Sect. 3.4 above). On the other hand, the result of the time logs show that students unevenly distribute their study-time throughout the semester without balancing the expected effort for each specific module. Factors like previous knowledge in the subject area (familiarity), the number and difficulty of assignments required throughout the semester, or the assessment type are course elements that affect the learning experiences and student engagement [17, 20].
Students do not consider the number of ECTS assigned to the module to balance their weekly workload. They are unaware of the hours that correspond to each subject (see Sect. 3.4.2). Freshmen seem to give more dedication to those subjects that are more familiar to them because of their prior knowledge (see Sect. 3.4.6). The most time-consuming modules seem to penalize students’ performance in the rest of the modules (see Sect. 3.4.3). Hence, teachers must instruct students about the time requirements of each subject using learning analytics and identify unexpected scenarios to support students as soon as possible. Regular time logs and qualitative learning analytics are useful tools for instructional designers to identify when students are at risk and dropouts begin to occur, and whether they are associated with certain subject modules or specific learning activities (see Sect. 3.4.5).
While temporal awareness of students may not directly affect their academic performance, there is some indication that perceived time requirements influences students’ decisions for staying on a course to a greater extent than the actual time spent. The student views on individual subjects is influenced by the amount of time they allocate to it. In simplified terms, a module is portrayed as “hard” or “easy”, not necessarily based on the level of cognitive difficulty alone but also in terms of time and effort needed to succeed. This subjective opinion, even when not based on real facts (actual time measured), can become the cause and origin of higher stress levels and work overload, potentially leading to withdrawal. An environment where this can be observed even more clearly is MOOCs. Here, again, the single biggest factor for dropping out is time-related .
Despite the non-conclusive relationship between time management and student grades, the development of SRL in pre-university education, which is to continue in higher education can be seen as an imperative challenge for the sector as a whole. However, SRL requires flexible and adaptable environments and structures that allow for personalized approaches. If structural constraints are too rigid, there is no room for SRL to flourish. Students, therefore, need “space” to improve on their learning strategies, evaluate them against perceived personal success and progress, and to personalize their learning experiences. Learning analytics, providing evidence-based feedback, can support such efforts, but, at the same time, teaching staff are required to establish realistic and integrated workload expectations that can also serve as benchmarks. Learning analytics measurements on a regular, if not continuous basis, would be able to inform faculty on how to adapt their syllabi in terms of complexity, volume, assignments and assessments to arrive at a workload balance avoiding spikes, and troughs of activity that put unbalanced pressure on students. In this way, it is the alignment of learning designs combined with learning analytics that would enable improved curricula leading to better learning and higher retention. This view is also expressed in the increasing amount of recent literature on “learning analytics for learning design” (LALD) (cf. e.g., [36, 62, 63]). Our hitherto limited experiment would still maintain the views expressed by Claessens et al.  that effective time management through self-monitoring and logging of study-time by the students has the potential to improve the perceived control of time and student satisfaction, while reducing stress and anxiety. We can add to this that it will not work in isolation, but only in concert with the institutional and curricular arrangements. Further research is needed to see whether more flexible, but clearly communicated self-determination of student study-time would result in clearer linkages between SRL time management and academic performance. Educational stakeholders should clearly identify and report how many weekly hours of study per subject are required to obtain the best learning outcomes. These beacons would probably help students to better plan their week, setting well-defined personal goals. At the same time, the curricular orchestration of simultaneously running learning designs can be helped by regular iterations of study-time inquiries through learning analytics data.
In our multi-part investigation, raising awareness among students on how they spend their study-time turned out not to be a valid measure for enhancing their time management skills, nor did it lead to better achievements. While we found the regular self-observation and self-reporting by way of time logs generally useful, automated tracking and feedback on how well students distribute their time perhaps would have helped to keep them better engaged. In a similar fashion to leisure-time monitoring and self-awareness via various mobile apps and wearable technologies (cf. the “Quantified Self” movement ), closer unobtrusive monitoring providing feedback in case of unusual behaviors would help to understand where time is being devoted, but also to potentially identify bad practices of both teachers and students.
Our data are clearly showing an imbalance across a single curriculum as experienced by the students, despite an equal weight in terms of ECTS credits. Exploiting the findings of this research exercise, could, therefore, initiate innovative actions in curricular management to arrive at greater congruence in the learning experiences of individuals across parallel modules and courses. On these insights, a curricular planning system of balanced student workload can be created and further developed, making use of emergent learning analytics tools and methods, while at the same time motivating students to self-monitor and self-evaluate their time investments through their learning data, in order to alleviate stress factors and to reduce dropout and procrastination for reasons of work overload. The next step of this investigation is the establishment of an academic trans-disciplinary workgroup to discuss the curricular adaptations in the tasks, timetable etc., and the reasoning behind student workload of individual subjects. It will be interesting to compare student experiences with staff expectations and arguments. The direction this research will take is to further investigate the time-related course elements and study disposition of students, so as to arrive at a full understanding of student workload and curricular expectation, combined with deferring greater ownership of the learning process to the students through SRL.
Coming back to the research question, from a methodological point of view, the triangulation process applied in the experimentation invites not to deal with learning analytics in isolation. A refined picture of study-time in the course is obtained when learning analytics (applied on tests or exams) is coupled with a structured, frequent, and active information intake by students. This input (here obtained through reflection amplifiers) can in turn become a basis of its own, where upon a learning analytics process can be applied.
Exploiting the findings of this research exercise, could, therefore, initiate innovative actions in curriculum management to arrive at greater congruence in the learning experiences of individuals across parallel modules and courses. It holds the promise to improve universal access by reducing the withdrawals of students on the grounds of work overload.
Different aspects limited this research. On the one hand, all data measurements analyzed in this article were collected in the sessions of a single subject module (CF). It would be of interest to investigate what the results would have been considering the measurements and reflection exercise performed in the sessions of all subject modules comprised within the first semester. On the other hand, data collection was not carried out every week, but was carried out every two weeks. This fact may has impacted students' failure to internalize the reflection process as a routine integrated in their weekly duties as a professional learner .
van Rooij, E.C.M., Jansen, E.P.W.A., van de Grift, W.J.C.M.: First-year university students’ academic success: the importance of academic adjustment. Eur. J. Psychol. Educ. 33, 749–767 (2018). https://doi.org/10.1007/s10212-017-0347-8
Aldridge, S., Rowley, J.: Conducting a withdrawal survey. Qual. High. Educ. (2001). https://doi.org/10.1080/13538320120045085
Meeuwisse, M., Severiens, S.E., Born, M.P.: Reasons for withdrawal from higher vocational education. A comparison of ethnic minority and majority non-completers. Stud. High. Educ. 35, 93–111 (2010). https://doi.org/10.1080/03075070902906780
Harrison, N.: The impact of negative experiences, dissatisfaction and attachment on first year undergraduate withdrawal. J. Furth. High. Educ. (2006). https://doi.org/10.1080/03098770600965383
Charlton, J.P., Barrow, C., Hornby-Atkinson, P.: Attempting to predict withdrawal from higher education using demographic, psychological and educational measures. Res. Post-Compuls. Educ. (2006). https://doi.org/10.1080/13596740500507904
Wilcox, P., Winn, S., Fyvie-Gauld, M.: “It was nothing to do with the university, it was just the people”: The role of social support in the first-year experience of higher education. Stud. High. Educ. (2005). https://doi.org/10.1080/03075070500340036
Goldfinch, J., Hughes, M.: Skills, learning styles and success of first-year undergraduates. Act. Learn. High. Educ. (2007). https://doi.org/10.1177/1469787407081881
Yorke, M.: Student withdrawal during the first year of higher education in England. J. Inst. Res. Australas. 8, 17–35 (1999)
Prégent, R., Huguette, B., Anastassis, K.: Enseigner à l’université dans une approche-programme: guide à l’intention des nouveaux professeurs et chargés de cours. Presses inter Polytechnique (2009)
Karjalainen, A., Alha, K., Jutila, S.: Give me time to think: determining student workload in higher education; has been written as part of the project titled" Five years, two degrees", funded by the Ministry of Education, 2004–2006. Oulu University Press, Finland (2006)
Kember, D.: Interpreting student workload and the factors which shape students’ perceptions of their workload. Stud. High. Educ. 29, 165–184 (2004)
Kyndt, E., Berghmans, I., Dochy, F., Bulckens, L.: (2014) ‘Time is not enough.’ Workload in higher education: a student perspective. High. Educ. Res. Dev. 33, 684–698 (2014)
Tampakis, A., Vitoratos, E.: Estimation of students workload: correlation of teaching and learning methods with examination results. A case study. In: Internationalisation and the Role of University Networks Proceedings of the 2009 EMUNI Conference on Higher Education and Research, pp. 1–20 (2009)
GGerrard, M.D., Newfield, K., Asli, N.B., Variawa, C.: Are students overworked? Understanding the workload expectations and realities of first-year engineering. In: ASEE Annual Conference & Exposition, Columbus, Ohio (2017). https://doi.org/10.18260/1-2--27612
Daele, A., Berthiaume, D., Rochat, J.M., Sylvestre, E.: Évaluer la charge de travail des étudiants: enjeux, méthodes et propositions pour l’organisation des cursus universitaires. Actes des Commun. par affiche des Commun. Individ. (Sessions 1 à 6), pp. 440–441 (2012)
Samoilova, E., Keusch, F., Wolbring, T.: Learning analytics and survey data integration in workload research. Zeitschrift für Hochschulentwicklung. Spec. Ed. Learn. Anal. Implic. High. Educ. 12, 65–78 (2017). https://doi.org/10.3217/zfhe-12-01/04
Black, T.S.: Education students’ first year experience on a regional university campus (2015)
Claessens, B.J.C., Van Eerde, W., Rutte, C.G., Roe, R.A.: A review of the time management literature. Pers. Rev. 36, 255–276 (2007). https://doi.org/10.1108/00483480710726136
Nordmann, E., Horlin, C., Hutchison, J., Murray, J.-A., Robson, L., Seery, M.K., MacKay, J.R.D.: Ten simple rules for supporting a temporary online pivot in higher education. PLOS Comput. Biol. 16, e1008242 (2020). https://doi.org/10.1371/journal.pcbi.1008242
Wolters, C.A., Brady, A.C.: College students’ time management: a self-regulated learning perspective. Educ. Psychol. Rev. (2020). https://doi.org/10.1007/s10648-020-09519-z
Zimmerman, B.J.: Attaining self-regulation: A social cognitive perspective. In: Handbook of Self-Regulation. Elsevier, pp. 13–39 (2000)
Panadero, E.: A review of Self-regulated Learning: Six models and four directions for research. Front. Psychol. 8 (2017). https://doi.org/10.3389/fpsyg.2017.00422
Zimmerman, B.J., Bonner, S., Kovach, R.: Developing self-regulated learners: beyond achievement to self-efficacy (1996). https://doi.org/10.1037/10213-000
Ramdass, D., Zimmerman, B.J.: Developing self-regulation skills: the important role of homework. J. Adv. Acad. 22, 194–218 (2011). https://doi.org/10.1177/1932202X1102200202
Stoeger, H., Ziegler, A.: Evaluation of a classroom based training to improve self-regulation in time management tasks during homework activities with fourth graders. Metacogn. Learn. 3, 207–230 (2008). https://doi.org/10.1007/s11409-008-9027-z
Häfner, A., Stock, A., Oberst, V.: Decreasing students’ stress through time management training: an intervention study. Eur. J. Psychol. Educ. 30, 81–94 (2015). https://doi.org/10.1007/s10212-014-0229-2
Dörrenbächer, L., Perels, F.: More is more? Evaluation of interventions to foster self-regulated learning in college. Int. J. Educ. Res. 78, 50–65 (2016). https://doi.org/10.1016/j.ijer.2016.05.010
van der Meer, J., Jansen, E., Torenbeek, M.: ‘It’s almost a mindset that teachers need to change’: first-year students’ need to be inducted into time management. Stud. High. Educ. 35, 777–791 (2010). https://doi.org/10.1080/03075070903383211
Ukpong, D.E., George, I.N.: Length of study-time behaviour and academic achievement of social studies education students in the University of Uyo. Int. Educ. Stud. 6, 172–178 (2013). https://doi.org/10.5539/ies.v6n3p172
Lincoln, M., Adamson, B.J., Covic, T.: Teaching time and organizational management skills to first year health science students: does training make a difference? J. Furth. High. Educ. 28, 261–276 (2004). https://doi.org/10.1080/0309877042000241742
Stevens, A.E., Hartung, C.M., Shelton, C.R., LaCount, P.A., Heaney, A.: The effects of a brief organization, time management, and planning intervention for at-risk college freshmen. Evid. Based Pract. Child Adolesc. Ment. Heal. 4, 202–218 (2019). https://doi.org/10.1080/23794925.2018.1551093
Macan, T.H., Shahani, C., Dipboye, R.L., Phillips, A.P.: College students’ time management: correlations with academic performance and stress. J. Educ. Psychol. 82, 760 (1990)
Kimber, C.T.: The effect of training in self-regulated learning on math anxiety and achievement among preservice elementary teachers in a freshman course in mathematics concepts. Diss. Abstr. Int. A, Humanit. Soc. Sci. 70 (2009)
Johnson, P.E., Perrin, C.J., Salo, A., Deschaine, E., Johnson, B.: Use of an explicit rule decreases procrastination in university students. J. Appl. Behav. Anal. 49, 346–358 (2016). https://doi.org/10.1002/jaba.287
Thomvas, V., Koivuniemi, M., Järvenoja, H., Greller, W.: SLIDEshow deliverable IO6: Evidence-based stakeholder reports (2019)
Pishtari, G., Rodríguez-Triana, M.J., Sarmiento-Márquez, E.M., Pérez-Sanagustín, M., Ruiz-Calleja, A., Santos, P., P. Prieto, L., Serrano-Iglesias, S., Väljataga, T.: Learning design and learning analytics in mobile and ubiquitous learning: a systematic review. Br. J. Educ. Technol. 51, 1078–1100 (2020). https://doi.org/10.1111/bjet.12944
Kim, D., Yoon, M., Jo, I.-H., Branch, R.M.: Learning analytics to support self-regulated learning in asynchronous online courses: a case study at a women’s university in South Korea. Comput. Educ. 127, 233–251 (2018)
Kitto, K., Lupton, M., Davis, K., Waters, Z.: Designing for student-facing learning analytics. Australas. J. Educ. Technol. 33, 152–168 (2017). https://doi.org/10.14742/ajet.3607
De Laet, T., Broos, T., Pinxten, M., Vanhoudt, J., Verbert, K., Van Soom, C. and Langie, G.: Learning and study strategies: a learning analytics approach for feedback. In: European First Year Experience Conference (2017)
Tabuenca, B., Kalz, M., Drachsler, H., Specht, M.: Time will tell: the role of mobile learning analytics in self-regulated learning. Comput. Educ. 89, 53–74 (2015). https://doi.org/10.1016/j.compedu.2015.08.004
Greller, W., Drachsler, H.: Translating learning into numbers: a generic framework for learning analytics. Educ. Technol. Soc. 15, 42–57 (2012)
Tsai, Y.-S., Scheffel, M., Gašević, D.: Enabling systematic adoption of learning analytics through a policy framework. In: European Conference on Technology Enhanced Learning, pp. 556–560 (2018). https://doi.org/10.1007/978-3-319-98572-5_44
Greller, W., Santally, M.I., Boojhawon, R., Rajabalee, Y., Kevin, R.: Using learning analytics to investigate student performance in blended learning courses. J. High. Educ. Dev. 12, 37–63 (2017)
Gašević, D., Dawson, S., Rogers, T., Gasevic, D.: Learning analytics should not promote one size fits all: the effects of instructional conditions in predicting academic success. Int. High. Educ. 28, 68–84 (2016)
Dunbar, R.L., Dingel, M.J., Prat-Resina, X.: Connecting analytics and curriculum design: process and outcomes of building a tool to browse data relevant to course designers. J. Learn. Anal. 1, 223–243 (2014)
Hausman, M., Verpoorten, D., Defaweux, V., Detroz, P.: Learning analytics: a lever for professional development of teachers? In: Handbook of Research on Operational Quality Assurance in Higher Education for Life-Long Learning, 308–335 (2019)
Drachsler, H., Greller, W.: The pulse of learning analytics understandings and expectations from the stakeholders. In: Proceedings of the 2nd International Conference on Learning Analytics and Knowledge, pp. 120–129 (2012). https://doi.org/10.1145/2330601.2330634
Biggs, J.B.: The role of meta-learning in study processes. Br. J. Educ. Psychol. 55, 185–212 (1985). https://doi.org/10.1111/j.2044-8279.1985.tb02625.x
Ellis, R.A., Han, F., Pardo, A.: Improving learning analytics—combining observational and self-report data on student learning. J. Educ. Technol. Soc. 20, 158–169 (2017)
Tabuenca, B., Kalz, M., Ternier, S., Specht, M.: Stop and think: exploring mobile notifications to foster reflective practice on meta-learning. IEEE Trans. Learn. Technol. 8, 1–12 (2015). https://doi.org/10.1109/TLT.2014.2383611
Verpoorten, D.: Reflection amplifiers in self-regulated learning (2012). http://dspace.ou.nl/handle/1820/4560
Bond, M.J., Feather, N.T.: Some correlates of structure and purpose in the use of time. J. Pers. Soc. Psychol. 55, 321 (1988)
Britton, B.K., Tesser, A.: Effects of time-management practices on college grades. J Educ Psychol. 84, 405 (1991). https://doi.org/10.1037/0022-06220.127.116.115
Nunnally, J.C., Bernstein, I.: Psychometric Theory. McGraw-Hill, New York, New York, USA (1967)
Evans, J.D.: Straightforward statistics for the behavioural science. Brooks/Cole Publishing Company, Publisher (1996)
Broadbent, J., Poon, W.L.: Self-regulated learning strategies & academic achievement in online higher education learning environments: a systematic review. Int. High. Educ. 27, 1–13 (2015). https://doi.org/10.1016/j.iheduc.2015.04.007
Bowyer, K.: A model of student workload. J. High. Educ. Policy Manag. 34, 239–258 (2012)
Greller, W.: Personalised Evidence-Based Practice Framework. IO3 of the SLIDEshow project (2018)
Whitelock, D., Thorpe, M., Galley, R.: Student workload: a case study of its significance, evaluation and management at the Open University. Distance Educ. 36, 161–176 (2015)
Iqbal A.K., Zeb, A., Ahmad, S., Ullah, R.: Relationship between university students time management skills and their academic performance. Rev. Econ. Dev. Stud. 5 (2020). https://doi.org/10.26710/reads.v5i4.900
Clow, D.: MOOCs and the funnel of participation. In: Proceedings of the Third International Conference on Learning Analytics and Knowledge, pp. 185–189 (2013). https://doi.org/10.1145/2460296.2460332
Lockyer, L., Heathcote, E., Dawson, S.: Informing pedagogical action: aligning learning analytics with learning design. Am. Behav. Sci. 57, 1439–1459 (2013)
Rodríguez-Triana, M.J., Martínez-Monés, A., Asensio-Pérez, J.I., Dimitriadis, Y.: Scripting and monitoring meet each other: aligning learning analytics and learning design to support teachers in orchestrating CSCL situations. Br. J. Educ. Technol. 46, 330–343 (2015)
Wolf, G.: Know thyself: Tracking every facet of life, from sleep to mood to pain. Wired Mag. 365 (2009)
Tabuenca, B., Verpoorten, D., Ternier, S., Westera, W., Specht, M.: Fostering reflective practice with mobile technologies. In: Proceedings of the 2nd Workshop on Awareness and Reflection in Technology Enhanced Learning, pp. 87–100 (2012)
Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature. This work was partially funded by the Madrid Regional Government through the e-Madrid-CM Project under grant S2018/TCS-4307, a project which is co-funded by the European Structural Funds (FSE and FEDER). Partial support has also been received from the European Commission through the Erasmus + Strategic Partnerships for higher education project TEASPILS (2020-1-ES01-KA203-082258). This publication reflects the views only of the authors and funders cannot be held responsible for any use which may be made of the information contained therein.
Conflict of interest
The authors declare that they have no conflict of interest.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix A: Reflection amplifiers (A–H) prompted in 6 week-points (wp) over the semester
Which modules required most of your study-time this week?
How much time did you devote to study the most time-consuming module? (not counting class hours)
How much time did you devote to study Computer Fundamentals this week? (not counting class hours)
Which modules required most of your study-time this week?
Do you think those modules imbalance your study-time in the rest of the modules?
How much time did you devote to study Computer Fundamentals this week? (not counting class hours)
Which modules required most of your study-time this week?
The Computer Fundamentals module has 6 ECTS credits. Do you know how much study-time / WEEKLY work corresponds to it? (not counting class hours)
How much time did you devote to study Computer Fundamentals this week? (not counting class hours)
Which modules required most of your study-time this week?
A module of 6 ECTS credits requires 5 h of study-time per week outside the classroom. How much time did you devote to the most time-consuming module this week? (not counting class hours)
How much study-time you expect to devote to Computer Fundamentals the current week? (not counting class hours)
Which modules required most of your study-time this week?
How much time did you devote to study Computer Fundamentals this week? (not counting class hours)
How much study-time you expect to devote to Computer Fundamentals the next week? (not counting class hours)
Which modules required most of your study-time this week?
What activities within those modules required more time throughout the course?
How much time did you devote to study Computer Fundamentals this week? (Not counting class hours)
About this article
Cite this article
Tabuenca, B., Greller, W. & Verpoorten, D. Mind the gap: smoothing the transition to higher education fostering time management skills. Univ Access Inf Soc 21, 367–379 (2022). https://doi.org/10.1007/s10209-021-00833-z
- Curriculum development
- Design for all
- Learning analytics
- Mobile learning
- Self-regulated learning
- Time management