1 Introduction

With the growing availability of digital technology, students are exposed to a constant stream of digital distractions, which can lead to conflicts with their learning goals. On one hand, students may be drawn to engaging in enjoyable activities, such as browsing social media or watching entertaining videos. On the other hand, students may also recognize the importance of achieving long-term academic goals, which require focus and effort. This self-control conflict between immediate gratification and delayed rewards is a growing problem that negatively impacts students’ academic performance and psychological well-being (Jamet et al., 2020; Masood et al., 2020; Ratan et al., 2021; Tran et al., 2019). Studies revealed that students use digital distractions frequently, often averaging only a few minutes of uninterrupted study time before turning to digital media (Calderwood et al., 2014; Rosen et al., 2013). Nonetheless, there are tools that students can use to mitigate the negative effects of these distractions.

In recent years, digital self-control tools (DSCTs) have emerged to support the self-control of students (Biedermann et al., 2021; Lyngs et al., 2019). DSCTs come in a variety of forms. There are, for instance, website blockers that allow users to block access to digital content on their devices (e.g., Kim et al., 2017b), visualizations that show users how much time they spend on digital content, in the hope that this will lead to a change in behaviour (e.g., (Y.-H. Kim et al., 2016), or reminders that pop up when a user spends too much time on digital distractions (e.g., Kim et al., 2019a, b). Aside from these very common examples, there are many more such tools with a variety of ingenious features to support self-control and mitigate distractions (Lyngs et al., 2019). Some of these we will present in the latter sections of this paper, and we encourage interested readers to visit the app stores of their preferred device to view the diversity of tools for themselves.

Indeed, most digital devices now come with pre-installed DSCTs, such as the “Digital Wellbeing” app on Android devices, or the “Focus” app on iOS devices. The investment of device manufacturers in tools to reduce device use, despite their vested interest in the opposite, underscores that digital distractions have been recognized as a severe problem. A yet unresolved question is whether DSCTs actually address this problem effectively, especially to alleviate their negative effects on academic achievement (Biedermann et al., 2021).

The solution appears obvious, a student who suffers from frequently watching videos instead of learning could install a website blocker. The content is no longer accessible, the problem should be solved. Why does this not happen whenever a student suffers from digital distractions?

As the first possible obstacle to the widespread use of self-control tools, we suspect that they may simply not be well known enough. There are all kinds of helpful things that people do not know about, and we assume that DSCTs are no exception. Even the pre-installed apps are not activated by default, and users must discover them on their devices. To our knowledge, there has been no research to date on how widespread the use of DSCTs is.

A second barrier may be found in a lack of motivation. Students might be aware of DSCTs, but not be sufficiently motivated to use them to limit digital distractions. They might recognize that distractions are an issue but feel that watching videos is simply too enjoyable, and a website blocker would prevent this enjoyment. Perhaps the deadline for the essay is several weeks in the future.

The next hurdle is selecting the right DSCT. Due to differences in the way a particular feature is implemented, and due to interindividual differences in how one responds to a particular feature, not every person benefits to the same extent from the same DSCT.

1.1 The role of differences in DSCTs

1.1.1 Differences due to implementation details

Tools that nominally have the same mechanism can implement their self-control features very differently. This, of course, affects how users perceive the tool, because expectations of effectiveness and ease of use play a big role in the adoption of technologies (Granić & Marangunić, 2019; Nguyen, 2022). For example, the success of content blocking is related to the ease with which the blocking can be disabled. When users had to enter a code of random digits before they could lift a block, the effectiveness was significantly higher for longer codes. At the same time, the longer codes also involved more effort, and therefore tended to be less popular (Kim et al., 2019). Similar observations were made when users set time limits for device usage. It worked well only in conjunction with automatic lockouts from the device after the time limit. If the users could extend their limits freely, they mostly did just that and clicked their reminders away (Kim et al., 2019a, b). In general, DSCTs which do not apply any restrictions, and rely purely on users monitoring themselves, tend to be ineffective (Loid et al., 2020; Terry et al., 2016; Zimmermann, 2021).

1.1.2 Differences in habitual media use

Of course, there is a plethora of individual differences that may lead to differential preferences for a DSCT. But the actual effectiveness, independent of preference, could be influenced by the extent to which the media use is habitual. Habits are automatic and unconscious behaviours triggered by environmental cues that bypass conscious, goal-oriented decision-making processes (Wood & Rünger, 2016). This automaticity can render certain DSCTs, which require cognitive effort and conscious decision-making, less effective. Following the taxonomy of Lyngs et al. (2019), DSCTs with self-tracking and goal advancement features work by comparing current behaviour with specified goals, and thus implies that conscious decisions about goals are made. On the other hand, preventing features like blocking should be better suited to prevent unwanted habitual behaviour because they are active even when there is no conscious goal driving the behaviour.

1.1.3 Challenges due to dual use of media

We also see a special challenge when using DSCTs while learning: The same platforms can be both relevant for on- and off-task purposes. YouTube has a lot of entertaining videos, but it also has lecture videos and a lot of high-quality educational content. Similarly, a social network might be the source of interruptions, but also the place where other students exchange essential information about a course (Hrastinski & Aghaee, 2012). Thus, completely preventing access is not always a viable option. Users have to micro-manage and activate or deactivate their DSCT whenever they start or stop learning, possibly even depending on the specific task they work on. Micro-managing requires the user to switch from their original task to the task of managing their tool. Task switching requires effort and can result in poorer performance of the original task (Rubinstein et al., 2001). Consequently, it appears that users will start to circumvent or even disable restrictive mechanisms frequently (cf. (Kim et al., 2019).

In the previous sections, we have sketched out a scenario that is as follows: On the one hand, there are multiple interventions, some of which have also been shown to be effective in a number of studies (Holte & Ferraro, 2020; Kim et al., 2019b; Ko et al., 2015; Lyngs et al., 2020; Tseng et al., 2019). On the other hand, we see that digital distractions continue to be a problem that DSCTs seem to not fully address. We have gone through several possible reasons for this: Insufficient awareness, deficiencies in the implementation, and a poor fit between the individual and the chosen tool.

In this study, we examine whether and to what extent these assumptions hold true in a sample of higher education students.

2 The present study

We administered a survey to investigate the prevalence of different DSCT features among students, their knowledge about them, and the perceived helpfulness of DSCT features for reducing digital distractions during learning. We focus on higher education students because they are typically of full age and, therefore, rather free in their time management, with less external regulation. Even in university classrooms, media use is typically unregulated (Wammes et al., 2019). It can be assumed that DSCT use among higher education students is voluntary and self-determined. Of course, DSCTs can also have a benefit for school children, but the environment is fundamentally different, with more external regulation from school policies (Tandon et al., 2020) and from parents (Nikken & De Haan, 2015).

We tailored our questions to the perception of individual features (e.g., website blocking, goal setting) rather than specific tools. This was to avoid the potential confounding influence of multifaceted tools, which typically combine numerous features such as website blocking, usage visualization, and goal-prompting within a single application (Lyngs et al., 2019). This has the benefit of generating potential insights that are not tied to specific applications and instead shed light on the underlying mechanisms and affordances associated with different features (Ko et al., 2015; Kovacs et al., 2018).

First, we wanted to rule out that DSCTs fail simply because users are not aware that they exist. Put differently, if users do not know about a solution, then it cannot help them. Thus, our first research question (RQ) addressed the knowledge about and actual use of DSCTs.

RQ1-Knowledge & use: how widespread are knowledge about and use of DSCT features?

Building on this, our second research question aimed to investigate whether participants perceived DSCT features as actually helpful in mitigating the negative impact of digital distractions.

RQ2-Helpfulness: How do participants perceive the helpfulness of DSCT features in reducing digital distractions during learning?

Further, we examine the connection between habitual use of media and the helpfulness of DSCTs features. Habitual behaviour bypasses goal-directed behaviour, and we therefore suspect that DSCTs which rely on an individual’s assessment of conscious goals are less useful in these cases (Chang et al., 2017; Pinder et al., 2018).

RQ3-Habits: How are the perceived helpfulness of DSCT features and habitual use of digital media correlated?

There are indications that users are often frustrated with the DSCTs that they use, and that these frustrations may cause users to stop using an otherwise helpful tool. (Lyngs et al., 2022). We would therefore like to learn more about why users stop using the tool after they have already used it.

RQ4-Stopping: what are the reasons that people have for stopping the use of a DSCT?

3 Methods

To investigate our research questions, we employed a correlational survey design, sampling a wide variety of people to account for the breadth of experiences with and attitudes toward DSCTs. In addition to a quantitative section of our survey, we also gathered more qualitative information via free text responses. As recruitment procedure, we reached participants via social media, word of mouth, mailing lists, and the prolific recruitment platform.

3.1 Participants

403 participants completed the questionnaire over a period of three months between May and August 2022. Ethical approval for the study was obtained from the ethics committee of the DIPF | Leibniz Institute for Research and Information in Education. All participants had to consent to the use of their data for the study prior to starting the questionnaire.

We excluded all participants that did not complete the full questionnaire (n = 46). Due to our focus on the use of DSCTs for learning scenarios, we excluded all participants that were not enrolled students (university, school, or vocational) or Ph.D. candidates (n = 132). To ensure data quality, the questionnaire contained a screener question. Participants that did not respond to this question as instructed were excluded from the analysis (n = 35). The final number of participants was 273. The mean age of the participants was 27.18. (SD = 7.56), 84 male, 186 female, three diverse. Countries of residence were 141 Germany, 77 UK, 24 US, 10 Ireland, 4 Australia, 4 Austria, 3 Italy, 3 Netherlands, 2 Portugal, 1 France, 1 Philippines, 1 Norway, 1 Greece, and 1 Columbia. Regarding their occupation, 219 responded that they were enrolled students (202 university, 17 vocational), and 54 were Ph.D. candidates.

3.2 Instruments

3.2.1 Questions about the extent and effect of digital distractions during learning

We used one item to ask about the frequency of digital distractions in general (scale from 1 (never) to 5 (very often)), one item to ask whether digital distractions are a hindrance to reaching their learning goals (scale from 1 (does not apply) to 4 (applies)), one item to ask about suffering from digital distractions (scale from 1 (does not apply) to 4 (applies)), and one item for the self-efficacy of being able to reduce digital distractions during learning (scale from 1 (does not apply) to 4 (applies)). Satisfaction with learning outcomes was assessed with two items (scale from 1 (dissatisfied) to 4 (satisfied)) (e.g., “In general, how satisfied are you with the results that you get from your learning?”, Cronbach’s 𝛼 = 0.74).

3.2.2 Habituality of media use

We measured the degree of habitual media use with a modified version of the self-report habit index (SRHI) developed by Verplanken and Orbell (Verplanken & Orbell, 2003). The SRHI is a widely used measure of habit strength and has been shown to have good reliability and validity in previous research (Gardner et al., 2012; Verplanken & Orbell, 2003). We used only the four items that, according to Gardner et al. (2012), can measure the automaticity of habits (termed the “Self-Report Behavioural Automaticity Index”; SRBAI). These items are: ‘[Behaviour X is something…]’ ‘…I do automatically’, ‘…I do without having to consciously remember’, ‘…I do without thinking’, and ‘…I start doing before I realise I’m doing it’. Each of these items was rated on a Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree).

We used these items to measure the habitual use of the following behaviours: using social media (“Using social media while or instead of learning is something that …”, 𝛼 = 0.91), watching videos (“Watching videos or series while or instead of learning is something that …”, 𝛼 = 0.92), surfing the internet (“Surfing the internet while or instead of learning is something that …”, 𝛼 = 0.92), gaming (“Playing video games (e.g. on PC, console, or smartphone) while or instead of learning is something that …”, 𝛼 = 0.89), and chatting (“Communicating with others via messenger services while or instead of learning is something that…”, 𝛼 = 0.89). To capture the presence of habitual media-related distractions in general, we report the highest SRBAI score across all SRBAI scores per participant (MaxSRBAI).

3.2.3 Knowledge and perceived helpfulness of self-control tools

Knowledge of and prior experience with DSCT features were rated on a per-feature basis, with questions about one feature on one page. We composed the list of features based on existing DSCT research (Biedermann et al., 2021; Lyngs et al., 2019; Roffarello & De Russis, 2022), personal experience, and feedback from pilot studies.

  • Autoclose: A feature that automatically closes apps or websites after a specified amount of time.

  • Blocking: A feature that blocks access to specific apps or websites.

  • Delay: A feature that uses delay of gratification mechanisms, such as making the user wait or solve a task before accessing an app or website.

  • Modification: A feature that removes or modifies particularly distracting features from websites.

  • Gamification: A feature that uses game-like elements, such as rewards, to motivate users to engage in less distracting behaviour.

  • Goals: A feature that prompts users to create and track progress towards specific goals.

  • Pomodoro: A feature that supports the use of the Pomodoro technique, where the user specifies time intervals (typically 25 min) that are reserved for focused work, after which they are allowed a short break (typically 5 min) to do whatever they like.

  • Compare: A feature that enables users to share and compare their progress with others, which can help users stay motivated and accountable.

  • Screenshare: A feature that allows users to create learning groups and monitor each other’s device use to encourage less distraction during learning.

  • Visualizations: A feature that provides visualizations of device usage to help users monitor and reflect on their own behaviour.

For each of these features, we created a written description and an example of a tool that has this feature at its core. We asked the participants whether they were aware that the feature exists and whether they currently use it or have used it in the past. We asked whether they think that this feature is helpful to reduce digital distractions on a scale from 1 (does not apply) to 5 (applies). We used perceived helpfulness here because it became apparent in the piloting of our questionnaire that participants found terms like effectiveness more difficult to assess than the subjective perceived helpfulness.

If a participant reported that they had previously used a DSCT but stopped doing so, or if they stated that they used a DSCT feature less than they used to, we asked them to elaborate on the reasons in a free text field. The name of the tool with the feature the participants used had to be entered in a text field.

3.2.4 Coding of free text responses

The free-text responses about reasons for stopping tool use were first coded by one author in a round of open coding by reading all responses and creating an initial coding scheme. The remaining authors then evaluated the initial scheme by independently applying the codes to 50 responses. After discussing modifications to the coding scheme, the first author coded all the responses with the revised coding scheme, and the other authors each coded half of the responses. Pairwise intercoder agreements were 85.82% and 77.42% (M = 81.62%). The remaining discrepancies were resolved through group discussion until a consensus was reached.

In total, the participants gave 382 responses. Six responses were found to be incomprehensible, and 21 either misread the question or responded to a different question, bringing the number of coded responses to 355. A response could have multiple codes. We used the MaxQDA software for the coding process.

4 Results

4.1 Extent and effect of digital distractions during learning

Regarding the frequency of digital distractions, no participant stated that they never distracted themselves with digital media. Meanwhile, 19 (6.96%) said that they do it rarely, 80 (29.30%) now and then, 114 (41.75%) often, and 60 (21.98%) very often. For the question of whether the participants suffered from the digital distractions, the average rating was 2.80 (SD = 0.95) out of 5. Goal hindrance was rated similarly, with a mean of 2.80 (SD = 0.83). However, most people thought they could manage to reduce their digital distractions, and the mean rating for the self-efficacy question was 2.83 (SD = 0.86) (see Fig. 1).

Fig. 1
figure 1

Distribution of responses to the questions regarding the extent of digital distractions during learning

4.2 Knowledge and use of self-control tools

We explored which DSCT features the participants knew about, and which DSCT features the participants had previous experience with. For details about knowledge and usage, see Table 1. Knowledge of features varied between 6.59% (n = 18) of participants for Delay and 66.67% (n = 182) of participants for Screenshare. Prior experience (either current use or previous use that then stopped) was highest for Screenshare (n = 82, 30.04%) and particularly low for Delay (n = 5, 1.83%) and Compare (n = 5, 1.83%). Not knowing a single DSCT feature was the response of 19 (6.96%) participants, and 141 (51.65%) stated that they currently did not use any of the DSCT features that we asked about. Since it could be that this was mostly the group that did not suffer at all from distractions, we checked the sub-group of sufferers (i.e., they selected “rather applies” or “applies” on the suffering question). In this sub-group of 175 sufferers, 12 (4.40%) participants did not know any, and 86 (31.50%) participants did not currently use any DSCT features.

The specific DSCTs used by the participants can be found in the OSF repository (https://osf.io/8zmb7/?view_only=10162da5b5a54e18bd9a7e9830fd0638). For brevity, we will not list all statistics here, but we noted that the goal category contained several habit trackers, to-do list apps, and calendars, which did appear to be explicitly related to reducing media use. Upon investigation, we found that the description of the goal support feature in our questionnaire allowed for the interpretation that this referred to general goal-setting tools, and not only goals related to media use.

Table 1 Statistics about tool use and knowledge

4.3 Helpfulness of self-control tools

Next, we analysed the ratings for the perceived helpfulness of the DSCT features. We are primarily interested in the ratings from the participants with prior experience. Across all DSCT features, there was a trend that participants who had prior experience with a feature rated its helpfulness higher than participants without prior experience.

The mean difference between the rating from participants with prior experience and participants without prior experience (∆) was most prominent for Gamification (∆ = 0.85, d = 0.71, p = 0.0035), Modification (∆ = 1.16, d = 1.55, p = 0.0006) and Screenshare (∆ = 0.64, d = 0.58, p = 0.0001). Leaving out the three features with barely any ratings, the highest rating among experienced users was for Modification (M = 4.56, SD = 0.53), and the lowest for Visualization (M = 3.48, SD = 1.10), with a mean difference between both of 1.07 (d = 1.25, p = 0.0049). See Table 1 for details regarding the perceived helpfulness of all features.

The ratings for the helpfulness of the features were skewed towards rating the features as helpful, and participants rarely rated the features as completely unhelpful (see Fig. 2). Even participants who stated that they stopped using a tool and explicitly stated that they did not perceive it as helpful at all rated those features as at least somewhat helpful or even very helpful.

Fig. 2
figure 2

Distribution of the rating for the perceived helpfulness of the features

Note. For each feature, the left half (shaded blue) represents the participants with no prior experience with a feature. The right half (shaded orange) represents the ratings of those participants who either currently uses or previously used the feature.

4.4 Habituality of media use in relation to self-control tools

The SRBAI scores (see Table 2) were highest for social media and chatting, and lowest for gaming. A Shapiro-Wilk for the SRBAI values showed that the values were not normally distributed (p < 0.0001 for all SRBAI values). The number of participants for which this behaviour was not at all present (i.e., a score of 1) was highest for gaming (n = 102).

Table 2 Descriptive statistics
Table 3 Spearman Correlations between MaxSRBAI and the other measures

We correlated the perceived helpfulness of DSCT features with the MaxSRBAI scores (see Table 3). Only participants with prior experience with a feature were included in this analysis, as we aimed to investigate the actual experience rather than hypothetical perceptions. This analysis has an exploratory approach, and we interpret the confidence intervals of the correlations in the following manner: if a confidence interval does not include the zero, we interpret this as a sign that further research should investigate a particular correlation. Conversely, if a confidence interval includes 0, we interpret this cautiously as a lack of evidence supporting the effect.

For MaxSRBAI, we found a small negative correlation with Visualization, and a small positive correlation with Screenshare (see Table 3). All other correlations were small and included the zero.

4.5 Reasons for stopping the use of self-control tools

The participants often stopped using a DSCT after they had used it for some time (see Table 1 in the “Quit” column),and elaborated on the reasons for this in free text responses. We have listed the codes for each tool in detail in the supplementary table under https://tinyurl.com/uba426xf. When we provide quotes of participant responses, we provide the id of the response signified by a hashtag. (e.g., #42 to denote that this was the response with the id 42).

4.5.1 Tools no longer needed

Stopping the use of a DSCT did not always indicate that the tool was not helpful for participants. The most common reason for stopping the use of a DSCT (n = 130) was that participants no longer needed it, either because they were in a different phase of their life or only required it temporarily. For instance, some participants reported that they only needed the DSCT during particularly stressful phases, as one participant noted, “Because exams ended” (#249). Other participants stopped using DSCTs because they felt that their self-control had improved sufficiently (“I felt that my social media usage decreased and I am not as distracted anymore”, #164). However, we could not determine whether the improvement was due to using a DSCT, as participants did not provide enough information to make such conclusions.

A smaller group of participants (n = 15) stopped using DSCTs because they felt that their self-control had improved. These participants reported that they no longer needed the tools as they felt that they had gained greater control over their digital habits. However, it is unclear whether the participants’ improvements were solely due to using the tools.

4.5.2 Lack of motivation or self-control

Twelve responses indicated a self-reported lack of discipline, motivation, or self-control to DSCTs. These responses were most common for the Blocking and Autoclose tools, with one participant stating, “I got bored and needed stimulation” (#72).

4.5.3 Balance of restrictiveness

We also identified issues related to the balance of restrictiveness among participants’ use of DSCTs. Specifically, 17 responses indicated that participants began to circumvent the restriction mechanisms put in place by the tools. For example, one participant stated, “[…] I would find other ways to open distracting apps, e.g., opening the app on my tablet […]” (#134), while another participant mentioned that they “[…] used to just extend the limit time […]” (#83).

Conversely, in 15 responses, participants indicated that they stopped using a tool because it was too restrictive and prevented access to content, they needed to complete their tasks, or to content they wanted to access during leisure time. For instance, one participant reported that the tool would “block apps I needed when studying, e.g., YouTube” (#114), while another participant said, “[…] I use my phone to listen to music whilst I work and the app interferes with that” (#112).

4.5.4 Negative emotions in tool use

Our findings suggest that several participants experienced negative emotions when using their tools. Feelings of stress and/or anxiety when using DSCTs were mentioned in 15. Frequent self-reflection did not appear to be beneficial for everyone, as one participant stated, “[Visualization] made me feel guilty for spending so much time on my phone […]” (#177).

Additionally, a small number of participants (n = 7) became annoyed by DSCTs, with the tools themselves becoming more of a distraction than an assistance. For example, one participant noted that a blocking tool “distracted and restricted me more than it helped…” (#108).

4.5.5 Feature-specific reasons

Further reasons were specific to a particular feature type. Gamification features lost their appeal for the participants in eight cases because the rewards ceased to be interesting for the participants. One participant noted, “Because I had reached the maximum rewards and then I would have had to start all over again.” (#299, translated). For DSCTs with goal reminders, the participants indicated four times that the notifications were too frequent or at the wrong time (“Because at some point I was annoyed by the notifications”, #203). Furthermore, in four responses they also found that managing and keeping up with the goals became too much work. In the responses to Screenshare features, the participants noted that they had issues with their learning group. A recurring topic (n = 11) was that the group itself became a distraction (“The meetings were rather complicated and not always productive.”, #164). In 13 responses, the participants mentioned that they had trouble finding a group for screensharing. For Pomodoro, 17 of the participants responded that using fixed time intervals did not fit their mode of working (“It wasn’t suited for the type of assignments I do. It breaks the flow of my thinking and isn’t suited for time-consuming assignments like programming.”, #262).

5 Discussion

Our study found that the majority of participants in our sample experienced digital distractions that they perceived as interfering with the achievement of their learning goals. Self-control tools, which are promoted as a solution to this problem, are not universally perceived as helpful and students lack awareness of potentially helpful features. Our qualitative analysis of people’s reasons for stopping DSCT use provides insights into why DSCTs fail for some users.

5.1 Use and knowledge

In our sample, even the most popular DSCT features were far from universally known. Crucially, a small group of 8% of the students who suffered from distractions was not aware of any DSCT feature at all. Making DSCTs better known to this population could help more people. This is underlined by the observation that participants rated all features as more helpful when they had previous experience with them. Therefore, we believe that increased promotion of existing DSCTs is a necessary step to address the problem on a larger scale. Campaigns to increase awareness of the potential of DSCTs could be a good way to address this issue (Kim et al., 2017a). However, there is the limitation that not all features are equally available on all devices. Especially Apple is highly restrictive in terms of apps allowed in the app store. For example, reading the usage times for other apps or preventing apps from opening in any way is prohibited for third-party apps. In relation to this situation, Lyngs and colleagues (2022) have noted that restrictive platforms must therefore take on the challenge of covering the multitude of possible user scenarios, as they cannot rely on third-party developers to do so.

5.2 Helpfulness

Our study revealed differences in the perceived helpfulness of different DSCT features, with the greatest discrepancy between the most helpful feature (Modification) and the least helpful feature (Visualization). Somewhat unexpected was the widespread use and positive reception of screen sharing. Despite the challenges of finding a good group to screen share with, this seems a promising strategy that deserves further attention. Overall, however, we caution against interpreting these results as an overall ranking, as each feature was found to be helpful by some of its users. Instead, we emphasize the importance of tailoring DSCTs to the specific needs and preferences of the user.

The results for the correlation between habitual usage and the perceived helpfulness of DCST features are somewhat in line with our expectations. Our findings did support the idea that pure self-awareness interventions, such as usage visualizations, are less helpful for highly habitual media use. The small positive correlation with Screenshare fits into the narrative as well, considering that engaging in an online session together with peers constitutes a change in the social context, which is well-suited to disrupting unwanted habits (Lally & Gardner, 2013).

On the other hand, we initially assumed that goal support features would be less helpful in mitigating habitual behaviour because these tools rely on individuals consciously comparing their actions with their goals, a process that is unlikely to occur in habitual behaviour (Fiorella, 2020). This would make such tools unsuitable for their purpose. However, this was not the case in our analyses. We suspect that one reason for the lack of negative correlation may be found in the specific tools that participants used. The specific tools that the participants named included many to-do lists and calendars that were not necessarily related to reducing digital distractions. We suspect that our feature description might not have been explicit enough, leading participants to report their use of general productivity tools.

5.3 Challenge of dual-purpose platforms

A complication of DSCTs specific for use during learning is the dual purpose of platforms such as YouTube, which contain both educational and entertainment content. Using social media for educational purposes is well-known (Hrastinski & Aghaee, 2012), and our results show that this also has an influence on DSCT acceptance. Several participants stated that they required access to a certain platform, which was prevented by their blocker, and so they had to disable it. This makes smooth usage difficult to achieve and naturally leads to frustration.

A way to make platforms less distracting while keeping them usable would be the use of feature removal tools (e.g., the “HabitLab” tools by Kovacs et al. (2018) that remove particularly distracting parts. Specifically for YouTube, which is certainly the platform that is currently most relevant as a dual purpose platform, a recent study by Lukof et al. ( 2023) investigated the use of an adaptable commitment device. The users would start their visit on YouTube by explicitly stating their intent (e.g. entertainment or focus) and thus received a different interface. This was well received and led to greater satisfaction and goal alignment when using the platform. However, while this worked great in a study setting, it is doubtful that this will quickly find its way into practice, especially as a mobile app, as it is seemingly detrimental for the business case of platforms. Mobile apps are the greater challenge in this regard, because feature modifications do work quite well in browser extensions, as they simply have to modify the HTML content, which is comparatively trivial to do. The content in smartphone apps cannot be modified as easily for everyday use (Datta et al., 2022). Currently, users can only resort to installing alternative apps (Zhang et al., 2022), and these alternatives are often not available. Thus, although there are in principle good solutions for dual purpose platforms, in practice they are not available to all users.

5.4 The various reasons for stopping the use of self-control tools

Overall, the reasons for discontinuing DSCTs supported the idea that the same tool does not work for everyone. For some participants who are unhappy with their tools, a reasonable first suggestion would be to try another tool to see if it suits them better. Especially as some pointed out quite trivial usability issues, e.g., Pomodoro timers that appear to be inflexible, where alternatives with more flexibility certainly exist. Again, this highlights that users should be made more aware of the large variety of existing DSCTs.

5.4.1 Seasonal and infrequent use

By far the most common reason why our participants stopped using their DSCTs was that they only needed them during stressful periods like exam preparation. This finding is in accordance with previous observations that DSCTs are often used to help with specific, undesirable tasks rather than to limit usage time in general (Lyngs et al., 2022). With short-term and interval use of DSCTs, there is less chance of achieving lasting behaviour change. Shorter time frames are usually insufficient to break undesired habits or build beneficial ones, as habit formation typically requires several weeks of context dependent cue-response repetitions (Lally et al., 2010; Stojanovic et al., 2020; Wood & Rünger, 2016). Therefore, users should not expect longer lasting behaviour change from this type of usage pattern. In such scenarios, user education is pivotal to understand the circumstances under which they can anticipate lasting behavioral transformation and when to expect only transient support.

5.4.2 The role of motivation

The notion that DCST use in education is infrequent or seasonal also has implications for the understanding of motivation to use tools. Our observations point to the fact that a lack of motivation can lead to discontinuing DSCT use. Participants reported instances where they had simply “lost motivation” about their tools, or instances where they got “too lazy” to continue using their DSCT. The motivation to change behavior also fluctuates in other domains, e.g. smoking (Zhou et al., 2009) or weight loss (Elfhag & Rössner, 2005). However, in contrast to these examples, users typically don’t seek to permanently eliminate digital media from their lives (Lukoff et al., 2018; Monge Roffarello & De Russis, 2019), and the motivation to use a tool might only be temporary. This distinction necessitates a unique model of motivation for DSCT use, one that incorporates this temporal requirement.

Various theories and models may provide inspiration for constructing this motivational model, including the transtheoretical model of behavior change (Heller et al., 2013) or self-determination theory (Deci & Ryan, 2012). However, these models should be critically assessed for their applicability given the specific nature of the use of DSCT. For example, the transtheoretical model emphasizes the need for users to be willing to change. However, in the context of DSCT use, this willingness may fluctuate, peaking during periods of high pressure and waning during more relaxed periods. These unique, often transient, demands on DSCT need to be recognized in future research. Understanding users’ motivations and how they change over time is crucial, particularly for adherence to DSCT use.

5.4.3 Managing self-control tool activation

The qualitative statements also show that it is a recurring problem that restrictive DSCTs require a certain amount of effort, and have to be repeatedly switched on and off. In the context of learning, this means that the tool has to be activated for the learning phases and then deactivated again for leisure media use. Over short periods of time, this leads to frustration and annoyance, and over longer periods of time, users may simply forget about their tools if they have not used them for a longer time.

A possible solution could be DSCTs that activate automatically. Context-aware (Schilit et al., 1994) activation has previously been explored in some instances. It has been promising for highly controlled workplace scenarios (Tseng et al., 2019), which are, however, not comparable to self-directed learning at home. Other context-aware triggers that have been under study were based on the time of day (Löchtefeld et al., 2013), the physical presence in a classroom (Kim et al., 2017a) or the recognition of longer periods without movement (Kim et al., 2018). However, the context-awareness was only evaluated superficially, and we have doubts that these are sufficient and reliable triggers for the realities of everyday learning, where time and location of learning and entertainment vary and are interchangeable. If the triggers interfere too much with the learners and just get in their way, they will not find long lasting acceptance.

A potential improvement could be context-awareness of learning activities (Ciordas-Hertel et al., 2022). Given that learning materials are increasingly available on online learning platforms, a learner’s activity could also serve as a contextual trigger to activate restrictions. A monitoring application could detect that a learning platform was visited, thereby starting the contextual activation. This could reduce the burden on users to frequently activate and deactivate their tools manually.

To mitigate the difficulties with dual-use platforms, adaptive DSCTs could furthermore experiment with content classification to distinguish between content that is relevant for learning and content that is purely for leisure. Machine Learning models for content classification have shown promising results in other domains (Yousaf & Nawaz, 2022) and could present a way to distinguish distractions from learning content without forcing the user to manage long lists of black- or whitelisted content.

5.5 Limitations and suggestions for future research

A limitation to the generalizability of our results is that our sample of participants was limited to English and German speakers. Cultural differences in device use could yield different results in other regions (Gray & Schofield, 2021; Kononova & Chiang, 2015).

Some observations hint at the issue that our methodology for measuring helpfulness was insufficient to detect complex nuances like “This would help me in principle, however…”. Another issue is that what feels good might not actually help to change behaviour. For example, users prefer non-restrictive interventions even when they are aware of their shortcomings (Zimmermann, 2021). The perceived helpfulness of an intervention may be high, and it may feel helpful, but not necessarily result in behaviour change (Patel et al., 2015).

While we cannot answer these questions regarding the effectiveness, our analysis of the reasons for stopping tool use yielded additional insights about DSCTs.

Another limitation is that combinations of DSCT features are very typical (Lyngs et al., 2022). For example, the “Digital Wellbeing” app on Android systems has usage visualizations, usage goals, and automatic closing features. We saw several such apps, like the “Forest” app, in multiple feature categories. We tried to circumvent this limitation by asking about features instead of specific tools, but disentangling the helpfulness of a single feature from a tool with multiple features can be challenging. Some participants stated that they stopped using a feature because it did not help them, but still rated it as helpful, which suggests that some of the participants could not sufficiently express their thoughts about the DSCTs.

A general limitation of cross-sectional studies on the topic of DSCTs is that they cannot incorporate the dynamics of the tool use. The participants often used DSCTs only for short, stressful phases, e.g., during exams. Afterward, they stopped again. This is understandable, but it potentially impacts the helpfulness of DSCTs because the tool usage does not become habitual. These phenomena can only be addressed with longitudinal studies that observe how the use of DSCTs waxes and wanes over a learner’s career.

Moreover, our study focused on the perceived helpfulness and not the actual effectiveness. Surveys and self-reports are unlikely to be sufficient to measure the true effectiveness of DSCT features (Parry et al., 2021). Tracking digital activity requires a sophisticated tracking system. It must be able to track multiple devices simultaneously, otherwise, activity on the smartphone may be detected, but not activity on the notebook, or vice versa. It must also distinguish between cases where media use takes place during learning and those where it takes place during leisure (Biedermann et al., 2023). In addition, as already mentioned, the role of internal states must be adequately considered, which requires frequent prompting of these states, for example via experience sampling (van Berkel et al., 2017). Such a tracking setup is technically challenging, but necessary to advance research on DSCTs in academic learning.

6 Conclusion

Digital self-control tools present a wide range of approaches to the urgent and growing problem of digital distractions. Our findings underscore that digital self-control tools can provide valuable assistance in mitigating digital distractions. However, their success is not universal; the perceived helpfulness varies significantly among individuals. One key obstacle impeding the utility of digital self-control tools is the lack of user awareness about available tools. This is especially noteworthy as there can be a mismatch between a user’s needs and the features provided by a specific tool. Hence, it is essential for users to explore and find digital self-control tools that best align with their specific requirements.

Our qualitative assessment exploring why participants discontinued their use of these tools identified several general factors, such as the fine line between restrictiveness and accessibility, and the challenges posed by dual-purpose platforms. We also unveiled factors particular to certain tool features that can directly inform their enhancement.

In summary, while the use of self-control tools is widespread, numerous minor obstacles hinder their potential effectiveness in learning environments. Addressing these issues will substantially enhance the support we can provide to students dealing with digital distractions. As we navigate the expanding landscape of digital education, these considerations are key to leveraging information technologies to facilitate a more productive and focused learning experience.