Introduction

Self-Regulated Learning (SRL) characterizes learners as active participants in their own learning process who study how they learn and how learning helps them to achieve their goals (Winne, 2010; Zimmerman, 1989). For a learner to successfully self-regulate their learning, sufficient cognitive ability and motivation must be met with sufficient metacognition: the knowledge of one’s own cognitive processes and products, and the skills to regulate cognitive aspects of the learning process (Flavell, 1979; Schraw et al., 2006). In this study we examine whether metacognition can be improved through self-explication of metacognitive processes in a digital SRL-tool.

In the past two decades, researchers have studied digital tools for supporting metacognition and SRL (Azevedo, 2005; Hadwin & Winne, 2001; Winters et al., 2008), with the majority of research focusing on embedding metacognitive support within the content of domain-specific digital learning environments (Azevedo, et al., 2012; Broadbent et al., 2020). For example, a digital learning environment designed to offer instruction and practice for mathematical problems may be augmented with instructional support, promoting help-seeking and self-monitoring (e.g., Arroyo et al., 2014). Alternatively, a digital tool could offer such support independently of any domain-specific content. Such domain-general metacognitive support could be offered detached from, but in parallel to, ongoing learning. Potential benefits of domain-general support are that learners can identify and isolate metacognitive knowledge and skills that apply across different learning situations and altogether have more opportunities to practice and improve their learning (Derry & Murphy, 1986; Osman & Hannafin, 1992). While ample research addresses digital metacognitive support in a domain-specific and embedded way (e.g., Bannert & Mengelkamp, 2013; Schwonke et al., 2013), current research lacks insights into the design, use, and effects of detached and domain-general digital metacognitive support.

In this paper, we study a detached digital SRL-tool supporting domain-general metacognition through self-explication: prompting learners to make otherwise implicit metacognition concrete. We focus on the improvement of metacognition of learners in higher education, who have some experience in learning but tend to produce ineffective learning behaviors. First, we introduce the key concepts of SRL, metacognition, and digital instructional support. Second, we present the design of the tool and the domain-specific and domain-general metacognitive support implemented to help learners. Third, we discuss the evaluation of the tool in an in-vivo quasi-experiment aiming to assess effects, use, and learners' perceptions of the tool. The paper concludes with discussing the results and formulating implications for design as well as future research.

Background

Self-regulated learning and metacognition

SRL encompasses cognitive, metacognitive, behavioral, and affective aspects of learning and has become an important conceptual framework for educational research (Panadero, 2017; Winne & Hadwin, 1998; Zimmerman, 1989). While various models co-exist in literature, SRL is generally described as learner behaviors during three cyclic phases: (1) a preparatory phase (task analysis, goal-setting, and strategic planning), (2) a performance phase (enacting strategies and tactics, monitoring performance and progress, and adapting goals, plans and strategies), and (3) an appraisal phase (reflection, adaptations for future performance) (Panadero, 2017; Puustinen & Pulkkinen, 2001).

Different research perspectives on SRL have identified a large number of factors involved. A social perspective of SRL relates learning to influence of and influence on personal, behavioral, and environmental factors affecting learning (Zimmerman, 1989). Correspondingly, learners employ SRL-strategies such as self-evaluation, seeking social assistance, or environmental structuring. An affective perspective of SRL relates learning to emotional and motivational processes that occur during learning (Boekaerts, 1997; Boekaerts & Cascallar, 2006). A metacognitive perspective of SRL emphasizes the cognitive and metacognitive processes involved in learning (Azevedo et al., 2006; Efklides, 2014; Winne, 2010; Winne & Hadwin, 1998).

In this paper we focus on this metacognitive perspective and how students in higher education could benefit from metacognition in learning. First, learners use metacognitive skills to estimate their ability, make predictions about their performance, and accordingly set realistic goals, make strategic plans, and monitor and regulate their learning effort (Pintrich, 2002; Schraw & Moshman, 1995; Veenman & Spaans, 2005). Second, learners use metacognitive knowledge of what strategies are available, how to implement these strategies, and under which conditions these strategies are effective (Ertmer & Newby, 1996; Pintrich, 2002; Schraw, 1998; Schraw & Moshman, 1995). Third, learners have beliefs about their learning and such metacognitive theories are used to steer cognition through metacognitive processes (Bjork et al., 2013; Dweck, 1986; Schraw & Moshman, 1995; Winne & Nesbit, 2009).

Consider, for example, a learner who thinks that learning will be more effective when more concerted effort is invested (metacognitive theory), who may know that, for them, part of the effort should involve discussion of the materials with peers (metacognitive knowledge), and may correspondingly plan and schedule such sessions in advance (metacognitive skills). However, metacognitive theories are not necessarily correct and metacognitive knowledge is not necessarily optimal. Consider, alternatively, a learner who believes that learning is mostly about repeating the material (metacognitive theory), may only know cramming for the test as a strategy (metacognitive knowledge), and may find that, upon monitoring progress, learning does not proceed as well as hoped (metacognitive skills). Metacognitive support of SRL can thus seek to (i) encourage learners to apply, evaluate, and improve their metacognitive theories in response to evidence gathered during learning, (ii) expand and improve metacognitive knowledge of learners, (iii) improve the occurrence and quality of metacognitive skills, or any combination thereof.

Students entering higher education have previous experience with learning from primary and primarily secondary education. However, they need to make a transition from one educational phase to the next, as they are increasingly expected to self-regulate learning, take individual responsibility for and control of learning, in a pursuit of more complex learning outcomes (Kane et al., 2014). At the same time, development of metacognition is known to continue well into adolescence and young adulthood (Schneider, 2008). Students who make active use of metacognition perform better than students who do not, are more aware of how metacognitive knowledge can be used to improve cognitive processing of learning material (Meijer et al., 2013; Romainville, 1994; Veenman et al., 2006). An effective way of improving learning for such students is to improve their metacognitive awareness by fostering reflection on their own approach to learning (Brown & Palinscar, 1989; Meijer et al., 2013; Romainville, 1994).

Metacognitive support

SRL and metacognition can be improved through instructional support (Callender et al., 2016; McCormick et al., 2013). Three common and effective types of metacognitive support are direct instruction (Kim et al., 2009; Schraw, 1998; Zepeda et al., 2015), metacognitive scaffolding (Arroyo et al., 2014; Azevedo & Jacobson, 2008), and metacognitive prompting (Bannert & Mengelkamp, 2013; Hoffman & Spatariu, 2008). Direct instruction can, for example, be used to explain what metacognitive strategies are, and how and when to use them effectively (e.g., Jansen, Leeuwen, Janssen, Conijn, & Kester, 2020). Metacognitive scaffolding can support metacognitive processes, for example by letting a virtual character announce and explain at each step of a learning task (e.g., Molenaar et al., 2011). Metacognitive prompts are typically used (i) as a cue to remind a learner of and focus attention on metacognitive processing (e.g., Fiorella & Mayer, 2012; Merriënboer & De Bruin, 2019), (ii) as a request to self-explain current understanding with the aim of triggering metacognitive monitoring and regulation (e.g., McNamara, 2009; Yeh et al., 2010), or (iii) as a combination thereof (e.g., Bannert & Reiman, 2012). However, previous research has not investigated the use of prompts primarily to enable learners to self-explicate metacognitive processing with the purpose of examining and improving metacognition. Metacognitive theories can be improved when learners apply them to learning, evaluate them for merit, and adjust them in response to evidence (Bjork et al., 2013; Schraw & Moshman, 1995). Self-explication, when prompted, allows learners to examine such otherwise implicit metacognitive theories. As the goal is for learners to, eventually, self-initiate regulation in absence of any support, the design of such tools must provide for sufficient support while not precluding opportunities for learners to self-regulate (Arroyo et al., 2014; Broadbent et al., 2020; Griffin et al., 2013; Hattie et al., 1996). Prompting learners to explicate, examine, and improve their metacognitive processes during learning could potentially support SRL while allowing for sufficient learner control.

Metacognitive support can be delivered through digital tools (Altıok et al., 2019; Bannert & Mengelkamp, 2013; Connor et al., 2019), which generally fall into one of two categories: embedded instruction within domain-specific digital learning environments and detached instruction provided outside of, and prior to or in parallel to, ongoing domain-specific training (Broadbent et al., 2020; Osman & Hannafin, 1992). Embedded instruction typically (i) augments domain-specific content with cognitive tools aiding information processing (Bannert et al., 2009; Winne, 2010; Winne et al., 2006), (ii) uses data gathered from learning to provide meaningful feedback and support to learners to help them overcome particular challenges (Winne et al., 2006), and (iii) makes use of interactive and multimedia environments to situate SRL-support (McQuiggan & Hoffmann, 2008; Sabourin et al., 2013). Detached instruction, in contrast, makes few assumptions about the content of learning, and instead focuses on supporting metacognition during different parts of the learning process (Broadbent et al., 2020; Derry & Murphy, 1986; Osman & Hannafin, 1992). An example of detached instruction is offering video-based training of SRL through a dedicated digital learning environment (Jansen et al., 2020).

Metacognition is in part domain-specific, with limited transfer to other learning situations, and in part domain-general and transferrable between different domains (McCormick et al., 2013; Schraw, 1998; Veenman et al., 2006; Wang, 2015). Domain-specific metacognitive knowledge (e.g., knowing the steps to solve an equation) and skills (e.g., checking if a solution is plausible) are embedded in ongoing learning, making acquisition more straightforward (Bannert & Mengelkamp, 2013; Lin, 2001; Veenman et al., 2006). Domain-general metacognitive knowledge (e.g., knowing oneself as a learner, knowing general learning strategies) and skills (e.g., planning, monitoring, and regulating learning) can be applied effectively across a wide range of learning situations (Broadbent et al., 2020; Osman & Hannafin, 1992; Wang, 2015). Domain-general metacognitive instruction is agnostic to the content of learning and thus can be offered embedded in or detached from domain-specific instruction. Thus, while domain-specific metacognitive support is easier for students to connect to their learning, domain-general support can be applied across many different settings of learning. From a design perspective, the challenge is to make metacognitive support generic enough to replicate across different domains while remaining specific enough for students to apply. Here, detached instruction allows learners to more easily identify potential transfer to future learning situations (Derry & Murphy, 1986; Osman & Hannafin, 1992; Veenman et al., 2006).

Outline

Previous research has focused predominantly on embedded and domain-specific digital metacognitive-support for specific elements of SRL (Azevedo, 2020; Bannert & Mengelkamp, 2013; Merriënboer & De Bruin, 2019; Veenman et al., 2006). However, little is known about domain-general and detached digital metacognitive support across all phases of SRL, or about self-explicating otherwise implicit metacognitive processes. The present study investigates the design of detached digital metacognitive support for students in higher education. The three key research questions are:

  • Can metacognition of learners be improved through self-explication within a digital SRL-tool that is detached from domain-specific learning?

  • Can detached metacognitive support be domain-general or must there be a connection with domain-specific learning?

  • How do learners make use of, sustain use of, and perceive the use of such a detached digital SRL-tool?

The remainder of this paper discusses a digital tool that supports self-explication. After the design of the tool is presented, an evaluation of how the tool affects learners, how learners use the tool, and how learners perceive using the tool is discussed. The results and corresponding implications for the design and research of digital metacognitive support are discussed.

Design of a digital self-explication tool

Concept

The design goal for the tool was to improve metacognition by encouraging learners to make connections between (i) their knowledge, beliefs, and assumptions about learning, (ii) an ongoing and concrete learning process, and (iii) improvements made to this learning process for current as well as future learning tasks.

The following conceptual model of metacognition during SRL was created to facilitate the design (see Fig. 1). The conceptual model was derived from the COPES-model (Winne & Hadwin, 1998), is supported by ample empirical evidence and is widely used in studying computer-supported learning (Greene & Azevedo, 2007; Panadero, 2017; Winne & Nesbit, 2009).

Fig. 1
figure 1

Conceptual model of metacognition during self-regulated learning

Task-relevant learner knowledge is represented as either task knowledge or metacognition (metacognitive theories, strategies, and tactics) (cf. Ertmer & Newby, 1996; Schraw & Moshman, 1995; Winne & Hadwin, 1998). The model combines the preparatory, performance, and appraisal phases of SRL with five facets of learning: (i) the conditions for learning (e.g., task conditions and cognitive conditions), (ii) the operations involved in learning (e.g., tactics and strategies), (iii) the (meta)cognitive products that are the result of learning (e.g., task definition, plan), (iv) the evaluations that are made of learning (e.g., judgment of learning), and the standards that learning are held to (e.g., expectations based on past performance).

During each phase, it is indicated how (meta)cognitive activities are informed by task-relevant knowledge, and how each activity is assumed to result in (meta)cognitive products, through self-observation, self-judgment, and self-reaction (Winne & Hadwin, 1998; Zimmerman, 1989). As such, this conceptual model defines two specific ways in which learners adapt their learning in response to observations and judgments. First, metacognitive monitoring and control lead to adaptations of the current task definition, goals and performance expectations, and plans (local update). Second, reflection on the learning process itself leads to adaptations to metacognitive knowledge (global update).

The design rationale for the tool, now, is to encourage learners to make informed local and global updates to learning, using self-explication to allow them to inspect their metacognitive processes, and to eventually replace belief-based judgments and predictions by those based on experience (Bjork et al., 2013; Winne & Hadwin, 1998).

Metacognitive support

The support for metacognitive processes during SRL is indicated in the conceptual model (see Fig. 2). The primary support within the tool was prompting learners to self-explicate otherwise implicit metacognitive processes and products during different phases of SRL. Five categories of metacognitive processes affecting learning were created: (1) applying metacognitive knowledge to current learning, (2) goal-setting, (3) strategic planning, (4) monitoring and controlling learning by adjusting previous goals and plans, and (5) making adaptations to metacognitive knowledge. As such, three key phases of SRL (2–4) were augmented with applying and adapting metacognitive knowledge (1 + 5). The organization of learning into five distinct categories containing specific prompts can in itself be considered metacognitive scaffolding (6), and further support was implemented as direct instruction of particular metacognitive strategies (7).

Fig. 2
figure 2

Metacognitive support indicated in the conceptual model

For each category, a main prompt was created that would ask a learner directly to make a key metacognitive process explicit. To make it easier for learners to understand and respond to the prompts, more colloquial phrasing was used to describe a prompt category (e.g., “ideas about learning”, instead of “metacognitive theories”, “checks” instead of “monitoring and control”, etc.). Within each category, multiple more refined prompts were available to improve the quality of the responses. The refined prompts were created to let learners consider different aspects and perspectives of the current metacognitive process they may not have thought of. Each refined prompt was presented as a question accompanied by an instruction, to provide learners both with an open-ended and a concrete way of responding. The main prompts, refined prompts, and how they relate to metacognitive components of SRL, are shown in Table 1.

Table 1 Five categories of metacognitive self-explication prompts

Metacognitive support was made progressively available to avoid overwhelming learners and precluding self-initiated metacognitive processing. Per category, the main prompt was always available. Responding to a prompt, updating a previous response, or otherwise interacting with the tool for a set amount of time, contributed to unlocking further support in the form of cards. Each card either presented one of the refined prompts (6–9 per category) or highlighted a metacognitive strategy (1 per category). The metacognitive strategy cards provided a form of direct instruction by explaining a strategy, when to use the strategy, and examples of how to implement the strategy. Direct instruction was included to complement self-explication with concrete help, such that eventually most learners would be able to make relevant responses to the prompts.

Implementation

All materials were discussed in a focus group with students in higher education and were reviewed independently by two educational experts. Adjustments to organization, presentation, and wording were made accordingly. The digital tool was then implemented as a web-application, which could be accessed on any device via a browser. A reserved and contrast-rich visual style, including icons as well as text, was used to maximize accessibility and usability.

The main menu of the tool displays the five prompt categories (see Fig. 3). Learners could freely navigate through the different categories as available and add, review, or update their responses as desired. The tool was offered in either English or Dutch, and learners could adjust this language setting within the tool as desired.

Fig. 3
figure 3

Main menu of the tool with the five categories of learning

For each category, a separate screen could be accessed from the menu (see Fig. 4). This screen would display the main question prompt (e.g., “What are your goals?”), an instruction (e.g., “Think of the current period/block of your study and the courses within that period.”), and the learner’s current response for this prompt (e.g., “Your current goals are:”). Any changes would be saved automatically or when the learner would press the “Save changes” button.

Fig. 4
figure 4

Category screen with the main prompt for the goals category

Below the main prompt section, any of the cards with refined prompts were shown (see Fig. 5). Newly unlocked cards were shown with a sparkling star icon and a green background to draw attention. Learners could write responses to such cards, which would be saved as a chronological series of replies.

Fig. 5
figure 5

Unlocked cards with refined prompts below the main prompt

When all refined prompt cards for a category were unlocked, one of the metacognitive strategy cards was automatically unlocked (see Fig. 6). These cards would describe a specific strategy (e.g., “Seeking information: gathering relevant additional information”, explain when to use this strategy (e.g., “Use when you feel you need more info before proceeding with the task.”), and provide concrete examples of implementing the strategy (e.g., “Read through the chapters of a book or reader.”).

Fig. 6
figure 6

An unlocked card highlighting a metacognitive strategy

Summary

In summary, the tool was intended to work as follows. The tool prompts learners (i) to make explicit their beliefs about learning, (ii) to explicitly formulate goals and plans for learning, (iii) to explicitly monitor learning, (iv) to make local updates to learning by adjusting goals and plans if needed, and (v) to make explicit any improvements that could apply to similar future learning situations. The tool further allows learners to remain in control and freely navigate back and forth between these prompts to make adjustments as needed. The tool supports learners through refined prompts, that promote them to attend to specific metacognitive aspects of SRL, and altogether improve the quality of their responses. The tool further supports learners through direct instruction of metacognitive strategies. As such, the tool represents a detached form of digital metacognitive support of SRL based on learners self-explicating their metacognitive processes and products.

Methods

The objective of this study was to examine how self-explication of metacognition within a detached digital SRL-tool affects metacognition in learners. Additionally, we aimed to compare effects between domain-specific and domain-general metacognitive support. Finally, we wanted to evaluate how learners use and perceive the use of such a tool.

Study design

The study was an in-vivo quasi-experiment, with students assigned to experimental groups on a per-class basis. The study adopted a within-subject pre-test/post-test design with between-groups comparisons. Mixed methods were used to collect data, with a primary focus on quantitative and confirmative analysis, and qualitative and exploratory analysis used to identify the underlying motivations and perceptions.

Intervention

The intervention in this study was the digital tool as presented previously. As part of the experimental condition, the tool could be presented in a domain-specific or a domain-general configuration. In the domain-specific configuration, all prompts and instructions were phrased in terms of the domain of learning. Examples of such domain-specific prompts were “What do I already know about game design?”, “How can I increase my understanding of game design?”, or “When would you use or not use these strategies for learning how to design games?”. As such, these prompts instructed students to explicate learning in terms of the domain-specific concepts they were already involved in as part of their study program. As such, this configuration attempts to bridge the gap between detached support and students' ongoing learning. This configuration of the tool requires that the designers have some knowledge about the subject matter of the educational context in which the tool is used and correspondingly limits when and where it can be used. However, this configuration does not take into account any unique aspects of the subject-matter content: the domain-specificness refers to the phrasing of the prompts, which may be replicated for various educational context with limited effort.

In the domain-general configuration, a generic phrasing was used, referring to a course without making assumptions about its contents. Examples of the same three prompts in a domain-general phrasing were “What do I already know about the topics of this course?”, “How can I increase my understanding of the course material?”, and “When would you use or not use these strategies for studying in a course?”. These prompts instructed students to explicate learning in more general terms and leaves it up to them to make a connection to their ongoing learning. This configuration of the tool can be applied in many educational contexts and incorporates no knowledge of the subject matter.

While the role of the prompts in both configurations is the same, its specific form has implications for the design of the tool and where and when the tool can be applied. Furthermore, we hypothesize that students can use both configurations in a similar way and with similar effects.

Participants

The participants in this study were 1st-year students of a program in multimedia design at a polytechnic (also referred to as a university of applied sciences) in The Netherlands. Within this program, students prepare for a major in visual design (taught in Dutch to mostly Dutch students) or in game design (taught in English to a mix of Dutch and international students). The default language for communication, instructions, and the tool was based on the main language of the specific major.

From a representative explorative study of metacognition among students of the same program (12% response rate among population, N = 110), 69 male, 42 female, and 6 nonbinary, with an average age of M = 20.8 (SD = 3.2), we found an average metacognitive awareness of 64.1% of the maximum score (M = 67.7, SD = 11.5), indicating both previous experience with learning and ample room for improvement.

An introductory session was scheduled for each class and 192 participants that completed the informed consent procedure and the pre-test were recruited. Between the pre-test and post-test, 72 participants withdrew from active participation in the experiment, including 3 participants who did not use the offered intervention at all. The number of participants completing the experiment was N = 120 (52 female, 66 male, and 2 nonbinary), aged 16–28 (M = 19.47, SD = 2.03), with 1–4 years of experience in higher education (M = 1.39, SD = 1.08).

Students in the domain-specific group (N = 48) worked with the tool in the domain-specific configuration, while students in the domain-general group (N = 42) worked with the tool in the domain-general configuration. The comparison group (N = 30) did not work with a digital tool but did receive similar instructions and exercises. This design, with a comparison group lacking only the digital tool, allowed us to examine the added value of the working mechanisms of the digital tool, rather than just the introduction of such a tool in general.

Measures

The following measures were taken during this study, as outlined in Table 2.

Table 2 Outline of measures taken during experiment

Via the pre-test, we asked participants for age, gender, as well as how many years they had been enrolled in higher education (including the current year). Additionally, three validated scales were administered: 6 items measured need for cognition (Lins de Holanda Coelho et al., 2018), 19 items measured metacognitive awareness (MAI; Harrison & Vallin, 2018; Schraw & Dennison, 1994), and 10 items measured general self-efficacy (Schwarzer & Jerusalem, 1995). The scale items were presented as statements about learning and participants were asked to express how typical each statement is of their learning, with answering options ranging from 1 (“not at all typical of me”) to 5 (“very typical of me”).

As we were not in a position to collect participants’ previous or future grades, we asked them to predict their learning performance in terms of a grade.

As it is recommended that measures of metacognition are taken in multiple ways (cf. Veenman et al., 2006; Wang, 2015), we combined a scale-based method (MAI) with an observation-based method (log data). The digital tool was equipped with an event logging system, which saved relevant interactions along with a unique user-id and timestamp. From these events, we counted the number of metacognitive activities performed within the tool as all updates of ideas, goals, plans, checks, and improvements, as well as any comments made in response to a card. The elapsed time between subsequent events by the same user was also calculated. If this time exceeded the cut-off time of 5 min, the usage time was counted as zero. Any event occurring after a gap of this length or longer was marked as a new session. As such, we obtained estimates of frequency of use (i.e., number of sessions) and duration of use (i.e., total elapsed time within such sessions).

Via the post-test, we measured metacognitive awareness, self-efficacy, and expected performance in the same way as during the pre-test. Furthermore, all participants were asked to rate and comment on how easy, enjoyable, effortful, and useful they found the training received during the study. Additional questions regarding usability, usefulness, and required effort of the tool were presented only to participants in the intervention groups, as were requests for suggested improvements to the tool.

Procedure

The procedure is outlined in Table 3. All communication and all sessions were provided by the same host and provided in the main language of the major of choice.

Table 3 Outline of the experimental procedure

In the first week, all students received direct instruction on metacognition and beliefs about learning. Instruction explained the relevant concepts and emphasized potential benefits of this approach. The two intervention groups then received instructions to access the tool and log some of their ideas about learning. The comparison groups completed a similar assignment without the tool.

In the second week, a per-class session was scheduled, during which students received direct instruction on setting goals and making plans. Subsequently, the intervention groups completed assignments to set goals and make plans with the tool, whereas the comparison group did so without the tool.

At the beginning of week three, all students were reminded via email to check-up on their previously logged beliefs, goals, and plans, and to make changes or updates as needed. During the third week, the intervention groups received a short assignment during class, asking them to monitor their learning progress and identify improvements for learning using the tool. The comparison group received a similar instruction via email.

The post-test was made available during the fourth week, and students were invited via email to respond. After three days, all students who had not yet responded were reminded to do so. Five days before closing the post-test, a final reminder was sent. A monetary reward of €5,- was offered to all participants who completed the pre-test and the post-test, and attended 50% of the scheduled sessions. All eligible participants who opted to receive the reward were paid in the seventh week.

Hypotheses and exploratory questions

For this study, we have formulated hypotheses as well as exploratory questions. First, we expect a positive effect of using the tool on learning in both the domain-specific and the domain-general configuration:

  • H1: metacognitive awareness is increased between pre-test and post-test when working with the tool, and this change is larger than when working without the tool.

  • H2: metacognitive awareness is not affected differently by a domain-specific or domain-general tool.

    Second, we expect that use of the tool accounts for these effects:

  • H3: use of the tool is not different between a domain-specific or domain-general tool.

  • H4: use of the tool correlate positively with changes in metacognitive awareness.

    Third, we want to examine student perceptions of working with the tool:

  • EQ1: which students use, and sustain use of, the tool over time?

  • EQ2: how do students perceive the tool in terms of ease of use, enjoyability, required effort, and usefulness?

  • EQ3: how do students perceive how the tool affects their learning?

Results

Effects of the intervention

To assess whether there was a positive within-subjects effect of the intervention on metacognitive awareness, three paired-samples one-tailed t-tests were conducted. Bonferroni-correction was applied to reduce the family-wise error rate.

Table 4 shows the results, indicating that on average metacognitive awareness increased within all groups between pre-test and post-test. For the domain-specific and domain-general groups, the confidence intervals of the differences do not contain zero and the effect size is small to medium, however, only the increase within the domain-specific group was significant at an alpha level of .05/3 = .017 (H1). The increase in the comparison group is of limited size and the confidence interval contains zero.

Table 4 Within-subjects comparison of metacognitive awareness

Given the quasi-experimental design, we checked and confirmed that metacognitive awareness at the pre-test was not different between the three groups, F(2,119) = .158, p = .854.

To assess whether the increase in metacognitive awareness scores differed between groups, an ANOVA was conducted on the post-test scores.Footnote 1 The assumption of equal error variance was confirmed using Levene’s test, F(2,117) = .080, p = .923. No significant effects of the intervention on the post-test metacognitive awareness scores were found (H2), F(2,119) = .334, p = .717, η2 = .045. Contrasts showed non-significant differences between the domain-specific group and the comparison group (1.708, SE = 2.29, p = .457), and between the domain-general group and the comparison group (.429, SE = 2.35, p = .856).

Our analyses regarding need for cognition, self-efficacy, and expected performance did not yield relevant results.

Use of the intervention

Students within the intervention groups (N = 90) worked with the tool up to 37 min (M = 9.95, SD = 6.54), over the course of 1 through 6 sessions (M = 2.87, SD = 1.29). The number of metacognitive activities within the tool varied widely (M = 8.62, SD = 6.37).

Usage of the tool was compared between the domain-specific and domain-general group (see Table 5). The number of sessions within the domain-general group was significantly higher than within the domain-specific group (H3). The interaction time and metacognitive activities were not significantly higher.

Table 5 Comparison of usage between domain-specific and domain-general groups

Correlational analysis was conducted to assess the relation between use of the tool and the changes in metacognitive awareness. Positive correlations between metacognitive awareness and number of sessions (r = .244, p = .034), interaction time (r = .083, p = .434) and metacognitive activities (r = .176, p = .096) were found (H4).

To examine which students sustained use of the intervention over time, we compared students who completed the pre-test and the post-test (completers) with students who withdrew at some point after the pre-test. Indeed, among withdrawers in the intervention groups (N = 43), use of the tool was significantly less frequent, of shorter duration, and with fewer metacognitive activities (see Table 6). This indicates that withdrawing occurred not just right before the post-test, but spread out over the three-week period between pre-test and post-test.

Table 6 Comparison of tool use between withdrawers and completers

The results further showed that withdrawers (N = 72) had significantly lower a priori metacognitive awareness (M = 60.03, SD = 10.64) than completers (M = 64.39, SD = 10.17), t(190) = 2.829, p = .005, d = .422. No significant differences were found for age, years in higher education, need-for-cognition, or self-efficacy. This indicates that sustained tool use is best predicted by higher metacognition (EQ1).

Perceptions of the intervention

Participants were asked to evaluate how easy, enjoyable, low effort, useful for themselves, and useful for others they perceived the training to be (EQ2; see Fig. 7). While no significant differences between groups were found, it appears that students within the comparison group found it easier, more enjoyable, and requiring less effort than students in the intervention groups. Furthermore, it appears that the domain-general group found the tool taking less effort than the domain-specific group.

Fig. 7
figure 7

Quantitative results of the evaluation questionnaire

The remarks of the participants in the intervention groups were analyzed to identify perceptions of how the tool affected learning (EQ3). The relative gains in metacognitive awareness between pre-test and post-test, and duration of tool use relative to the average duration, were used to verify whether such perceptions were warranted.

Four reasons for a perceived lack of impact were identified (see Table 7). The perceived lack of impact was corroborated by limited metacognitive gains for the group of students who found they already knew how to learn, as well as for the group of students who found a limited applicability of the tool to the type and level of study activities. However, the perception was not corroborated for the group of students who cited a lack of interest, motivation, or relevance, nor for the group of students who found the tool not sufficiently appealing. Both groups used the tool above average and had substantial metacognitive gains.

Table 7 Reasons for a perceived lack of impact of using the tool on learning, combined with relative change in metacognitive awareness and tool use relative to average tool use

Seven ways in which the tool was perceived as having an impact on learning were identified (see Table 8). Perceived impact was generally corroborated by substantial metacognitive gains and above average use of the tool. However, limited or negative metacognitive gains were associated with a perceived impact on making plans. Furthermore, a small negative effect on metacognition and below-average use of the tool was associated with a perception of improved ease of learning.

Table 8 Clarification of perceived impact of the intervention on learning

Finally, participants were asked to suggest improvements for the tool. Some respondents indicated no improvements were needed (e.g., “it’s good for now” or “it serves its purpose”), while many remarks suggested specific features be implemented (e.g., a calendar of learning activities, using data to identify best practices among students of a course, or the option to adjust or add your own prompts). The most frequently requested feature was an option to receive reminders to check up on learning within the tool. The remaining remarks suggested improvements that are related to the self-explanation approach and detached presentation of the tool, as shown in Table 9.

Table 9 Suggested improvements to the tool

Discussion

In this paper we investigated the design of detached digital metacognitive support. Self-explication of metacognition across all phases of SRL was compared between a domain-specific and a domain-general implementation. We focused on students in higher education, with specific attention for how these learners use and perceive such a tool.

Conclusions

The results show that a digital tool prompting learners to self-explicate learning, in combination with scaffolding and direction instruction, can improve metacognition. Furthermore, in contrast with current recommendations of embedding metacognitive support in domain-specific content, a detached implementation of metacognitive support was demonstrated to be effective. However, user feedback underlines that any detached metacognitive support still needs to be applicable to current learning and is preferred to be concrete and specific. Further research on embedded and detached metacognitive support is recommended.

The effect of domain-specific metacognitive support was confirmed, even when learners used the support relatively little over a relatively short period of time. The effect of domain-general metacognitive support could not be confirmed. However, both quantitative and qualitative analysis warrant further research. While the domain-specific tool was more effective, the domain-general tool was used more actively. Perhaps the domain-general approach requires more effort from learners to achieve similar effects, although learners perceived it as slightly easier and requiring slightly less effort. Alternatively, the domain-general support could have appealed more to students. Since domain-general support can be used repeatedly across different learning situations, this type of support has high potential for adoption across a curriculum and, as such, of offering more frequent and diverse opportunities for learners to develop metacognitive awareness.

The results show that use of the tool was limited in frequency, duration, and metacognitive activities. Predominantly, the tool was used during the scheduled sessions and in response to a cue by the host. Correspondingly, participants suggested receiving notifications to attend to the metacognitive support within the tool. Alternatively, a lack of self-initiated use outside of the sessions may be due to a perceived lack of relevance, corroborating results found by Narciss et al. (2007) and Jansen et al. (2020). We found this lack of relevance is warranted for a group of students who already know how to learn and did not find much added value in the current tool. Future work could identify what support, if any, could be provided to somewhat proficient learners.

The results also show that students with lower metacognition are less likely to make use of and sustain use of the available support. This signals a key problem with implementing metacognitive support: it is complicated to administer such an intervention to those who would benefit from it the most. While both domain-specific and domain-general digital metacognitive support can be effective, it is a prerequisite that students regularly use the available support. Previous research provides some indications that learners' metacognitive knowledge and skills affect both the quality and quantity of tool use (cf. Clarebout et al., 2013).

Limitations

In this study we collected insights for a specific group of learners (i.e., young adult students) within a specific educational context (i.e., institutional higher education in The Netherlands). This group of learners is, for example, likely to have previous learning experiences within an institutional context. The phrasing of the prompts used in the present studies is also somewhat specific to this group and context. As such, our findings can be considered relevant for similar situations but may not generalize beyond the studied group.

In this study, metacognition is primarily assessed through a self-report measure and may not accurately reflect actual learning behavior. While learners believed their metacognitive knowledge and skills have improved, only analysis of learning behaviors in terms of activities or performance could provide accurate insights into whether this is actually the case. Furthermore, the metacognitive perspective adopted in this study must be seen within the broader construct of SRL. In the present study, a measure of performance, such as grades, was unavailable and the detached approach prevented observations of learning activities. However, qualitative findings corroborate the quantitative results, providing some indication that learning behaviors were affected. In future studies, measures of performance and learning behaviors should be included to enable a more accurate analysis of the impact of metacognition on learning.

In this study, the domain-specific and domain-general configurations of the tool are studied as two end points of a design dimension. While the domain-general configuration can be viewed as one end point (as it could not be less specific), the domain-specific configuration is not necessarily the most domain-specific configuration possible (as it could be less general). For example, different mechanisms could be introduced that take into account the specific learning tasks and required problem-solving steps to offer more specific support. In the present study, the domain-specific prompts are phrased in a domain-specific way, to make it easier for learners to interpret and apply. However, the prompts do not make use of unique aspects of the subject-matter learning content. It would be interesting to further study different configurations to assess what level support is most effective and how domain-specific and domain-general components of metacognitive support interact.

Future research

The present study confirms that a key challenge for future research is to engage learners with lower metacognition to make use of available support. We foresee two different approaches to address this challenge in future research, with the similarity of leveraging a broader perspective of SRL to improve metacognitive support.

The first approach is to increase tool use by improving the relevance of the support for most learners. Since different learners have different needs for support, this implies that the support needs to be adapted to individual learners. This is possible within a digital tool when there are ways to measure the relevant variables within the tool, for example through self-reported metacognitive knowledge or learning performance. For example, for learners who already know how to learn well, the self-explication of metacognitive strategies could be omitted, however, they may still find it relevant to keep track of their goals and plans. Similarly, support can be adapted to the learning situation. For example, in this study, some learners found the content of the tool mismatched the study level (introductory) and study type (experiential learning). To the extent that such insights about the study context could be incorporated, tools could be made to provide more relevant content.

The second approach is to increase tool use by making it easier and more appealing to make use of the tool. For example, learners could be cued to use the tool through digital reminders sent from the tool or through an intervention by a teacher. However, the goal of self-regulated learning is to self-initiate such activities. Providing such cues are essentially scaffolding the desired behavior, and for self-regulation to occur, should be faded over time. Self-initiated use could be promoted through habit-formation, for example by using gamification to reward behavior and by using cues fading over time to establish self-initiation. Alternatively, self-initiated use could be promoted by increasing perceived task value, for example by providing learners with insights regarding their progress (e.g., demonstrate task value) or by making the support more engaging and motivating (e.g., increase perceived task value). Such research should incorporate motivational aspects of metacognition (e.g., Efklides, 2011, 2014) and address these within the design of the intervention.

Future research and design of digital support of metacognition and SRL should incorporate how learners perceive, value, use, and sustain use of available support on the road towards self-initiated self-regulation of learning.