We propose the self-regulation view in writing-to-learn as a promising theoretical perspective that draws on models of self-regulated learning theory and cognitive load theory. According to this theoretical perspective, writing has the potential to scaffold self-regulated learning due to the cognitive offloading written text generally offers as an external representation and memory aid, and due to the offloading, that specifically results from the genre-free principle in journal writing. However, to enable learners to optimally exploit this learning opportunity, the journal writing needs to be instructionally supported. Accordingly, we have set up a research program—the Freiburg Self-Regulated-Journal-Writing Approach—in which we developed and tested different instructional support methods to foster learning outcomes by optimizing cognitive load during self-regulated learning by journal writing. We will highlight the main insights of our research program which are synthesized from 16 experimental and 4 correlative studies published in 16 original papers. Accordingly, we present results on (1) the effects of prompting germane processing in journal writing, (2) the effects of providing worked examples and metacognitive information to support students in effectively exploiting prompted journal writing for self-regulated learning, (3) the effects of adapting and fading guidance in line with learners’ expertise in self-regulated learning, and (4) the effects of journal writing on learning motivation and motivation to write. The article closes with a discussion of several avenues of how the Freiburg Self-Regulated-Journal-Writing Approach can be developed further to advance research that integrates self-regulated learning with cognitive load theory.
Writing, just as reading, is basic to academic learning both in school and at the university (Phillips and Norris 2009). In academic writing tasks such as learning journals or essays, learners are supposed to develop their ideas about subject matter in a self-determined way and thereby to construct sustainable knowledge. With the advent of a scientific writing pedagogy in the early seventies of the last century, the idea was born that writing is a natural tool for thinking and learning. This idea of writing as a learning tool (see also Tynjälä et al. 2012) was first suggested by the educational reform movement “writing across the curriculum” in the UK and soon spread over to many high schools and universities in the USA (Britton et al. 1975; Emig 1977). In this article, we present the main tenets of our research program on writing-to-learn which focuses on journal writing (called the Freiburg Self-Regulated-Journal-Writing Approach in the following). In our definition of writing-to-learn, we take up the old idea of writing as a tool for thinking and learning, meaning that with appropriate instructional support, writing activities such as journal writing in particular may serve students as a beneficial medium to enact knowledge construction processes that result in deep comprehension of subject matter, increased learning motivation, and long-term retention.
In the following, we will first discuss diverging classic theoretical views on writing-to-learn. Based on a critique of these approaches, we will then derive and characterize in detail a new theoretical perspective, called the self-regulation view in writing-to-learn. Following this theoretical discussion, we will present and discuss major results of the empirical studies of our research program, in which we systematically tested different support methods to optimize cognitive load and learning outcomes in writing learning journals.
Theoretical Perspectives on Writing-to-Learn
In research on writing-to-learn, two contesting theoretical views can be distinguished (Nückles et al. 2009). Following the so-called romantic view (Galbraith 1992), the idea of writing as a tool for thinking and learning is rooted in the work of Vygotsky, who regarded the human language as fundamental for the development of reasoning and thought. Building on Vygotsky’s work, James Britton (1980), the founder of the British writing-across-the-curriculum movement, argued that a great deal of our knowledge stored in long-term memory is tacit and therefore not directly accessible to us. According to Britton’s shaping-at-the-point-of utterance-hypothesis (Klein 1999), it is only by articulating our thoughts in the course of writing, that this tacit knowledge becomes explicit and our thoughts take shape. Thus, the very act of writing itself should induce germane cognitive load,Footnote 1 that is, evoke knowledge construction processes that inevitably result in learning (Sweller et al. 1998). Galbraith (1992) termed this view the “romantic” stance in writing-to-learn to highlight the idea that learning would emerge from spontaneous, expressive writing without consideration of rhetorical forms.
From another prominent perspective on writing, the so-called classic view (Galbraith 1992), in contrast, writing does not at all appear as a natural learning medium, in particular when regarded through the lens of cognitive load theory. Early writing theorists such as Flower and Hayes (1980) or Bereiter and Scardamalia (1987) characterized the nature of writing as complex problem solving that is likely to produce cognitive overload especially in novice writers. Based on empirical analyses of think-aloud protocols, Hayes and Flower (1986) emphasized the goal-directed nature of such problem solving: A writing plan based on a hierarchy of writing goals is generated, this plan is translated into written text, and the produced text is revised. Hayes and Flower conceived of these steps as interactive and recursive complex cognitive processes.
Building on Hayes and Flower (1986), Bereiter and Scardamalia (1987) proposed a theory of how writing may contribute to learning which is diametrically opposed to the romantic view sketched above. Following Bereiter and Scardamalia (1987), the complex task of writing requires the writer to integrate knowledge from two different problem spaces: (1) a content space that represents a writer’s knowledge and beliefs about subject matter (i.e., “what do I want so say?”) and (2) a rhetorical space that comprises the writer’s knowledge about rhetorical goals and schemata (i.e., “how do I say what I mean?”). Bereiter and Scardamalia proposed that the writer’s dialectical movement between these two problem spaces in seeking to satisfy both content and rhetorical requirements may produce learning. According to this assumption, germane cognitive load is induced when during planning, a writer selects specific knowledge elements from the content problem space and attempts to translate them into written text by drawing on rhetorical schemata such as metaphor (Ortony 1993) or Toulmin’s schema of argument (consisting of claim, data, and warrant, see Toulmin 1958). Bereiter and Scardamalia (1987) assumed that through this working-out of content elements by instantiating rhetorical schemata, the writer’s knowledge becomes reorganized or transformed: For example, a writer works out an envisaged line of argument and realizes in the course of writing how the information in the semantic problem space selected from long-term memory has to be presented as data and warrant in the draft in order to convincingly support an intended claim. Following Bereiter and Scardamalia, it is through such knowledge transforming (i.e., the reorganization and elaboration of a writer’s content knowledge through the instantiation of rhetorical schemata) that learning is produced. As rhetorical schemata play a constitutive role in this conception of writing-to-learn, Galbraith (1992) has termed the writing-as-problem-solving perspective as the “classic” view in research on writing-to-learn.
The two perspectives on writing-to-learn mentioned above make quite distinct assumptions about the specific cognitive activities involved in writing on which learners should focus their mental effort in order to expand their knowledge and comprehension of subject matter. Following the romantic view, the mere activity of articulating one’s ideas about subject matter in written text should entail learning. Hence, with regard to the design of writing tasks, forms of spontaneous and expressive writing, which allow the writer to freely develop their ideas about subject matter, should yield the greatest learning gains. Following the classic view, on the other hand, learners should explicitly be encouraged to invest mental effort in rhetorical writing, that is, to focus on the rhetorical aspects of writing and trying to instantiate a particular genre, for example, writing an argumentative essay or a research report in line with APA guidelines, as perfect as possible.
However, a closer inspection reveals that the assumptions of both the romantic view and the classic view regarding how to best achieve learning by writing can be called into question. On the one hand, it is obvious that writing to instantiate a particular genre (e.g., a research report in psychology) imposes high cognitive load on the writer and is likely to overtax novice writers. Every professor who ever supervised a bachelor, master, or PhD thesis knows how demanding it is—even for graduate students—to produce a coherent and rhetorically well worked-out empirical journal article. Accordingly, Scardamalia and Bereiter (1991) regularly found in their expert-novice studies that expert writers were much better than novices at controlling their text production in line with rhetorical goals. However, their empirical studies eventually left open the question whether novice writers with a low level of rhetorical genre knowledge (see Winter-Hölzl et al. 2016) can indeed deepen and expand their knowledge by trying to conform to a particular rhetorical genre.
On the other hand, with regard to the romantic view, it is questionable whether unguided expressive writing would indeed lead to substantial learning gains. For example, Nückles et al. (2004) had university students write learning journal entries as follow-up course work to the weekly seminar sessions. For writing the learning journal, the students received only a brief and informal introduction. Nückles and colleagues found that, on average, the students showed a rather low level of cognitive knowledge construction activities in the journal entries; at the same time, the amount of actually realized knowledge construction strongly correlated with learning outcomes as measured by a test at the end of the term.
In line with this result, the available empirical evidence generally suggests that the effects of writing-to-learn tasks typically are rather small, though positive. In their meta-analysis on school-based writing-to-learn task assignments, which included different types of “informational” writing such as writing summaries, reports, or descriptions of scientific processes, Bangert-Drowns et al. (2004) obtained an average effect size of 0.26 standard deviations, which can be regarded as a small to medium effect (Cohen 1988). This finding has to be considered as rather weak evidence both for the romantic and the classic view of writing-to-learn. On the other hand, the meta-analysis by Bangert-Drowns et al. (2004) also showed that writing tasks that included metacognitive prompts encouraging students to reflect on their knowledge, comprehension difficulties, and learning processes had a significantly larger effect on learning success (Cohen’s d = 0.44) compared with writing tasks without such prompts. Hence, the learning effect of writing might specifically be dependent on the scaffolding that it provides for metacognitive and self-regulatory processes. Accordingly, Bangert-Drowns et al. (2004) concluded that the stimulation of metacognitive reflection in writing to learn should be promising especially if metacognitive reflection is combined with a thorough application of cognitive knowledge construction activities.
The Self-Regulation View in Writing-to-Learn: Focusing Mental Effort on the Content Space Rather Than on the Rhetorical Space
In our approach to writing-to-learn, we sought to incorporate the main conclusions of the discussion of the theoretical perspectives and empirical results sketched above. To this end, we adopted journal writing as our writing task. In a learning journal, learners typically write down their reflections on previously presented learning contents. In addition, they should ask themselves what they do not understand and what can be done to close this gap in understanding. In this way, learners can apply beneficial cognitive strategies such as organization and elaboration strategies as well as metacognitive strategies such as monitoring and regulation. We consider learning journals as a promising way of conducting follow-up coursework and as a method to foster self-regulated learning by writing.
In the learning journal entry in Fig. 1, the first sentence represents a type of organization strategy as the student identified the main points of the last lesson. In the next paragraph, the student applies an elaboration strategy by providing an example for non-specific immune defense. Then, another organization strategy follows by distinguishing different types of cells executing specific immune defense. Afterwards, the student shows an episode of negative monitoring as she articulates difficulties in understanding the functioning of the helper cells. Last, the student regulates her comprehension by developing ideas of how to overcome her comprehension difficulty (see Fig. 1).
In line with the romantic view, the writing of such a learning journal is a self-determined way of writing that allows learners to freely develop their ideas about subject matter and to personally select which aspects of a learning episode require deeper reflection. Contrary to the classic view, learners do not need to follow a certain rhetorical structure during this reflection because—unlike genres such as argumentative essays or scientific reports—learning journals specifically do not have a conventionalized rhetorical structure. However, the results of Nückles et al. (2004) suggest that learners spontaneously tend to keep their invested mental effort at a minimum during journal writing (see also Feldon et al. 2019; Shenhav et al. 2017) and thus do not sufficiently engage in germane processes of self-regulated learning such as elaboration, organization, monitoring, and regulating knowledge gaps. Consequently, in our journal writing approach, learners are instructionally supported to invest sufficient mental effort in germane processing. In other words, this support consists of prompts eliciting cognitive learning strategies (elaboration and organization) as well as metacognitive learning strategies (monitoring and regulating, see Table 1 for example prompts). Inspired by the meta-analytic results of Bangert-Drowns et al. (2004), we termed this idea of combining free and expressive writing with a systematic prompting of self-regulatory processes the self-regulation view of writing-to-learn.
Our notion to instructionally support students in investing sufficient effort in germane processes of self-regulated learning without imposing genre-driven cognitive load aligns well with research question 3 of the Effort Monitoring and Regulation (EMR) framework, which has been proposed by de Bruin et al. (in press). The framework builds on the model of Nelson and Narens (1994) that distinguishes between an object level and a meta-level of cognitive processing. At the meta-level, judgments of learning and regulation decisions (referred to as control) take place. Via the process of monitoring, the meta-level receives input from the object level, at which learners engage with the content that is to be learned (see Fig. 2). Cognitive load is assumed to have direct links with both levels. The execution of both object-level and meta-level processes imposes cognitive load. On the other hand, cognitive load can influence both learners’ monitoring and regulation processes. For instance, monitoring can be influenced because learners monitor cognitive load and use it as a cue for judging their level of comprehension; regulation can be affected because learners might decide on the basis of their perceived cognitive load whether investing further mental effort is useful and possible (see de Bruin et al. in press).
In the EMR framework, the question of how cognitive load on self-regulated learning tasks can be optimized is one of three main research questions (see Fig. 2; for detailed information on the other two research questions, see de Bruin et al. in press). With our approach to journal writing, we seek to provide answers to this research question in particular.
Cognitive Offloading in Journal Writing
From the perspective of cognitive load theory, journal writing appears to be especially promising to facilitate self-regulated learning because of two reasons: First, writing naturally affords writers to externalize their thoughts on paper or on a computer screen. Externalizing one’s thoughts in a written text preserves them, allowing the writer to reread them, and to develop them further (Klein 1999). Hence, through externalization, information processing load on working memory is greatly reduced so that cognitive capacity is freed for metacognitive reflection (i.e., monitoring and regulation). At the same time, the externalized thoughts may act as feedback for the writer that triggers associative processes through spreading activation and both facilitates and guides idea generation. Galbraith (1992, 2009) has described this dynamic interaction between the writer and the emerging text using connectionist modeling (see Rogers and McClelland 2004) and termed it as the implicit knowledge-constituting process in writing. Thus, the potential of writing as a scaffold for self-regulated learning can theoretically be underpinned by the advantages written text offers as an external representation and memory aid (Klein 1999).
Second, cognitive offloading is further achieved by the fact that—as argued above—learners do not have to meet prescribed rhetorical standards in journal writing. We call this particularity of journal writing the “genre-free principle.” Accordingly, a high-quality learning journal entry is per definition not expected to be a rhetorically well-shaped text. Thus, in terms of classic linguistic criteria such as text cohesion (Lachner et al. 2017), audience design (Traxler and Gernsbacher 2011), or readability (e.g., Kincaid et al. 1975), a learning journal entry may appear as rather imperfect and idiosyncratic from a reader’s perspective, but may nevertheless prove to be highly beneficial to the writer herself concerning her learning progress achieved by writing this entry. We assume the genre-free principle to facilitate self-regulated learning by journal writing precisely because the writer is offloaded from the burden to invest substantial mental effort in instantiating rhetorical schemata which may be regarded as extrinsic to the goal of comprehending subject matter especially for novice writers. Consequently, because learners are offloaded from focusing much on rhetorical aspects of writing, the available cognitive capacity can fully be invested in germane processing of subject matter.
We suggest that the genre-free principle in journal writing can be considered, at least on a more general level, as a particular variant of the goal-free effect in cognitive load theory (e.g., Paas and Kirschner 2012; Sweller et al. 2019). In cognitive load theory, it is assumed that trying to reach a specific goal (e.g., finding the solution of a particular mathematics problem, as it is usually required) leads to strategies that do not contribute to learning (e.g., means-ends analysis), imposes extraneous (i.e., unproductive) cognitive load on working memory, and reduces learning outcomes. It is more effective to provide unspecific goals such as “calculate the value of as many variables as you can” so that leaners can focus on learning-relevant (germane) aspects (here: to-be-learned solution steps). Similarly, setting a specific rhetorical format as writing goal would induce writing strategies to stick to this format (genre). These strategies are extraneous to applying cognitive and metacognitive strategies to the learning contents (i.e., germane processes). Hence, taking away the goal to stick to a rhetorical format (i.e., genre-free principle) allows the learners to devote more of their cognitive capacities to germane processing.
What Types of Germane Processing Should Be Instructionally Supported According to the Self-Regulation View?
Based on learning strategy research (e.g., Mayer 2002; Weinstein and Mayer 1986) and models of self-regulated learning (e.g., Winne and Hadwin 1998; Zimmerman 2008), several core cognitive and metacognitive learning strategies can be distinguished whose application in journal writing should result in germane processing. We termed these processes “strategies” to highlight that learners would be expected to enact them intentionally when writing a learning journal. On the object level, core cognitive strategies are organization and elaboration (Mayer 2002). Through organization strategies, a writer may, for example, identify main ideas of the learning contents, establish links between concepts, and structure the learning contents in a meaningful way. Via elaboration strategies, the writer fleshes out her ideas, particularly by generating examples to illustrate abstract concepts, by using analogies to relate new concepts to familiar ones, and through the critical discussion of contents. Following Mayer (2002, 2009) organization and elaboration are at the heart of meaningful learning because they enable the learner to organize the learning contents into a coherent whole and to integrate new information with prior knowledge, thereby enabling deep understanding and long-term retention. As organization and elaboration are assumed to inevitably produce learning, we consider them as germane processes.
On the meta-level, journal writing might in particular facilitate metacognitive strategies such as the monitoring and regulation of comprehension. Comprehension monitoring by writing a learning journal entry particularly enables the identification of knowledge gaps and comprehension difficulties, for example, when a learner fails to find an appropriate example to elucidate the meaning of an abstract concept, or if the learner has difficulty in resolving a contradiction that became apparent by working-out a line of thought. If such impasses are detected during the writing process, the learner could plan to enact remedial activities in order to overcome the identified difficulties and augment their understanding. In the context of this regulation, the learner would return to remedial organization and elaboration strategies. To the extent that learners successfully detect and overcome impasses in their comprehension of subject matter by executing monitoring and regulation strategies, we consider monitoring and regulation also as important germane processes.
Models of self-regulated learning such as the model by Zimmerman (2008) describe the interplay between cognitive and metacognitive strategies as a cyclical and interactive process (see Fig. 3), in which the execution of a particular cognitive or metacognitive strategy recursively initiates an adjacent cognitive or metacognitive process. Thus, completing one learning cycle may naturally segue into further learning cycles until a level of comprehension is achieved that is personally regarded as satisfactory.
Prompting learners to engage in the outlined cognitive and metacognitive strategies of self-regulated learning during journal writing has been proven highly beneficial. To foreshadow the effects of prompted journal writing on learning outcomes, a mini meta-analysis following Goh et al. (2016) showed a medium-to-large effect size in favor of instructionally supported journal writing by prompts versus unsupported journal writing, Hedges’s g of 0.78, SE = 0.14, p < .0001.Footnote 2 This effect is almost double as large as the average effect size Bangert-Drowns et al. (2004) obtained in their meta-analysis of writing interventions using prompts for metacognitive reflection. Hence, the self-regulation view of writing-to-learn, which is reflected in the outlined journal writing approach, is highly promising. Likely, the substantial benefits of journal writing are due not only to the fact that we designed the journal writing as a self-regulated learning task, but also to the fact that the learners were provided with prompts to optimize cognitive load in performing this task (see Table 1 for example prompts).
How to Optimize Cognitive Load in Self-Regulated Learning by Journal Writing
In view of research question 3 of the EMR framework (i.e., How do we optimize cognitive load on self-regulated learning tasks?), we will summarize and reflect on the main insights of our research program on diverse instructional support procedures that aim to optimize learning outcomes by optimizing cognitive load during self-regulated learning by journal writing. These insights are synthesized from 16 experimental and four correlative studies published in 16 original papers on the Freiburg Self-Regulated-Journal-Writing Approach (all 16 papers are highlighted with an asterisk in the reference list; see also Table 2 for an overview).
Historically, we developed our approach to journal writing primarily for university students to improve self-regulated learning in follow-up coursework at the university and then extended our intervention studies to younger students. When school students participated, we took care that the students could be assumed to have sufficient mastery of transcription skills which can roughly be assumed to be achieved with the entry into secondary school (Wilson and Braaten 2019). Transcription is a writer’s ability to transform the words she or he wants to say into written symbols on a page. Transcription comprises the subskills of spelling and handwriting or typing (Graham and Harris 2000). We regarded sufficient mastery of transcription skills as an important precondition for successful journal writing, because the execution of these skills can bind considerable working memory resources, especially if the writer cannot carry them out fluently and efficiently. Thus, regarding the goal of the journal writing, that is, deepening one’s comprehension about subject matter by applying cognitive and metacognitive strategies, paying too much attention on how to get the words on the paper would create undesirable extraneous cognitive load. For this reason, we focused on students who could be assumed to have already some mastery of the mechanics of writing (see Graham and Harris 2000).
Part of our studies were realized in the laboratory and part of them in the field. In the laboratory studies, students typically received different combinations of prompts for writing a single learning journal entry about a videotaped lecture they had previously viewed. Prompts were typically delivered via a prepared word document in which learners typed their learning journal. In some studies, students received further instructional support in addition to the prompts (e.g., a worked-out example journal entry or meta-strategic information in relation to the prompted strategies, i.e., information about how, when, and why to use the prompted strategies, see Paris et al. 1983; Zohar and Peled 2008). After they had finished their journal entry, the students took a comprehension test. In some experiments, they took the same test again one or several weeks later to measure the students’ retention of the learning contents. In the field experiments, students usually wrote several learning journal entries, usually once a week after a class or seminar session, over a period of about 3 to 6 weeks, or in some studies, over a whole term (12 weeks). In these field studies, students typically received an extended introduction to journal writing which included a presentation of the cyclical model of cognitive and metacognitive processes involved in self-regulated learning by journal writing (see Fig. 3) and a modeling of how the strategies should be applied in writing. We then either varied experimentally different combinations of prompts or we compared a prompted journal writing condition with a no writing control condition. Prompts were delivered via a printed worksheet or a card.
Generally, we measured the quantity of cognitive and metacognitive strategies in learning journals by a coding scheme, differentiating between rehearsal, elaboration, organization, monitoring, and remedial (regulation) strategies (Chi 1997). In addition, in some studies, we rated the quality of the strategies (e.g., Glogger et al. 2012; Roelle et al. 2017). We usually used 6-point rating scales ranging from 1 (very low quality) to 6 (very high quality). At least two independent raters coded and rated the learning journals after having achieved good inter-rater reliabilities (usually ICC > .85, Cohen’s Kappa > .80). Appendix Table 3 shows examples of learning journal excerpts, as well as the coding of the according learning strategy category. The findings from the research program on the Freiburg Self-Regulated-Journal-Writing Approach will be described in more detail in the following subsections of this paper.
Prompting Germane Processing in Journal Writing
Given the aforementioned germane processes involved in self-regulated learning, the question arises, how germane cognitive load can be increased effectively (see research question 3 by de Bruin et al. in press). That is, how can the cognitive strategies of elaboration and organization, and the metacognitive strategies of monitoring and regulation be activated in an optimal way? As mentioned, unsupported learners tend to use journal writing in sub-optimal ways. Nückles et al. (2004) found that journal writing without instructional support resulted in almost absent metacognitive strategies and clear deficits in the use of cognitive strategies. Prompts can be used to address such deficits. Prompts are hints or questions that induce productive learning processes (Bannert and Reimann 2012; Zheng 2016). We conceive of prompts as strategy activators following Reigeluth and Stein (1983). That is, we assume that learners are, in principle, capable of using learning strategies, but do not spontaneously use them, at least not to a satisfactory degree (e.g., Bannert 2009; Nückles et al. 2004). In our research program, we developed sets of cognitive and metacognitive prompts in order to increase germane processing in journal writing (see Table 1). We investigated whether and how they encourage learners to enact powerful learning strategies and improve learning outcomes.
Experiments that varied the provision of prompts clearly showed that strategy prompts strongly increased learners’ use of the prompted strategies in the learning journals (e.g., an elaboration prompt increases elaboration; Berthold et al. 2007; Glogger et al. 2009; Nückles et al. 2009; Schwonke et al. 2006). In addition, the increased use of strategies was accompanied by enhanced learning outcomes (see the mini meta-analysis reported above). Several studies further found that the effect on comprehension (and sometimes retention) was mediated by the strategy use. That is, the prompting is effective through the learning strategies they activate. In Berthold et al. (2007), the use of cognitive strategies mediated learning outcomes. Roelle et al. (2017) found the quality of organization strategies to mediate learning outcomes in a conceptual knowledge test in two experiments. These findings underline the germane nature of the learning strategies and that the prompts are the activators of this germane processing.
The prompting germane processing principle in journal writing, however, does not only refer to the pure increase of strategy use. Much of our research program has concentrated on finding the optimal combination (e.g., combination of cognitive and metacognitive prompts) and sequencing of different types of prompts in order to maximize germane cognitive load and to foster deep comprehension and acquisition of sustainable knowledge.
How to Combine Prompts to Optimize Germane Processing
Do learners need both cognitive prompts and metacognitive prompts (i.e., for monitoring and regulation) to optimize germane processing in journal writing? Berthold et al. (2007) found that undergraduate students who received cognitive, or a combination of cognitive and metacognitive prompts learned more than students without prompts. Students provided only with metacognitive prompts, however, did not learn more than those without prompts. They applied a high amount of metacognitive strategies but very few cognitive strategies. The two successful groups, however, used a balanced amount of cognitive and metacognitive strategies in their learning journals. The use of cognitive learning strategies mediated the effect on learning outcomes.
Is the unbalanced use of metacognitive strategies a problem? Are the prompted metacognitive strategies rather detrimental to learning (i.e., inducing not only germane load)? Nückles et al. (2009) replicated the experiment of Berthold et al., but this time gave all learners time and access to the learning material during a later writing phase to better allow for metacognitive regulation of detected comprehension problems and application of remedial strategies. Accordingly, during this writing phase, students were given the opportunity to revise their learning journal. Nückles et al. also included a further experimental condition, as compared to Berthold et al. (2007), in which prompts for the planning of remedial strategies were added to the formerly used cognitive and metacognitive prompts. More specifically, the learners received either (a) no prompts, (b) cognitive prompts (elaboration and organization, see cognitive prompts with superscript “b” in Table 1), (c) all types of metacognitive prompts (superscript “b” in Table 1: monitoring, planning of remedial strategies), (d) mixed prompts (cognitive and metacognitive prompts) without, or (e) with prompts for planning of remedial strategies (marked with superscript “a” in Table 1). The experiment successfully replicated the results in Berthold et al. (2007) with regard to the learning outcomes of cognitive prompts (i.e., condition b) and the combination of cognitive and metacognitive prompts (i.e., mixed prompts, see condition d above). In contrast to the results of Berthold et al. (2007), however, metacognitive prompts alone (i.e., condition c, see above) also improved learning outcomes. That is, prompting metacognitive strategies alone is not detrimental—if the planning of remedial strategies is prompted and learners have the opportunity to realize the remedial strategies. In real-world contexts, this opportunity is usually given.
The most successful set of prompts, however, was the combination of all types of prompts (i.e., condition e, see above). That is, prompting all essential sub-processes of self-regulated learning fostered students’ comprehension best. The two most successful groups in this study again used cognitive as well as metacognitive strategies and showed a balanced use of the different strategies.
In a correlative field study, Glogger et al. (2012) had ninth graders write learning journals in mathematics over the term of 6 weeks. The quality and quantity of cognitive strategies predicted learning outcomes, controlling for prior knowledge. Learners who applied both cognitive plus metacognitive strategies were particularly successful. Learners who mainly used one type of strategy performed similarly poorly as did learners who hardly used strategies. In a conceptual replication in biology classes, again the combination of cognitive and metacognitive strategies made students most successful.
All in all, using a combination of cognitive and metacognitive prompts is a powerful instructional strategy to optimize germane processing. At the same time, it is important to activate all sub-processes of self-regulated learning in learners, that is, to prompt organization and elaboration, monitoring of comprehension, and planning of remedial cognitive strategies (see Fig. 3).
How to Sequence Prompts to Optimize Germane Processing
A further, more recent question in our research program is, whether the sequence of different types of prompts is important in order to optimize germane cognitive load. In our previous research, we usually gave cognitive and metacognitive prompts at the same time, but cognitive prompts were printed above metacognitive ones (e.g., Berthold et al. 2007; Nückles et al. 2009). In the correlative field study with ninth graders (Glogger et al. 2012), we prompted the following sequence, depicted as a round-trip: (1) monitoring, (2) organization, (3) elaboration, (4) monitoring again and planning remedial strategies. However, in none of the mentioned studies, we investigated to what extent learners realized the metacognitive strategies prior to the cognitive strategies or vice versa and whether the sequence matters for learning. There is reason to expect that the sequence matters.
After engaging in metacognitive strategies, that is, identifying gaps in understanding and putting effort into closing them, learners should have a more solid knowledge base. Thus, they should be able to organize the main content in a more coherent manner and elaborate on the main content more deeply (cognitive processing) than before (e.g., Glogger-Frey et al. 2015; Mayer 2009; Roelle and Berthold 2016). On the other hand, cognitive processing of the learning contents should give cues as to how well learners have understood the contents and where exactly they have gaps in understanding (de Bruin et al. in press; Nelson and Narens 1994). Thus, engaging in cognitive processing first might inform comprehension monitoring and planning of remedial strategies to close specific gaps. It might improve subsequent metacognitive processing.
Against this background, Roelle et al. (2017) manipulated the sequence in which tenth-graders responded to cognitive and metacognitive prompts (see superscript “c” in Table 1). More specifically, after attending a lecture in a first experiment, or regular school lessons in a second experiment, the students wrote learning journals as follow-up activity. During writing, the learners were prompted to either (a) engage in the cognitive processes of organization and elaboration prior to engaging in metacognitive processes (i.e., monitoring and planning of remedial strategies) and implementing their remediation plans (cognitive-first group), or (b) engage in the same metacognitive and remedial processes prior to organizing and elaborating on the learning content (metacognitive-first group).
In both experiments, the learners in the metacognitive-first group outperformed their counterparts regarding the quality of the executed organization and metacognitive processes as well as in a conceptual knowledge test. These results can be interpreted as follows: writing down gaps in understanding as well as planning and realizing remedial strategies built a more solid knowledge base, on which the students were better able to organize the learning contents. It is also possible that engaging in comprehension monitoring (and actually identifying gaps in understanding) first helped the learners to recognize the need for engaging in deep processing of the learning content. As a consequence, they invested more effort in subsequent organization (and elaboration) processes (see the preparation-to-learn effect of knowledge gap experiences, Glogger-Frey et al. 2015; Loibl et al. 2017). Investigating this effort in future studies would contribute to answering research question 1 (How do students monitor effort?) and research question 2 (How do students regulate effort?) of the EMR Framework (de Bruin et al. in press). First evidence by Roelle et al. (2017) suggests that the sequence matters in prompting germane processing. Further possible sequences of prompts could be investigated in further research.
In summary, our research program on self-regulated learning journals suggests that learners are optimally supported in distributing germane load between the object level and the meta-level by prompting them to engage in all sub-processes involved in self-regulated learning: cognitive (elaboration, organization) and metacognitive strategies including monitoring as well as planning of remedial strategies (see Fig. 3). In addition to prompting all types of sub-processes, metacognitive strategies should be prompted first.
Effects of Worked Examples and Self-Explanations
The worked example effect is a classical cognitive load theory effect (e.g., Sweller and Cooper 1985; Sweller et al. 1998). It describes the advantage of studying worked examples as compared to learning by doing (e.g., problem solving, exploring) for the initial acquisition of cognitive skills (e.g., solving certain types of mathematical problems or engaging in effective learning strategies; e.g., Renkl 2014a, b). Note that learning by worked examples is usually optimized if the learners not only (superficially) read through the examples but are nudged to self-explain the principles (e.g., mathematical theorems, strategy guidelines) applied in the examples (self-explanation effect; e.g., Chi et al. 1989; Renkl 2017, Renkl and Eitel 2019). In recent publications, the self-explanation effect has been integrated into cognitive load theory (e.g., Sweller et al. 2019).
In our research program on the Freiburg Self-Regulated-Learning-Journals Approach, we tested whether the worked example effect can be exploited for fostering the quality of students’ learning journals. More specifically, we tested whether students react to provided strategy prompts (see our previous sections on prompting) more adequately if we show them and have them self-explain examples of good responses to such prompts before journal writing. Furthermore, we also tested whether the potentially improved responses to prompts also led to better learning outcomes.
Hübner et al. (2010) had their participating high school students read a general introduction into journal writing. In particular, they were informed about the main objectives of journal writing and possible fields of application. In addition, cognitive and metacognitive prompts for journal writing were introduced. Afterwards, half of the students got an example of a well-written learning journal entry about a fictitious physics lesson to illustrate how to react sensibly to the introduced cognitive and metacognitive prompts. Moreover, these students were requested to self-explain the examples by assigning single passages in the learning journal to the corresponding cognitive or metacognitive prompts; the students got feedback to their assignments. Following this, the high school students entered a training phase. They first watched a videotaped lecture on a topic from social psychology (topic: social pressure) and then engaged in journal writing. A comprehension-oriented posttest assessed how much the students had learned about social pressure. In a transfer session, 1 week later, the students watched again a social psychology lecture (topic: destructive obedience) and engaged in journal writing. In this transfer session, all students just received the prompts (without any further support such as examples). Finally, a comprehension-oriented posttest assessed how much the students had learned about destructive obedience. Hübner et al. (2010) obtained the following main findings: Self-explaining a learning journal example enhanced in particular elaborative learning strategies as expressed in the learning journals, both in the training and in the transfer session (strong effects). A similar effect was found for metacognitive strategies (strong effect in the training session); however, this effect slightly failed to reach the level of statistical significance in the transfer session. Most importantly, self-explaining a learning journal example had a strong effect on the learning outcomes on destructive obedience in the transfer session. Overall, Hübner et al. (2010) showed that the worked example effect and the self-explanation effect could be exploited to enhance transfer in the sense that the students wrote better learning journals (in particular, more elaboration) on a new topic and also achieved better learning outcomes on this topic.
Roelle et al. (2012) tested whether the examples should be provided right from the beginning, as in Hübner et al., or after some experience in journal writing. Although worked examples are usually introduced early (see Renkl 2011, 2014b), there might be advantages of a delayed provision of worked examples: Even in the case of missing examples, the learners receive an introduction to journal writing (see the previous description of the study by Hübner et al. 2010), which is usually a new and complex learning method, and they have to apply this method to some new learning contents (e.g., social psychology topics or mathematics). Having additionally to self-explain examples on an additional topic (i.e., physics in the case of Hübner et al. 2010) and to relate these examples to the new topic might be very demanding and potentially lead to cognitive overload in some students (see, e.g., Sweller 2006). To test the effect of immediate or delayed presentation of examples, Roelle et al. (2012) conducted a quasi-experimental field experiment in high school mathematics (5th grade). Journal writing was used as homework after each of four mathematics lessons. In one classroom, the examples were introduced before the first learning journal entry and withdrawn after having written the second learning journal entry; in another classroom, the examples were provided as additional support before writing the third and fourth learning journal entries. Basically, it was found that providing examples early led to qualitatively better learning journal entries in the beginning (first two lessons) and to the acquisition of more conceptual knowledge from the first two and the last two lessons (no effects on procedural knowledge). Overall, presenting examples for productive journal writing early is superior, which is in line with cognitive load theory and the theory of example-based learning by Renkl (2014b).
Graichen et al. (2019) tried to conceptually replicate the effects of examples on the quality of learning journals and learning outcomes in the context of teacher education. More specifically, the authors had their participating student teachers (geography) write learning journal entries in which they should integrate information from three texts providing content knowledge (here: on geography), pedagogical content knowledge (here: on geography education), and pedagogical knowledge (see Shulman 1987).Footnote 3 For this purpose, the prompts used by Hübner et al. (2010) were modified so that they also encouraged the learners (except for those of the control group) to integrate the information from the different texts, that is, to use coherence-creating strategies. Overall, the authors found mixed results. Combining prompts with examples, as compared to providing just prompts, led to more high-quality coherence-creating strategies in learning journals. However, this example effect did not lead to more knowledge or better knowledge application, as compared to prompts alone. A tentative explanation might be that a case of a utilization deficiency was found (e.g., Miller 2000): The execution of the unfamiliar and demanding coherence-creating strategies imposed heavy cognitive load so that the learners were partly distracted form the learning contents. Only after more training on these strategies less cognitive load would be imposed when applying them, and they would then provide the expected benefit.
Note that examples of journal entries were also used in other studies on the Freiburg Self-Regulated-Journal-Writing Approach to prepare the learners for their writing task (e.g., Glogger et al. 2009, Roelle et al. 2011, Roelle et al. 2017, see Table 2). However, there was no experimental variation in these studies in this respect. Nevertheless, using worked examples has proven to be a sensible part of the introduction to journal writing. Thereby, the students are enabled to engage in productive journal writing.
Overall, the available evidence suggests that self-explaining worked examples is a sensible instructional procedure to prepare the learners for journal writing, in particular, as transfer effects were repeatedly found, which is usually hard to achieve (e.g., Barnett and Ceci 2002). Self-explaining worked examples probably optimizes cognitive load during journal writing in terms of reducing extraneous activities during writing and fostering germane writing activities.
Self-management Effects in Journal Writing
A recent effect within cognitive load theory is the self-management effect (e.g., Eitel et al., in press, Mizza et al. 2020). This effect refers to enabling learners who are confronted with learning materials violating design principles from cognitive load theory to cope successfully with this sub-optimality; hence, their learning is not hampered. For example, the learners are taught (a) to detect that split attention is evident in learning materials (i.e., text and picture as separate information sources) and (b) to then apply strategies to map the two information sources onto each other. Up to now, this effect has been mainly established for the split attention sub-optimality (Mizza et al. 2020). Note that the students usually get very detailed instruction on how to proceed (i.e., little self-regulation) so that they can compensate for split-attention effects (e.g., Roodenrys et al. 2012).
Eitel et al. (2020) recently extended the self-management effect by emphasizing learners’ self-regulation. These authors “just” provided parsimonious information about the specifics of learning materials (e.g., what type of information is peripheral) and counted on learners’ self-regulation to avoid potential extraneous load from sub-optimal design (i.e., no provision of detailed instruction). More specifically, the learners were informed before entering the learning environment that the pictures (including a short accompanying text) about the consequences of lightening (i.e., seductive details) were not relevant for the present learning goal to understand the development of lightening. Actually, this information enabled the learners to self-manage their later learning in that they largely ignored the seductive details so that these details did not hamper learning outcomes; the learners without this information, in contrast, were hampered in their learning by the seductive details (Eitel et al. 2020).
The positive effects of two instructional support procedures that we have found in our research program on the Freiburg Self-Regulated-Learning-Journals Approach can be re-interpreted as self-management effects primarily in the sense of Eitel et al. (2020). First, the worked example effects that were discussed in the previous section can be regarded as a self-management effect. Such a claim may be puzzling at first glance as in classical cognitive load research, these two effects are not related: Worked examples help learning on the content or object level (e.g., mathematics); providing information for self-management helps on the meta-level to best exploit the present learning opportunity. Note, however, that in our journal writing approach, the worked examples do not teach knowledge on the object level (e.g., physics or social psychology; see Hübner et al. 2010) but on the meta-level, that is, knowledge about how to best exploit the learning opportunity of journal writing (for an extensive discussion of different levels in worked examples see Renkl et al. 2009). More specifically, we tried to teach the (potentially) transferable skill to self-manage journal writing by our worked examples. Actually, we found such transfer effects in Hübner et al. (2010: enhanced learning outcomes with respect to obstructive obedience).
Second, Hübner et al. (2010) not only provided worked examples as support procedure for journal writing but also presented to half of their learners a type of informed training (Paris et al. 1983). More specifically, half of the learners received a short presentation (10 min) about the utility of the strategies to be elicited by our prompts; in particular, declarative knowledge about learning strategies and corresponding conditional knowledge (i.e., when and how to use these strategies) was taught. As expected, the metacognitive knowledge about the strategies (targeted by our prompts) that was provided by the informed training fostered learning on the object level in the training phase (topic: social pressure; strong effect) as well as in the transfer phase (topic: destructive obedience; medium effect). Again, these findings can be interpreted as a self-management effect: The informed training taught the learners on the meta-level how to effectively exploit prompted journal writing so that transfer effects to learning on the object level could be achieved.
Relating our findings to the EMR framework (research question 3), worked examples and informed training enabled the learners to effectively regulate their strategy use in journal writing on the meta-level: Tentatively, the supported learners minimized writing that was not connected to applying learning strategies, thereby reducing extraneous load. In addition, an increased focus on strategy-related writing fostered germane cognitive load.
Adapting and Fading Guidance in Line with Learners’ Expertise in Self-Regulated Learning Strategies
Inspired by the guidance-fading effect (e.g., Sweller et al. 2011b) and the expertise reversal effect (e.g., Kalyuga et al. 2003), we also delved into the role of adapting the instructional support measures that are designed to foster cognitive and metacognitive strategies of self-regulated learning, to the learners’ expertise. Up to date, this research has yielded three main insights.
Fading Guidance Adaptively in Line with Learners’ Growing Strategic Expertise
The first main insight is that in prompting germane processing, learners’ learning strategy expertise needs to be considered. In a field experiment with university students, Nückles et al. (2010; Exp. 1) found that the benefits of prompting cognitive and metacognitive strategies concerning both the elicitation of the respective strategies and learning outcomes diminished when the prompts were provided over the course of 12 weeks. More specifically, similar to the abovementioned studies that involved only one or two journal entries (e.g., Berthold et al. 2007; Nückles et al. 2009), the authors found beneficial prompt effects in an initial phase of journal writing (i.e., in the first 6 weeks). By contrast, in a later phase of journal writing (i.e., the following 6 weeks), the provision of prompts led to a significant decrease in the use of cognitive and metacognitive learning strategies and the superiority of the prompted group over the not-prompted group in terms of learning outcomes completely diminished. One explanation for this pattern of results is that over time learners internalized the guidance provided by the prompts, which caused the prompts to change from essential guidance that helps learners to engage in germane processing to redundant guidance that requires reconciliation with internal guidance and thus mainly induced extraneous cognitive load. This explanation is supported by the second experiment of Nückles et al. (2010). In this experiment, the authors tested a procedure in which the prompts were faded once a learner had applied the respective prompted strategies in high quality in two previous journal entries. The results showed that this adaptive fading procedure fostered cognitive learning strategies and learning outcomes (but not metacognitive strategies) in comparison to permanent prompting in subsequent journal entries. Hence, prompts should be faded out past a certain point in a manner adapted to learners’ learning strategy expertise.
Adapting Guidance to Learners’ Level of Strategic Expertise
Further evidence that the extent to which prompts contribute to germane processing in journal writing can be optimized by adapting the prompts to learners’ learning strategy expertise stems from an experiment of Schwonke et al. (2006). In this experiment, learners first wrote a learning journal entry without prompt support and subsequently were instructed to revise their drafts. For revising, the learners received either prompts that were adapted to them on the basis of their responses to a learning strategy questionnaire or a meta-strategic knowledge test; two comparison groups received either a random set of (cognitive or metacognitive) prompts or no prompts at all. With respect to both elaboration and metacognitive strategies, the adaptive prompts (the two adaptation procedures yielded similar effects) proved to be more effective than the random prompts and no prompts, whereas the random prompts were partly even worse than no prompts. Similar effects were obtained with respect to learning outcomes. The adaptive prompting fostered understanding in comparison to both random and no prompts, whereas there was no significant benefit of random prompts over no prompts. In conjunction with the abovementioned findings concerning the benefits of the adaptive fading procedure, these results make a strong case for the notion that implementing the prompting germane processing principle needs to consider learners’ learning strategy expertise.
The studies by Nückles et al. (2010) and Schwonke et al. (2006) refer to the role of inter-individual differences in learning strategy expertise within a certain student population (i.e., within university students). There is also evidence that in prompting germane processing, differences between student populations that pertain to learners’ developmental level should be considered as well. In a quasi-experiment by Glogger et al. (2009), ninth-grade high school students received cognitive and metacognitive prompts (as well as worked examples) that were similar to the ones that had proven to be effective in university students. For these learners, the prompts scarcely elicited cognitive and metacognitive strategies. One explanation for this finding is that the guidance provided by the prompts was too low for ninth-grade high school students. This explanation is underpinned by a second finding of Glogger et al. (2009). When ninth-grade high school students received prompts that were enriched by specific suggestions concerning the implementation of cognitive and metacognitive strategies (e.g., the prompt “Try to build connections between what you have learned last week and what you already know” was enriched by the suggestion “For this purpose, write down how you could apply what you have learned this week at home in your spare time,” see also the prompts with superscript “d” in Table 1), the prompts were more effective concerning the elicitation of the cognitive learning strategies of organization and elaboration. Apparently, this level of guidance was better aligned with learners’ learning strategy expertise than the rather abstract prompts that are effective for university students. In terms of metacognitive strategies, however, no beneficial effects of the specific prompts were found. Possible explanations for the lack of effect concerning metacognitive strategies could be that metacognitive strategies are cognitively more demanding than cognitive ones (e.g., Bannert 2007) and/or that putting one’s understanding into question might be regarded as unpleasurable by many learners.
Learner Expertise and Feedback
The third main insight concerning the adaptation of instructional support to learners’ learning strategy expertise relates to the provision of feedback. In a field study, Roelle et al. (2011) provided learners writing learning journals over the course of 14 weeks with elaborated feedback on the quality of their cognitive and metacognitive strategies. The authors found that the feedback fostered the quality of cognitive learning strategies for learners who performed relatively poorly in their journal entries before the feedback. By contrast, for learners who had already shown high-quality cognitive strategies in their learning journals before the feedback, the feedback detrimentally affected the quality of cognitive strategies in subsequent journal entries (in terms of metacognitive strategies, the feedback did not entail any significant effects). This pattern of findings, which aligns with a full expertise reversal effect (e.g., Kalyuga et al. 2003), can be interpreted as follows. For the learners who merely executed low-quality cognitive learning strategies in their journal entries, the external guidance by the feedback compensated for the lack of learning strategy expertise and thus fostered cognitive learning strategy quality in subsequent entries. In contrast, for the learners who were already able to apply high-quality cognitive learning strategies, the external guidance was redundant and interfered with learners’ internal guidance thus contributing to extraneous rather than to germane cognitive load. Consequently, it detrimentally affected the quality of cognitive learning strategies in subsequent entries. Although to date there is no evidence concerning potential benefits of adaptively faded feedback, these findings suggest that in optimizing cognitive load in self-regulated learning through journal writing, instructors should consider learners’ learning strategy expertise when providing feedback.
Relatedness of Cognitive and Motivational Processes in Journal Writing
Our research program on self-regulated-learning-journals has also yielded insights into the relatedness of cognitive and motivational processes whereby both beneficial and detrimental effects have been found. A first main insight of the studies that dealt with this issue is that the prompting germane processing principle entails motivational costs. Specifically, in two experiments with university students, Nückles et al. (2010) found that both permanent prompting of cognitive and metacognitive strategies over the course of 12 weeks and faded prompting of cognitive and metacognitive strategies (see the adapting and fading guidance in line with learners’ expertise section above) yielded substantial decreases in students’ motivation to engage in journal writing over time.
One explanation for these motivational decreases could be that learners perceive the cognitive load that they have to invest in applying the prompted strategies during journal writing as motivational cost (see Feldon et al. 2019). The concept of cost (i.e., the effort needed to complete a task and the other activities that one must give up in order to complete the task) has been recently considered as a third important core component that determines learners’ motivation in expectancy-value theories, in addition to expectancy (i.e., the extent to which learners believe they can succeed in a task) and value (i.e., that the respective task is important; see, e.g., Barron and Hulleman 2015). The higher the costs, the lower learners’ motivation to engage in a task. Following this line of argumentation, regardless of whether learners receive instructional support that is well aligned with their expertise (i.e., faded prompting in Exp. 2 of Nückles et al.) or (at least in part) redundant (i.e., permanent prompting in Exp. 1 of Nückles et al.), it might be the case that learners perceive the effort they have to invest into the prompted germane cognitive and metacognitive strategies as relatively high and thus responding to the prompts as cost-intensive. As a consequence, their motivation to engage in journal writing decreases over time.
There is also evidence, however, that even when the prompting germane processing principle is not implemented, learners’ motivation to engage in journal writing slightly decreases over time (Exp. 1 of Nückles et al. 2010). Hence, the journal writing itself likely entails some motivational costs as well. In the mentioned experiments by Nückles and colleagues, the learners might over time have perceived that the time and effort invested in journal writing has detrimental consequences on either other academic tasks or non-academic activities that learners had to give up or in which they could merely invest insufficient effort due to the journal writing. Future studies should delve into these motivational effects of journal writing more deeply and aim to differentiate between the perceived costs of the journal writing and the benefits of the prompting germane processing principle.
Despite the outlined detrimental effects concerning the motivation to engage in journal writing, it cannot be concluded that prompting germane processing entails detrimental motivational effects only. The detrimental effects regarding learners’ journal writing motivation are contrasted by beneficial effects regarding learners’ motivation to engage with the learning content on which they reflect in their journal entries. Specifically, Wäschle et al. (2015; Exp. 1) found that prompted journal writing performed as homework fostered seventh-grade high school students’ interest in the learning content (i.e., immunology) in comparison to three different homework tasks (i.e., answering teacher provided questions, creating a concept map, writing a summary). This beneficial effect regarding content-related motivation, which occurred only after a delay of several weeks, was mediated via the benefits of prompted journal writing regarding comprehension. That is, because prompting germane processing fostered comprehension of the learning content, learners’ motivation to engage with the learning content in subsequent learning phases was fostered, too.
In an attempt to mitigate the detrimental effect of the prompting germane processing principle on journal writing motivation without decreasing the beneficial effect on content-related motivation, Schmidt et al. (2012) investigated the effects of prompting learners to reflect on the personal relevance of the learning content in addition to prompting the established cognitive and metacognitive learning strategies. The results of their experiment indicated that a personal utility prompt (“Why is the learning material personally relevant for you at present or future out of school?”) increased both students’ acceptance of journal writing as well as content-related motivation in comparison to prompting only cognitive and metacognitive strategies. Moreover, the beneficial motivational effects were accompanied by increased learning outcomes (i.e., comprehension). Jointly, these findings suggest that the motivational costs of the prompting germane processing principle can be mitigated via prompting learners to reflect on the personal relevance of the respective learning content in their learning journals. Further support for the benefits of the personal utility prompt stems from Wäschle et al. (2015, Exp. 2). In a field study with tenth-grade high school students, the authors replicated the finding that the personal utility prompt increased content-related motivation as compared to the established cognitive and metacognitive prompts only. Although the increased motivation was not reflected in higher performance on a comprehension test in this study, it nevertheless fostered another facet of learning outcomes. Those learners who received the personal utility prompt outperformed their counterparts on an argumentation task that required them to critically reflect on the learning content.
A final insight of the studies that dealt with the role of motivation in journal writing is that not only the type of prompts, but also the goal structure in which the journal writing is embedded matters. In an experiment with ninth-grade high school students, Moning and Roelle (2020) varied whether the journal writing task was embedded in a mastery goal structure or in a performance goal structure. The mastery goal-structured journal writing task emphasized students’ individual effort and progress. Specifically, the students were told that, by writing the learning journal, they should try to improve their comprehension of a text they had read before. Furthermore, in order to induce a self-referential feedback expectation, they were told that they would receive feedback on the improvement of their comprehension afterwards. In contrast, the performance goal-structured journal writing task emphasized high normative performance (i.e., in comparison with the performance of the other classmates). Accordingly, the students in that condition were told to demonstrate a better comprehension than their classmates of the text they had read before. Furthermore, in order to induce a normative feedback expectation, they were told that they would receive feedback concerning their own comprehension compared to the comprehension of their classmates.
The authors found that a mastery goal structure better fostered the quality of metacognitive processes (i.e., the specificity of monitoring, self-diagnosis, and regulation), learning efficiency, and learning outcomes than a performance goal structure. One explanation for this finding is that the students in the mastery goal structure group might have considered detailed comprehension monitoring as highly useful because journal writing can be used particularly to improve comprehension of initially poorly understood content. In contrast, as overtly reflecting on comprehension difficulties would have conflicted with demonstrating high competence, the students in the performance goal structure group might have considered detailed comprehension monitoring at least in part as a waste of time and effort. The outlined differences in comprehension monitoring potentially brought about the differences in learning efficiency (learning outcomes in relation to invested mental effort) and learning outcomes. High-quality comprehension monitoring might have fostered the degree to which learners deeply engaged with content they had not yet understood well, which, in comparison to engaging with already well-understood content, fostered both learning efficiency and outcomes in the mastery goal structure group.
Summary and Directions for Future Research
In the present paper, we have introduced the self-regulation view in writing-to-learn as a promising theoretical perspective that draws on models of self-regulated learning (e.g., Zimmerman 2008) and cognitive load theory (e.g., Sweller et al. 2011a, 2011b, 2019). Accordingly, we argued that writing has the potential to act as a powerful scaffold for self-regulated learning due to the cognitive offloading written text generally offers as an external representation and memory aid, and due to the offloading, that specifically results from the genre-free principle in journal writing. However, to enable learners to optimally exploit this learning opportunity, the journal writing needs to be instructionally supported. Accordingly, we developed and tested in our research program instructional support methods for self-regulated journal writing. Irrespective of the offloading nature of the journal writing (especially because of the genre-free principle), our support methods were of course intended to increase cognitive load by inducing higher levels of germane cognitive load. However, if the support is not well aligned with the learners’ strategic expertise, it can also induce extraneous load (see expertise reversal effect).
The most important support procedure is prompting cognitive und metacognitive strategies. Learning was best when prompts activated all major sub-processes of self-regulated learning (i.e., organization, elaboration, monitoring of comprehension, and planning of remedial cognitive strategies) and when metacognitive prompts preceded cognitive prompts. Other effective support procedures were informed prompting, (self-explaining) worked examples, and feedback. Informed prompting and worked examples fostered also the skill to self-manage effective journal writing on new topics. Evidence for the relevance of the expertise reversal effect was shown in our research program for (the way of) prompting (Nückles et al. 2010) and for feedback (Roelle et al. 2011). Hence, these support procedures are best used in an adaptive way (e.g., fading support with growing learner expertise). Our findings also suggest that the effort that students have to invest for the journal writing to be effective has motivational costs. These motivational costs can be buffered with prompts for reflecting on the personal relevance of the learning contents. Finally, emphasizing mastery (vs. performance) goals has been shown to benefit learning.
Relating the Results of the Freiburg Research Program on Journal Writing to Other Research on Writing-to-Learn
Experimental research on journal writing outside the Freiburg Self-Regulated-Journal-Writing Approach is scarce and has mainly been published in the form of case studies presenting anecdotal evidence from applications of journal writing in higher education (e.g., Burke and Dunn 2006; Creme 2005) and as part of classroom instruction in secondary education (Carson and Longhini 2002; Swafford and Bryan 2000). In one of the few published quasi-experimental studies, Cantrell et al. (2000) contrasted prompted journal writing with summary writing to support reading comprehension. In the journal writing condition, students were prompted to ask themselves what they already knew about the learning content, what they would like to know about the topic, and what they had learned from their reading. Thus, Cantrell et al.’s (2000) prompts were similar to the elaboration and organization prompts of the Freiburg approach to journal writing inasmuch as they asked students to activate and relate their prior knowledge to the to be learned information (i.e., elaboration) and to determine for themselves the main points they had learned from the reading (i.e., organization). Cantrell et al. found accordingly that the prompted journal writing significantly benefitted more students’ learning than the summary writing. In a related experimental study, McCrindle and Christensen (1995) contrasted journal writing with writing a conventional scientific report in a first-year biology course at university. The researchers found that students in the journal writing condition articulated more cognitive and metacognitive learning strategies on a learning strategy assessment task, acquired more complex and integrated knowledge, and performed significantly better on the final exam as compared with the students in the scientific report condition. Together, the scarce available (quasi-) experimental evidence outside the Freiburg Self-Regulated-Journal-Writing Approach confirms the main insights from the Freiburg research program inasmuch as using the journal writing to encourage the application of cognitive and metacognitive strategies clearly improved students’ learning gains.
In a recent overview on different approaches to writing-to-learn, Klein et al. (2019) identified besides journal writing several other approaches, with writing summaries or discourse syntheses and writing argumentations as the most important ones. With regard to summary/synthesis writing and argumentative writing, the available empirical evidence is quite mixed (see Klein 1999, and Klein et al. 2019, for summaries) and difficult to compare to the Freiburg research program on journal writing. A major difference to the Freiburg approach to journal writing is that researchers investigating summary/synthesis or argumentative writing typically implemented and evaluated complex and time-intensive writing trainings including phases of modeling and writing exercises often lasting over several weeks (e.g., Gelati et al. 2014; Martínez et al. 2015). In these writing trainings, the goal typically was to teach students how to write a good summary or synthesis. Accordingly, text quality and the acquisition of the rhetorical genre in question (i.e., what makes a good summary or a good argumentation?) was the main focus in those studies and learning goals such as deep comprehension and long-term retention of subject matter were rather secondary (see, e.g., Klein et al. 2017; Nussbaum and Schraw 2007). In contrast to those approaches, acquisition of a particular text genre is not the goal of the Freiburg Self-Regulated-Journal-Writing Approach. Accordingly, by implementing instructional procedures such as strategy prompts and worked examples of journal entries, we sought to keep the rhetorical demands of journal writing as low as possible in order to optimize germane processing and learning outcomes in terms of deep comprehension and long-term retention of subject matter.
The Role of Learner Prerequisites for the Benefits of Journal Writing
Our extant research clearly shows that, on the whole, instructionally supported journal writing benefits a wide range of students including young secondary school students aged between 11 and 14 years as well as older university students aged between 20 and 25 years old (see Table 2 or an overview). Also, the tested support methods such as the provision of prompts or worked examples proved to be similarly effective both for younger secondary students (e.g., Roelle et al. 2012; Wäschle et al. 2015) and older secondary or university students (e.g., Hübner et al. 2010; Nückles et al. 2009).
On the other hand, our studies also showed that, in order to optimize cognitive load in journal writing, instructional support methods such as the provision of prompts or feedback should be adapted to the learners’ individual strategic expertise (see the previous subsection on adapting and fading guidance in line with learners’ expertise). At the same time, strategic expertise is also linked to age (Klein et al. 2019). Thus, the younger the students, the more it is likely that they are unfamiliar with certain learning strategies (Zimmerman and Martinez-Pons 1990). Therefore, in order to benefit from journal writing, younger students may need more instructional support, for instance, by worked examples (Roelle et al. 2012) and concrete suggestions of how to execute cognitive and metacognitive strategies (Glogger et al. 2009; Klein et al. 2019). Besides learning strategy expertise, however, there are also other potentially relevant learner prerequisites such as students’ writing skills or their prior knowledge about the learning content which we, hitherto, have not focused on.
With regard to writing skills, we included mainly students in our studies at an age level where they could be assumed to have sufficient mastery of transcription skills (spelling and handwriting ability) such that the execution of these skills does no longer consume substantial working memory resources. Nevertheless, there are case studies in primary school mathematics education suggesting that beginning writers may also benefit from trying to articulate their mathematical reasoning in writing a learning journal (Gallin and Ruf 1998). Furthermore, in a recent unpublished study (Nückles 2019), young secondary school students with exceptionally low writing skills improved their learning outcomes by prompted journal writing if they received formative teacher feedback on the quality of their enacted learning strategies in addition to cognitive and metacognitive prompts. Thus, it could be fruitful for future research to investigate more systematically, to what extent the benefits of journal writing are dependent on or rather independent of students’ mastery of the mechanics of writing.
Besides writing skills as a relevant learner prerequisite, delving into the role of prior knowledge on the learning content could further be fruitful for it is reasonable to assume that such prior knowledge matters for the benefits of self-regulated journal writing. For instance, recent findings by Roelle and Nückles (2019) suggest that learners who have not yet formed a coherent and well-integrated mental representation of the learning content in particular benefit from organization and elaboration activities. On this basis, it could be assumed that the benefits of journal writing and the support measures to enhance the object-level processes of organization and elaboration (e.g., the outlined prompting procedures) are high for learners with low topic-related prior knowledge in particular. However, there might also be some type of tipping point concerning prior knowledge in which it is no more possible for learners to meaningfully engage in organization and elaboration because their prior knowledge is too low.
Automated Coding of Learning Journals
When we analyzed the application of learning strategies during journal writing in our previous studies (e.g., Berthold et al. 2007; Glogger et al. 2012), we used a “manual” coding procedure (see Appendix Table 3). Although we obtained good inter-rater reliabilities and successfully predicted learning outcomes by our strategy measures, which indicates validity, this procedure takes a lot of time; it is not economical. In particular, if an adaptive fading of prompts (see Nückles et al. 2010) should be used in regular teaching practice, it is necessary to develop more parsimonious coding procedures to quickly have the necessary database for adaptive decisions. A promising approach might be to use natural language processing techniques. Techniques for coding complex student-generated texts (automated essay scoring) have already been developed (e.g., Seifried et al. 2012; Burstein et al. 2013). A promising avenue of further research is to adapt such techniques for the automated coding of learning journals.
Relating the Results of the Freiburg Research Program on Journal Writing to the Effort Monitoring and Regulation Framework
With regard to the EMR framework proposed in the Editorial of this special issue (see also Fig. 2), the research program on the Freiburg approach to journal writing addresses in particular research question 3 (How do we optimize cognitive load on self-regulated learning tasks?) and partly also research question 2 (How do students regulate mental effort?). Concerning the question of how to optimize germane cognitive load in self-regulated learning, we found that prompting all essential sub-processes involved in self-regulated journal writing (see Fig. 3) resulted in the largest learning gains both in terms of deep comprehension and retention of subject matter (Nückles et al. 2009). Concerning the question of how to best sequence these learning processes, we have evidence that engaging learners in metacognitive monitoring and regulation prior to organization and elaboration was more beneficial to learning than vice versa (Roelle et al. 2017).
The latter result suggests that prompting students to monitor their current understanding of subject matter by journal writing apparently makes them aware of the gaps in their current understanding of subject matter and thus effectively prepares them for the subsequent working-through of the learning content via organization and elaboration processes. This interpretation resembles, in some respect, recent assumptions about the mechanisms underlying productive failure (see Loibl et al. 2017). In sharp contrast to productive failure, however, in our approach to journal writing, students do not engage in problem solving (see the genre-free principle); rather, they are invited to use the journal writing as an opportunity to reflect on their ideas about subject matter in order to become aware of what they already know and what they do not know or find difficult to comprehend.
Journal Writing and Monitoring Accuracy
Relating the findings by Roelle et al. (2017) on sequencing to the research suggested on basis of the EMR framework (see de Bruin et al. in press), it would be interesting to investigate whether learners’ metacognitive monitoring accuracy can be improved by prompting the metacognitive strategies in journal writing. To date, judgments of learning have scarcely been assessed in journal writing research. Exploring this possibility could be promising because several effective methods to enhance monitoring accuracy, such as asking learners to generate keywords (Thiede et al. 2003) or to complete a diagram (van de Pol et al. 2019), this issue; Prinz et al. 2020), similarly engage learners in writing down their responses to the respective prompts. However, other than in these paradigms, in journal writing, learners are not required to react to demands that are related to specific parts of the learning content such as in the studies by Thiede et al. and van de Pol et al. (2019) On the contrary, the generic prompts for cognitive and metacognitive strategies used in the Freiburg research program leave it up to the learner to decide on which aspects of the learning contents the learner wants to focus on in order to monitor their comprehension. Thus, it is a question for future research whether prompted journal writing, which gives the learner ample freedom to determine the focus of the comprehension monitoring, will prove to be similarly effective as the more directive methods applied in current research on monitoring accuracy (e.g., Thiede et al. 2003; van de Pol et al. 2019; see also Waldeyer and Roelle 2020). Possibly, the answer to this question differs for students with different prior knowledge levels. Furthermore, it could be fruitful to integrate the mentioned established means to enhance judgment accuracy (e.g., the keyword-method) in journal writing.
How to Encourage Effort into Monitoring and Regulation Sustainably
The available evidence suggests that the use of metacognitive learning strategies in journal writing can successfully be prompted and also contributes to germane cognitive load as indicated by enhanced learning outcomes (Glogger et al. 2012; Nückles et al. 2009). Nevertheless, prompting metacognition in the long run (i.e., over the course of a whole term) proved to be relatively ineffective regardless of whether the prompts were adapted to the learners’ individual strategic expertise or not (Nückles et al. 2010). It is possible that learners perceived the cognitive load that they had to invest in the prompted strategies as a substantial motivational cost (Feldon et al. 2019) given that learners are generally inclined to keep their invested mental effort as minimal as possible (Shenhav et al. 2017). This inclination might apply in particular for metacognitive strategies which require learners to put their current understanding into question. Accordingly, assuming a self-critical stance towards one’s current understanding over a longer period of time (over a whole term in the journal writing studies of Nückles et al. 2010) is likely regarded as unpleasurable by many learners. Hence, the question of how to stimulate metacognitive reflection sustainably probably touches upon a fundamental constraint of human cognition. This question might not be answerable from the perspective of the individual learner but rather from a perspective that views learning and metacognition as a collective endeavor. For example, most authors would probably refrain from thoroughly revising their “already well written” articles if they were not forced to do so by reviewers and editors. Taking a critical and reflective stance towards one’s submission is initially perceived by many authors as a considerable motivational cost. Hence, fostering metacognitive processes in journal writing sustainably might be rather achieved in a social, collaborative learning environment where the tasks of generating ideas by journal writing and taking a metacognitive critical stance towards those ideas may be socially distributed among learning partners.
The approach of allocating specific cognitive and metacognitive processes to different roles learners may adopt during the learning process has successfully been established, for example, in reciprocal teaching (e.g., Palincsar and Brown 1984). With regard to journal writing, Nückles et al. (2005) conducted a small-scale field study on reciprocal commenting on each other’s learning journals within dyadic learning partnerships in the context of a blended learning university course. The authors found that the feedback provided by one learning partner strongly influenced the degree of elaboration and organization of the other partner’s learning journal. Glogger-Frey et al. (2019) found that students who received higher-quality feedback from peers as compared with students who received lower-quality feedback, perceived their learning outcomes as higher and felt more confident to do well in journal writing. Thus, future research could investigate as to whether the willingness to sustainably apply metacognitive strategies in journal writing could also be improved by such dialogical elements in journal writing.
Introducing Journal Writing as Intentional Learning
The results of the Nückles et al. studies (Nückles et al. 2010) further suggest that the journal writing itself likely entailed some motivational costs irrespective of whether prompts for cognitive and metacognitive strategies were provided or not. These costs can be assumed because journal writing can be regarded as a learning task that creates desirable difficulties (Bjork and Bjork 2011) whose benefit typically becomes transparent for learners not immediately but only at a later time. Accordingly, when implementing journal writing as follow-up course work in our university courses, students often commented on the journal writing in their evaluation sheets at the end of the term with sentences such as “All this journal writing was hard work, but looking back now, I can see that I have learned a lot.” Hence, it is an important question for future research to investigate how students can be encouraged to maintain their invested level of mental effort into the journal writing over a longer period of time (see research question 2 of the Editorial: How do students regulate their mental effort, de Bruin et al. in press). In our previous studies, the journal writing was typically imposed on the students as follow-up course work. Therefore, they might not have sufficiently recognized the benefit of the journal writing because they considered it as schoolwork which has to be done and with little personal value. In particular, because the writing took place as an obligatory after school assignment, it is an open question to what extent students were able, under these conditions, to conceive the journal writing as an opportunity to freely develop their ideas about the subject matter. Accordingly, a challenge for future research is how to introduce the journal writing to students such that they will be able to perceive the journal writing as a valuable opportunity for intentional learning (see Bereiter and Scardamalia 1989). One promising starting point for this future avenue in journal writing research could be the outlined effects of the personal utility prompts. Long-term effects of these prompts as well as potential differences between prompts that relate to the utility of the learning content and prompts that refer to the utility of the journal writing itself still need to be addressed. Nevertheless, the consistent beneficial effects on learners’ motivation already suggest that such prompts entail high potential for convincing learners of the usefulness of self-regulated journal writing.
Journal Writing as Preparation for Future Problem Solving
In this paper, we have argued that the potential of journal writing to foster self-regulated learning can especially be attributed to the genre-free principle according to which the learner is freed from the burden to invest mental effort in the instantiation of rhetorical schemata. Hence, learners are explicitly encouraged not to conceive of journal writing as problem solving but rather as an opportunity to freely develop ideas about subject matter and examine one’s current understanding for gaps and inconsistencies. Thus, the self-regulation view on journal writing parallels Sweller’s goal-free-effect in this respect (see Sweller et al. 2011a). Due to its goal-free nature, journal writing might be particularly suited to prepare learners for future problem solving. We are currently investigating this possibility in a project on teacher education where history teacher students are writing learning journal entries to develop ideas for later lesson planning (see Nückles and Schuba 2019). Based on three texts providing content knowledge (here: on history), pedagogical content knowledge (here: on history education), and pedagogical knowledge (see also Graichen et al. 2019; Wäschle et al., 2015b)), they independently identify and develop didactic goals for teaching history. After the journal writing, the students work out a formal plan for a history lesson. The results of this study so far suggest that the journal writing helps the students to define appropriate didactic goals and to adopt these goals for their lesson planning (“For planning my lesson, it will be important to consider students’ prior knowledge about the second world war …”). Accordingly, the number of articulated and personally valued teaching goals in the journal writing mediated the quality of the formal lesson plans which the students produced afterwards as a transfer and application task (Nückles & Schuba 2019). Thus, using journal writing for preparing lesson planning can be regarded as another fruitful instantiation of the goal-free-principle (see Sweller et al. 2011a). Defining goals for later problem solving basically is planning, and planning is viewed (besides monitoring and evaluation) as an essential component of people’s metacognitive competence to regulate cognition (Nückles et al. 2009; Schraw 1998). Given the preliminary character of the results of Nückles and Schuba (2019), future studies should examine more broadly how journal writing can be used as an opportunity to facilitate students’ self-regulation in solving core teaching problems such as lesson planning.
Journal Writing—a Circumscribed Intervention with Wide-Ranging Effects
A final direction for further research is derived from an observation made in our high school studies on journal writing. In the studies by Glogger et al. (2009, 2012) and Wäschle et al. (2015a), it turned out that both the 9th graders of the Glogger et al. studies and the 7th graders of Wäschle et al. wrote rather short weekly journal entries ranging roughly between 100 and 350 words. Despite their brevity, however, the journal entries nevertheless strongly contributed to learning outcomes as measured by learning outcome tests that covered all central aspects of the topic (e.g., immunology in the study by Wäschle et al. 2015a) taught in the weeks before. This pattern of results suggests that the relatively local writing intervention (encouraging application of cognitive and metacognitive strategies in a weekly journal entry) impacted students’ learning behavior positively also on a more global level and improved students’ use of learning strategies in and outside the classroom beyond the learning journal. However, to date, we do not have any data that would provide direct evidence for this assumption. Accordingly, it would be interesting to conduct observational studies to investigate to what extent journal writing indeed generally influenced the students’ use of learning strategies in the wide-ranging way as can be speculated on basis of the studies of Glogger et al. (2009, 2012) and Wäschle et al. (2015a).
In summary, these are numerous promising directions of how to develop the Freiburg Self-Regulated-Journal-Writing Approach further. Furthermore, it is necessary to conduct replication studies in order to consolidate the results found, for example, for the positive motivational effects of journal writing (i.e., raised interest subject matter, see Wäschle et al. 2015a) or for the positive effect of feedback on the quality of the enacted learning strategies in the learning journals (see Roelle et al. 2011) as well as on self-efficacy and perceived learning outcomes (Glogger-Frey et al. 2019). First steps towards such consolidation of the benefits of feedback are currently undertaken by Pieper et al. (2019) who investigated the effects of prompts, expert feedback, and worked examples in journal writing during the practical semester of teacher students. A recent quasi-experimental study by Nückles (2019) further replicated the positive effects of elaborated expert feedback on the quality of the cognitive learning strategies enacted by 7th grade low-ability writers in their learning journals. Additionally, the quality of the cognitive strategies mediated the effect of the feedback on students’ learning outcomes.
Although we have discussed numerous aspects of our journal writing approach from the perspective of cognitive load theory, we have yet in our extant studies not systematically included measures of cognitive load. Accordingly, it would be interesting to test whether the inclusion of specific types of prompts (e.g., including a metacognitive or personal utility prompt) leads to a raised perception of (germane) cognitive load in students.
To conclude, the Freiburg Self-Regulated-Journal-Writing Approach has yielded valuable insights into the question of how writing can be instructionally supported to effectively scaffold self-regulated learning and optimize cognitive load. Yet, these insights open up a considerable number of new avenues of how to further advance the self-regulation view in writing-to-learn and how to advance research that integrates self-regulated learning with cognitive load theory.
In the present paper, we refer to the concept of germane load as introduced by Sweller et al. (1998). Germane load refers to the effort contributing to knowledge construction (e.g., by elaboration) and adds to the intrinsic load (determined by the complexity of the learning contents in relation to the learners’ prior knowledge) as well as the extraneous (unproductive) load. Recently, another conception of germane load has been proposed (e.g., Sweller et al. 2019). However, we stick to the conception from 1998 for several reasons: (1) It guided large parts of the cognitive load research to which we refer in this article; (2) when considering cognitive load in our research program on journal writing, we had the 1998 conception in mind; (3) we find it helpful in our context to clearly differentiate the separate contributions to the overall load resulting from the complexity of the learning contents and from the learning strategies applied when studying these contents.
The mini meta-analysis is based on those experimental studies in which the combined application of cognitive and metacognitive strategies was prompted and contrasted with unsupported journal writing. These studies are published in Berthold et al. (2007); Nückles, Hübner, and Renkl et al. (2009); Nückles et al. (2010); and Schwonke et al. (2006). Accordingly, the calculation of Hedges’s g was based on the data of 238 participants.
Bangert-Drowns, R. L., Hurley, M. M., & Wilkinson, B. (2004). The effects of school-based writing-to-learn interventions on academic achievement: a meta-analysis. Review of Educational Research, 74(1), 29–58. https://doi.org/10.3102/00346543074001029.
Bannert, M. (2007). Metakognition beim Lernen mit Hypermedia. Erfassung, Beschreibung und Vermittlung wirksamer metakognitiver Lernstrategien und Regulationsaktivitäten [Metacognition in learning with hypermedia. Assessment, description and instruction of effective metacognitive learning strategies and regulation activities]. Waxmann.
Bannert, M. (2009). Promoting self-regulated learning through prompts. Zeitschrift für Pädagogische Psychologie/German Journal of Educational Psychology, 23(2), 139–145. https://doi.org/10.1024/1010-06188.8.131.52.
Bannert, M., & Reimann, P. (2012). Supporting self-regulated hypermedia learning through prompts. Instructional Science, 40(1), 193–211. https://doi.org/10.1007/s11251-011-9167-4.
Barnett, S. M., & Ceci, S. J. (2002). When and where do we apply what we learn? A taxonomy for far transfer. Psychological Bulletin, 128(4), 612–637. https://doi.org/10.1037/0033-2909.128.4.612.
Barron, K. E., & Hulleman, C. S. (2015). The expectancy-value-cost model of motivation. In J. D. Wright (Ed.), International encyclopedia of social and behavioral sciences (pp. 503–509). Oxford: Elsevier. https://doi.org/10.1016/B978-0-08-097086-8.26099-6.
Bereiter, C., & Scardamalia, M. (1987). The psychology of written composition. Erlbaum.
Bereiter, C., & Scardamalia, M. (1989). Intentional learning as a goal of instruction. In L. B. Resnick (Ed.), Knowing, learning, and instruction: essays in honor of Robert Glaser (pp. 361–392). Lawrence Erlbaum Associates, Inc.
*Berthold, K., Nückles, M., & Renkl, A. (2007). Do learning protocols support learning strategies and outcomes? The role of cognitive and metacognitive prompts. Learning and Instruction, 17(5), 564–577. https://doi.org/10.1016/j.learninstruc.2007.09.007
Bjork, E. L., & Bjork, R. A. (2011). Making things hard on yourself, but in a good way: creating desirable difficulties to enhance learning. In M. A. Gernsbacher, R. W. Pew, L. M. Hough, & J. R. Pomerantz (Eds.), FABBS Foundation, Psychology and the real world: essays illustrating fundamental contributions to society (pp. 56–64). Worth Publishers.
Britton, J. (1980). Shaping at the point of utterance. In A. Freedman & I. Pringle (Eds.), Reinventing the rhetorical tradition (p. 61e66). L & S Books for the Canadian Council of Teachers of English.
Britton, J., Burgess, T., Martin, N., McLeod, A., & Rosen, H. (1975). School councils research studies: the development of writing abilities. McMillan.
Burke, J. P., & Dunn, S. (2006). Communicating science: exploring reflexive pedagogical approaches. Teaching in Higher Education, 11(2), 219–231. https://doi.org/10.1080/13562510500527743.
Burstein, J., Tetreault, J., & Madnani, N. (2013). The E-rater® Automated Essay Scoring System. In M. D. Shermis & J. Burstein (Eds.), Handbook of automated essay scoring: current applications and future directions (pp. 55–67). Routledge.
Cantrell, R. J., Fusaro, J. A., & Dougherty, E. A. (2000). Exploring the effectiveness of journal writing on learning social studies: a comparative study. Reading Psychology, 21(1), 1–11. https://doi.org/10.1080/027027100278310.
Carson, J. G., & Longhini, A. (2002). Focusing on learning styles and strategies: a diary study in an immersion setting. Language Learning, 52(2), 401–438. https://doi.org/10.1111/0023-8333.00188.
Chi, M. T. (1997). Quantifying qualitative analyses of verbal data: a practical guide. The Journal of the Learning Sciences, 6(3), 271–315. https://doi.org/10.1207/s15327809jls0603_1.
Chi, M. T., Bassok, M. H., Lewis, M. W., Reimann, P., & Glaser, R. (1989). Self-explanations: how students study and use examples in learning to solve problems. Cognitive Science, 13, 145–182. https://doi.org/10.1016/0364-0213(89)90002-5, 2.
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Erlbaum.
Creme, P. (2005). Should student learning journals be assessed? Assessment & Evaluation in Higher Education, 30(3), 287–296. https://doi.org/10.1080/02602930500063850.
De Bruin, A. B. H., Roelle, J., & Baars, M. (in press). Synthesizing cognitive load and self-regulation theory: a theoretical framework and research agenda. Educational Psychology Review.
Eitel, A., Bender, L., & Renkl, A. (2020). Effects of informed use: a proposed extension of the self-management effect. In S. Tindall-Ford, S. Agostinho, & J. Sweller (Eds.), Advances in cognitive load theory: rethinking teaching (pp. 168–179). Routledge.
Eitel, A., Endres, T., & Renkl, A. (in press). Self-management as a bridge between cognitive load and self-regulated learning: The illustrative case of seductive details. Educational Psychology Review.
Emig, J. (1977). Writing as a mode of learning. College Composition and Communication, 28(2), 122–128. https://doi.org/10.2307/356095.
Evens, M., Elen, J., Larmuseau, C., & Depaepe, F. (2018). Promoting the development of teacher professional knowledge: integrating content and pedagogy in teacher education. Teaching and Teacher Education, 75, 244–258. https://doi.org/10.1016/j.tate.2018.07.001.
Feldon, D. F., Callan, G., Juth, S., & Jeong, S. (2019). Cognitive load as motivational cost. Educational Psychology Review, 31(2), 319–337. https://doi.org/10.1007/s10648-019-09464-6.
Flower, L. S., & Hayes, J. R. (1980). The dynamics of composing: making plans and juggling constraints. In L. W. Gregg & E. R. Steinberg (Eds.), Cognitive processes in writing (pp. 31–50). Erlbaum.
Galbraith, D. (1992). Conditions for discovery through writing. Instructional Science, 21(1-3), 45–72. https://doi.org/10.1007/BF00119655.
Galbraith, D. (2009). Writing as discovery. In BJEP monograph series II, number 6-teaching and learning writing (Vol. 5, No. 26, pp. 5–26). British Psychological Society.
Gelati, C., Galvan, N., & Boscolo, P. (2014). Summary writing as a tool for improving the comprehension of expository texts: An intervention study in a primary school. In P. D. Klein, P. Boscolo, L. Kirckpatrick, & C. Gelati (Eds.), Writing as a learning activity (pp. 191-216). Brill.
Gallin, P., & Ruf, U. (1998). Sprache und Mathematik [Language and mathematics]. New York: Kallmeyer.
*Glogger, I., Holzäpfel, L., Schwonke, R., Nückles, M., & Renkl, A. (2009). Activation of learning strategies in writing learning journals: the specificity of prompts matters. Zeitschrift für Pädagogische Psychologie/German Journal of Educational Psychology, 23(2), 95–104. https://doi.org/10.1024/1010-06184.108.40.206
*Glogger, I., Schwonke, R., Holzäpfel, L., Nückles, M., & Renkl, A. (2012). Learning strategies assessed by journal writing: prediction of learning outcomes by quantity, quality, and combinations of learning strategies. Journal of Educational Psychology, 104(2), 452–468. https://doi.org/10.1037/a0026683.
Glogger-Frey, I., Fleischer, C., Grüny, L., Kappich, J., & Renkl, A. (2015). Inventing a solution and studying a worked solution prepare differently for learning from direct instruction. Learning and Instruction, 39, 72–87. https://doi.org/10.1016/j.learninstruc.2015.05.001.
Glogger-Frey, I., Bürgermeister, A., & Saalbach, H. (2019). Supporting formative peer-feedback on learning-strategy use by a digital tool. In Paper presented in the Invited SIG11 symposium at the 18th Biennial Conference of the European Association for Research on Learning and Instruction. Aachen.
Goh, J. X., Hall, J. A., & Rosenthal, R. (2016). Mini meta-analysis of your own studies: some arguments on why and a primer on how. Social and Personality Psychology Compass, 10(10), 535–549. https://doi.org/10.1111/spc3.12267.
Graham, S., Harris, R., & K. (2000). The role of self-regulation and transcription skills in writing and writing development. Educational Psychologist, 35(1), 3–12. https://doi.org/10.1207/S15326985EP3501_2.
*Graichen, M., Wegner, E., & Nückles, M. (2019). Wie können Lehramtsstudierende beim Lernen durch Schreiben von Lernprotokollen unterstützt werden, dass die Kohärenz und Anwendbarkeit des erworbenen Professionswissens verbessert wird? [How should the writing of learning protocols be supported in order to facilitate student teachers’ self-regulated construction of coherence and acquisition of applicable knowledge for teaching?] Unterrichtswissenschaft, 47, 7–28.
Hayes, J. R., & Flower, L. S. (1986). Writing research and the writer. American Psychologist, 41(10), 1106–1113. https://doi.org/10.1037/0003-066X.41.10.1106.
*Hübner, S., Nückles, M., & Renkl, A. (2010). Writing learning journals: instructional support to overcome learning-strategy deficits. Learning and Instruction, 20(1), 18–29. https://doi.org/10.1016/j.learninstruc.2008.12.001.
Kalyuga, S., Ayres, P., Chandler, P., & Sweller, J. (2003). The expertise reversal effect. Educational Psychologist, 38(1), 23–31. https://doi.org/10.1207/S15326985EP3801_4.
Kincaid, J. P., Fishburne Jr, R. P., Rogers, R. L., & Chissom, B. S. (1975). Derivation of new readability formulas for navy enlisted personell. Research Branch Report, 8–75.
Klein, P. D. (1999). Reopening inquiry into cognitive processes in writing-to-learn. Educational Psychology Review, 11(3), 203–270. https://doi.org/10.1023/A:1021913217147.
Klein, P. D., Haug, K. N., & Arcon, N. (2017). The effects of rhetorical and content subgoals on writing and learning. The Journal of Experimental Education, 85(2), 291–308. https://doi.org/10.1080/00220973.2016.1143795.
Klein, P. D., Haug, K. N., & Bildfell, A. (2019). Writing to learn. In S. Graham, C. A. McArthur, & M. Hebert (Eds.), Best practices in writing instruction (3rd ed., pp. 162–184). The Guilford Press.
Lachner, A., Burkhart, C., & Nückles, M. (2017). Mind the gap! Automated concept map feedback supports students in writing cohesive explanations. Journal of Experimental Psychology: Applied, 23(1), 29–46. https://doi.org/10.1037/xap0000111.
Loibl, K., Roll, I., & Rummel, N. (2017). Towards a theory of when and how problem solving followed by instruction supports learning. Educational Psychology Review, 29(4), 693–715. https://doi.org/10.1007/s10648-016-9379-x.
Martínez, I., Mateos Sanz, M. D. M., Martín, E., & Rijlaarsdam, G. (2015). Learning history by composing synthesis texts: Effects of an instructional programme on learning, reading and writing processes, and text quality. Journal of Writing Research, 7, 275–302. https://doi.org/10.17239/jowr-2015.07.02.03.
Mayer, R. E. (2002). Teaching for meaningful learning. The promise of educational psychology (Vol. 2). Prentice Hall.
Mayer, R. E. (2009). Multimedia learning. Cambridge University Press.
McCrindle, A. R., & Christensen, C. A. (1995). The impact of learning journals on metacognitive and cognitive processes and learning performance. Learning and Instruction, 5(2), 167–185. https://doi.org/10.1016/0959-4752(95)00010-Z.
Miller, P. H. (2000). How best to utilize a deficiency. Child Development, 71(4), 1013–1017. https://doi.org/10.1111/1467-8624.00205.
Mizza, F., Agostinho, S., Tindall-Ford, S., Paas, F., & Chandler, P. (2020). Self-management of cognitive load. Potential and challenges. In S. Tindall-Ford, S. Agostinho, & J. Sweller (Eds.), Advances in cognitive load theory: rethinking teaching (pp. 157–167). Routledge.
*Moning, J., & Roelle, J. (2020). Self-regulated learning by writing learning protocols: do goal structures matter? Manuscript submitted for publication.
Nelson, T., & Narens, L. (1994). Why investigate metacognition? In Metacognition: Knowing about knowing. The MIT Press.
Nückles, T. (2019). Effects of expert-feedback on journal writing in physics education: a field intervention study. Unpublished thesis in the teacher education program. Freiburg: University of Education.
Nückles, M., & Schuba, C. (2019). Teachers as Informed Pragmatists : Ein theoretisches Modell und empirische Befunde zur Förderung didaktischer Argumentationskompetenz von angehenden Lehrkräften. [Teachers as informed pragmatists: a theoretical model and empirical results on how to support future teachers’ didactic argumentations skills]. In J. Kilian, T. Kleickmann, M. Köller, & I. Parchmann (Eds.), Profilbildung im Lehramtsstudium. Beiträge der Qualitätsoffensive Lehrerbildung zur individuellen Orientierung, curricularen Entwicklung und institutionellen Verankerung (pp. 134–143). Federal Ministry of Education and Research.
*Nückles, M., Schwonke, R., Berthold, K., & Renkl, A. (2004). The use of public learning diaries in blended learning. Journal of Educational Media, 29(1), 49–66. https://doi.org/10.1080/1358165042000186271.
Nückles, M., Renkl, A., & Fries, S. (2005). Wechselseitiges Kommentieren und Bewerten von Lernprotokollen in einem Blended Learning Arrangement [Reciprocally commenting and evaluating learning protocols in a blended learning environment]. Unterrichtswissenschaft, 33, 227–243.
*Nückles, M., Hübner, S., & Renkl, A. (2009). Enhancing self-regulated learning by writing learning protocols. Learning and Instruction, 19(3), 259–271. https://doi.org/10.1016/j.learninstruc.2008.05.002.
*Nückles, M., Hübner, S., Dümer, S., & Renkl, A. (2010). Expertise reversal effects in writing-to-learn. Instructional Science, 38(3), 237–258. https://doi.org/10.1007/s11251-009-9106-9.
Nussbaum, E. M., & Schraw, G. (2007). Promoting argument-counterargument integration in students’ writing. The Journal of Experimental Education, 76(1), 59–92. https://doi.org/10.3200/JEXE.76.1.59-92.
Ortony, A. (1993). Metaphor and thought. Cambridge: Cambridge University Press. https://doi.org/10.1017/CBO9781139173865.
Paas, F. & Kirschner, F. (2012). The goal-free effect. In N. M. Seel (Eds.), Encyclopedia of the Sciences of Learning vol. 2 (pp. 1375–1377). Netherlands: Springer.
Palincsar, A. S., & Brown, A. L. (1984). Reciprocal teaching of comprehension-fostering and comprehension-monitoring activities. Cognition and Instruction, 2(2), 117–175. https://doi.org/10.1207/s1532690xci0102_1.
Paris, S. G., Lipson, M. Y., & Wixson, K. K. (1983). Becoming a strategic reader. Contemporary Educational Psychology, 8(3), 293–316. https://doi.org/10.1016/0361-476X(83)90018-8.
Phillips, L. M., & Norris, S. P. (2009). Bridging the gap between the language of science and the language of school science through the use of adapted primary literature. Research in Science Education, 39(3), 313–319. https://doi.org/10.1007/s11165-008-9111-z.
Pieper, M., Roelle, J., vom Hofe, R., Salle, A. & Berthold, K. (2019). Unterstützen Leitfragen und Expertenfeedback im Lerntagebuch das Reflektieren im Praxissemester? [Do prompts and expert feedback support teacher students in reflecting by journal writing on their teaching experiences during their practical training?] In B. Drechsel, B. Kracke, & J. Sparfeldt (Chair), Psychologische Perspektiven in der Qualitätsoffensive Lehrerbildung. Interaktives Forum auf der gemeinsamen Tagung der Fachgruppen Entwicklungspsychologie und Pädagogische Psychologie (PAEPSY), Leipzig.
Prinz, A., Golke, S., & Wittwer, J. (2020). To what extent do situation-model-approach interventions improve relative metacomprehension accuracy? Meta-analytic insights. Manuscript submitted for publication.
Reigeluth, C. M., & Stein, F. S. (1983). The elaboration theory of instruction. In C. M. Reigeluth (Ed.), Instructional design theories and models: an overview of their current status (pp. 335–382). Erlbaum.
Renkl, A. (2011). Instruction based on examples. In R. E. Mayer & P. A. Alexander (Eds.), Handbook of research on learning and instruction (pp. 272–295). Routledge.
Renkl, A. (2014a). The worked examples principle in multimedia learning. In R. E. Mayer (Ed.), Cambridge handbook of multimedia learning (2nd revised ed., pp. 391–412). Cambridge University Press.
Renkl, A. (2014b). Towards an instructionally-oriented theory of example-based learning. Cognitive Science, 38(1), 1–37. https://doi.org/10.1111/cogs.12086.
Renkl, A. (2017). Instruction based on examples. In R. E. Mayer & P. A. Alexander (Eds.), Handbook of research on learning and instruction (2nd ed., pp. 325–348). Routledge.
Renkl, A., & Eitel, A. (2019). Self-explaining: learning about principles and their application. In J. Dunlosky & K. Rawson (Eds.), Cambridge handbook of cognition and education (pp. 528–549). Cambridge University Press.
Renkl, A., Hilbert, T., & Schworm, S. (2009). Example-based learning in heuristic domains: a cognitive load theory account. Educational Psychology Review, 21(1), 67–78. https://doi.org/10.1007/s10648-008-9093-4.
Roelle, J., & Berthold, K. (2016). Effects of comparing contrasting cases and inventing on learning from subsequent instructional explanations. Instructional Science, 44(2), 147–176. https://doi.org/10.1007/s11251-016-9368-y.
Roelle, J., & Nückles, M. (2019). Generative learning versus retrieval practice in learning from text: The cohesion and elaboration of the text matters. Journal of Educational Psychology, 111, 1341–1361.
*Roelle, J., Berthold, K., & Fries, S. (2011). Effects of feedback on learning strategies in learning journals: learner-expertise matters. International Journal of Cyber Behavior, Psychology and Learning, 1(2), 16–30. https://doi.org/10.4018/ijcbpl.2011040102.
*Roelle, J., Krüger, S., Jansen, C., & Berthold, K. (2012). The use of solved example problems for fostering strategies of self-regulated learning in journal writing. Education Research International 2012, 1–14.https://doi.org/10.1155/2012/751625
*Roelle, J., Nowitzki, C., & Berthold, K. (2017). Do cognitive and metacognitive processes set the stage for each other? Learning and Instruction, 50, 54–64. https://doi.org/10.1016/j.learninstruc.2016.11.009.
Rogers, T., & McClelland, J. (2004). Semantic cognition. MIT Press.
Roodenrys, K., Agostinho, S., Roodenrys, S., & Chandler, P. (2012). Managing one's own cognitive load when evidence of split attention is present. Applied Cognitive Psychology, 26(6), 878–886. https://doi.org/10.1002/acp.2889.
Scardamalia, M., & Bereiter, C. (1991). Literate expertise. In K. A. Ericsson & J. Smith (Eds.), Toward a general theory of expertise: prospects and limits (pp. 172–194). Cambridge University Press.
*Schmidt, K., Maier, J., & Nückles, M. (2012). Writing about the personal utility of learning contents in a learning journal improves learning motivation and comprehension. Education Research International, 2012, 1–10. https://doi.org/10.1155/2012/319463
Schraw, G. (1998). Promoting general metacognitive awareness. Instructional Science, 26(1/2), 113–125.
*Schwonke, R., Hauser, S., Nückles, M., & Renkl, A. (2006). Enhancing computer-supported writing of learning protocols by adaptive prompts. Computers in Human Behavior, 22(1), 77–92. https://doi.org/10.1016/j.chb.2005.01.002.
Seifried, E., Lenhard, W., Baier, H., & Spinath, B. (2012). On the reliability and validity of human and LSA-based evaluations of complex student-authored texts. Journal of Educational Computing Research, 47(1), 67–92. https://doi.org/10.2190/EC.47.1.d.
Shenhav, A., Musslick, S., Lieder, F., Kool, W., Griffiths, T. L., Cohen, J. D., & Botvinick, M. M. (2017). Toward a rational and mechanistic account of mental effort. Annual Review of Neuroscience, 40(1), 99–124. https://doi.org/10.1146/annurev-neuro-072116-031526.
Shulman, L. S. (1987). Knowledge and teaching: foundations of the new reform. Harvard Educational Review, 57, 1–22. https://doi.org/10.17763/haer.57.1.j463w79r56455411.
Swafford, J., & Bryan, J. K. (2000). Instructional strategies for promoting conceptual change: supporting middle school students. Reading & Writing Quarterly, 16(2), 139–161. https://doi.org/10.1080/105735600278006.
Sweller, J. (2006). The worked example effect and human cognition. Learning and Instruction, 16(2), 165–169. https://doi.org/10.1016/j.learninstruc.2006.02.005.
Sweller, J., & Cooper, G. A. (1985). The use of worked examples as a substitute for problem solving in learning algebra. Cognition and Instruction, 2(1), 59–89. https://doi.org/10.1207/s1532690xci0201_3.
Sweller, J., van Merriënboer, J. J. G., & Paas, F. G. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251–296. https://doi.org/10.1023/a:1022193728205.
Sweller, J., Ayres, P., & Kalyuga, S. (2011a). The goal-free effect. In J. Sweller, P. Ayres, & S. Kalyuga (Eds.), Cognitive load theory (pp. 89–98). Springer. https://doi.org/10.1007/978-1-4419-8126-4_13.
Sweller, J., Ayres, P., & Kalyuga, S. (2011b). The guidance fading effect. In J. Sweller, P. Ayres, & S. Kalyuga (Eds.), Cognitive load theory (pp. 171–182). Springer. https://doi.org/10.1007/978-1-4419-8126-4_13.
Sweller, J., van Merriënboer, J. J. G., & Paas, F. G. (2019). Cognitive architecture and instructional design: 20 years later. Educational Psychology Review, 31(2), 261–292. https://doi.org/10.1007/s10648-019-09465-5.
Thiede, K. W., Anderson, M., & Therriault, D. (2003). Accuracy of metacognitive monitoring affects learning of texts. Journal of Educational Psychology, 95(1), 66–73. https://doi.org/10.1037/0022-06220.127.116.11.
Toulmin, S. E. (1958, 2003). The uses of argument. Cambridge: Cambridge University Press.
Traxler, M., & Gernsbacher, M. A. (Eds.). (2011). Handbook of psycholinguistics. Elsevier.
Tynjälä, P., Mason, L., & Lonka, K. (2012). Writing as a learning tool: integrating theory and practice (Vol. 7). Springer Science & Business Media.
van de Pol, J., de Bruin, A. B. H., van Loon, M. H. & van Gog, T. (2019). Students’ and teachers’ monitoring and regulation of students’ text comprehension: effects of metacomprehension cue availability. Contemporary Educational Psychology, 56, 236–249. https://doi.org/10.1016/j.cedpsych.2019.02.001.
van de Pol, J., van Loon, M. H., van Gog, T., Braumann, S., & de Bruin, A. B. H. (2020). Mapping and drawing to improve students’ and teachers’ monitoring and regulation of students' learning from text: Current findings and future directions. Manuscript submitted for publication.
Waldeyer, J., & Roelle, J. (2020). The keyword effect: a conceptual replication, effects on bias, and an optimization. Metacognition and Learning. Manuscript submitted for publication. https://doi.org/10.1007/s11409-020-09235-7
*Wäschle, K., Gebhardt, A., Oberbusch E. M., & Nückles, N. (2015a). Journal writing in science: effects on comprehension, interest, and critical reflection. Journal of Writing Research, 7, 41–64. https://doi.org/10.17239/jowr-2015.07.01.03.
Wäschle, K., Lehmann, T., Brauch, N., & Nückles, M. (2015). Prompted journal writing supports preservice history teachers in drawing on multiple knowledge domains for designing learning tasks. Peabody Journal of Education, 90, 546-559. https://doi.org/10.1080/0161956X.2015.1068084
Weinstein, C. E., & Mayer, R. E. (1986). The teaching of learning strategies. In C. M. Wittrock (Ed.), Handbook of research in teaching (pp. 315–327). Macmillan.
Wilson, H. K., & Braaten, E. B. (2019). The Massachusetts General Hospital guide to learning disabilities: assessing learning needs of children and adolescents. Springer.
Winne, P. H., & Hadwin, A. F. (1998). Studying as self-regulated learning. In D. J. Hacker, J. Dunlosky, & A. C. Graesser (Eds.), Metacognition in educational theory and practice (pp. 277–304)). Erlbaum.
Winter-Hölzl, A., Watermann, R., Wittwer, J., & Nückles, M. (2016). Warum schreiben Promovierende bessere abstracts als Studierende? Genrewissen schlägt Textverständnis und Forschungskompetenz [Why are PhD students able to write better abstracts than undergraduates? Genre knowledge beats text comprehension and research literacy]. Unterrichtswissenschaft, 44, 7–24.
Zheng, L. (2016). The effectiveness of self-regulated learning scaffolds on academic performance in computer-based learning environments: a meta-analysis. Asia Pacific Education Review, 17(2), 187–202. https://doi.org/10.1007/s12564-016-9426-9.
Zimmerman, B. J. (2008). Investigating self-regulation and motivation: historical background, methodological developments, and future prospects. American Educational Research Journal, 45(1), 166–183. https://doi.org/10.3102/0002831207312909.
Zimmerman, B. J., & Martinez-Pons, M. (1990). Student differences in self-regulated learning: relating grade, sex, and giftedness to self-efficacy and strategy use. Journal of Educational Psychology, 82(1), 51–59. https://doi.org/10.1037/0022-0618.104.22.168.
Zohar, A., & Peled, B. (2008). The effects of explicit teaching of meta-strategic knowledge on low- and high-achieving students. Learning and Instruction, 18(4), 337–353. https://doi.org/10.1016/j.learninstruc.2007.07.001.
We thank Alex Ulyet for proofreading the manuscript. Special thanks go to Kirsten Berthold whose experimental study on prompting cognitive and metacognitive strategies marked a milestone in our research program on journal writing.
Open Access funding provided by Projekt DEAL.
Conflict of Interest
The authors declare that they have no conflict of interest.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Excerpts are close to the German originals. The part from Glogger et al., Learning strategies assessed by journal writing: Prediction of learning outcomes by quantity, quality, and combinations of learning strategies, Journal of Educational Psychology, 104, 452–468, 25.04.2020, APA, is adapted with permission.
About this article
Cite this article
Nückles, M., Roelle, J., Glogger-Frey, I. et al. The Self-Regulation-View in Writing-to-Learn: Using Journal Writing to Optimize Cognitive Load in Self-Regulated Learning. Educ Psychol Rev 32, 1089–1126 (2020). https://doi.org/10.1007/s10648-020-09541-1
- Journal writing
- Self-regulated learning
- Cognitive and metacognitive learning strategies
- Cognitive load theory
- Worked examples