Research on instructional and learning design is ‘booming’ in Europe, although there has been a move from a focus on content and the way to present it in a formal educational context (i.e., instruction), to a focus on complex learning, learning environments including the workplace, and access to learner data available in these environments. We even see the term ‘learning experience design’ (Neelen and Kirschner 2020) to describe the field. Furthermore, there is an effort to empower teachers (and even students) as designers of learning (including environments and new pedagogies), and to support their reflection on their own practice as part of their professional development (Hansen and Wasson 2016; Luckin et al. 2016; Wasson et al. 2016). While instructional design is an often heard term in the United States and refers to “translating principles of learning and instruction into plans for instructional materials, activities, information resources, and evaluation” (Smith and Ragan 1999), Europe tends to lean more towards learning design as the key for providing efficient, effective, and enjoyable learning experiences. This is not a switch from an instructivist to a constructivist view nor from a teacher-centred to a student-centred paradigm. It is, rather, a different mind-set where the emphasis is on the goal (i.e., learning) rather than the approach (i.e., instruction). Designing learning opportunities in a technology enhanced world builds on theories of human learning and cognition, opportunities provided by technology, and principles of instructional design. New technology both expands and challenges some instructional design principles by opening up new opportunities for distance collaboration, intelligent tutoring and support, seamless and ubiquitous learning and assessment technologies, and tools for thinking and thought. In this article, the authors give an account of their own and other research related to instructional and learning design, highlight related European research, and point to future research directions.
Background and Challenges
Technology enhanced learning environments are designed social and information spaces where formal, non-formal and informal learners (Van Merriënboer et al. 2009) engage with learning materials, learning artefacts (e.g., learning tools / applications), co-learners, teachers/instructors,Footnote 1 trainers, experts, etc. The advent of information and communication technologies for learning has moved the focus of learning design from just the learning materials and their sequencing (including the earliest print learning material that was mailed to students by distance or open learning institutions) or a learning artefact (e.g., a content management system), to the learning environment as a whole.Footnote 2 The consequence is that design is no longer just concerned with content, or a single technological learning artefact, but with learning environments, be they on-line, in a classroom, somewhere in the real world such as in the workplace or at a museum, or blended situations. Furthermore, the role of different actors in these environments (e.g., peers and team members for collaborative learning), different types of interaction (e.g., via training or collaboration scripts), or learning behaviour (e.g., self-reflection, regulation of learning of oneself or others), can be the focus of the learning design. Thus, there is a move from traditional instructional design (ID) and authoring tools to support teachers/instructors in ID of instruction, to a wider understanding of design as learning design (LD) of the learning experienceFootnote 3 and learning environment. In our view there are three major challenges for the learning designer as professional.
One challenge is empowering teachers/instructors as designers. An expert restaurant chef makes use of all of the techniques (baking, frying, steaming, cryo-cooking, sous-vide), tools (pots, pans, knives, ovens, 3D-printers), and ingredients (foods, herbs, spices, condiments) that (s)he has at her/his disposal. This expert chef also has the requisite deep knowledge, skills, attitude, and experience to know what to use with what as well as how and when to use them to create delicious, nutritious, and beautiful meals. If that chef can do this and also can organise and manage her/his restaurant (including collaboration with others within the restaurant ranging from the sous-chef to the specialty chefs to the waiters and dish-washers) in a way that showcases all three including the collaboration with other team members, then (s)he will receive one, two, or even three Michelin® stars.
Analogous to this, an expert learning designer also makes use of all of the techniques (different pedagogies and approaches to instruction and learning from lecture to computer supported collaborative learning to games/simulations, to on the job training, learning communities, apprenticeship, etc.), tools (books, whiteboards, computers, mobile devices, workshops, etc.), and ingredients (content domain, cues, traditional- and 360-degree feedback, learning objectives, etc.), see Fig. 1. (S)he also has the requisite deep content-, pedagogical content, technological pedagogical content knowledge and skills, attitude, and experience to know what to use with what, as well as how and when to use them to create effective, efficient, and enjoyable learning experiences. If that designer can do this and also can organise and manage her/his environment (including the necessary collaboration with others such as content area specialists, ICT specialists, graphic user interface designers to tech-staffers and administrative personnel) in a way that showcases all of this this, then (s)he will be a one, two, or even three Michelin® star designer. From such a perspective, learning designers are the source of new learning designs. How do we best support learning designers in this endeavour?
This brings a second challenge. There is, on the one hand, a growing corpus of solid empirical research evidence on what works and where/why. Learning designers can make use of this information to make evidence-informed decisions as to what tools, techniques, and ingredients they can and should employ. On the other hand, there is also a growing corpus of plausible sounding misinformation, hypes, fads, and even ‘fake’ news (e.g., learning styles, our ability to multitask, generic twenty-first century skills) permeating journals, conventional media, blogs, etcetera which is polluting the learning design ecosystem. Note here the use of the term evidence-informed and not evidence based. Neelen and Kirschner (2019) note that while evidence-based practice which is often found in healthcare and medicine makes use of the best available research evidence of whether and why a treatment works, clinical expertise of the health care professional, and client preferences and values, evidence-informed practice is based on scientific research, but is less direct. The learning sciences deals with “muddy real-life things…that can influence what we want to achieve and whether it’s achieved…[sometimes without] ‘straightforward measurement’. (p. 4).
Why this is a problem is because many learning designers either don’t have access to the scientific literature as this is often only available behind a paywall that is only freely available to academics or, if they do, are often not able to understand and interpret it with respect to what they do as learning designers as they have not been trained as learning scientists. The journals are often behind paywalls, the learning designers have often not followed a strand of education that allows them to know and understand the theories employed, and don’t have the methodological or statistical knowledge and skills to judge the value of the research (i.e., its reliability and validity) and then interpret it as to how it can or should be implemented. Also, their day-to-day practice often does not afford them the time and space to do the necessary reading.
A final and related challenge regards the growing access to digital data about students and one’s own practice. Is the data being captured actually relevant to what the designer is doing or is it either the analysis of low hanging fruit (e.g., with lots of proxies or data that is irrelevant to the actual learning process) and/or is it pure correlational data (here again we see the need for statistical and methodological knowledge) where no causal relationships can be drawn? How can the relevant and necessary data be captured and presented to teachers (and learners) in such a way that they can understand it, reflect on it, and then integrate it in into their design or learning choices. How can we empower teachers/instructors to collect, analyse, interpret, and use the results for their own professional development and for improving student learning?
Learning Design in the European Context
In the Netherlands, the tradition has been more of a focus on “learning” than on “instruction” and shifting from instructional design to learning design. Jeroen van Merriënboer and Paul Kirschner, for example have expanded the concept of Four Components Instructional Design (4C/ID; Van Merriënboer 1997) to one of complex learning (i.e., Ten Steps to Complex Learning; Kirschner and van Merriënboer 2009; Van Merriënboer and Kirschner 2018). This is a holistic approach to designing learning environments that breaks with the traditional compartmentalisation and fragmentation of traditional design approaches. A basic assumption of the 4C/ID model is that learning design for complex learning consist of four basic components, namely (a) learning tasks, (b) supportive information, (c) procedural information, and (d) part-task practice (see Fig. 2). Learning tasks provide the backbone of the educational program; they provide learning from varied experiences and explicitly aim at the transfer of learning. The three other components are connected to this backbone (https://www.4cid.org/about-4cid). It will typically be used for developing substantial learning or training programs ranging in length from several weeks to several years or that entail a substantial part of a curriculum for the development of competencies or complex skills, both cognitive and psycho-motor (see Fig. 2).
In Scandinavia there is a culture and tradition of design perspectives (theoretical as well as political and cultural) and a focus on designs for learning, in particular interactive and collaborative ones. In an editorial in the Designs for Learning journal (http://www.designsforlearning.nu), the editors explain
Scandinavian education, likewise, has developed a broad, democratic approach to learning, with a focus on aspects such as collaboration, creativity, inclusion, and problem solving, as well as on learner participation and responsibility … This broad approach to design and learning involves an understanding for deep commitments to democracy and democratisation, discussions of values in design and imagined futures, and how conflict and contradictions are regarded as resources in design (Sørensen et al. 2016, p.23).
Scandinavian tradition also recognises a tight relationship between design and use, where one is always designing for a future use situation (Bannon and Bødker 1991). In such a perspective design is rooted in a human activity framework where the origin for design is the future use activity or situation, which means that design needs to start from “the present praksis of future users” (Bannon & Bødker, 991, p. 242). This raises implications for the design, implementation, use, and evaluation of technology enhanced learning environments (Wasson 2007), whether a single artefact or an on-line, classroom, or blended learning space. In particular, this implies that when we look at a technology rich learning environment we need to look at activity from both a design and use perspective, as illustrated in Fig. 3, and that these are tightly intertwined in the institutional, pedagogical and technological aspects of a learning environment.
On the wider European scene, there is a focus on learning design (LD) (Mor and Mogilevsky 2013; Beetham and Sharpe 2013; Laurillard 2012; Persico and Pozzi 2015), where the aim is “the creative and deliberate act of devising new practices, plans of activity, resources and tools aimed at achieving particular educational aims in a given context” (Mor and Craft 2012, p. 89). Persico and Pozzi (2015) provide a comprehensive overview of the history of LD and are of the opinion that the main difference between ID and LD is that ID “mostly focuses on methodological support to make the design process more systematic” (p. 233) and LD is focussed on “the objective of making already produced designs easier to share and reuse ... LD is based on what are perceived to be the needs of today’s individual educators, rather than those of educational technologists engaged in the systematic design of big instructional programmes.” (p. 233).
In recent years there has also been a European focus on LD and its integration with teacher inquiry and learning analytics (Mor et al. 2015; Persico and Pozzi (2015); Wasson et al. 2016). In their editorial to a special issue in the British Journal of Educational Technology (BJET) on ‘Learning design, teacher inquiry into student learning, and learning analytics: A call for action’, Mor, Ferguson, and Wasson write that the “learning design approach advocates a shift from a focus on content to a focus on the learning experience, with an aim of guiding learners as they make a transition from an initial state of mind to a desired one” (Mor et al2015, p. 221). Such an approach has its roots in design as a reflective practice (Schön 1992) and the idea is that teachers/instructors are practitioners who are empowered as designers and researchers of learning. Teacher inquiry is an approach to professional development and capacity building in education in which teachers study their own and their peers’ practice (Cochran-Smith and Lytle 1999). In the European project, NEXT-TELL (http://www.next-tell.eu/), researchers built on teacher inquiry approaches to investigate teacher inquiry into student learning (TISL) focussing on using student data to improve teacher practice and student learning (Avramides et al. 2014; Clark et al. 2011; Hansen and Wasson 2016; Luckin et al. 2016). Such a view on the use of student data open nicely for the use of learning analytics (LA), which uses data about learners and their contexts to understand and optimize learning and the environments in which it takes place (Ferguson 2012). Persico and Pozzi (2015) discuss the contribution “learning analytics can make to transform LD from a craft, based on experience, intuition and tacit knowledge, into a mature research area, grounded on data concerning the learning process and hence supporting enquiry while teachers design, run and evaluate the learning process” (p. 230).
Our Own Research and European Highlights
How Does my Research Look at the Problem?
Our research related to design has been at different levels of granularity, albeit complementary. Paul and his colleagues have looked at how to foster effective, efficient, and enjoyable environments for complex learning, focusing on research relating to learning tasks, information problem solving, and flexible learning. Barbara and her colleagues have taken a wider perspective on the design of technology enhanced learning environments. From a wider European perspective we see efforts looking at the relationship between learning design, teacher inquiry, and learning analytics.
What Specific Solutions and Approaches Are There in our Own Research?
Paul’s work has, among other things, dealt with the four-component instructional design (4C/ID) for complex learning, see Fig. 4. Nowadays, the 4C/ID model receives a lot of attention because it nicely fits current trends in education: (a) focus on the developing complex skills or professional competencies, (b) emphasis on the transfer of what is learned in school to new situations including the workplace, and (c) development of skills important for lifelong learning. It is an approach for complex learning; that is, learning aimed at integrative goals where knowledge, skills, and attitudes are developed simultaneously in order to acquire complex skills and professional competencies. Its basic assumption is that blueprints for complex learning can always be described by four basic components: learning tasks, supportive information, procedural information, and part-task practice.
Practical applications of the 4C/ID model can be found around the world, and books and articles on the model have been translated in several languages, including Chinese, Dutch, German, Korean, Portuguese, and Spanish. The majority of its practical applications are not well described in the international literature, but instead described in local publications or not published at all. Yet, we will provide some recent examples to give an impression of its use.
The Amsterdam University of Applied Sciences in the Netherlands, for example, is designing its courses Android app development using the 4C/ID model. In their description of the design of the course, Marcellis et al. (2018) use a blended design consisting of the four components. Traditional, small practice items that focus on specific aspects of the task are replaced by whole learning tasks based on professional practice, grouped in task classes that increase in complexity. Frerejean et al. (2019a) describe that “the course is intended for a varied population of students from all over the world studying full-time or part-time, the designers chose a blended design where learning tasks, supportive information, and procedural information reside in an online environment. Classroom activities can be followed by students on-site, and include modelling examples, imitation tasks in small groups, and feedback sessions led by the teacher. Other learning tasks are presented in the online learning environment and are performed individually by students” (p. XX).
Another example is from a teacher training institute in the Netherlands that aims to address the lack of information problem solving (IPS) skills training in their curriculum. Usually, design thinking leads to implementing separate IPS training sessions, modules, or assignments focusing on developing what can be called ‘general IPS skills’. This may take place in a separate course or provided by the university library. While this approach often shows short term gains, teaching IPS skills out of context of the domain-specific field where they will be used often isn’t successful in the long term. To alleviate this problem, the institute in question used the 4C/ID approach where the acquisition of IPS skills was embedded as a task-centred IPS training in a semester-long course on language teaching for primary school teachers. The IPS skills training was designed according to an ‘learning blueprint’ which described a series of online learning tasks – complete with the necessary supportive and procedural information –, aimed at the acquisition of IPS skills in the context of vocabulary development. This blueprint was subsequently woven into the original course. Van Merriënboer and Kirschner (2018) call this second-order scaffolding of learning. For a complete description of the course and its design, see Frerejean et al. (2019b).
In her own work and that of her research group, Wasson has drawn on a sociocultural perspective on learning that views activity as central to both design and analysis, and on the Scandinavian tradition of a tight relationship between design and use. An example found in Wasson (2007), which describes the design of a learning scenario, VisArt, where students at three institutions, with different backgrounds (e.g., teacher education, psychology, computer science) collaborated through a groupware learning environment (Teamwave workplace) to collaborative design a learning artefact. Figure 5 shows the institutional, pedagogical, and technological design aspects that were taken into consideration.
More recent work has focussed on Teacher-Inquiry into Student Learning (TISL) and the development of the TISL Heart, a theory-practice model and method of teacher inquiry into student learning, which has a particular emphasis on the use of student learning, activity and assessment data for professional development and better student learning (Hansen and Wasson 2016). Motivated by calls for new teacher-training models that are based on the twenty-first-century teaching professional who designs learning environments that offer technology for better learning and who continuously learns from their teaching (Wastiau 2014), the TISL Heart, see Fig. 6, was the result of iterative development with teachers of a theory-practice model to support teacher inquiry into student learning, which places student data in the centre of the inquiry process. The Kick-off begins when a teacher first identifies the issues related to student learning in which s/he is interested. Related to these issues are Assumptions and beliefs that flavour the teacher’s understanding of the issues. Aware of the issues and assumptions, a manageable Research question (shown as?) is formed and feeds into the Method, which expounds how to collect student data to answer the research question. The student data are collected during teaching and assessment, and in turn feed into a Learning outcome, the analysis of which results in Feedback (for students). The results are shared (with other teachers) and used for reflection leading to new assumptions, new practice (teaching and assessment) and, thus, further change.
Another example of design for learning is her work with Wake and Guribye on the design of location-based learning games, which sends learners out in physical spaces with mobile devices that through the use of GPS tie the game to specific physical locations and provides context sensitivity. Thus, location-based games employ mobile technology in pedagogical designs for educational purposes and provide an immersive experience in a way not possible in the classroom. Guribye et al. (2014) describe the design, deployment, analysis and evaluation of a mobile learning game, Premierløitnant Bielke (PB), for learning history that is embedded in a pedagogical scenario based on collaborative mobile learning. PB uses both the concrete geographical surroundings relevant to a historical setting, and a storyline about the same setting to engage teams of learners in game-play. The analysis of their game-play offers insight into interactional organisation and practical accomplishment of collaborative game-play in a location-based game.
They take this work further by exploring students as learning designers. Using an authoring tool developed for designing location-based games, Wake et al. (2018) explores the pedagogical potential of students as game designers and shows how students can learn through collaborative designing and playing location-based games. Teams of students collaboratively designed games for each other about the history of WWII in Bergen. The students were not only designing a game, but also designing for a history learning experience; a learning experience outside the game itself. The learning design scenario engaged them in collaborative designing activities, producing media content, carrying out collaborative learning activities, and integrating these with curricular goals and institutional demands. An analysis of the design scenario showed that the students engaged creatively with the learning materials and the resources available to them and transformed the source materials and concrete locations into points of interest in the location-based game. Furthermore, they “they relate to the historical materials and sources to create a game and thus have to make design decisions and reflect upon how the game will be received by the other team of students” (Wake et al. 2018, p. 182).
What Are the Related Works in Europe on these Issues?
In the Netherlands, Belgium, and Germany, 4C/ID with The Ten Steps is probably the most popular learning design model used in all educational sectors, ranging from primary education to adult learning. At the University of Maastricht, for example, there is a whole research and curriculum development programme at the master’s level in Health Sciences dedicated to this. In this respect, the scale of thinking about and using learning design is increasing. The University of Leicester, for example, offers a MOOC in learning design within EMMA, a European Multiple MOOC Aggregator (https://platform.europeanmoocs.eu/). EMMA is a Multilanguage learning environment, at the moment in eight languages.
Persico and Pozzi (2015) identify and describe three main strands of research related to LD: representation, approaches, and tools. LD representations of the products of the LD process can be either text based (formalised language or natural language narratives) or visual (graphs, flow charts, content maps, swim lanes) that represents the content domain structure and associated assessments (Persico and Pozzi 2015). LD approaches guide the decision-making for the design of a single or sequence of activities and are “intended to help both novice teachers, who may not be familiar with the decision criteria that are at the heart of the design process, and experienced teachers who intend to design activities with some innovative features, such as the use of a new technological tool (Persico and Pozzi 2015, p. 238-239). Their Italian 4Ts approach (Pozzi and Persico 2013) supports decision-making and pedagogical planning about tasks, teams, technology and time, in computer supported collaborative learning activities. Teachers or designers juggle these 4Ts in “no predetermined or mandatory order” (Persico and Pozzi 2015, p. 239). They also describe examples such as the Spanish 4SPPIces (Pérez-Sanagustín et al. 2012) that facilitates the design of computer-supported collaborative blended learning activities, the French approach ISiS (Emin et al. 2009) that captures teachers’ intentions and strategies that can be used by other teachers to understand LDs, thus facilitating sharing and reuse.
Several tools to support LD have also been developed in Europe. These tools can be distinguished as “reflection tools and pedagogical planners, authoring and sharing tools, repositories, and delivery tools” (Persico and Pozzi 2015, p. 240). Some tools implement the LD approaches mentioned above, including the Spanish LDShake tool (Hernández-Leo et al. 2011) implements the 4SPPIces approach, and ScenEdit implements the ISiS approach (Emin et al. 2010). Others they mention address new ideas such as the Spanish tool Web Collage (Villasclaras-Fernández et al. 2011) that provides a graphical interface to help design the use of collaboration patterns/techniques such as the Jigsaw or the Pyramid, the University of Oxford LD tool Phoebe (http://www.phoebe.ox.ac.uk) that provides inspiration and practical support to those engaging in LD, and the Greek developed tool CADMOS (Katsamani and Retalis 2011) that support the development of a conceptual model that describes learning activities and corresponding learning resources/services) and a flow model that describes the orchestration of the learning activities.
Recent work at the University of Tallinn by Eradze and colleagues investigates how a synergy between learning design and classroom observations could improve our understanding of classroom learning, and address the need for evidence-based teaching and learning practices. They contextualise classroom observations within modern data collection approaches and practices and carry out a systematic literature review based on 24 works that connect learning design and classroom observations (Eradze et al. 2019). They found challenges from a “need for computer-interpretable documented designs; the lack of reported systematic approaches and technological support to connect the (multimodal) observations with the corresponding learning designs; and, the predominance of human-mediated observations of the physical space, whose applicability and scalability are limited by the human resources available.” (p. 1). Their conclusion is that to tap the potential synergy between learning design and classroom observations there is a crucial need for a technological infrastructure and the use of standards in both design and observation.
Research on the relationship between learning design and learning analytics has also been a focus in European research in recent years. For example, in their research at the Open University UK, Toetenel and Rienties combine learning design and learning analytics where learning design provides context to empirical data about OU courses enabling the learning analytics to give insight into learning design decisions. This research is important as it attempts to close the virtuous cycle between learning design to improve courses and enhancing the quality of learning, something that has been lacking in the research literature. For example, they study the impact of learning design on pedagogical decision-making and on future course design, and the relationship between learning design and student behaviour and outcomes (Toetenel and Rienties 2016; Rienties and Toetenel 2016; Rienties et al. 2015). In Rienties et al. (2015) they present a learning design taxonomy that identifies seven types of learning activity – assimilative, finding and handling information, communicative, productive, experimental, interactive, assessment – that was developed in the Open University Learning Design Initiative (Cross et al. 2012). Rienties et al. used to learning design taxonomy to categorise learning activities in 87 teaching modules and captured the categorisations in an activity planner. They used learning analytics (cluster- and correlation analyses) to compare how the designs of the 87 modules impacted student behaviour in Learning Management Systems (LMS) and their learning performance. Their results showed that the “learning design activities strongly influenced how student were engaging online”, and also “seemed to have an impact on learning performance, in particular when modules reply on assimilative activities” (p. 315).
In Rienties and Toetenel (2016) they present the results of using learning analytics (multiple regression models) to link 151 Open University UK learning modules and 111,256 students with students’ behaviour (<400 million minutes of online behaviour), satisfaction and performance. One result is their findings “strongly indicate the importance of learning design in predicting and understanding Virtual Learning Environment behaviour and performance of students in blended and online environments.” (p. 333), and that the “primary predictor for academic retention was the time learners spent on communication activities”, which suggests that “Where possible, appropriate and well-designed communication tasks that align with the learning objectives of the course may be a way forward to enhance academic retention” (p. 333).
Another example is a workshop on Learning Design, Teacher Inquiry, and Learning Analytics that was held at the 2013 Alpine Rendez-Vous in Villard-de-Lans, France (Emin-Martínez et al. 2014; Wasson et al. 2016) where researchers investigated the relationship between these three, resulting in the proposal of a model for Teacher-led Design Inquiry of Learning, an integrated model of teacher inquiry into student learning, learning design, and learning analytics, which aims to capture the essence of the synergy of the three fields and to support the development of a new model of educational practice, as illustrated in in Fig. 7.
Rodríguez-Triana and colleagues in Spain have been investigating how to align learning analytics (monitoring) to support teachers in the design (scripting) and orchestration of Computer Supported Collaborative Leaning (CSCL) situations. Rodríguez-Triana et al. (2015) present their approach to connecting the “pedagogical decisions made at design time with the analysis of the participants’ interactions” (p. 330). In particular, the LA will enable teachers to monitor if their scripting design decisions are accomplished or not. Their monitoring-aware model of CSCL scripting describes the connections between scripting and monitoring. From the design perspective it represents “the output of the monitoring- aware design process, providing a joint picture of the pedagogical and monitoring decisions made by the teacher” (p. 336) and from the monitoring perspective it “specifies the data to be gathered and the analysis criteria.” (p. 336). Finally, they provide guidelines on how to implement their approach in exiting authoring, enactment, and analysis tools.
Persico and Pozzi (2015) also investigate how informing LD with LA can improve teacher inquiry related to the three identified strands of LD research— representation, approaches, tools. For example, in work related to research on textual representations for sharing learning designs, design patterns have been used for both recording the designs (learning design) and for analysing their use in learning systems (learning analytics) (Persico et al. 2009). Another approach is to have the learning design include a description of the data available during the learning process, which can be used for learning analytics. This can be included in both textual descriptions, or as visualisations of learning indicators and analytics for the different learning designs. Work by Dias and Diniz (2013) is used to illustrate this; they use an on the fly generated chart that represents student and teacher interaction that teachers can use “to tune and adapt their initial designs” (Persico and Pozzi 2015, p. 238). An example of how learning analytics can aid with effectiveness of learning design approaches is to data about their previous application, including “data about previous learning dynamics of the same students or also other students in similar contexts, including information on task performance, learning outcomes, problems arisen and solutions adopted” (p. 240). The GLUE!-PS (Prieto et al. 2011) tool developed at the University of Valladolid in Spain enacts the Web College pedagogical plans, thus is focussed on the enactment of learning design making it easier to collect learning analytics data that can be used to enhance the enactment tool (i.e., easier than for collecting data for the pedagogical planning that many tools address). Finally, Persico & Pozzi write that no tools yet make use of learning analytics to support teachers or designs in their inquiry process during the design phase.
The increased use of data to support learning design needs careful reflection. Oxford’s Rebecca Eynon’s (2013) editorial on the rise of big data use in education raises issues that need our attention. She cautions about the commercial focus of efficiency and cost-effectiveness as being the driver for “the use of data to improve education ‘delivery’ and as a means of carrying out research in our field” (p. 237) without tempering this with a much needed academic debate. She identifies 3 areas that need particular attention: ethics; limitations in the types of research questions that can be answered with big data; and, inequality and how big data can “reinforce and even exacerbate exiting social and educational inequalities” (p. 239). Within Europe the focus on issues of ethics and social implications of the use of the data (Eynon 2013) has been increasing in recent years, and within the learning analytics community there is good deal of reflective work on the use of data (e.g., Ferguson and Clow 2015; Perrotta and Williamson 2018). In Europe there is less focus on predictive algorithms (for example related to University admission) and we would posit less focus on demographic data than in the USA, and more focus as reviewed above on empowerment for teachers/instructors and for their professional development. For example, in Norway it would never be accepted for a learning analytics algorithm to use data about a student’s family background (when you are 18 you are “independent” from your parents) or middle school grades (students do not even get grades until 8th grade) to predict a student’s performance in University. European discussions around commercial companies use of student data (e.g., Google Classroom or Chromebook) centre about issues of conformance to data protection recommendations (GDPRFootnote 4) and about using our children’s data to create profiles of European studentsFootnote 5 (a quick search of the web shows that American parents question these practices as well). UK researchers Perrotta and Williamson (2018) provide an excellent example of epistemic differences in educational philosophies that are highlighted from the application of cluster analysis to a Stanford MOOC (Kizilcec et al. 2013) and its replication on an Open University UK FutureLearn MOOC (Ferguson and Clow 2015). They explain that these two papers highlight “(Stanford’s) eager to develop a ‘data-driven science of learning’ or ‘educational data science’, that enthusiastically (and forcefully?) married educational research and computer science; the other (the OU’s) showing a degree of intellectual alignment with the tradition of ‘socially sensitive’ British educational research, with its emphasis on conversations, dialogue and contexts (Crook 1996; Laurillard 2009; Wegerif 2007)” (Perrotta and Williamson 2018, p. 12). Here we see the European focus on the learning design to support more dialogic pedagogy.
Conclusions and Future Work
What we have shown in this article, is that research on learning design is alive and well in Europe, although in new forms than that of the foundations of traditional learning design of the 1970s and 1980s. There has been a move from a focus on content and the way to present it, to a focus on complex learning, learning environments, and the access to student data that is available in these environments. Furthermore, there is an effort to empower teachers (and even students) as designers of learning (including environments and new pedagogies), and to support their reflection on their own practice as part of their professional development. Challenges related to the role of data use in learning analytics (as discussed above) and teacher inquiry will need to be addressed, not just by researchers, but also from the perspective of teachers.
The design demands on the twenty-first Century teacher are many and time constraints are a hindrance. As we pointed out in the introduction, teachers need to effectuate the use of techniques, tools, and ingredients, and acquire the requisite deep knowledge, skills and experience to know what to use with what as well as how and when to use them. We need research on how to best support them in this learning design work and in the development of new pedagogies. Furthermore, as teachers need new data literacies, we need research on how we can empower teachers to collect, integrate, analyse, interpret, and use the results for their own professional development and for improving student learning.
More research is needed to explore the synergies between learning design, teacher-led inquiry into student learning (TISL), and learning analytics. Mor et al. (2015) point out that normally these are separate activities with little epistemic convergence between the three fields, however, they argue that they can be seen as complementary endeavours, each informing and improving the others. Persico and Pozzi (2015) argue that LA has the potential to increase the robustness of learning design and to make the “the decision-making process involved in LD better grounded in evidence and the exchanges between designers more anchored to actual teaching and learning practice” (p. 245).
Also, there is increased interest in the consequences of the availability of data to improve learning and teacher practice (see Reimann et al. 2016 for results from the European Next-TELL project – http://next-tell.eu). Ellen Mandinach introduced the concept pedagogical data literacy, which she defines as “the ability to transform information into actionable instructional knowledge and practices by collecting, analysing, and interpreting all types of data (assessment, school climate, behavioural, snapshot, etc.) to help determine instructional steps. It combines an understanding of data with standards, disciplinary knowledge and practices, curricular knowledge, pedagogical content knowledge, and an understanding of how children learn” (Mandinach 2013). Building on Mandinach and her colleague’s work (Mandinach and Gummer 2016), Wasson and Hansen (2016) write “data-rich work environments require new knowledge, skills, and abilities to lever the possibilities in, and beyond these classrooms. Accordingly, teacher capacity development for using ICT and data for their students’ learning and for their own professional development (Luckin et al. 2016) needs to be fostered.” (p. 56).
Finally, we see other challenges that complicate the picture. Technology is rapidly changing (i.e. technological advances) and providing new affordances for teaching and learning. Technology savvy students, who are used to configuring their own virtual world, organise their own interactions with peers, instructors and the world beyond, are the students that are in our classrooms and demanding new pedagogies. How can we support teachers in dealing with these realities?
We opt for the term ‘teacher/instructor’ to make clear that we are speaking of both K-12 and higher education.
A note on terminology: In this article we use the term learning designer to mean learning experience designers, learning architects, instructional designers, instructors, trainers, teachers, content developers, content curators, their managers, senior L&D leaders – indeed, anyone who supports learners. When we say ‘learner’, we mean any person who wants or needs to learn something.
In professional circles there is a discussion about the difference in the skill sets of an instructional designer and a learning experience designer: https://blog.elearningbrothers.com/webinar-learning-experience-design-vs-instructional-design
Avramides, K., Hunter, J., Oliver, M., & Luckin, R. (2014). A method for teacher inquiry in cross-curricular projects: Lessons from a case study. British Journal of Educational Technology, 46(2), 249–264.
Bannon, L. J., & Bødker, S. (1991). Beyond the interface. Encountering artifacts in use. In J. M. Carroll (Ed.), Designing interaction: Psychology at the human-computer interface (pp. 227–253). Cambridge, UK: Cambridge University Press.
Beetham, H., & Sharpe, R. (2013). Rethinking pedagogy for a digital age: Designing for 21st-century learning. London: Routledge.
Clark, W., Luckin, R., & Jewitt, C. (2011). Methods and specifications for TISL components v1 NEXT-TELL research report D5.1. NEXT-TELL consortium, European Commission IST-285114.
Cochran-Smith, M., & Lytle, S. (1999). The teacher research movement: A decade later. Educational Researcher, 28, 15–25. https://doi.org/10.3102/0013189x028007015
Crook, C. (1996). Computers and the collaborative experience of learning. London: Routledge, Psychology Press.
Cross, S., Galley, R., Brasher, A., & Weller, M. (2012). Final project report of the OULDI-JISC project: Practice. Challenge and Change in Curriculum Design Process, Communities, Visualisation and Practice City, 2012.
Dias, S. B., & Diniz, J. A. (2013). FuzzyQoI model: A fuzzy logic-based modelling of users’ quality of interaction with a learning management system under blended learning. Computers & Education, 69, 38–59. https://doi.org/10.1016/j.compedu.2013.06.016.
Emin, V., Pernin, J.-P., & Guéraud, V. (2009). Model and tool to clarify intentions and strategies in learning scenarios design, In Proceedings of the European conference on technology-enhanced learning (EC-TEL 2009) France, Nice, 462–476.
Emin, V., Pernin, J.-P. & Aguirre, J.-L. (2010). ScenEdit: an intention-oriented authoring environment to design learning scenarios. Proceedings of the European Conference on Technology-Enhanced Learning (EC-TEL 2010), Barcelona, Spain, 626–631.
Emin-Martínez, V., Hansen, C., Rodríguez-Triana, M. J., Wasson, B., Mor, Y., Dascalu, M., Ferguson, R., & Pernin, J.-P. (2014). Towards teacher-led design inquiry of learning. eLearning Papers, 36, 1–12.
Eradze, M., Rodríguez-Triana, M. J., Laanpere, M. (2019). A conversation between learning design and classroom observations: A systematic literature review. Educational Science, 9(2),91. 13 pages. https://doi.org/10.3390/educsci9020091.
Eynon. (2013). The rise of big data: What does it mean for education, technology, and media research?, learning. Media and Technology, 38(3), 237–240. https://doi.org/10.1080/17439884.2013.771783.
Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning (IJTEL), 4(5/6), 304–317.
Ferguson, R., and D. Clow. (2015). Examining engagement: Analysing learner subpopulations in massive open online courses (MOOCs). Proceedings of the Fifth International Conference on LA and Knowledge, ACM, 51–58, March 16–20.
Frerejean, J., van Merriënboer, J. J. G., & Kirschner, P. A. (2019a, accepted). Designing instruction for complex learning: 4C/ID in higher education. European Journal of Education, 54, 513–524.
Frerejean, J., Velthorst, G. J., van Strien, J. L. H., Kirschner, P. A., & Brand-Gruwel, S. (2019b). Embedded instruction to learn information problem solving: Effects of a whole task approach. Computers in Human Behavior, 90, 117–130. https://doi.org/10.1016/j.chb.2018.08.043.
Guribye, F., Wake, J., & Wasson, B. (2014). The practical accomplishment of location-based game-play: Design and analysis of mobile collaborative gaming. International Journal of Mobile Human Computer Interaction (IJMHCI), 6(3), 32–50.
Hansen, C. & Wasson, B. (2016). Teacher inquiry into student learning: The TISL heart model and method for use in teachers’ professional development. Nordic Journal of Digital Literacy, 10(1), 24-49. https://doi.org/10.18261/issn.1891-943x-2016-01-02.
Hernández-Leo, D., Romeo, L., Carralero, M. A., Chacón, J., Carrió, M., Moreno, P., et al. (2011). LdShake: Learning design solutions sharing and co-edition. Computers & Education, 57(4), 2249–2260.
Katsamani, M. & Retalis, S. (2011) Making learning design in layers. The CADMOS approach. IADIS Multi Conference on Computer Science and Information Systems, 305-312. 20-26 July. Rome, Italy. ISBN: 978-972-8939-38-0.
Kirschner, P. A., & van Merriënboer, J. J. G. (2009). Ten steps to complex learning: A new approach to instruction and instructional design. In T. L. Good (Ed.), 21st century education: A reference handbook (pp. 244–253). Thousand Oaks: Sage.
Kizilcec, R. F., Piech, C. & Schneider, E. (2013). “Deconstructing disengagement: Analysing learner subpopulations in massive open online courses.” Proceedings of the third international conference on LA and knowledge, Leuven, Belgium, ACM, 170–179, April 08–12. https://doi.org/10.1145/2460296.2460330
Laurillard, D. (2009). The pedagogical challenges to collaborative technologies. International Journal of Computer-Supported Collaborative Learning, 4(1), 5–20.
Laurillard, D. (2012). Teaching as a design science: Building pedagogical patterns for learning and technology. New York: Routledge.
Luckin, R., Hansen, C., Wasson, B., Clark, W., Avramides, K., Hunter, J. & Oliver, M. (2016). Teacher inquiry into Students' learning: Researching pedagogical innovations. In P. Reimann, S. Bull, M. Kickmeier-Rust, R. Vatrapu & B. Wasson (Eds) Measuring and visualizing learning in the information-rich classroom, 74–91. New York: Routledge. ISBN 9781138021136.
Mandinach, E.B. (2013). Data literacy vs. assessment literacy. Blog entry on Michael & Susan SDell Foundation. Retrieved from http://www.msdf.org/blog/2013/09/ellen-mandinach-data-literacy-vs-assessment-literacy/
Mandinach, E. B., & Gummer, E. S. (2016). Data literacy for teachers: Making it count in teacher preparation and practice. New York: Teachers College Press.
Marcellis, M., Barendsen, E., & van Merriënboer, J. (2018). Designing a blended course in android app development using 4C/ID. In Proceedings of the 18th Koli calling international conference on computing education research - Koli calling ‘18 (pp. 1–5). https://doi.org/10.1145/3279720.3279739.
Mor, Y., & Craft, B. (2012). Learning design: Reflections upon the current landscape. Research in Learning Technology, 20, 85–94.
Mor, Y., & Mogilevsky, O. (2013). The learning design studio: Collaborative design inquiry as teachers’ professional development. Research in Learning Technology, 21.
Mor, Y., Ferguson, R., & Wasson, B. (2015). Learning design, teacher inquiry into student learning, and learning analytics: A call for action. British Journal of Educational Technology (BJET), 46(2), 221–229. https://doi.org/10.1111/bjet.12273.
Neelen, M., & Kirschner, P. A. (2019). Title coming. London, UK: Kogan Page.
Neelen, M. & Kirschner, P. (2020). Evidence-informed learning design: Use evidence to create training which improves performance. London: KoganPage.
Pérez-Sanagustín, M., Santos, P., Hernández-Leo, D., & Blat, J. (2012). 4SPPIces: A case study of factors in a scripted collaborative-learning blended course across spatial locations. International Journal of Computer-Supported Collaborative Learning, 7(3), 443–465.
Perrotta, C., & Williamson, B. C. (2018). The social life of learning analytics: Cluster analysis and the ‘performance’ of algorithmic education, learning. Media and Technology, 43(1), 3–16. https://doi.org/10.1080/17439884.2016.1182927.
Persico, D., & Pozzi, F. (2015). Informing learning design with learning analytics to improve teacher inquiry. British Journal of Educational Technology, 46(2), 230–248.
Persico, D., Pozzi, F., & Sarti, L. (2009). Design patterns for monitoring and evaluating CSCL processes. Computers in Human Behavior, 25(5), 1020–1027.
Pozzi, F., & Persico, D. (2013). Sustaining learning design and pedagogical planning in CSCL. Research in Learning Technology, 21. https://doi.org/10.3402/rlt.v21i0.17585.
Prieto, L. P., Asensio-Pérez, J. I., Dimitriadis, Y., Gómez-Sánchez, E., & Muñoz-Cristóbal, J. A. (2011). GLUE!- PS: A multi-language architecture and data model to deploy TEL designs to multiple learning environments. In Proceedings of the European Conference on Technology-Enhanced Learning (EC-TEL 2011) (pp. 285–298). Italy: Palermo.
Reimann, P., Bull, S., Kickmeier-Rust, M., Vatrapu, R., & Wasson, B. (Eds.). (2016). Measuring and visualising competence development in the information-rich classroom. New York: Routledge.
Rienties, B., & Toetenel, L. (2016). The impact of learning design on student behaviour, satisfaction and performance: A cross-institutional comparison across 151 modules. Computers in Human Behavior, 60, 333–341 http://www.sciencedirect.com/science/article/pii/S0747563216301327.
Rienties, B., Toetenel, L. & Bryan, A. (2015). “Scaling up” learning design: Impact of learning design activities on LMS behavior and performance. In: Fifth International Conference on Learning Analytics And Knowledge (LAK 15), 315-319. Poughkeepsie, NY: ACM.
Rodríguez-Triana, M. J., Martínez-Monés, A., Asensio-Pérez, J. I., & Dimitriadis, Y. (2015). Scripting and monitoring meet each other: Aligning learning analytics and learning design to support teachers in orchestrating CSCL situations. British Journal of Educational Technology, 46(2), 330–343. https://doi.org/10.1111/bjet.12198.
Schön, D. A. (1992). Designing as reflective conversation with the materials of a design situation. Research in Engineering Design, 3(3), 131–147.
Smith, P. L., & Ragan, T. J. (1999). Instructional design. New York: Wiley.
Sørensen, B. H., Selander, S., Wasson, B., & Wennström, S. (2016). Designs for learning – Taking a step forward. Designs for Learning, 8(1), 23–24. https://doi.org/10.16993/dfl.71.
Toetenel, L., & Rienties, B. (2016). Analysing 157 learning designs using learning analytic approaches as a means to evaluate the impact of pedagogical decision-making. British Journal of Educational Technology, 47(5), 981–992.
Van Merriënboer, J. J. G. (1997). Training complex cognitive skills: A four-component instructional design model for technical training. Englewood Cliffs: Educational Technology Publications.
Van Merriënboer, J. J. G., & Kirschner, P. A. (2018). Ten steps to complex learning (Third ed.). New York: Taylor & Francis.
Van Merriënboer, J. J. G., Kirschner, P. A., Paas, F., Sloep, P. B., & Caniëls, M. C. J. (2009). Towards an integrated approach for research on lifelong learning. Educational Technology Magazine, 49(3), 3–15.
Villasclaras-Fernández, E. D., Asensio-Pérez, J. I., Hernández-Leo, D., Dimitriadis, Y., de la Fuente-Valentín, L., & Martínez-Monés, A. (2011). Implementing computer-interpretable CSCL scripts with embedded assessment: A pattern based design approach. In F. Pozzi & D. Persico (Eds.), Techniques for fostering collaboration in online learning communities: Theoretical and practical perspectives (pp. 261–277). Hershey: IGI Global Publishing.
Wake, J., Guribye, F., & Wasson, B. (2018). Learning through collaborative design of location-based games. International Journal of Computer-Supported Collaborative Learning, 13, 167–187.
Wasson, B. (2007). Design and use of collaborative network learning scenarios: The DoCTA experience. Educational Technology & Society, 10(4), 3–16.
Wasson, B. & Hansen, C. (2016). Data literacy and use for teaching. In P. Reimann, S. Bull, R. Lukin, B. Wasson (Eds.) Measuring and visualising competence development in the information-rich classroom, 56–74. New York: Routledge. ISBN 9781138021136.
Wasson, B., Hansen, C. & Mor, Yishay. (2016). Empowering teachers with student data. In J. Eberle, K. Lund, F. Fischer, & P. Tchounikine (Ed.) Grand challenge problems in technology enhanced learning II : MOOCS and beyond – Perspectives for research, practice, and policy making, 55-59. Springer Briefs in Education. London: Springer.
Wastiau, P. (2014). From face to face to online teacher professional development – Paving the way for new teacher training models? Nordic Journal of Digital Literacy, 9(1), 4–5.
Wegerif, R. (2007). Dialogic education and technology: Expanding the space of learning (Vol. 7). Berlin: Springer.
Open Access funding provided by University of Bergen.
Conflict of Interest
The authors declare that they have no conflict of interest.
Research Involving Human Participants and/or Animals
This article does not contain any studies with human participants or animals performed by any of the authors.
As there are no studies with human participants in the article, thus we have no informed consent to report on.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
It is our experience that Europeans appear to opt for learning design as opposed to instructional design as they see the former as being more inclusive. Schools, companies, and organisations in general strive to be learning organisations where learning occurs for everyone and can happen at any time in any situation.
About this article
Cite this article
Wasson, B., Kirschner, P.A. Learning Design: European Approaches. TechTrends 64, 815–827 (2020). https://doi.org/10.1007/s11528-020-00498-0
- Instructional design
- Learning analytics
- Learning design
- Technology enhanced learning