Skip to main content
Log in

Adults’ Self-Regulatory Behaviour Profiles in Blended Learning Environments and Their Implications for Design

  • Original research
  • Published:
Technology, Knowledge and Learning Aims and scope Submit manuscript

Abstract

Blended forms of learning have become increasingly popular. However, it remains unclear under what circumstances blended learning environments are successful. Studies suggest that blended learning challenges learners’ self-regulation. Yet little is known about what self-regulatory behaviour learners exhibit in such environments. This limited understanding is problematic since this insight is needed for effective designs. Therefore, the aim of this study was to identify learners’ self-regulatory behaviour profiles in blended learning environments and to relate them to designs of blended learning environments. Learners’ (n = 120) self-regulatory behaviour in six ecologically valid blended learning courses was captured. Log files were analysed in a learning analytics fashion for frequency, diversity, and sequence of events. Three main user profiles were identified. The designs were described using a descriptive framework containing attributes that support self-regulation in blended learning environments. Results indicate fewer mis-regulators when more self-regulatory design features are integrated. These finding highlights the value of integrating features that support self-regulation in blended learning environments.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  • Agrawal, R., & Srikant, R. (1995). Mining sequential patterns. Paper presented at the data engineering. Proceedings of the eleventh international conference on IEEE.

  • Artino, A. R. (2009). Think, feel, act: Motivational and emotional influences on military students’ online academic success. Journal of Computing in Higher Education,21(2), 146–166. https://doi.org/10.1007/s12528-009-9020-9.

    Article  Google Scholar 

  • Ausburn, L. J. (2004). Course design elements most valued by adult learners in blended online education environments: An American perspective. Educational Media International,41(4), 327–337.

    Google Scholar 

  • Azevedo, R. (2005). Using hypermedia as a metacognitive tool for enhancing student learning? The role of self-regulated learning. Educational Psychologist,40(4), 199–209.

    Google Scholar 

  • Azevedo, R., Cromley, J. G., Winters, F. I., Moos, D. C., & Greene, J. A. (2005). Adaptive human scaffolding facilitates adolescents’ self-regulated learning with hypermedia. Instructional Science,33(5–6), 381–412.

    Google Scholar 

  • Azevedo, R., & Hadwin, A. F. (2005). Scaffolding self-regulated learning and metacognition—Implications for the design of computer-based scaffolds. Instructional Science,33(5), 367–379.

    Google Scholar 

  • Azevedo, R., Johnson, A., Chauncey, A., & Burkett, C. (2010). Self-regulated learning with MetaTutor: Advancing the science of learning with MetaCognitive tools. In M. Khine & I. Saleh (Eds.), New science of learning (pp. 225–247). Berlin: Springer.

    Google Scholar 

  • Bandura, A. (1989). Human agency in social cognitive theory. American Psychologist,44(9), 1175.

    Google Scholar 

  • Bannert, M. (2009). Promoting self-regulated learning through prompts. Zeitschrift für Pädagogische Psychologie,23(2), 139–145.

    Google Scholar 

  • Barnard, L., Lan, W. Y., To, Y. M., Paton, V. O., & Lai, S.-L. (2009). Measuring self-regulation in online and blended learning environments. The Internet and Higher Education,12(1), 1–6.

    Google Scholar 

  • Belfiore, P. J., & Hornyak, R. (1998). Operant theory and application to self-monitoring in adolescents. In D. Schunk & B. Zimmerman (Eds.), Self-regulated learning: From teaching to self-reflective practice (pp. 184–202). New York: Guilford.

    Google Scholar 

  • Bennet, S., Harper, B., & Hedberg, J. (2002). Designing real life cases to support authentic design activities. Australasian Journal of Educational Technology,18(1), 1–12.

    Google Scholar 

  • Benson, P. (2013). Teaching and researching: Autonomy in language learning. Abingdon: Routledge.

    Google Scholar 

  • Bersin, J. (2003). What works in blended learning. Learning circuits, July. Retrieved 19 October 2003 from http://www.learningcircuits.org/2003/jul2003/bersin.htm.

  • Biemiller, A., Shany, M., Inglis, A., & Meichenbaum, D. (1998). Factors influencing children’s acquisition and demonstration of self-regulation on academic tasks. In D. Schunk & B. Zimmerman (Eds.), Self-regulated learning: From teaching to self-reflective practice (pp. 203–224). London: The Guilford Press.

    Google Scholar 

  • Boekaerts, M., & Corno, L. (2005). Self-regulation in the classroom: A perspective on assessment and intervention. Applied Psychology,54(2), 199–231.

    Google Scholar 

  • Boelens, R., Van Laer, S., De Wever, B., & Elen, J. (2015). Blended learning in adult education: Towards a definition of blended learning. Retrieved from http://www.iwt-alo.be/

  • Bol, L., & Garner, J. K. (2011). Challenges in supporting self-regulation in distance education environments. Journal of Computing in Higher Education,23(2–3), 104–123.

    Google Scholar 

  • Bonk, C. J., & Graham, C. R. (2012). The handbook of blended learning: Global perspectives, local designs. New York: Wiley.

    Google Scholar 

  • Boud, D., Keogh, R., & Walker, D. (2013). Reflection: Turning experience into learning. Abingdon: Routledge.

    Google Scholar 

  • Bransford, J. D., Vye, N., Kinzer, C., & Risko, V. (1990). Teaching thinking and content knowledge: Toward an integrated approach. Dimensions of Thinking and Cognitive Instruction,1, 381–413.

    Google Scholar 

  • Brookfield, S. (1986). Understanding and facilitating adult learning: A comprehensive analysis of principles and effective practices. London: McGraw-Hill Education.

    Google Scholar 

  • Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher,18(1), 32–42.

    Google Scholar 

  • Butler, D. L. (1998). The strategic content learning approach to promoting self-regulated learning: A report of three studies. Journal of Educational Psychology,90(4), 682.

    Google Scholar 

  • Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: A theoretical synthesis. Review of Educational Research,65(3), 245–281. https://doi.org/10.2307/1170684.

    Article  Google Scholar 

  • Caffarella, R., & Merriam, S. B. (2000). Linking the individual learner to the context of adult learning. In A. Wilson & E. Hayes (Eds.), Handbook of adult and continuing education (pp. 55–70). San Francisco, CA: Jossey-Bass.

    Google Scholar 

  • Cennamo, K. S., Ross, J. D., & Rogers, C. S. (2002). Evolution of a web-enhanced course: Incorporating strategies for self-regulation. Educause Quarterly,25(1), 28–33.

    Google Scholar 

  • Cicchetti, D. V. (1994). Guidelines, criteria, and rules of thumb for evaluating normed and standardized assessment instruments in psychology. Psychological Assessment,6(4), 284.

    Google Scholar 

  • Collins, A., Brown, J. S., & Newman, S. E. (1989). Cognitive apprenticeship: Teaching the crafts of reading, writing, and mathematics. Knowing, Learning, and Instruction: Essays in Honor of Robert Glaser,18, 32–42.

    Google Scholar 

  • Collis, B., Bruijstens, H., & van Veen, J. K. D. (2003). Course redesign for blended learning: Modern optics for technical professionals. International Journal of Continuing Engineering Education and Life Long Learning,13(1–2), 22–38.

    Google Scholar 

  • Connolly, C., Murphy, E., & Moore, S. (2007). Second chance learners, supporting adults learning computer programming. Paper presented at the international conference on engineering education–ICEE.

  • Cordova, D. I., & Lepper, M. R. (1996). Intrinsic motivation and the process of learning: Beneficial effects of contextualization, personalization, and choice. Journal of Educational Psychology,88(4), 715.

    Google Scholar 

  • Corno, L. (1995). Comments on Winne: Analytic and systemic research are both needed. Educational Psychologist,30(4), 201–206.

    Google Scholar 

  • Dabbagh, N., & Kitsantas, A. (2004). Supporting self-regulation in student-cantered web-based learning environments. International Journal on E-Learning,3(1), 40–47.

    Google Scholar 

  • De Wever, B., Schellens, T., Valcke, M., & Van Keer, H. (2006). Content analysis schemes to analyze transcripts of online asynchronous discussion groups: A review. Computers & Education,46(1), 6–28.

    Google Scholar 

  • Devedžić, V. (2006). Semantic web and education. Integrated series in information systems/Vladan Devedžić (Vol. 12). Berlin: Springer.

    Google Scholar 

  • Dewey, J. (1958). Experience and nature. North Chelmsford: Courier Corporation.

    Google Scholar 

  • Driscoll, M. (2002). Blended learning: Let’s get beyond the hype. E-learning,1, 1–4.

    Google Scholar 

  • Edelson, D. C., Gordin, D. N., & Pea, R. D. (1999). Addressing the challenges of inquiry-based learning through technology and curriculum design. Journal of the Learning Sciences,8(3–4), 391–450.

    Google Scholar 

  • Edelson, D. C., Pea, R. D., & Gomez, L. (1996). Constructivism in the collaborator. In B. G. Wilson (Ed.), Constructivist learning environments: Case studies in instructional design (pp. 151–164). Englewood Cliffs: Educational Technology Publications.

    Google Scholar 

  • Endedijk, M. D., Brekelmans, M., Sleegers, P., & Vermunt, J. D. (2016). Measuring students’ self-regulated learning in professional education: Bridging the gap between event and aptitude measurements. Quality & Quantity, 50(5), 2141–2164.

  • Farrall, S. (2007). Desistance studies versus cognitive-behavioural therapies: Which offers most hope for the long term. In R. Canton & D. Hancock (Eds.), Dictionary of probation and offender management (Vol. 178). Cullompton: Willan Publishing.

    Google Scholar 

  • Gabadinho, A., Ritschard, G., Mueller, N. S., & Studer, M. (2011). Analyzing and visualizing state sequences in R with TraMineR. Journal of Statistical Software,40(4), 1–37.

    Google Scholar 

  • Garrison, D. R., & Kanuka, H. (2004). Blended learning: Uncovering its transformative potential in higher education. The Internet and Higher Education,7(2), 95–105. https://doi.org/10.1016/j.iheduc.2004.02.001.

    Article  Google Scholar 

  • Garrison, D. R., & Vaughan, N. D. (2008). Blended learning in higher education: Framework, principles, and guidelines. New York: Wiley.

    Google Scholar 

  • Garza, R. (2009). Latino and white high school students’ perceptions of caring behaviors are we culturally responsive to our students? Urban Education,44(3), 297–321.

    Google Scholar 

  • Graham, C. R. (2006). Blended learning systems. In C. J. Bonk & C. R. Graham (Eds.), The handbook of blended learning: Global perspectives, local designs. San Francisco: Pfeiffer.

    Google Scholar 

  • Graham, S., Harris, K. R., & Troia, G. A. (1998). Writing and self-regulation: Cases from the self-regulated strategy development model. In D. H. Schunk & B. J. Zimmerman (Eds.), Self-regulated learning: From teaching to self-reflective practice (pp. 20–41). New York: Guilford Press.

    Google Scholar 

  • Gray, S. J. (1988). Towards a theory of cultural influence on the development of accounting systems internationally. Abacus,24(1), 1–15.

    Google Scholar 

  • Greene, J. A., & Azevedo, R. (2007). A theoretical review of Winne and Hadwin’s model of self-regulated learning: New perspectives and directions. Review of Educational Research,77(3), 334–372. https://doi.org/10.3102/003465430303953.

    Article  Google Scholar 

  • Grimmett, P. P., & Neufeld, J. (1994). Teacher development and the struggle for authenticity: Professional growth and restructuring in the context of change. New York: Teachers College Press.

    Google Scholar 

  • Hadwin, A. F., Nesbit, J. C., Jamieson-Noel, D., Code, J., & Winne, P. H. (2007). Examining trace data to explore self-regulated learning. Metacognition and Learning,2(2–3), 107–124.

    Google Scholar 

  • Hansman, C. (2008). Adult learning in communities of practice. Communities of Practice,1, 293–310.

    Google Scholar 

  • Harley, J. M., Bouchet, F., Hussain, M. S., Azevedo, R., & Calvo, R. (2015). A multi-componential analysis of emotions during complex learning with an intelligent multi-agent system. Computers in Human Behavior,48, 615–625.

    Google Scholar 

  • Harrison, M. (2003). Blended learning in practice. Brighton: Epic Group PLC.

    Google Scholar 

  • Hatton, N., & Smith, D. (1995). Reflection in teacher education: Towards definition and implementation. Teaching and Teacher Education,11(1), 33–49.

    Google Scholar 

  • Herrington, J. (2005). Authentic learning environments in higher education. Hershey: IGI Global.

    Google Scholar 

  • Herrington, J., Oliver, R., & Reeves, T. C. (2003). Patterns of engagement in authentic online learning environments. Australasian Journal of Educational Technology, 19(1).

  • Hiemstra, R. (1993). Three underdeveloped models for adult learning. New Directions for Adult and Continuing Education,1993(57), 37–46.

    Google Scholar 

  • Hillman, D. C., Willis, D. J., & Gunawardena, C. N. (1994). Learner-interface interaction in distance education: An extension of contemporary models and strategies for practitioners. American Journal of Distance Education,8(2), 30–42.

    Google Scholar 

  • Hofer, B., Yu, S., & Pintrich, P. (1998). Teaching college students to be self-regulated learners. In D. H. Schunk & B. J. Zimmerman (Eds.), Self-regulated learning: From teaching to self-reflective practice (pp. 57–83). New York: The Guilford Press.

    Google Scholar 

  • Hooper, S. (1992). Cooperative learning and computer-based instruction. Educational Technology Research and Development,40(3), 21–38.

    Google Scholar 

  • House, R. (2002). Clocking in column. The Spokesman-Review, January 8. Retrieved from http://www.spokesman.com.

  • Ifenthaler, D. (2012). Determining the effectiveness of prompts for self-regulated learning in problem-solving scenarios. Educational Technology & Society,15(1), 38–52.

    Google Scholar 

  • Järvelä, S., Järvenoja, H., & Malmberg, J. (2012). How elementary school students’ motivation is connected to self-regulation. Educational Research and Evaluation,18(1), 65–84.

    Google Scholar 

  • Jorgensen, D. L. (1989). Participant observation. New York: Wiley.

    Google Scholar 

  • Keller, J. M. (2010). Motivational design for learning and performance. Boston, MA: Springer.

    Google Scholar 

  • Knowles, M. S., Holton, E. F., & Swanson, R. A. (2014). The adult learner: The definitive classic in adult education and human resource development. Abingdon: Routledge.

    Google Scholar 

  • Kolodner, J. L., Owensby, J. N., & Guzdial, M. (2004). Case-based learning aids. Handbook of Research on Educational Communications and Technology,2, 829–861.

    Google Scholar 

  • Lajoie, S. P. (2005). Extending the scaffolding metaphor. Instructional Science,33(5–6), 541–557.

    Google Scholar 

  • Laurillard, D. (1987). Computers and the emancipation of students: Giving control to the learner. Instructional Science,16(1), 3–18.

    Google Scholar 

  • Lebow, D. G., & Wager, W. W. (1994). Authentic activity as a model for appropriate learning activity: Implications for emerging instructional technologies. Canadian Journal of Educational Communication,23(3), 231–244.

    Google Scholar 

  • Ley, K., & Young, D. B. (2001). Instructional principles for self-regulation. Educational Technology Research and Development,49(2), 93–103.

    Google Scholar 

  • Lin, B., & Hsieh, C.-T. (2001). Web-based teaching and learner control: A research review. Computers & Education,37(3), 377–386.

    Google Scholar 

  • Linn, R. L., Baker, E. L., & Dunbar, S. B. (1991). Complex, performance-based assessment: Expectations and validation criteria. Educational Researcher,20(8), 15–21.

    Google Scholar 

  • Lynch, R., & Dembo, M. (2004a). The relationship between self-regulation and online learning in a blended learning context. The International Review of Research in Open and Distributed Learning, 5(2). Athabasca University Press. Retrieved 3 January 2018 from https://www.learntechlib.org/p/49426/.

  • Lynch, R., & Dembo, M. (2004b). The relationship between self-regulation and online learning in a blended learning context. The International Review of Research in Open and Distance Learning,5(2), 1–16.

    Google Scholar 

  • Manlove, S., Lazonder, A. W., & de Jong, T. (2007). Software scaffolds to promote regulation during scientific inquiry learning. Metacognition and Learning,2(2–3), 141–155.

    Google Scholar 

  • Mann, K., Gordon, J., & MacLeod, A. (2009). Reflection and reflective practice in health professions education: A systematic review. Advances in Health Sciences Education,14(4), 595–621.

    Google Scholar 

  • Martinez, M. (2002). Designing learning objects to personalize learning. In D. A. Wiley (Ed.), The instructional use of learning objects (pp. 151–171). Bloomington: Agency for Instructional Technology.

    Google Scholar 

  • Merrill, M. D. (2002). First principles of instruction. Educational Technology Research and Development,50(3), 43–59. https://doi.org/10.1007/Bf02505024.

    Article  Google Scholar 

  • Milheim, W. D., & Martin, B. L. (1991). Theoretical bases for the use of learner control: Three different perspectives. Journal of Computer-Based Instruction,18(3), 99–105.

    Google Scholar 

  • Moon, J. (1999). Reflection in learning and professional development. Abingdon: Routledge.

    Google Scholar 

  • Moore, M. G. (1989). Editorial: Three types of interaction. The American Journal of Distance Education,3(2), 1–6.

    Google Scholar 

  • Munby, H., & Russell, T. (1992). Frames of reflection: An introduction. In T. Russell & H. Munby (Eds.), Teachers and teaching: From classroom to reflection (pp. 1–8). London: The Falmer Press.

    Google Scholar 

  • Nietfeld, J. L., Cao, L., & Osborne, J. W. (2006). The effect of distributed monitoring exercises and feedback on performance, monitoring accuracy, and self-efficacy. Metacognition and Learning,1(2), 159–179.

    Google Scholar 

  • Nordlund, M., Bonfanti, S., & Strandh, M. (2015). Second chance education matters! Income trajectories of poorly educated non-Nordics in Sweden. Journal of Education and Work,28(5), 528–550.

    Google Scholar 

  • Oliver, M., & Trigwell, K. (2005). Can ‘blended learning’ be redeemed. E-learning,2(1), 17–26. https://doi.org/10.2304/elea.2005.2.1.17.

    Article  Google Scholar 

  • Orey, M. (2002a). Definition of blended learning. University of Georgia. Retrieved 21 February 2003.

  • Orey, M. (2002b). One year of online blended learning: Lessons learned. In Annual meeting of the eastern educational research association, Sarasota, FL.

  • Perry, N. E., & Winne, P. H. (2006). Learning from learning kits: gStudy traces of students’ self-regulated engagements with computerized content. Educational Psychology Review,18(3), 211–228.

    Google Scholar 

  • Petraglia, J. (1998). Reality by design: The rhetoric and technology of authenticity in education. Abingdon: Routledge.

    Google Scholar 

  • Pintrich, P. R. (2004). A conceptual framework for assessing motivation and self-regulated learning in college students. Educational Psychology Review,16(4), 385–407.

    Google Scholar 

  • Pintrich, P. R., Smith, D. A., García, T., & McKeachie, W. J. (1993). Reliability and predictive validity of the Motivated Strategies for Learning Questionnaire (MSLQ). Educational and Psychological Measurement,53(3), 801–813.

    Google Scholar 

  • Pressley, M., El-Dinary, P., Wharton-McDonald, R., & Brown, R. (1998). Transactional instruction of comprehension strategies in the elementary grades. In D. H. Schunk & B. J. Zimmerman (Eds.), Self-regulated learning: From research to self-reflective practice (pp. 42–56). New York: Guilford.

    Google Scholar 

  • Puntambekar, S., & Hubscher, R. (2005). Tools for scaffolding students in a complex learning environment: What have we gained and what have we missed? Educational Psychologist,40(1), 1–12.

    Google Scholar 

  • Reay, J. (2001). Blended learning-a fusion for the future. Knowledge Management Review,4, 6.

    Google Scholar 

  • Reeves, T. C., & Okey, J. R. (1996). Alternative assessment for constructivist learning environments. In B. G. Wilson (Ed.), Constructivist learning environments: Case studies in instructional design (pp. 191–202). Englewood Cliffs, NJ: Educational Technology Publications.

    Google Scholar 

  • Reeves, T. C., & Reeves, P. M. (1997). Effective dimensions of interactive learning on the World Wide Web. In B. H. Khan (Ed.), Web-based instruction (pp. 59–66). Englewood Cliffs, NJ: Educational Technology Publications.

    Google Scholar 

  • Reigeluth, C. M. (1999). What is instructional-design theory and how is it changing. Instructional-Design Theories and Models: A New Paradigm of Instructional Theory,2, 5–29.

    Google Scholar 

  • Reiser, B. J. (2004). Scaffolding complex learning: The mechanisms of structuring and problematizing student work. The Journal of the Learning Sciences,13(3), 273–304.

    Google Scholar 

  • Rooney, J. E. (2003). Knowledge infusion. Association Management,55, 26–32.

    Google Scholar 

  • Rossett, A. (2002). The ASTD e-learning handbook: Best practices, strategies, and case studies for an emerging field. New York: McGraw-Hill.

    Google Scholar 

  • Roth, W.-M., & Bowen, G. M. (1995). Knowing and interacting: A study of culture, practices, and resources in a grade 8 open-inquiry science classroom guided by a cognitive apprenticeship metaphor. Cognition and Instruction,13(1), 73–128.

    Google Scholar 

  • Rourke, L., Anderson, T., Garrison, D. R., & Archer, W. (2001). Methodological issues in the content analysis of computer conference transcripts. International Journal of Artificial Intelligence In Education (IJAIED), 12, 8–22.

  • Sands, P. (2002). Inside outside, upside downside. Strategies 8(6). Retrieved March 31, 2004 from http://www.uwsa.edu/ttt/articles/sands2.htm.

  • Sansone, C., Fraughton, T., Zachary, J. L., Butner, J., & Heiner, C. (2011). Self-regulation of motivation when learning online: the importance of who, why and how. Etr&D-Educational Technology Research and Development,59(2), 199–212. https://doi.org/10.1007/s11423-011-9193-6.

    Article  Google Scholar 

  • Scheiter, K., & Gerjets, P. (2007). Learner control in hypermedia environments. Educational Psychology Review,19(3), 285–307.

    Google Scholar 

  • Schön, D. A. (1987). Educating the reflective practitioner: Toward a new design for teaching and learning in the professions. San Francisco: Jossey-Bass.

    Google Scholar 

  • Schraw, G., Crippen, K. J., & Hartley, K. (2006). Promoting self-regulation in science education: Metacognition as part of a broader perspective on learning. Research in Science Education,36(1–2), 111–139.

    Google Scholar 

  • Schunk, D. H. (1998). Teaching elementary students to self-regulate practice of mathematical skills with modeling. In D. H. Schunk & B. J. Zimmerman (Eds.), Self-regulated learning: From teaching to self-reflective practice. New York: Guilford.

    Google Scholar 

  • Schunk, D. H., & Zimmerman, B. J. (1994). Self-regulation of learning and performance: Issues and educational applications. New Jersey: Lawrence Erlbaum Associates Inc.

    Google Scholar 

  • Sharma, S., Dick, G., Chin, W. W., & Land, L. (2007). Self-regulation and E-learning. Paper presented at the ECIS.

  • Sims, R., & Hedberg, J. (1995). Dimensions of learner control a reappraisal for interactive multimedia instruction. In J. Lee (Ed.), First international workshop on intelligence and multimodality in multimedia interfaces: research and applications. Human Communication Research Centre, University of Edinburgh, Scotland.

  • Singh, H., et al. (2001). A white paper: Achieving success with blended learning. Centra Software,1, 1–11.

    Google Scholar 

  • Smith, G. G., & Kurthen, H. (2007). Front-stage and back-stage in hybrid e-learning face-to-face courses. International Journal on E-Learning,6(3), 455–474.

    Google Scholar 

  • Spanjers, I. A., Könings, K. D., Leppink, J., Verstegen, D. M., de Jong, N., Czabanowska, K., et al. (2015). The promised land of blended learning: Quizzes as a moderator. Educational Research Review. https://doi.org/10.1016/j.edurev.2015.05.001.

    Article  Google Scholar 

  • Sutton, L. A. (2001). The principle of vicarious interaction in computer-mediated communications. International Journal of Educational Telecommunications,7(3), 223–242.

    Google Scholar 

  • Swanson, H. L., & Lussier, C. M. (2001). A selective synthesis of the experimental literature on dynamic assessment. Review of Educational Research,71(2), 321–363.

    Google Scholar 

  • Thiede, K. W., Anderson, M., & Therriault, D. (2003). Accuracy of metacognitive monitoring affects learning of texts. Journal of Educational Psychology,95(1), 66.

    Google Scholar 

  • Thiede, K. W., & Dunlosky, J. (1994). Delaying students’ metacognitive monitoring improves their accuracy in predicting their recognition performance. Journal of Educational Psychology,86(2), 290.

    Google Scholar 

  • Thomson, I. (2002). Thomson job impact study: The next generation of corporate learning. Retrieved 7 July 2003.

  • Tough, A. (1978). Major learning efforts: Recent research and future directions. Adult Education Quarterly,28(4), 250–263.

    Google Scholar 

  • Van Laer, S., & Elen, J. (2017). In search of attributes that support self-regulation in blended learning environments. Education and Information Technologies, 22(4), 1395–1454.

  • van Merriënboer, J. J. G., & Kirschner, P. A. (2001). Three worlds of instructional design: State of the art and future directions. Instructional Science,29(4–5), 429–441.

    Google Scholar 

  • Veenman, M. V., Van Hout-Wolters, B. H., & Afflerbach, P. (2006). Metacognition and learning: Conceptual and methodological considerations. Metacognition and Learning,1(1), 3–14.

    Google Scholar 

  • Vermunt, J. D., & Vermetten, Y. J. (2004). Patterns in student learning: Relationships between learning strategies, conceptions of learning, and learning orientations. Educational Psychology Review,16(4), 359–384.

    Google Scholar 

  • Ward, J., & LaBranche, G. A. (2003). Blended learning: The convergence of e-learning and meetings. Franchising World,35, 22–24.

    Google Scholar 

  • Weinstein, C. E., Zimmerman, S., & Palmer, D. (1988). Assessing learning strategies: The design and development of the LASSI. In C. E. Weinstein, E. T. Goetz, & P. A. Alexander (Eds.), Learning and study strategies: Issues in assessment, instruction, and evaluation (pp. 25–40). San Diego: Academic Press.

    Google Scholar 

  • Wesiak, G., Steiner, C. M., Moore, A., Dagger, D., Power, G., Berthold, M., et al. (2014). Iterative augmentation of a medical training simulator: Effects of affective metacognitive scaffolding. Computers & Education,76, 13–29. https://doi.org/10.1016/j.compedu.2014.03.004.

    Article  Google Scholar 

  • Whitelock, D., & Jelfs, A. (2003). Editorial for special issue on blended learning: Blending the issues and concerns of staff and students. Journal of Educational Media,28, 99–100.

    Google Scholar 

  • Wiggins, G. P. (1993). Assessing student performance: Exploring the purpose and limits of testing. New York: Jossey-Bass.

    Google Scholar 

  • Williams, M. D. (1993). A comprehensive review of learner-control: The role of learner characteristics. In M. R. Simonson (Ed.), Proceedings of the annual conference of the AECT. Washington, DC: AECT.

    Google Scholar 

  • Wilson, S., Liber, O., Johnson, M. W., Beauvoir, P., Sharples, P., & Milligan, C. D. (2007). Personal learning environments: Challenging the dominant design of educational systems. Journal of E-Learning and Knowledge Society,3(2), 27–38.

    Google Scholar 

  • Winne, P. H. (1982). Minimizing the black box problem to enhance the validity of theories about instructional effects. Instructional Science,11(1), 13–28. https://doi.org/10.1007/Bf00120978.

    Article  Google Scholar 

  • Winne, P. H. (1985). Steps toward promoting cognitive achievements. The Elementary School Journal,85(5), 673–693. https://doi.org/10.1086/461429.

    Article  Google Scholar 

  • Winne, P. H. (1995). Inherent details in self-regulated learning. Educational Psychologist,30(4), 173–187. https://doi.org/10.1207/s15326985ep3004_2.

    Article  Google Scholar 

  • Winne, P. H. (1996). A metacognitive view of individual differences in self-regulated learning. Learning and Individual Differences,8(4), 327–353. https://doi.org/10.1016/S1041-6080(96)90022-9.

    Article  Google Scholar 

  • Winne, P. H. (2006). Handbook of educational psychology. Abingdon: Psychology Press.

    Google Scholar 

  • Winne, P. H. (2015). What is the state of the art in self-, co-and socially shared regulation in CSCL? Computers in Human Behavior,52, 628–631.

    Google Scholar 

  • Winne, P. (2016). Self-regulated learning. SFU Educational Review,1(1), 1.

    Google Scholar 

  • Winne, P. H., & Hadwin, A. F. (1998). Studying as self-regulated learning. Metacognition in Educational Theory and Practice,93, 27–30.

    Google Scholar 

  • Winne, P. H., & Hadwin, A. (2013). nStudy: Tracing and supporting self-regulated learning in the internet. In R. Azevedo & V. Aleven (Eds.), International handbook of metacognition and learning technologies (Vol. 28, pp. 293–308). New York: Springer.

    Google Scholar 

  • Winne, P. H., & Marx, R. W. (1989). A cognitive-processing analysis of motivation within classroom tasks. Research on Motivation in Education,3, 223–257.

    Google Scholar 

  • Winne, P. H., Nesbit, J. C., Kumar, V., Hadwin, A. F., Lajoie, S. P., Azevedo, R., et al. (2006). Supporting self-regulated learning with gStudy software: The learning kit project. Technology Instruction Cognition and Learning,3(1/2), 105.

    Google Scholar 

  • Winne, P. H., & Perry, N. E. (2000). Measuring self-regulated learning. In M. Boekaerts, P. R. Pintrich, & M. Zeidner (Eds.), Handbook of self regulation. Orlando: Academic Press.

    Google Scholar 

  • Winne, P. H., & Stockley, D. B. (1998). Computing technologies as sites for developing self-regulated learning. In D. H. Schunk & B. J. Zimmerman (Eds.), Self-regulated learning: From teaching to self-reflective practice (pp. 106–136). New York: Guilford Press.

    Google Scholar 

  • Woo, Y., & Reeves, T. C. (2007). Meaningful interaction in web-based learning: A social constructivist interpretation. The Internet and Higher Education,10(1), 15–25.

    Google Scholar 

  • Wood, E., Woloshyn, V., & Willoughby, T. (1995). Cognitive strategy instruction for middle and high schools. Brookline: Brookline Books.

    Google Scholar 

  • Young, M. F. (1993). Instructional design for situated learning. Educational Technology Research and Development,41(1), 43–58.

    Google Scholar 

  • Young, J. R. (2001). “Hybrid” teaching seeks to end the divide between traditional and online instruction. The Chronicle of Higher Education,48, A33.

    Google Scholar 

  • Zaki, M. J. (2001). SPADE: An efficient algorithm for mining frequent sequences. Machine Learning,42(1–2), 31–60.

    Google Scholar 

  • Zimmerman, B. J. (2008). Investigating self-regulation and motivation: Historical background, methodological developments, and future prospects. American Educational Research Journal,45(1), 166–183.

    Google Scholar 

  • Zimmerman, B. J., & Schunk, D. H. (2001). Self-regulated learning and academic achievement: Theoretical perspectives. Abingdon: Routledge.

    Google Scholar 

Download references

Funding

We would like to acknowledge the support of the project “Adult Learners Online” funded by the Agency for Science and Technology (Project Number: SBO 140029), who made this research possible.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stijn Van Laer.

Appendices

Appendix 1: Questions per Attribute

Attributes

Main question

Sub question

Authenticity

Does the learning environment contain authentic real-world relevance?

Is an authentic context provided that reflect the way the knowledge will be used in real life?

Are authentic activities provided?

Is there access to expert performances and the modelling of processes?

Are there multiple roles and perspectives provided?

Is there support for collaborative construction of knowledge?

Is articulation provided to enable tacit knowledge to be made explicit?

Is authentic assessment of learning provided within the tasks?

Personalization

Does the learning environment contain personalization?

Is the personalization name-recognized?

Is the personalization self-described?

Is the personalization cognitive-based?

Learner-control

Does the learning environment allow learner control?

Is control of pacing allowed?

Is control of content allowed?

Is control of learning activities allowed?

Is control of content sequence allowed?

Scaffolding

Does the learning environment scaffold support?

Is support tailored to the learner through continuous monitoring?

Does the support fade over time?

Is there a transfer of responsibilities over time?

Interaction

Does the learning environment entail interaction?

Is learner-content interaction facilitated?

Is learner-instructor interaction facilitated?

Is learner–learner interaction facilitated?

Is learner-interface interaction facilitated?

Is vicarious interaction facilitated?

Reflection cues

Does the learning environment contain reflection cues?

Does the reflection-for-action approach apply?

Does the reflection-in-action approach apply?

Does the reflection-on-action approach apply?

Calibration cues

Does the learning environment contain calibration cues?

Is a strategy applied to guide learners to delay metacognitive monitoring?

Is a strategy applied for the provision of forms that guide students to summarize content?

Are timed alerts given that guide students to summarize content?

Is a strategy applied for helping learners review the ‘right’ information?

Is a strategy applied for effective practice tests that provide students with records of their performance on past tests as well as items (or tasks) on those tests?

Appendix 2: Manual for Scoring Attributes

2.1 Authenticity

The use of the word authentic is open to interpretation. A sustainable amount of attempts to define this concept transparently is done (see e.g., Bennet et al. 2002; Herrington 2005; Wesiak et al. 2014). Definitions range from real-world relevance (Wesiak et al. 2014), needed in real-life situations (Sansone et al. 2011) and of important interest of the learner for later professional life (Grimmett and Neufeld 1994) to models that focus on applying conceptual knowledge or skills, such as critical thinking or problem solving (Young 1993). Based on their literature review Van Laer and Elen (2017) defined authenticity as the real-world relevance (both to the learners’ professional and personal life) of on the one hand the learning environment (e.g., Herrington et al. 2003; Petraglia 1998; Roth and Bowen 1995) and on the other hand the task (e.g., Merrill 2002; Reigeluth 1999; van Merriënboer and Kirschner 2001). Guidance question for identifying authenticity in learning environments and learning tasks are:

  1. 1.

    Authentic context Is an authentic context provided that reflect the way the knowledge will be used in real life? In designing online learning environments with authentic contexts, it is not enough to simply provide suitable examples from real-world situations to illustrate the concept or issue being taught. The context needs to be all-embracing, to provide the purpose and motivation for learning, and to provide a sustained and complex learning environment that can be explored at length (e.g., Brown et al. 1989; Reeves and Reeves 1997).

  2. 2.

    Authentic activities Are authentic activities provided? The learning environment needs to provide ill-defined activities which have real-world relevance, and which present a single complex task to be completed over a sustained period of time, rather than a series of shorter disconnected examples (e.g., Bransford et al. 1990; Lebow and Wager 1994).

  3. 3.

    Expert performance Is there access to expert performances and the modelling of processes? In order to provide expert performances, the environment needs to provide access to expert thinking and the modelling of processes, access to learners in various levels of expertise, and access to the social periphery or the observation of real-life episodes as they occur (Collins et al. 1989).

  4. 4.

    Multiple roles Are there multiple roles and perspectives provided? In order for students to be able to investigate the learning environment from more than a single perspective, it is important to enable and encourage students to explore different perspectives on the topics from various points of view, and to ‘criss cross’ the learning environment repeatedly (Collins et al. 1989).

  5. 5.

    Collaborative knowledge construction Is there support for collaborative construction of knowledge? The opportunity for users to collaborate is an important design element, particularly for students who may be learning at a distance. Consequently, tasks need to be addressed to a group rather than an individual, and appropriate means of communication need to be established. Collaboration can be encouraged through appropriate tasks and communication technology (e.g., discussion boards, chats, email, debates etc.) (e.g., Hooper 1992).

  6. 6.

    Tacit knowledge made explicit Is articulation provided to enable tacit knowledge to be made explicit? In order to produce a learning environment capable of providing opportunities for articulation, the tasks need to incorporate inherent opportunities to articulate, collaborative groups to enable articulation, and the public presentation of argument to enable defense of the position (e.g., Edelson et al. 1996).

  7. 7.

    Authentic assessment Is authentic assessment of learning within the tasks provided? In order to provide integrated and authentic assessment of student learning, the learning environment needs to provide: the opportunity for students to be effective performers with acquired knowledge, and to craft polished, performances or products in collaboration with others. It also requires the assessment to be seamlessly integrated with the activity, and to provide appropriate criteria for scoring varied products (e.g., Linn et al. 1991; Reeves and Okey 1996; Wiggins 1993).

2.2 Personalization

Personalization is often described as non-homogenous experiences related directly to the learner (Wilson et al. 2007), associated with characters and objects of inherent interest to the learner and connects with topics of high interest value (Cordova and Lepper 1996). Similar to these views on personalization, based on their literature review, Van Laer and Elen (2017) defined personalization as the modification of the learning environment to the inherent needs of each individual learner. Five major questions were raised by the current literature on the use of personalized learning environments (Devedžić 2006; Martinez 2002). These questions are:

  1. 1.

    Name-recognition Is the personalization name-recognized? This type of personalization aims at the acknowledgement of the learner as an individual. For example, the learner’s name can appear in the instruction or previous activities or accomplishments that have been collected and stored can later be presented when appropriate.

  2. 2.

    Self-described Is the personalization self-described? Self-described personalization enables learners, (using questionnaires, surveys, registration forms, and comments) to describe preferences and common attributes. For example, learners may take a pre-course quiz to identify existing skills, preferences, or past experiences. Afterwards, options and instructional experiences appear based on the learner-provided answers.

  3. 3.

    Cognition-based Is the personalization cognitive-based? Cognitive-based personalization uses information about cognitive processes, strategies, and ability to deliver content specifically targeted to specific types (defined cognitively) of learners. For example, learners may choose to use an audio option because they prefer hearing text rather than reading it. Or, a learner may prefer the presentation of content in a linear fashion, rather than an unsequenced presentation with hyperlinks.

2.3 Learner-Control

Learner-control refers to the amount of control learners have over support in BLEs. Different researchers identify different kinds of learner-control. Varying from freedom of task-selection by the learner (Artino 2009), control of learning sequences (sequence control) (Lin and Hsieh 2001), allowing decisions on which contents to receive (selection or content control), allowing decisions on how a specific content should be displayed (representation control) and control over the pace of information presentation (Scheiter and Gerjets 2007). Van Laer and Elen (2017), based on their literature review, defined learner-control as an inclusive approach based on the earlier mentioned different kinds of learner-control. Therefor learner control is a concept where learners have or have not control over the pacing, content, learning activities and content sequence. Four major questions (Williams 1993) occur when describing learner-control in learning environments:

  1. 1.

    Control over pacing Is control of pacing allowed (Sims and Hedberg 1995)? These traces suggest that the learners have control over the speed of presentation of instructional materials. Another element considered is the ability to control pacing, is the speed and time at which content is presented.

  2. 2.

    Control over content Is control of content allowed (Milheim and Martin 1991)? These traces suggest that the learner is permitted to skip over certain instructional units. This option generally refers to the selection of topics or objectives associated with a specific lesson, although it does not extend to a choice of which content items are displayed. This component of learner control does not focus on the micro level of interaction, in which the learner must make certain choices in response to questions or problems. Therefore, while the learner has control over the content selected for study, the actual presentation of that content has generally remained instructor driven. Thus, there would appear to be two levels of content control—that where the learner chooses a module of study, and that where the presentation and associated display elements are also under learner control.

  3. 3.

    Control over learning activities Is control of learning activities allowed (Laurillard 1987)? This includes options for the student to see examples, do exercises, receive information, consult a glossary, ask for more explanation, and take a quiz.

  4. 4.

    Control over content sequence Is of control of content sequence allowed? This includes provisions for the student to skip forward or backward a chosen amount or to retrace a route through the material, and options to control when to view such features as content indexes or content maps. Sequence control refers to the order in which the content is viewed, and often is defined in terms of being able to move to and fro among content items, such as those described by Gray (1988).

2.4 Scaffolding

Many different approaches to scaffolding have emerged from the design research on interactive learning environments, and a variety of design guidelines or principles have been proposed (Edelson et al. 1999; Kolodner et al. 2004). Based on their literature review Van Laer and Elen (2017) define scaffolding as changes in the task, so learners can accomplish tasks that would otherwise be out of their reach (Reiser 2004). This definition of scaffolding is reflected by three major questions (Puntambekar and Hubscher 2005):

  1. 1.

    Contingency Is support tailored to the learner through continuous monitoring? The support must be adapted to the current level of the learners’ performance and should either be at the same or a slightly higher level. A tool for contingency is diagnostic strategies. To provide this support, one must first determine the learners’ current level of competence. Many authors have acknowledged the importance of diagnosis in relation to scaffolding (e.g., Garza 2009; Lajoie 2005; Swanson and Lussier 2001).

  2. 2.

    Fading over time Does the support fade over time? Fading depends upon the learners’ level of development and competence. Support fades when the level and/or the amount decreases over time.

  3. 3.

    Transfer of responsibility Is there a transfer of responsibilities over time? Responsibility for the performance of a task is gradually transferred to the learner. Responsibility can refer both to cognitive and metacognitive activities and to learners’ affect. The responsibility for learning is transferred when a student takes increasing learner control.

2.5 Interaction

The nature of interaction in various forms of learning environments has been defined in a variety of ways, based upon the participants’ level of involvement in a specific learning opportunity and the objects of interaction such as other participants or content materials. The nature of interaction is also dependent upon the contexts in which interaction occurs, in a face-to-face situation or at a distance. Van Laer and Elen (2017) describe interaction as the involvement of learners with elements in the learning environment. Five major interaction related questions are taken into account (Woo and Reeves 2007):

  1. 1.

    Learner-content interaction Is learner-content interaction facilitated (Hiemstra 1993)? The first type of interaction is interaction between the learner and the content or subject of study. They are often one-way communications with a subject expert, intended to help learners in their study of the subject.

  2. 2.

    Learner-instructor interaction Is learner-instructor interaction facilitated (Moore 1989)? The second type of interaction is learners-instructor interaction between the learner and the expert who prepared the subject material, or some other expert acting as an instructor.

  3. 3.

    Learner-learner interaction Is learner–learner interaction facilitated (Moore 1989)? The third form of interaction is the inter-learner interaction, between one learner and other learners, alone or in group settings, with or without the real-time presence of an instructor.

  4. 4.

    Learner-interface interaction Is learner-interface interaction facilitated (Hillman et al. 1994)? The fourth type of interaction is learner-interface interaction, which describes the interaction between the learner and the tools needed to perform the required task.

  5. 5.

    Vicarious interaction Is vicarious interaction facilitated (Sutton 2001)? This final type of interaction takes place when a student actively observes and processes both sides of a direct interaction between two other students or between another student and the instructor.

2.6 Reflection-Cues

Many different definitions of reflection have been proposed over time. Dewey (1958) defined reflection as “active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusion to which it tends” (p. 9). Moon (1999) describes reflection as “a form of mental processing with a purpose and/or anticipated outcome that is applied to relatively complex or unstructured ideas for which there is not an obvious solution” (p. 23). Boud et al. (2013) define reflection as “a generic term for those intellectual and affective activities in which individuals engage to explore their experiences in order to lead to a new understanding and appreciation” (p. 19). All three definitions emphasize purposeful critical analysis of knowledge and experience, in order to achieve deeper meaning and understanding. Van Laer and Elen (2017) define reflection cues as prompts that aim to activate learners’ purposeful critical analysis of knowledge and experience, in order to achieve deeper meaning and understanding. This definition occurs via three major questions (Farrall 2007; Mann et al. 2009):

  1. 1.

    Reflection-before-action Does the reflection-for-action approach apply (Farrall 2007)? This type is different from the other two types since it is proactive in nature. For example the instructor asks the learner about his or her personal expectations about an upcoming task.

  2. 2.

    Reflection-in-action Does the reflection-in-action approach apply (Farrall 2007; Schön 1987)? This type of reflection takes place while learners are performing a task. Reflective cues are given when the learner is performing a certain task. Cues are given to let him reflect upon if he needs to alter, amend, change what he is doing and being in order to adjust to changing circumstances, to get back into balance, to attend accurately, etc.? Learners must check with themselves that they are on the right track: if I am not on the right track, is there a better way? For example an instructor asks learners to review the actions they are undertaking.

  3. 3.

    Reflection-on-action Does the reflection-on-action approach apply (Farrall 2007)? Munby and Russell (1992) describe it succinctly as the “systematic and deliberate thinking back over one’s actions”. Another definition which involves thinking back on what teachers have done to discover how knowing-in-action might have contributed to unexpected action (Hatton and Smith 1995). For example an instructor asks the learner about his or her previous experiences regarding a task that is just finished.

2.7 Calibration Cues

Calibration is defined as the learners’ perceptions of performance compared to the actual performance and perceived use of study tactics and actual use of study tactics (Bol and Garner 2011). Calibration concerns on the one hand the deviation of a learner’s judgment from fact, introducing notions of bias and accuracy and on the other hand metric issues regarding the validity of cues’ contributions to judgments and the grain size of cues (Azevedo and Hadwin 2005). Van Laer and Elen (2017) define calibration cues as triggers for learners to test their perceptions of performance against their actual performance and their perceived use of study tactics against their actual use of study tactics. While identifying calibration cues we focus on five major questions (Nietfeld et al. 2006; Thiede and Dunlosky 1994):

  1. 1.

    Cues for delayed metacognitive monitoring Is a strategy applied to guide learners to delay metacognitive monitoring? (Thiede and Dunlosky 1994)This strategy is based on a phenomenon labelled ‘the delayed judgement of learning effect’ that shows improved judgments after a learning delay similar to improved performance associated with distributed sessions over time. For example, learners might be first asked to highlight a text and at a later time evaluate the highlighted content in terms of how well it is understood, how easily is can be retrieved, and how it relates to the learning objective. They are asked to evaluate previously made judgements.

  2. 2.

    Forms for summarizing Is a strategy applied for the provision of forms that guide students to summarize content? Summarizing information improved calibration accuracy. It is suggests that the summaries were more effective when forms and guidelines were provided (Wood et al. 1995). For example an instructor gives the learners the task to summarize a specific content component and to review it using a correction key.

  3. 3.

    Timed alerts Are timed alerts given that guide students to summarize content? Thiede et al. (2003) state that summarizing information after a delay improved calibration accuracy.

  4. 4.

    Review of the ‘right’ information Is a strategy applied for helping learners review the “right” information? (Bol and Garner 2011) Learners have a tendency to select “almost learned” or more interesting content for restudy. If students were to rate test items on judgement of learning and interest they could be provided with feedback indicating that selection of content for restudy based on interest and minimal challenge may not be the best choices. For example an instructor advises the learners to select exercises that are challenging for them.

  5. 5.

    Effective practice tests Is a strategy applied for effective practice tests that provide students with records of their performance on past tests as well as items (or tasks) on those tests? (Bol and Garner 2011) Learners should be aware of the change in behaviour they should make. By informing them of the mistakes they already made they might direct further attempts. For example an instructor gives the results of the previous test as a guideline for the completion of the next test.

Appendix 3: Overview of Blended Learning Environments Described

See Fig. 6.

Fig. 6
figure 6

Overview of the six Blended Learning Environments and the total mean score describe via the framework of the 7 attributes that support self-regulation

Appendix 4: Variables Traced per School

School A

School B

Content

1. Course module viewed (p < .05)

2. Course searched

3. Course viewed (p < .05)

4. List of modules viewed

5. User logged in in course

Content

1. Course module viewed (p < .05)

2. Course viewed (p < .05)

3. Feedback viewed

4. List of modules viewed

5. SCORM started (p < .05)

6. User logged in in course

Content related information

6. Content posted (p < .05)

7. Discussion made (p < .05)

8. Discussion viewed (p < .05)

9. Enrolled on discussion (p < .05)

10. Message made (p < .05)

11. Message modified (p < .05)

12. Note created

13. Note removed

14. Post made

15. Subscription made on discussion

16. Subscription removed

Content related information

7. Discussion created (p < .05)

8. Discussion viewed (p < .05)

9. Note created

10. Note removed

11. Post made

12. Subscription made on discussion

13. Subscription removed

14. User profile viewed

Tasks and assignments

17. Assignment made (p < .05)

18. Assignment saved (p < .05)

19. Assignment sent (p < .05)

20. File uploaded (p < .05)

21. Submissions made

22. Test attempt viewed (p < .05)

23. Test made (p < .05)

24. Test started (p < .05)

25. Test viewed

26. There is an uploaded file

27. User preserved submission

Tasks and assignments

15. Assignment made (p < .05)

16. Assignment saved (p < .05)

17. Assignment sent (p < .05)

18. File uploaded (p < .05)

19. Test viewed

20. There is an uploaded file

21. User preserved submission

Scores and results

28. Score overview viewed

29. Status of assignment viewed (p < .05)

30. Submission form consulted (p < .05)

31. Summary test attempts viewed (p < .05)

32. Test attempt reviewed (p < .05)

33. Test checked

34. User score (p < .05)

Scores and results

22. Score report viewed (p < .05)

23. Status assignment viewed (p < .05)

24. Submission form viewed (p < .05)

25. Test checked

26. User score (p < .05)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Van Laer, S., Elen, J. Adults’ Self-Regulatory Behaviour Profiles in Blended Learning Environments and Their Implications for Design. Tech Know Learn 25, 509–539 (2020). https://doi.org/10.1007/s10758-017-9351-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10758-017-9351-y

Keywords

Navigation