Advertisement

When Students Get Stuck: Adaptive Remote Labs as a Way to Support Students in Practical Engineering Education

  • Anja HawlitschekEmail author
  • Till Krenz
  • Sebastian Zug
Chapter

Abstract

In computer science, dropout rates are high due to heterogeneous prior knowledge and skills. In this paper, we consider an adaptive remote laboratory as one approach to deal with this challenge at the course level. A remotely controlled laboratory gives each learner the possibility to learn at their own pace and in their individual learning space. As an adaptive learning environment, it is tailored to the needs of the learner. Investigating the underlying causes for dropouts is a precondition for providing suitable forms of adaptivity. Therefore, we analyzed cognitive and motivational differences between students who dropout and students who persist. Additionally we analyzed user behavior, i.e., a pattern of user-system interaction which might goes hand in hand with dropout—the probability of error streaks. Our results indicate that students in the dropout group had a significant higher probability to get stuck than their fellow students. On the one hand, students in the dropout group reported significant higher extraneous cognitive load, indicating difficulties to understand the task and to apply an adequate procedure while solving the task. On the other hand, there were problems in the process of programming. Students in the dropout group had a significant higher probability of falling into error streaks. In the article, we describe practical implications for both types of getting stuck. From our findings, we especially consider the monitoring and analyzing of error streaks as a promising way for the design of adaptive instructional interventions in courses where the students have to program code.

Keywords

Adaptivity Dropout Remote laboratory Cognitive load Error streaks 

Notes

Acknowledgments

This work was partially supported by the German Federal Ministry of Education and Research (BMBF, Funding number: 16DHL1033).

References

  1. Al-Qahtani, A. A., & Higgins, S. (2013). Effects of traditional, blended and e-learning on students’ achievement in higher education. Journal of Computer Assisted Learning, 29(3), 220–234.  https://doi.org/10.1111/j.1365-2729.2012.00490.x CrossRefGoogle Scholar
  2. Arnold, K. E. & Pistilli, M. (2012). Course Signals at Purdue: Using learning analytics to increase student success. In Proceedings of the LAK12: 2nd International Conference on Learning Analytics and Knowledge (pp. 267–270).  https://doi.org/10.1145/2330601.2330666
  3. Berges, M., Striewe, M., Shah, P., Goedicke, M., & Hubwieser, P. (2016). Towards deriving programming competencies from student errors. In Proceedings of the 4th International Conference on Learning and Teaching in Computing and Engineering (LaTiCE’16) (pp. 19–23). Los Alamitos, CA: IEEE Xplore Digital Library.Google Scholar
  4. Bernard, R. M., Borokhovski, E., Schmid, R. F., Tamim, R. M., & Abrami, P. C. (2014). A meta-analysis of blended learning and technology use in higher education: From the general to the applied. Journal of Computing in Higher Education, 26(1), 87–122.  https://doi.org/10.1007/s12528-013-9077-3 CrossRefGoogle Scholar
  5. Bures, E. M., Abrami, P. C., & Amundsen, C. (2000). Student motivation to learn via computer conferencing. Research in Higher Education, 41(5), 593–621.  https://doi.org/10.1023/A:1007071415363 CrossRefGoogle Scholar
  6. Chen, O., Kalyuga, S., & Sweller, J. (2017). The expertise reversal effect is a variant of the more general element interactivity effect. Educational Psychology Review, 29(2), 393–405.  https://doi.org/10.1007/s10648-016-9359-1 CrossRefGoogle Scholar
  7. Cox, A., & Fisher, M. (2008). A qualitative investigation of an all-female group in a software engineering course project. Journal of Information Technology Education, 7, 1–20.CrossRefGoogle Scholar
  8. Czaplicki, E. (2015). Compiler errors for humans. Rethinking the terminal UX in Elm 0.15.1, http://elm-lang.org/blog/compiler-errors-for-humans
  9. Davis, F., Bagozzi, P., & Warshaw, P. (1989). User acceptance of computer technology - a comparison of two theoretical models. Management Science, 35(8), 982–1003.  https://doi.org/10.1287/mnsc.35.8.982 CrossRefGoogle Scholar
  10. de Jong, T., & Lazonder, A. W. (2014). The guided discovery principle in multimedia learning. In R. E. Mayer (Ed.), The Cambridge handbook of multimedia learning (2nd ed., pp. 371–390). Cambridge, MA: Cambridge University Press.  https://doi.org/10.1017/CBO9781139547369.019 CrossRefGoogle Scholar
  11. Ferguson, R. (2012). Learning analytics: Drivers, developments and challenges. International Journal of Technology Enhanced Learning, 4(5/6), 304–317.  https://doi.org/10.1504/IJTEL.2012.051816 CrossRefGoogle Scholar
  12. Fredericksen, E., Pickett, A., Shea, P., Pelz, W., & Swan, K. (2000). Student satisfaction and perceived learning with on-line courses: Principles and examples from the SUNY learning network. Journal of Asynchronous Learning Networks, 4(2), 7–41.Google Scholar
  13. Giesbers, B., Rienties, B., Tempelaar, D., & Gijselaers, W. (2013). Investigating the relations between motivation, tool use, participation, and performance in an e-learning course using web-videoconferencing. Computers in Human Behavior, 29(1), 285–292.  https://doi.org/10.1016/j.chb.2012.09.005 CrossRefGoogle Scholar
  14. Heublein, U. (2014). Student drop-out from German higher education institutions. European Journal of Education, 49(4), 497–513.  https://doi.org/10.1111/ejed.12097 CrossRefGoogle Scholar
  15. Horton, D., & Craig, M. (2015). Drop, fail, pass, continue: Persistence in CS1 and beyond in traditional and inverted delivery. SIGCSE’15. In Proceedings of the 46th ACM Technical Symposium on Computer Science Education (pp. 235–240) Kansas City, MO.  https://doi.org/10.1145/2676723.2677273
  16. Isen, A. M., & Reeve, J. (2005). The influence of positive affect on intrinsic and extrinsic motivation: Facilitating enjoyment of play, responsible work behavior, and self-control. Motivation and Emotion, 29, 295–323.  https://doi.org/10.1007/s11031-006-9019-8 CrossRefGoogle Scholar
  17. Isleib, S., & Heublein, U. (2017). Ursachen des Studienabbruchs und Anforderungen an die Prävention. Empirische Pädagogik, 30. Jahrgang (Heft 3/4), 513–530. Landau in der Pfalz, Germany: Verlag Empirische Pädagogik.Google Scholar
  18. Joo, Y. J., Lim, K. Y., & Kim, E. K. (2011). Online university students’ satisfaction and persistence: Examining perceived level of presence, usefulness and ease of use as predictors in a structural model. Computers & Education, 57(2), 1654–1664.  https://doi.org/10.1016/j.compedu.2011.02.008 CrossRefGoogle Scholar
  19. Kalyuga, S. (2007). Expertise reversal effect and its implications for learner-tailored instruction. Educational Psychology Review, 19, 509–539.  https://doi.org/10.1007/s10648-007-9054-3 CrossRefGoogle Scholar
  20. Kalyuga, S. (2011). Cognitive load theory: How many types of load does it really need? Educational Psychology Review, 23(1), 1–19.  https://doi.org/10.1007/s10648-010-9150-7 CrossRefGoogle Scholar
  21. Kori, K., Pedaste, M., Leijen, Ä., & Tõnisson, E. (2016). The role of programming experience in ICT students’ learning motivation and academic achievement. International Journal of Information and Education Technology, 6(5), 331–337.  https://doi.org/10.7763/IJIET.2016.V6.709 CrossRefGoogle Scholar
  22. Kori, K., Pedaste, M., Tõnisson, E, Palts, T., Altin, H., Rantsus, R., … Rüütmann, T. (2015). First-year dropout in ICT studies. In Proceedings of IEEE Global Engineering Education Conference (pp. 444–452).  https://doi.org/10.1109/EDUCON.2015.7096008
  23. Law, K. M. Y., Lee, V. C. S., & Yu, Y. T. (2010). Learning motivation in e-learning facilitated computer programming courses. Computers & Education, 55(1), 218–228.  https://doi.org/10.1016/j.compedu.2010.01.007 CrossRefGoogle Scholar
  24. Legris, P., Ingham, J., & Collerette, P. (2003). Why do people use information technology? A critical review of the technology acceptance model. Information & Management, 40, 191–204.  https://doi.org/10.1016/S0378-7206(01)00143-4 CrossRefGoogle Scholar
  25. Leppink, J., Paas, F., van Gog, T., van der Vleuten, C. P. M., & van Merriënboer, J. J. G. (2014). Effects of pairs of problems and examples on task performance and different types of cognitive load. Learning and Instruction, 30, 32–42.  https://doi.org/10.1016/j.learninstruc.2013.12.001 CrossRefGoogle Scholar
  26. Leutner, D. (2002). Adaptivität und Adaptierbarkeit multimedialer Lehr- und Informationssysteme. In L. Issing & P. Klimsa (Eds.), Information und Lernen mit Multimedia und Internet (pp. 115–125). Weinheim, Germany: Beltz.Google Scholar
  27. Leutner, D. (2014). Motivation and emotion as mediators in multimedia learning. Learning and Instruction, 29, 174–175.  https://doi.org/10.1016/j.learninstruc.2013.05.004 CrossRefGoogle Scholar
  28. Levy, Y. (2007). Comparing dropouts and persistence in e-learning courses. Computers & Education, 48, 185–204.  https://doi.org/10.1016/j.compedu.2004.12.004 CrossRefGoogle Scholar
  29. Liaw, S.-S. (2008). Investigating students’ perceived satisfaction, behavioral intention, and effectiveness of e-learning: A case study of the Blackboard system. Computers & Education, 51, 864–873.  https://doi.org/10.1016/j.compedu.2007.09.005 CrossRefGoogle Scholar
  30. López-Pérez, M. V., Pérez-López, M. C., & Rodríguez-Ariza, L. (2011). Blended learning in higher education: Students’ perceptions and their relation to outcomes. Computers & Education, 56, 818–826.CrossRefGoogle Scholar
  31. Marks, R. B., Sibley, S. D., & Arbaugh, J. B. (2005). A structural equation model of predictors for effective online learning. Journal of Management Education, 29(4), 531–563.  https://doi.org/10.1177/1052562904271199 CrossRefGoogle Scholar
  32. Mayer, R. E. (2014). Incorporating motivation into multimedia learning. Learning and Instruction, 29, 171–173.  https://doi.org/10.1016/j.learninstruc.2013.04.003 CrossRefGoogle Scholar
  33. Melis, E., Andrès, E., Büdenbender, J., Frischauf, A., Goguadze, G., Libbrecht, P., … Ullrich, C. (2001). ActiveMath: A generic and adaptive web-based learning environment. International Journal of Artificial Intelligence in Education, 12, 385–407.Google Scholar
  34. Moreno, R. (2006). When worked examples don’t work: Is cognitive load theory at an impasse? Learning and Instruction, 16(2), 170–181.CrossRefGoogle Scholar
  35. Narciss, S., Proske, A., & Koerndle, H. (2007). Promoting self-regulated learning in web-based learning environments. Computers in Human Behavior, 23(3), 1126–1144.  https://doi.org/10.1016/j.chb.2006.10.006 CrossRefGoogle Scholar
  36. Paas, F., Tuovinen, J. E., van Merriënboer, J. J., & Darabi, A. A. (2005). A motivational perspective on the relation between mental effort and performance. Optimizing learner involvement in instruction. Educational Technology Research and Development, 53(3), 25–34.CrossRefGoogle Scholar
  37. Papamitsiou, Z., & Economides, A. (2014). Learning analytics and educational data mining in practice: A systematic literature review of empirical evidence. Educational Technology & Society, 17(4), 49–64.Google Scholar
  38. Park, B., Plass, J. L., & Brünken, R. (2014). Cognitive and affective processes in multimedia learning. Learning and Instruction, 29, 125–127.  https://doi.org/10.1016/j.learninstruc.2013.05.005 CrossRefGoogle Scholar
  39. Park, J.-H., & Choi, H. J. (2009). Factors influencing adult learners’ decision to drop out or persist in online learning. Educational Technology & Society, 12(4), 207–217.Google Scholar
  40. Plass, J. L., Moreno, R., & Brünken, R. (2010). Cognitive load theory. Cambridge, MA: Cambridge University Press.  https://doi.org/10.1017/CBO9780511844744 CrossRefGoogle Scholar
  41. Schneider, M., & Preckel, F. (2017). Variables associated with achievement in higher education: A systematic review of meta-analyses. Psychological Bulletin, 143(6), 565–600.  https://doi.org/10.1037/bul0000098 CrossRefGoogle Scholar
  42. Schnotz, W., & Kürschner, C. (2007). A reconsideration of cognitive load theory. Educational Psychology Review, 19(4), 469–508.  https://doi.org/10.1007/s10648-007-9053-4 CrossRefGoogle Scholar
  43. Schulmeister, R. (2017). Presence and self-study in blended learning. eLeed, 12, urn:nbn:de:0009-5-45027. https://eleed.campussource.de/archive/12/4502
  44. Siegmund, J., Kästner, C., Liebig, J., Apel, S., & Hanenberg, S. (2014). Measuring and modeling programming experience. Empirical Software Engineering, 19(5), 1299–1334.  https://doi.org/10.1007/s10664-013-9286-4 CrossRefGoogle Scholar
  45. Sweller, J., Ayres, P., & Kalyuga, S. (2011). Cognitive load theory. New York, NY: Springer.  https://doi.org/10.1007/978-1-4419-8126-4 CrossRefGoogle Scholar
  46. Talton, J. O., Peterson, D. L., Kamin, S., Israel, D., & Al-Muhtadi, J. (2006). Scavenger hunt: Computer science retention through orientation. In Proceedings of the 37th SIGCSE Technical Symposium on Computer Science Education (pp. 443–447).Google Scholar
  47. van Seters, J. R., Ossevoort, M. A., Tramper, J., & Goedhart, M. J. (2012). The influence of student characteristics on the use of adaptive e-learning material. Computers & Education, 58, 942–952.  https://doi.org/10.1016/j.compedu.2011.11.002 CrossRefGoogle Scholar
  48. Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management Science, 46, 186–204.  https://doi.org/10.1287/mnsc.46.2.186.11926 CrossRefGoogle Scholar
  49. Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27, 425–478.  https://doi.org/10.2307/30036540 CrossRefGoogle Scholar
  50. Watson, C., & Li, F. W. B. (2014). Failure rates in introductory programming revisited. In Proceedings of the 2014 Conference on Innovation Technology in computer science education (ITiCSE’14) (pp. 39–44). New York, NY: Association for Computing Machinery (ACM).Google Scholar
  51. Wu, J.-H., Tennyson, R. D., & Hsia, T.-L. (2010). A study of student satisfaction in a blended e-learning system environment. Computers & Education, 55(1), 155–164.  https://doi.org/10.1016/j.compedu.2009.12.012 CrossRefGoogle Scholar
  52. Zug, S., Hawlitschek, A., & Krenz, T. (2017). What are the key features of future Remote Labs? A critical evaluation of an existing one. In FDIBA Conference Proceedings, http://www.elab.ovgu.de/elab_media/_users/hawlitsc/Paper_FDIBA_eLab/FDIBA_eLab-p-90.pdf

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Magdeburg-Stendal University of Applied SciencesMagdeburgGermany
  2. 2.Otto-von-Guericke-University MagdeburgMagdeburgGermany

Personalised recommendations