Sensors for Seamless Learning

  • Marcus SpechtEmail author
  • Limbu Bibeg Hang
  • Jan Schneider Barnes
Part of the Lecture Notes in Educational Technology book series (LNET)


The chapter highlights the role of sensors for supporting seamless learning experiences. In the first part, the relation between sensor tracking of learning activities and research around real-time feedback in educational situations is introduced. The authors present an overview of the kinds of sensor data that have been used for educational purposes in the literature. Secondly, the authors introduce the link between sensor data and educational interventions, and especially the role of building expert models from real-world expert tracking. The third part of the paper illustrates how educational AR applications have used sensor data for different forms of learning support. The authors present 15 design patterns that have been implemented in different educational AR applications that build on our analysis of sensor tracking. For future AR applications, the authors propose that the use of sensors for building expert performance models is essential for a variety of educational interventions.


  1. Ahmmad, S. N. Z., Ming, E. S. L., Fai, Y. C., & Narayanan, A. L. T. (2014). Experimental study of Surgeon’s psychomotor skill using sensor-based measurement. Procedia Computer Science, 42(C), 130–137. Scholar
  2. Araki, A., Makiyama, K., Yamanaka, H., Ueno, D., Osaka, K., Nagasaka, M., … Yao, M. (2016). Comparison of the performance of experienced and novice surgeons: Measurement of gripping force during laparoscopic surgery performed on pigs using forceps with pressure sensors. Surgical Endoscopy, 31(4), 1999–2005. Scholar
  3. Asadipour, A., Debattista, K., & Chalmers, A. (2017). Visuohaptic augmented feedback for enhancing motor skills acquisition. Visual Computer, 33(4), 401–411. Scholar
  4. Benedetti, F., Volpi, N. C., Parisi, L., & Sartori, G. (2014). Attention training with an Easy–to–Use brain computer interface. In Virtual, augmented and mixed reality: Applications of virtual and augmented reality (Vol. 8526, pp. 236–247). Scholar
  5. Börner, D., Kalz, M., & Specht, M. (2013). Beyond the channel: A literature review on ambient displays for learning. Computers & Education, 60(1), 426–435. Scholar
  6. Chia, F.-Y., & Saakes, D. (2014). Interactive training chopsticks to improve fine motor skills. In ACE’14 Proceedings of the 11th Conference on Advances in Computer Entertainment Technology Article No. 57 (pp. 1–4).
  7. Collins, A., Brown, J. S., & Holum, A. (1991). Cognitive apprenticeship: Making thinking visible. American Educator, 15(3), 6–11. Retrieved from
  8. Daponte, P., De Vito, L., Riccio, M., & Sementa, C. (2014). Design and validation of a motion-tracking system for ROM measurements in home rehabilitation. Measurement: Journal of the International Measurement Confederation, 55, 82–96. Scholar
  9. Idol, L., & Jones, B. F. (2013). Educational values and cognitive instruction: Implications for reform. New York: Routledge.CrossRefGoogle Scholar
  10. Jang, S. A., Kim, H.-I., Woo, W., & Wakefield, G. (2014). AiRSculpt: A wearable augmented reality 3D sculpting system. In Distributed, Ambient, and Pervasive Interactions. DAPI 2014. Lecture Notes in Computer Science (Vol. 8530, pp. 130–141). Scholar
  11. Jarodzka, H., Van Gog, T., Dorr, M., Scheiter, K., & Gerjets, P. (2013). Learning to see: Guiding students’ attention via a Model’s eye movements fosters learning. Learning and Instruction, 25, 62–70. Scholar
  12. Ke, F., Lee, S., & Xu, X. (2016). Teaching training in a mixed-reality integrated learning environment. Computers in Human Behavior, 62, 212–220. Scholar
  13. Khan, M. (2015). Transmitting Al Ardha: Traditional Arab sword dance. International Journal of Heritage in the Digital Era, 4(1), 71–85. Scholar
  14. Kim, S. J., & Dey, A. K. (2016). Augmenting human senses to improve the user experience in cars: Applying augmented reality and haptics approaches to reduce cognitive distances. Multimedia Tools and Applications, 75(16), 9587–9607. Scholar
  15. Kowalewski, K. F., Hendrie, J. D., Schmidt, M. W., Garrow, C. R., Bruckner, T., Proctor, T., … Nickel, F. (2016). Development and validation of a sensor- and expert model-based training system for laparoscopic surgery: The iSurgeon. Surgical Endoscopy, 31(5), 2155–2165. Scholar
  16. Li, H., Lu, M., Chan, G., & Skitmore, M. (2015). Proactive training system for safe and efficient precast installation. Automation in Construction, 49(Part A), 163–174. Scholar
  17. Limbu, B., Rasool, J., & Klemke, R. (2016). WEKIT D1.3 WEKIT framework & training methodology. Retrieved from
  18. Limbu, B., Jarodzka, H., Specht, M., & Klemke, R. (2018). Using sensors and augmented reality to train apprentices using recorded expert performance: A systematic literature review. Educational Research Review, 25, 1–22.CrossRefGoogle Scholar
  19. Limbu, B., Fominykh, M., Klemke, R., Specht, M., & Wild, F. (2018). Supporting training of expertise with wearable technologies: The WEKIT reference framework. In Mobile and ubiquitous learning (pp. 157–175). Singapore: Springer. Scholar
  20. Meleiro, P., Rodrigues, R., Jacob, J., & Marques, T. (2014). Natural user interfaces in the motor development of disabled children. Procedia Technology, 13, 66–75. Scholar
  21. Oppermann, R., & Specht, M. (2006). Situated learning in the process of work. In Engaged learning with emerging technologies.
  22. Prabhu, V. A., Elkington, M., Crowley, D., Tiwari, A., & Ward, C. (2017). Digitisation of manual composite layup task knowledge using gaming technology. Composites Part B Engineering, 112, 314–326. Scholar
  23. Sanfilippo, F. (2017). A multi-sensor fusion framework for improving situational awareness in demanding maritime training. Reliability Engineering & System Safety, 161(February 2016), 12–24. Scholar
  24. Schneider, J. (2017). Sensor-based learning support. Open Universiteit. Retrieved from
  25. Schneider, J., Börner, D., van Rosmalen, P., & Specht, M. (2014). Presentation trainer: A study on immediate feedback for developing non-verbal public speaking skills. Bulletin of the Technical Committee on Learning Technology, 16(2–3), 6–9.Google Scholar
  26. Schneider, J., Börner, D., van Rosmalen, P., & Specht, M. (2015). Augmenting the senses: A review on sensor-based learning support. Sensors, 15(2), 4097–4133. Scholar
  27. Schneider, J., Börner, D., van Rosmalen, P., & Specht, M. (2015). Stand tall and raise your voice! A study on the presentation trainer. Lecture notes in computer science (including subseries Lecture notes in artificial intelligence and Lecture notes in bioinformatics) (Vol. 9307). Scholar
  28. Schneider, J., Börner, D., van Rosmalen, P., & Specht, M. (2015a). Augmenting the senses: A review on sensor-based learning support. Sensors (Switzerland), 15(2). Scholar
  29. Schneider, J., Börner, D., van Rosmalen, P., & Specht, M. (2015b). Presentation trainer, your public speaking multimodal coach. In ICMI 2015—Proceedings of the 2015 ACM International Conference on Multimodal Interaction.
  30. Schneider, J., Borner, D., van Rosmalen, P., & Specht, M. (2016). Can you help me with my pitch? Studying a tool for real-time automated feedback. IEEE Transactions on Learning Technologies, 9(4). Scholar
  31. Schneider, J., Börner, D., van Rosmalen, P., & Specht, M. (2017). Presentation trainer: What experts and computers can tell about your nonverbal communication. Journal of Computer Assisted learning, 33(2), 164–177. Scholar
  32. Specht, M. (2009). Learning in a technology enhanced world. Context in ubiquitous learning support. Heerlen: Open Universiteit Nederland. Retrieved from
  33. Specht, M. (2015). Connecting learning contexts with ambient information channels. Seamless learning in the age of mobile connectivity. Scholar
  34. Specht, M., Ternier, S., & Greller, W. (2011). Mobile augmented reality for learning: A case study. Journal of the Research Center for Educational Technology, 7(1), 117–127. Retrieved from
  35. Specht, M., Börner, D., & Tabuenca, B. (2012). RTST trend report: Lead theme contextualisation. Retrieved from
  36. Sun, X., Byrns, S., Cheng, I., Zheng, B., & Basu, A. (2017). Smart sensor-based motion detection system for hand movement training in open surgery. Journal of Medical Systems, 41(2), 24. Scholar
  37. Van Merriënboer, J. J. G., Clark, R. E., & Croock, M. B. M. (2002). Blueprints for complex learning: The 4C/ID-model. Educational Technology Research and Development, 50(2), 39–64. Scholar
  38. Wagner, R. K., & Sternberg, R. J. (1990). Street smarts. In K. E. Clark & M. B. Clark (Eds.), Measures of leadership (pp. 493–504). West Orange, NJ, USA: Leadership Library of America.Google Scholar
  39. Wei, Y., Yan, H., Bie, R., Wang, S., & Sun, L. (2014). Performance monitoring and evaluation in dance teaching with mobile sensing technology. Personal and Ubiquitous Computing, 18(8), 1929–1939. Scholar
  40. Zimmermann, A., Lorenz, A., & Oppermann, R. (2007). An operational definition of context modeling and using context. In Proceedings of CONTEXT’07 (Vol. 4635, pp. 558–571).

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  • Marcus Specht
    • 1
    Email author
  • Limbu Bibeg Hang
    • 1
    • 2
  • Jan Schneider Barnes
    • 1
    • 3
  1. 1.Center for Education and LearningTU DelftDelftNetherlands
  2. 2.Welten Institute Open Universiteit NetherlandsHeerlenNetherlands
  3. 3.DIPF German Institute for Pedagogical ResearchFrankfurtGermany

Personalised recommendations