Abstract
The use of sensors to support learning research and practice is not new, whether in the context of wearable technology, context-aware technology, ubiquitous systems or else. Nevertheless, the proliferation of sensing technology has driven the field of learning technology in the development of tools and methods that can generate and leverage sensor-based analytics (SBA) to support complex learning processes. SBA fulfills the vision of integrating multiple sources of information, coming from different channels to strengthen learning systems’ desired features (e.g., adaptation, affect detection) and augment learners’ abilities (e.g., through embodied interaction and cognition). In addition, it offers a promising avenue for improving the research measurements in the field. In this chapter, the authors present how SBA has advanced learning technology through the lenses of their offered qualities, indented objectives and inevitable challenges. Through three case study examples, we showcase how those advancements are reflected in contemporary Multimodal Learning Analytics (MMLA) research. The chapter is concluded with a discussion on the role of SBA and a future research agenda that depicts how the lessons learned from the encountered challenges of MMLA can help us further improve the adoption of SBA for learning technology research and practice.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
analytics that utilize sensor data.
- 2.
What is Learning Analytics?: https://www.solaresearch.org/about/what-is-learning-analytics/.
- 3.
- 4.
For example, the Norwegian National Committees for Research Ethics provide general principles for conducting research: https://www.forskningsetikk.no/en/guidelines/general-guidelines/.
- 5.
The Fitness Calculator is an established tool that helps you to estimate your fitness level: https://www.ntnu.edu/cerg/vo2max#accurate.
References
Alwahaby, H., Cukurova, M., Papamitsiou, Z., & Giannakos, M. (2022). The evidence of impact and ethical considerations of multimodal learning analytics: A systematic literature review. In The Multimodal Learning Analytics Handbook, Springer.
Amft, O., Favela, J., Intille, S., Musolesi, M., & Kostakos, V. (2020). Personalized pervasive health. IEEE Annals of the History of Computing, 19(03), 11–13.
Barrett, L. F., Adolphs, R., Marsella, S., Martinez, A. M., & Pollak, S. D. (2019). Emotional expressions reconsidered: Challenges to inferring emotion from human facial movements. Psychological Science in the Public Interest, 20(1), 1–68.
Beardsley, M., MartĂnez Moreno, J., Vujovic, M., Santos, P., & Hernández-Leo, D. (2020). Enhancing consent forms to support participant decision making in multimodal learning data research. British Journal of Educational Technology, 51(5), 1631–1652.
Blikstein, P., & Worsley, M. (2016). Multimodal learning analytics and education data mining: Using computational technologies to measure complex learning tasks. Journal of Learning Analytics, 3(2), 220–238.
Catania, F., Spitale, M., Cosentino, G., & Garzotto, F. (2020). Conversational agents to promote children’s verbal communication skills. In International Workshop on Chatbot Research and Design (pp. 158–172). Springer.
Chejara, P., Prieto, L. P., Ruiz-Calleja, A., RodrĂguez-Triana, M. J., & Shankar, S. K. (2019). Exploring the triangulation of dimensionality reduction when interpreting multimodal learning data from authentic settings. In European Conference on Technology Enhanced Learning (pp. 664–667). Springer.
Chen, N. S., Cheng, I. L., Chew, S. W. (2016). Evolution is not enough: Revolutionizing current learning environments to smart learning environments. International Journal of Artificial Intelligence in Education, 26(2), 561–581.
Chng, E., Seyam, M. R., Yao, W., & Schneider, B. (2020) Using motion sensors to understand collaborative interactions in digital fabrication labs. In International Conference on Artificial Intelligence in Education (pp. 118–128). Springer.
Choe, E. K., Lee, N. B., Lee, B., Pratt, W., & Kientz, J. A. (2014). Understanding quantified-selfers’ practices in collecting and exploring personal data. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 1143–1152).
Clow, D. (2012). The learning analytics cycle: Closing the loop effectively. In Proceedings of the 2nd International Conference on Learning Analytics and Knowledge (pp. 134–138).
Cukurova, M. (2018). A syllogism for designing collaborative learning technologies in the age of AI and multimodal data. In European Conference on Technology Enhanced Learning (pp. 291–296). Springer.
Cukurova, M., Giannakos, M., & Martinez-Maldonado, R. (2020). The promise and challenges of multimodal learning analytics. British Journal of Educational Technology, 51, 1441–1449.
Desmarais, M. C., & d Baker, R. S. (2012). A review of recent advances in learner and skill modeling in intelligent learning environments. User Modeling and User-Adapted Interaction, 22(1), 9–38.
Di Lascio, E., Gashi, S., & Santini, S. (2018). Unobtrusive assessment of students’ emotional engagement during lectures using electrodermal activity sensors. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 2(3), 1–21.
Di Mitri, D., Schneider, J., Specht, M., & Drachsler, H. (2018). From signals to knowledge: A conceptual model for multimodal learning analytics. Journal of Computer Assisted Learning, 34(4), 338–349.
Di Mitri, D. E. A. (2022). Multimodal learning experience for deliberate practice. In The Multimodal Learning Analytics Handbook, Springer.
Dindar, M., Alikhani, I., Malmberg, J., Järvelä, S., & Seppänen, T. (2019). Examining shared monitoring in collaborative learning: A case of a recurrence quantification analysis approach. Computers in Human Behavior, 100, 335–344.
D’Mello, S., Dieterle, E., & Duckworth, A. (2017). Advanced, analytic, automated (aaa) measurement of engagement during learning. Educational Psychologist, 52(2), 104–123.
D’mello, S., & Graesser, A. (2013). Autotutor and affective autotutor: Learning by talking with cognitively and emotionally intelligent computers that talk back. ACM Transactions on Interactive Intelligent Systems (TiiS), 2(4), 1–39.
D’Mello, S., Lehman, B., Sullins, J., Daigle, R., Combs, R., Vogt, K., Perkins, L., & Graesser, A. (2010). A time for emoting: When affect-sensitivity is and isn’t effective at promoting deep learning. In International Conference on Intelligent Tutoring Systems (pp. 245–254). Springer.
Drachsler, H., & Greller, W. (2016). Privacy and analytics: It’s a delicate issue a checklist for trusted learning analytics. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (pp. 89–98).
Drachsler, H., & Schneider, J. (2018). JCAL special issue on multimodal learning analytics. Journal of Computer Assisted Learning, 34(4), 335–337.
Dunn, J., Kidzinski, L., Runge, R., Witt, D., Hicks, J. L., Rose, S. M. S. F., Li, X., Bahmani, A., Delp, S. L., Hastie, T., & Snyder, M. P. (2021). Wearable sensors enable personalized predictions of clinical laboratory measurements. Nature Medicine, 27, 1–8.
Ekman, P. (1982). Methods for measuring facial action. In Handbook of methods in nonverbal behavior research (pp. 45–90).
Emerson, A., Henderson, N., Rowe, J., Min, W., Lee, S., Minogue, J., & Lester, J. (2020). Early prediction of visitor engagement in science museums with multimodal learning analytics. In Proceedings of the 2020 International Conference on Multimodal Interaction (pp. 107–116).
Gelsomini, M., Leonardi, G., & Garzotto, F. (2020). Embodied learning in immersive smart spaces. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1–14).
Giannakos, M., Spikol, D., Molenaar, I., Di Mitri, D., Sharma, K., Ochoa, X., & Hammad, R. (2020). Preface: Crossmmla in practice: Collecting, annotating and analyzing multimodal data across spaces. In: CEUR Workshop Proceedings (Vol. 2610).
Giannakos, M., Spikol, D., Di Mitri, D., Sharma, K., Ochoa, X., Hammad, R. (2022). Introduction to Multimodal Learning Analytics, The Multimodal Learning Analytics Handbook, Springer.
Giannakos, M. N., Lee-Cultura, S., & Sharma, K. (2021). Sensing-based analytics in education: The rise of multimodal data enabled learning systems. IT Professional, 23, 31–38.
Giannakos, M. N., Papavlasopoulou, S., & Sharma, K. (2020). Monitoring children’s learning through wearable eye-tracking: The case of a making-based coding activity. IEEE Pervasive Computing, 19(1), 10–21.
Giannakos, M. N., Sharma, K., Papavlasopoulou, S., Pappas, I. O., & Kostakos, V. (2020). Fitbit for learning: Towards capturing the learning experience using wearable sensing. International Journal of Human-Computer Studies, 136, 102384.
Giannakos, M. N., Sharma, K., Pappas, I. O., Kostakos, V., & Velloso, E. (2019). Multimodal data as a means to understand the learning experience. International Journal of Information Management, 48, 108–119.
Graesser, A. C. (2016). Conversations with autotutor help students learn. International Journal of Artificial Intelligence in Education, 26(1), 124–132.
Hatfield, J. L., Cryder, M., & Basso, B. (2020). Remote sensing: Advancing the science and the applications to transform agriculture. IT Professional, 22(3), 42–45.
Heaven, D. (2020). Why faces don’t always tell the truth about feelings. Nature, 578(7796), 502–505.
Herborn, K. A., Graves, J. L., Jerem, P., Evans, N. P., Nager, R., McCafferty, D. J., & McKeegan, D. E. (2015). Skin temperature reveals the intensity of acute stress. Physiology & Behavior 152, 225–230.
Kim, S., Tasse, D., & Dey, A. K. (2017). Making machine-learning applications for time-series sensor data graphical and interactive. ACM Transactions on Interactive Intelligent Systems (TiiS), 7(2), 1–30.
Kourakli, M., Altanis, I., Retalis, S., Boloudakis, M., Zbainos, D., & Antonopoulou, K. (2017). Towards the improvement of the cognitive, motoric and academic skills of students with special educational needs using kinect learning games. International Journal of Child-Computer Interaction, 11, 28–39.
Lee-Cultura, S., & Giannakos, M. (2020). Embodied interaction and spatial skills: A systematic review of empirical studies. Interacting with Computers, 32, 331–366.
Lee-Cultura, S., Sharma, K., Cosentino, G., Papavlasopoulou, S., & Giannakos, M. (2021). Children’s play and problem solving in motion-based educational games: Synergies between human annotations and multi-modal data. In: Proceedings of the Interaction Design and Children Conference (p. 19).
Lee-Cultura, S., Sharma, K., Papavlasopoulou, S., Retalis, S., & Giannakos, M. (2020). Using sensing technologies to explain children’s self-representation in motion-based educational games. In Proceedings of the Interaction Design and Children Conference (pp. 541–555).
Leiner, D., Fahr, A., & Früh, H. (2012). Eda positive change: A simple algorithm for electrodermal activity to measure general audience arousal during media exposure. Communication Methods and Measures, 6(4), 237–250.
Mangaroska, K., Sharma, K., Gasevic, D., & Giannakos, M. (2020). Multimodal learning analytics to inform learning design: Lessons learned from computing education. Journal of Learning Analytics, 7(3), 79–97.
Ochoa, X. (2022). Multimodal systems for automated oral presentation feedback: A comparative analysis. In The Multimodal Learning Analytics Handbook, Springer.
Ochoa, X., Lang, A. C., & Siemens, G. (2017). Multimodal learning analytics. The Handbook of Learning Analytics, 1, 129–141.
Ochoa, X., & Worsley, M. (2016). Augmenting learning analytics with multimodal sensory data. Journal of Learning Analytics, 3(2), 213–219.
Papavlasopoulou, S., Sharma, K., & Giannakos, M. N. (2018). How do you feel about learning to code? Investigating the effect of children’s attitudes towards coding using eye-tracking. International Journal of Child-Computer Interaction, 17, 50–60.
Paquette, L., Rowe, J., Baker, R., Mott, B., Lester, J., DeFalco, J., Brawner, K., Sottilare, R., & Georgoulas, V. (2016). Sensor-free or sensor-full: A comparison of data modalities in multi-channel affect detection. In Proceedings of the Eighth International Conference on Educational Data Mining. International Educational Data Mining Society.
Qi, J., Yang, P., Waraich, A., Deng, Z., Zhao, Y., & Yang, Y. (2018). Examining sensor-based physical activity recognition and monitoring for healthcare using internet of things: A systematic review. Journal of Biomedical Informatics, 87, 138–153.
Romero, C., Ventura, S., Pechenizkiy, M., Baker, R. S. (2010). Handbook of educational data mining. CRC Press.
Shakroum, M., Wong, K. W., & Fung, C. C. (2018). The influence of gesture-based learning system (GBLS) on learning outcomes. Computers & Education, 117, 75–101.
Sharma, K., & Giannakos, M. (2020). Multimodal data capabilities for learning: What can multimodal data tell us about learning? British Journal of Educational Technology, 51(5), 1450–1484.
Sharma, K., & Giannakos, M. (2021). Sensing technologies and child-computer interaction: Opportunities, challenges and ethical considerations. International Journal of Child-Computer Interaction, 30, 100331.
Sharma, K., Mangaroska, K., van Berkel, N., Giannakos, M., & Kostakos, V. (2021). Information flow and cognition affect each other: Evidence from digital learning. International Journal of Human-Computer Studies, 146, 102549.
Sharma, K., Niforatos, E., Giannakos, M., & Kostakos, V. (2020). Assessing cognitive performance using physiological and facial features: Generalizing across contexts. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 4(3), 1–41.
Sharma, K., Papamitsiou, Z., & Giannakos, M. (2019). Building pipelines for educational data using ai and multimodal analytics: A “grey-box” approach. British Journal of Educational Technology, 50(6), 3004–3031.
Sharma, K., Pappas, I., Papavlasopoulou, S., & Giannakos, M. (2019). Towards automatic and pervasive physiological sensing of collaborative learning. In A Wide Lens: Combining Embodied, Enactive, Extended, and Embedded Learning in Collaborative Settings, 13th International Conference on Computer Supported Collaborative Learning (CSCL) 2019
Van Berkel, N., Ferreira, D., & Kostakos, V. (2017). The experience sampling method on mobile devices. ACM Computing Surveys (CSUR), 50(6), 1–40.
Vanhees, L., Lefevre, J., Philippaerts, R., Martens, M., Huygens, W., Troosters, T., & Beunen, G. (2005). How to assess physical activity? How to assess physical fitness? European Journal of Preventive Cardiology, 12(2), 102–114.
Weiser, M., & Brown, J. S. (1997). The coming age of calm technology. In Beyond calculation (pp. 75–85). Springer.
Weiser, M., Gold, R., & Brown, J. S. (1999) The origins of ubiquitous computing research at PARC in the late 1980s. IBM Systems Journal, 38(4), 693–696.
Worsley, M. (2018). (Dis) engagement matters: Identifying efficacious learning practices with multimodal learning analytics. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (pp. 365–369).
Worsley, M. (2012). Multimodal learning analytics: enabling the future of learning through multimodal data analysis and interfaces. In Proceedings of the 14th ACM International Conference on Multimodal Interaction (pp. 353–356).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Appendix: Categorization of Challenges and Opportunities, Across the Three Case Studies
Appendix: Categorization of Challenges and Opportunities, Across the Three Case Studies
 | Methodological | Practical | Ethical |
---|---|---|---|
Case study 1: Challenges | The trade-off between accuracy and authenticity. This is a controlled (lab) study. Lab studies tend to focus on the “accuracy” of the data collections and accept the low ecology of the results (authenticity) | Less impactful on the practical adoption. There are concerns about how such treatment can occur in real life settings, making it less impactful on the adoption in practice | Less focus on potential ethical and privacy challenges. The researchers follow clinical research standards and procedures, with time-tested protocols to ensure research ethics. Yet, no focus on larger ethical implications of the used technology |
Case study 1: Opportunities | Proxies and Markers. Controlled environments can help us to develop powerful computational models that can be used in the “out of lab” contexts, and further validated to account for the lack of authenticity | First step: Controlled environment studies can generate significant research insights and can support the discussions with regard to adoption by contributing to various practical challenges observed | Do we really need sensor data? Controlled studies can help us to investigate the extent to which we can use mainstream LA as proxies, this allows us to develop models that operate in a sensor-free manner with very high performance (that sometimes even outperform sensor-based models), but optimize when sensor data are available |
Case study 2: Challenges | Complexity of the measurements and their interpretations. One of the most important challenges connects the measurement of the multifaceted notion of the “learning experience”. Learning experience can not be portrayed by one index and its monitoring has to be context specific | The pendulum of universality and context. The development of “universal” markers to indicate the domain specific goals someone should have is a common approach in other disciplines (e.g., universality of VO2max and HR and context dependent on resting, exercising), in learning this is still a challenge we see in our studies, but shy away from developing further | The paradox of participants’ high-interest in the technology and low-interest in data privacy. Employing wearable sensor data is probably a new experience for many of the participants and the researchers have to show patience answering all questions. An interesting aspect is that a considerable amount of time was needed to explain the details about the wearable devices, but fewer to no questions were made regarding privacy issues with the collected data. (all information was given in the consent form) |
 | Methodological | Practical | Ethical |
---|---|---|---|
Case study 2: Opportunities | Safeguarding learners’ SBA. Methodologies and practices to safeguard SBA (that allow us to safeguard SBA at least at the same level we do for mainstream log data) will allow us to boost use and adoption | Working towards a transparent “risk-benefit” assessment on the use of SBA. Employing SBA in education has inherent ethical consequences, however, they arise from both using, but also denying the use of SBA (if there is a proven benefit). (e.g., how ethical is it to deny the use of SBA for learning, if their significant advantage for learners is evidenced?) | |
 | Move to non-invasive/data-sensitive SBA (as much as possible). Affective information coming from cameras, EEG and other invasive/data-sensitive streams, can be transferred through proxies to modalities such as EDA, HR. The use of these modalities is not without challenges and controversy, but the risk is reduced while the benefit stays at the same levels. |  | |
 | Development of group-level SBA: Defining SBA at the group level would help in developing group-level affordances and capabilities to support collaborative learning and awareness (e.g., team engagement or alignment in this case study) |  | |
Case study 3: Challenges | Development of functionalities that leverage SBA (as we do with mainstream analytics). Development of models and ontologies of meaningful SBA and design of interfaces and implementation of system logic that utilises and accounts for SBA (e.g., feedback and adaptive support) | Space matters more for SBA. Having a properly designed room, or “spot”, makes it much easier for researchers and stakeholders to utilize SBA. However, allocation of such space is rarely possible in most educational contexts | Significant concerns among stakeholders, even when no personal data are involved. Utilizing SBA can lead to significant concerns among stakeholders. Even when SBA are employed without capturing any SBA (e.g., used for interaction only), this might serve as a barrier for future advancements and adoption of SBA in education |
Case study 3: Opportunities | Leveraging on the various capacities of sensors. Sensors should not be used only as an interaction modality or as an extra measurement, but needs to enrich systems functionalities Development of SBA support for near real-time use. Designing and developing affordances that leverage SBA’s temporality in a momentary and continuous manner (e.g., agents, dashboards, reports to instructor), has been identified from both researchers’ observations and participants comments | Transparent, trustworthy and accountable SBA. Future research needs to devise frameworks and guidelines for SBA in education that will allow researchers and practitioners to employ SBA in a transparent, trustworthy and accountable manner; and result in the support and adoption from the stakeholders |
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Giannakos, M., Cukurova, M., Papavlasopoulou, S. (2022). Sensor-Based Analytics in Education: Lessons Learned from Research in Multimodal Learning Analytics. In: Giannakos, M., Spikol, D., Di Mitri, D., Sharma, K., Ochoa, X., Hammad, R. (eds) The Multimodal Learning Analytics Handbook. Springer, Cham. https://doi.org/10.1007/978-3-031-08076-0_13
Download citation
DOI: https://doi.org/10.1007/978-3-031-08076-0_13
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-08075-3
Online ISBN: 978-3-031-08076-0
eBook Packages: Computer ScienceComputer Science (R0)