Advertisement

Integrating Multimodal Learning Analytics and Inclusive Learning Support Systems for People of All Ages

  • Kaori Tamura
  • Min Lu
  • Shin’ichi KonomiEmail author
  • Kohei Hatano
  • Miyuki Inaba
  • Misato Oi
  • Tsuyoshi Okamoto
  • Fumiya Okubo
  • Atsushi Shimada
  • Jingyun Wang
  • Masanori Yamada
  • Yuki Yamada
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11577)

Abstract

Extended learning environments involving system to collect data for learning analytics and to support learners will be useful for all-age education. As the first steps towards to build new learning environments, we developed a system for multimodal learning analytics using eye-tracker and EEG measurement, and inclusive user interface design for elderly learners by dual-tablet system. Multimodal learning analytics system can be supportive to extract where and how learners with varied backgrounds feel difficulty in learning process. The eye-tracker can retrieve information where the learners paid attention. EEG signals will provide clues to estimate their mental states during gazes in learning. We developed simultaneous measurement system of these multimodal responses and are trying to integrate the information to explore learning problems. A dual-tablet user interface with simplified visual layers and more intuitive operations was designed aiming to reduce the physical and mental loads of elderly learners. A prototype was developed based on a cross-platform framework, which is being refined by iterative formative evaluations participated by elderlies, in order to improve the usability of the interface design. We propose a system architecture applying the multimodal learning analytics and the user-friendly design for elderly learners, which couples learning analytics “in the wild” environment and learning analytics in controlled lab environments.

Keywords

All-age learning Learning support systems Learning analytics Multimodal sensing Inclusive design 

Notes

Acknowledgement

This work was supported by JST Mirai Grant Number 17-171024547, Japan.

References

  1. 1.
    Konomi, S., et al.: Towards supporting multigenerational co-creation and social activities: extending learning analytics platforms and beyond. In: Streitz, N., Konomi, S. (eds.) DAPI 2018. LNCS, vol. 10922, pp. 82–91. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-91131-1_6CrossRefGoogle Scholar
  2. 2.
    Jarodzka, H., Holmqvist, K., Gruber, H.: Eye tracking in educational science: theoretical frameworks and research agendas. J. Eye Mov. Res. 10(1) (2017)Google Scholar
  3. 3.
    Jian, Y.-C., Ko, H.-W.: Influences of text difficulty and reading ability on learning illustrated science texts for children: an eye movement study. Comput. Educ. 113, 263–279 (2017)CrossRefGoogle Scholar
  4. 4.
    Hu, Y., Wu, B., Gu, X.: An eye tracking study of high- and low-performing students in solving interactive and analytical problems. Educ. Technol. Soc. 20, 300–311 (2017)Google Scholar
  5. 5.
    Gegenfurtner, A., Lehtinen, E., Säljö, R.: Expertise differences in the comprehension of visualizations: a meta-analysis of eye-tracking research in professional domains (2011).  https://doi.org/10.1007/s10648-011-9174-7CrossRefGoogle Scholar
  6. 6.
    Alemdag, E., Cagiltay, K.: A systematic review of eye tracking research on multimedia learning. Comput. Educ. 125, 413–428 (2018)CrossRefGoogle Scholar
  7. 7.
    Kleiner, M., Brainard, D., Pelli, D., Ingling, A., Murray, R., Broussard, C.: What’s new in psychtoolbox-3. Perception 36, 1–16 (2007)Google Scholar
  8. 8.
    Tobii Pro .http://developer.tobiipro.com Accessed 30 Nov 2018
  9. 9.
    Borghini, G., Astolfi, L., Vecchiato, G., Mattia, D., Babiloni, F.: Measuring neurophysiological signals in aircraft pilots and car drivers for the assessment of mental workload, fatigue and drowsiness (2014). https://www.sciencedirect.com/science/article/pii/via%3DihubCrossRefGoogle Scholar
  10. 10.
    Morris, J.M.: User-Interface design for older adults. Interact. Comput. 6, 373–393 (1994)CrossRefGoogle Scholar
  11. 11.
    Al-Razgan, M.S., Al-Khalifa, H.S., Al-Shahrani, M.D., AlAjmi, H.H.: Touch-based mobile phone interface guidelines and design recommendations for elderly people: a survey of the literature. In: Huang, T., Zeng, Z., Li, C., Leung, C.S. (eds.) ICONIP 2012. LNCS, vol. 7666, pp. 568–574. Springer, Heidelberg (2012).  https://doi.org/10.1007/978-3-642-34478-7_69CrossRefGoogle Scholar
  12. 12.
    Hawthorn, D.: Possible implications of aging for interface designers. Interact. Comput. 12, 507–528 (2000)CrossRefGoogle Scholar
  13. 13.
    Leung, R., McGrenere, J., Graf, P.: Age-related differences in the initial usability of mobile device icons. Behav. Inf. Technol. 30, 629–642 (2011)CrossRefGoogle Scholar
  14. 14.
    Pimentel, V., Nickerson, B.G.: Communicating and displaying real-time data with websocket. IEEE Internet Comput. 16, 45–53 (2012)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Kaori Tamura
    • 1
  • Min Lu
    • 1
  • Shin’ichi Konomi
    • 1
    Email author
  • Kohei Hatano
    • 1
  • Miyuki Inaba
    • 1
  • Misato Oi
    • 2
  • Tsuyoshi Okamoto
    • 1
  • Fumiya Okubo
    • 3
  • Atsushi Shimada
    • 4
  • Jingyun Wang
    • 5
  • Masanori Yamada
    • 1
  • Yuki Yamada
    • 1
  1. 1.Faculty of Arts and ScienceKyushu UniversityFukuokaJapan
  2. 2.Innovation Center for Educational ResourceKyushu UniversityFukuokaJapan
  3. 3.Faculty of Business AdministrationTakachiho UniversitySuginami-KuJapan
  4. 4.Faculty of Information Science and Electrical EngineeringKyushu UniversityFukuokaJapan
  5. 5.Research Institute for Information Technology, Kyushu UniversityFukuokaJapan

Personalised recommendations