AttentiveLearner2: A Multimodal Approach for Improving MOOC Learning on Mobile Devices

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10331)

Abstract

We propose AttentiveLearner2, a multimodal mobile learning system for MOOCs running on unmodified smartphones. AttentiveLearner2 uses both the front and back cameras of a smartphone as two complementary and fine-grained feedback channels in real time: the back camera monitors learners’ photoplethysmography (PPG) signals and the front camera tracks their facial expressions during MOOC learning. AttentiveLearner2 implicitly infers learners’ affective and cognitive states during learning by analyzing learners’ PPG signals and facial expressions. In a 26-participant user study, we found that it is feasible to detect 6 types of emotion during learning via collected PPG signals and facial expressions and these modalities are complement with each other.

Keywords

Mobile learning Intelligent tutoring systems Massive open online courses Multimodal interaction 

References

  1. 1.
    Chuang, I., Ho, A.D.: HarvardX and MITx: Four Years of Open Online Courses – Fall 2012-Summer (2016). https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2889436
  2. 2.
    D’Mello, S.K., Graesser, A.: Multimodal semi-automated affect detection from conversational cues, gross body language, and facial features. User Model. User-Adap. Inter. 20(2), 147–187 (2010)CrossRefGoogle Scholar
  3. 3.
    Han, T., Xiao, X., Shi, L., Canny, J., Wang, J.: Balancing accuracy and fun: designing camera based mobile games for implicit heart rate monitoring. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 847–856. ACM (2015)Google Scholar
  4. 4.
    Kim, J., Guo, P.J., Seaton, D.T., Mitros, P., Gajos, K.Z., Miller, R.C.: Understanding invideo dropouts and interaction peaks in online lecture videos. In: Proceedings of the First ACM Conference on Learning@ Scale Conference, pp. 31–40. ACM (2014)Google Scholar
  5. 5.
    Pham, P., Wang, J.: Understanding emotional responses to mobile video advertisements via physiological signal sensing and facial expression analysis. In: Proceedings of the 22nd International Conference on Intelligent User Interfaces, pp. 67–78. ACM (2017)Google Scholar
  6. 6.
    Pham, P., Wang, J.: AttentiveLearner: improving mobile MOOC learning via implicit heart rate tracking. In: Conati, C., Heffernan, N., Mitrovic, A., Verdejo, M.F. (eds.) AIED 2015. LNCS, vol. 9112, pp. 367–376. Springer, Cham (2015). doi:10.1007/978-3-319-19773-9_37 CrossRefGoogle Scholar
  7. 7.
    Xiao, X., Han, T., Wang, J.: LensGesture: augmenting mobile interactions with back-of-device finger gestures. In: Proceedings of the 15th ACM on International Conference on Multimodal Interaction, pp. 287–294. ACM (2013)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Computer Science and LRDCUniversity of PittsburghPittsburghUSA

Personalised recommendations