Chapter

Artificial Intelligence in Education

Volume 6738 of the series Lecture Notes in Computer Science pp 131-138

Affect Detection from Multichannel Physiology during Learning Sessions with AutoTutor

  • M. S. HussainAffiliated withNational ICT Australia (NICTA)School of Electrical and Information Engineering, University of Sydney
  • , Omar AlZoubiAffiliated withSchool of Electrical and Information Engineering, University of Sydney
  • , Rafael A. CalvoAffiliated withSchool of Electrical and Information Engineering, University of Sydney
  • , Sidney K. D’MelloAffiliated withInstitute for Intelligent Systems, University of Memphis

* Final gross prices may vary according to local VAT.

Get Access

Abstract

It is widely acknowledged that learners experience a variety of emotions while interacting with Intelligent Tutoring Systems (ITS), hence, detecting and responding to emotions might improve learning outcomes. This study uses machine learning techniques to detect learners’ affective states from multichannel physiological signals (heart activity, respiration, facial muscle activity, and skin conductivity) during tutorial interactions with AutoTutor, an ITS with conversational dialogues. Learners were asked to self-report (both discrete emotions and degrees of valence/arousal) the affective states they experienced during their sessions with AutoTutor via a retrospective judgment protocol immediately after the tutorial sessions. In addition to mapping the discrete learning-centered emotions (e.g., confusion, frustration, etc) on a dimensional valence/arousal space, we developed and validated an automatic affect classifier using physiological signals. Results indicate that the classifier was moderately successful at detecting naturally occurring emotions during the AutoTutor sessions.

Keywords

Affective computing emotion AutoTutor multichannel physiology learning interaction self reports