Towards Knowledge-Based Affective Interaction: Situational Interpretation of Affect

* Final gross prices may vary according to local VAT.

Get Access

Abstract

Human-to-computer interaction in a variety of applications could benefit if systems could accurately analyze and respond to their users’ affect. Although a great deal of research has been conducted on affect recognition, very little of this work has considered what is the appropriate information to extract in specific situations. Towards understanding how specific applications such as affective tutoring and affective entertainment could benefit, we present two experiments. In the first experiment, we found that students’ facial expressions, together with their body actions, gave little information about their internal emotion per se but they would be useful features for predicting their self-reported “true” mental state. In the second experiment, we found significant differences between the facial expressions and self-reported affective state of viewers watching a movie sequence. Our results suggest that the noisy relationship between observable gestures and underlying affect must be accounted for when designing affective computing applications.