Affective Game Dialogues
Natural game input devices, such as Microsoft’s Kinect or Sony’s Playstation Move, have become increasingly popular and allow a direct mapping of player performance in regard to actions in the game world. Games have been developed that enable players to interact with their avatars and other game objects via gestures and/or voice input. However, current technologies and systems do not tap in the full potential of affective approaches. Affect in games can be harnessed as a supportive and easy to use input method.
This paper proposes a design approach that utilizes facial expressions as an explicit input method in game dialogues. Our concept allows players to interact with Non Player Characters (NPCs) by portraying specific basic emotions. Similar to adventure games, the player may choose between different dialogue options, which are displayed in textual form. The possible answers are coded in a way so that they can be selected by distinct facial expressions. The player may, for example, choose to act aggressively towards an NPC by expressing anger. In contrast to traditional techniques, in game dialogue systems, where players solely make their decisions by selecting text information, the proposed approach includes an affective component to reduce misunderstanding of the provided information.
A comparative study was conducted that included our interaction design as well as a traditional approach (selection of options via mouse) in order to identify possible differences and benefits in regard to the User Experience (UX). Results indicate that the use of explicit facial expressions in the context of game dialogue appears to be quite promising.
KeywordsFacial Expression Transmission Control Protocol Multiagent System Basic Emotion Game Design
Unable to display preview. Download preview PDF.
- 1.Allwood, J.: On dialogue cohesion. In: Heltoft, L. (ed.) Papers from the Thirteenth Scandinavian Conference of Linguistics, vol. 65. Göteborg University, Department of Linguistics, Roskilde University Centre (1992)Google Scholar
- 3.Brusk, J., Bjork, S.: Gameplay design patterns for game dialogues. In: Barry, A., Helen, K., Tanya, K. (eds.) Breaking New Ground: Innovation in Games, Play, Practice and Theory: Proceedings of the 2009 Digital Games Research Association Conference. Brunel University, London (2009)Google Scholar
- 4.Cavazza, M., Pizzi, D., Charles, F., Vogt, T., André, E.: Emotional input for character-based interactive storytelling. In: Proceedings of the 8th International Conference on Autonomous Agents and Multiagent Systems, AAMAS 2009, vol. 1, pp. 313–320. International Foundation for Autonomous Agents and Multiagent Systems, Richland (2009)Google Scholar
- 5.Fraunhofer IIS: Face and object detection (2010), http://www.iis.fraunhofer.de/en/bf/bv/ks/gpe/index.jsp
- 6.Hassenzahl, M., Burmester, M., Koller, F.: Attrakdiff: Ein fragebogen zur messung wahrgenommener hedonischer und pragmatischer qualität. Mensch und Computer 2003. Interaktion in Bewegung, 187–196 (2003)Google Scholar
- 9.Lankes, M., Riegler, S., Weiss, A., Mirlacher, T., Pirker, M., Scherndl, T., Tscheligi, M.: Facial expressions as game input with different emotional feedback conditions. In: ACE 2008: Proceedings of the 2008 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology. ACM Press, New York (2008)Google Scholar