Affective Game Dialogues

Using Affect as an Explicit Input Method in Game Dialogue Systems
  • Michael Lankes
  • Thomas Mirlacher
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7168)


Natural game input devices, such as Microsoft’s Kinect or Sony’s Playstation Move, have become increasingly popular and allow a direct mapping of player performance in regard to actions in the game world. Games have been developed that enable players to interact with their avatars and other game objects via gestures and/or voice input. However, current technologies and systems do not tap in the full potential of affective approaches. Affect in games can be harnessed as a supportive and easy to use input method.

This paper proposes a design approach that utilizes facial expressions as an explicit input method in game dialogues. Our concept allows players to interact with Non Player Characters (NPCs) by portraying specific basic emotions. Similar to adventure games, the player may choose between different dialogue options, which are displayed in textual form. The possible answers are coded in a way so that they can be selected by distinct facial expressions. The player may, for example, choose to act aggressively towards an NPC by expressing anger. In contrast to traditional techniques, in game dialogue systems, where players solely make their decisions by selecting text information, the proposed approach includes an affective component to reduce misunderstanding of the provided information.

A comparative study was conducted that included our interaction design as well as a traditional approach (selection of options via mouse) in order to identify possible differences and benefits in regard to the User Experience (UX). Results indicate that the use of explicit facial expressions in the context of game dialogue appears to be quite promising.


Facial Expression Transmission Control Protocol Multiagent System Basic Emotion Game Design 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Allwood, J.: On dialogue cohesion. In: Heltoft, L. (ed.) Papers from the Thirteenth Scandinavian Conference of Linguistics, vol. 65. Göteborg University, Department of Linguistics, Roskilde University Centre (1992)Google Scholar
  2. 2.
    Bernhaupt, R., Boldt, A., Mirlacher, T., Wilfinger, D., Tscheligi, M.: Using emotion in games: emotional flowers. In: ACE 2007: Proceedings of the International Conference on Advances in Computer Entertainment Technology, pp. 41–48. ACM Press, New York (2007)CrossRefGoogle Scholar
  3. 3.
    Brusk, J., Bjork, S.: Gameplay design patterns for game dialogues. In: Barry, A., Helen, K., Tanya, K. (eds.) Breaking New Ground: Innovation in Games, Play, Practice and Theory: Proceedings of the 2009 Digital Games Research Association Conference. Brunel University, London (2009)Google Scholar
  4. 4.
    Cavazza, M., Pizzi, D., Charles, F., Vogt, T., André, E.: Emotional input for character-based interactive storytelling. In: Proceedings of the 8th International Conference on Autonomous Agents and Multiagent Systems, AAMAS 2009, vol. 1, pp. 313–320. International Foundation for Autonomous Agents and Multiagent Systems, Richland (2009)Google Scholar
  5. 5.
    Fraunhofer IIS: Face and object detection (2010),
  6. 6.
    Hassenzahl, M., Burmester, M., Koller, F.: Attrakdiff: Ein fragebogen zur messung wahrgenommener hedonischer und pragmatischer qualität. Mensch und Computer 2003. Interaktion in Bewegung, 187–196 (2003)Google Scholar
  7. 7.
    Hudlicka, E.: Affective game engines: motivation and requirements. In: Proceedings of the 4th International Conference on Foundations of Digital Games, FDG 2009, pp. 299–306. ACM, New York (2009)CrossRefGoogle Scholar
  8. 8.
    Kueblbeck, C., Ernst, A.: Face detection and tracking in video sequences using the modified census transformation. Journal on Image and Vision Computing 24(6), 564–572 (2006)CrossRefGoogle Scholar
  9. 9.
    Lankes, M., Riegler, S., Weiss, A., Mirlacher, T., Pirker, M., Scherndl, T., Tscheligi, M.: Facial expressions as game input with different emotional feedback conditions. In: ACE 2008: Proceedings of the 2008 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology. ACM Press, New York (2008)Google Scholar
  10. 10.
    Nacke, L.E.: Wiimote vs. controller: electroencephalographic measurement of affective gameplay interaction. In: Proceedings of the International Academic Conference on the Future of Game Design and Technology, Futureplay 2010, pp. 159–166. ACM, New York (2010)CrossRefGoogle Scholar
  11. 11.
    Pelachaud, C., Badler, N.I., Steedman, M.: Generating facial expressions for speech. Cognitive Science 20(1), 1–46 (1996)CrossRefGoogle Scholar
  12. 12.
    Zhan, C., Li, W., Safaei, F., Ogunbona, P.: Emotional states control for on-line game avatars. In: Proceedings of the 6th ACM SIGCOMM Workshop on Network and System Support for Games, NetGames 2007, pp. 31–36. ACM, New York (2007)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Michael Lankes
    • 1
  • Thomas Mirlacher
    • 2
  1. 1.University of Applied Sciences Upper AustriaAustria
  2. 2.IRITToulouseFrance

Personalised recommendations