Multimodal Sensing in Affective Gaming

  • Irene Kotsia
  • Stefanos Zafeiriou
  • George Goudelis
  • Ioannis Patras
  • Kostas Karpouzis
Part of the Socio-Affective Computing book series (SAC, volume 4)


A typical gaming scenario, as developed in the past 20 years, involves a player interacting with a game using a specialized input device, such as a joystick, a mouse, a keyboard or a proprietary game controller. Recent technological advances have enabled the introduction of more elaborated approaches in which the player is able to interact with the game using body pose, facial expressions, actions, even physiological signals. The future lies in ‘affective gaming’, that is games that will be ‘intelligent’ enough not only to extract the player’s commands provided by speech and gestures, but also to extract behavioural cues, as well as emotional states and adjust the game narrative accordingly, in order to ensure more realistic and satisfactory player experience. In this chapter, we review the area of affective gaming by describing existing approaches and discussing recent technological advances. More precisely, we first elaborate on different sources of affect information in games and proceed with issues such as the affective evaluation of players and affective interaction in games. We summarize the existing commercial affective gaming applications and introduce new gaming scenarios. We outline some of the most important problems that have to be tackled in order to create more realistic and efficient interactions between players and games and conclude by highlighting the challenges such systems must overcome.


Facial Expression Haptic Device Facial Expression Recognition Brain Computer Interface Wearable Device 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



This work has been supported by the Action “Supporting Postdoctoral Researchers” of the Operational Program “Education and Lifelong Learning” (Action’s Beneficiary: General Secretariat for Research and Technology), co-financed by the European Social Fund (ESF) and the Greek State, and by the FP7 Technology-enhanced Learning project “Siren: Social games for conflIct REsolution based on natural iNteraction” (Contract no.: 258453). KK and GG have been supported by European Union (European Social Fund ESF) and Greek national funds through the Operational Program “Education and Lifelong Learning” of the National Strategic Reference Framework (NSRF)—Research Funding Program “Thalis - Interdisciplinary Research in Affective Computing for Biological Activity Recognition in Assistive Environments”.


  1. 1.
    Adolphs R (2002) Recognizing emotion from facial expressions: psychological and neurological mechanisms. Behav Cogn Neurosci Rev 1:21–62CrossRefPubMedGoogle Scholar
  2. 2.
    Antifakos S, Schiele B (2002) Bridging the gap between virtual and physical games using wearable sensors. In: Sixth international symposium on wearable computers, SeattleCrossRefGoogle Scholar
  3. 3.
    Argyle M (1988) Bodily communicationGoogle Scholar
  4. 4.
    Asteriadis S, Karpouzis K, Kollias S (2014) Visual focus of attention in non-calibrated environments using gaze estimation. Int J Comput Vis 107(3):293–316CrossRefGoogle Scholar
  5. 5.
    Asteriadis S, Karpouzis K, Shaker N, Yannakakis GN (2012) Towards detecting clusters of players using visual and gameplay behavioral cues. Proc Comput Sci 15:140–147CrossRefGoogle Scholar
  6. 6.
    Asteriadis S, Shaker N, Karpouzis K, Yannakakis GN (2012) Towards player’s affective and behavioral visual cues as drives to game adaptation. In: LREC workshop on multimodal corpora for machine learning, IstanbulGoogle Scholar
  7. 7.
    Bateman C, Nacke L (2010) The neurobiology of play. In: Proceedings of future play 2010, Vancouver, pp 1–8Google Scholar
  8. 8.
    Bernhardt D, Robinson P (2007) Detecting affect from non-stylised body motions. In: International conference on affective computing and intelligent interaction, LisbonCrossRefGoogle Scholar
  9. 9.
    Bertelsmeyer C, Koch E, Schirm AH (2006) A new approach on wearable game design and its evaluation. In: Proceedings of 5th ACM SIGCOMM workshop on network and system support for games, SingaporeGoogle Scholar
  10. 10.
    Bianchi-Berthouze N (2010) Does body movement affect the player engagement experience? In: International conference on Kansei engineering and emotion research, ParisGoogle Scholar
  11. 11.
    Björk S, Falk J, Hansson R, Ljungstrand P (2001) Pirates: using the physical world as a game board. In: Proceedings of interact, TokyoGoogle Scholar
  12. 12.
    Blair RJR, Morris JS, Frith CD, Perrett DI, Dolan RJ (1999) Dissociable neural responses to facial expressions of sadness and anger. Brain 122:883–893CrossRefPubMedGoogle Scholar
  13. 13.
    Borod JC, Obler LK, Erhan HM, Grunwald IS, Cicero BA, Welkowitz J, Santschi C, Agosti RM, Whalen JR (1998) Right hemisphere emotional perception: evidence across multiple channels. Neuropsychology 12:446–458CrossRefPubMedGoogle Scholar
  14. 14.
    Bruce V, Young AW (1986) Understanding face recognition. Br J Psychol 77:305–327CrossRefPubMedGoogle Scholar
  15. 15.
    Calder AJ, Burton AM, Miller P, Young AW, Akamatsu S (2001) A principal component analysis of facial expressions. Vis Res 41:1179–1208CrossRefPubMedGoogle Scholar
  16. 16.
    Camurri A, Mazzarino B, Ricchetti M, Timmers R, Timmers G (2004) Multimodal analysis of expressive gesture in music and dance performances. In: Gesture-based communication on human-computer interaction. Springer, Berlin/New YorkCrossRefGoogle Scholar
  17. 17.
    Caporusso N, Mkrtchyan L, Badia L (2010) A multimodal interface device for online board games designed for sight-impaired people. IEEE Trans Inf Technol Biomed 14:248–254CrossRefPubMedGoogle Scholar
  18. 18.
    Caridakis G, Karpouzis K, Wallace M, Kessous L, Amir N (2010) Multimodal user’s affective state analysis in naturalistic interaction. J Multimodal User Interfaces 3(1):49–66CrossRefGoogle Scholar
  19. 19.
    Caridakis G, Wagner J, Raouzaiou A, Curto Z, Andre E, Kostas K (2010) A multimodal corpus for gesture expressivity analysis. In: Multimodal corpora: advances in capturing, coding and analyzing multimodality, VallettaGoogle Scholar
  20. 20.
    Cinaz B, Dselder E, Iben H, Koch E, Kenn H (2006) Wearable games—an approach for defining design principles. In: Student colloquium at the international symposium on wearable computers ISWC06, At MontreuxGoogle Scholar
  21. 21.
    Close B, Donoghue J, Squires J, Bondi PD, Morris M, Piekarski W (2000) Arquake: an outdoor/indoor augmented reality first person application. In: 4th international symposium on wearable computers, AtlantaGoogle Scholar
  22. 22.
    Cohn J, Schmidt K (2004) The timing of facial motion in posed and spontaneous smiles. Int J Wavelets Multiresolut Inf Process 2:121–132CrossRefGoogle Scholar
  23. 23.
    Cowie R, Douglas-Cowie E, Karpouzis K, Caridakis G, Wallace M, Kollias S (2008) Recognition of emotional states in natural human-computer interaction. In: Tzovaras D (ed) Multimodal user interfaces. Springer, Berlin/Heidelberg, pp 119–153CrossRefGoogle Scholar
  24. 24.
    DeGroot D, Broekens J (2003) Using negative emotions to impair game play. In: 15th Belgian-Dutch conference on artificial intelligence, NijmegenGoogle Scholar
  25. 25.
    Drachen A, Gbel S (2010) Methods for evaluating gameplay experience in a serious gaming context. Int J Comput Sci Sport 9Google Scholar
  26. 26.
    Ekman P, Davidson RJ (1994) The nature of emotion: fundamental questions. Oxford University Press, New YorkGoogle Scholar
  27. 27.
    Ekman P, Friesen W (1969) Nonverbal leakage and clues to deception. Psychiatry 32(1):88–105CrossRefGoogle Scholar
  28. 28.
    Ekman P, Friesen W (1974) Detecting deception from the body or face. Personal Soc Psychol 29(3):288–298CrossRefGoogle Scholar
  29. 29.
    Faust M, Yoo Y (2006) Haptic feedback in pervasive games. In: Third international workshop on pervasive gaming applications, PerGames, DublinGoogle Scholar
  30. 30.
    Gauthier I, Tarr MJ, Aanderson A, Skudlarski P, Gore JC (1999) Activation of the middle fusiform ‘face area’ increases with expertise in recognizing novel objects. Nat Neurosci 2:568–573CrossRefPubMedGoogle Scholar
  31. 31.
    Gilleade K, Dix A, Allanson J (2005) Affective videogames and modes of affective gaming: assist me, challenge me, emote me. In: DiGRA, VancouverGoogle Scholar
  32. 32.
    Gunes H, Piccardi M (2009) Automatic temporal segment detection and affect recognition from face and body display. IEEE Trans Syst Man Cybern B Spec Issue Hum Comput 39(1):64–84CrossRefGoogle Scholar
  33. 33.
    Hasselmo ME, Rolls ET, Baylis GC (1989) The role of expression and identity in the face-selective responses of neurons in the temporal visual cortex of the monkey. Behav Brain Res 32:203–218CrossRefPubMedGoogle Scholar
  34. 34.
    van Heijnsbergen CCRJ, Meeren HKM, Grezes J, de Gelder B (2007) Rapid detection of fear in body expressions, an ERP study. Brain Res 1186:233–241CrossRefPubMedGoogle Scholar
  35. 35.
    Heinz EA, Kunze KS, Gruber M, Bannach D, Lukowicz P (2006) Using wearable sensors for real-time recognition tasks in games of martial arts—an initial experiment. In: IEEE symposium on computational intelligence and games, Reno/Lake TahoeCrossRefGoogle Scholar
  36. 36.
    Holmgård C, Yannakakis GN, Martínez HP, Karstoft KI, Andersen HS (2015) Multimodal ptsd characterization via the startlemart game. J Multimodal User Interfaces 9(1):3–15CrossRefGoogle Scholar
  37. 37.
    Hossain SKA, Rahman ASMM, Saddik AE (2010) Haptic based emotional communication system in second life. In: IEEE international symposium on haptic audio-visual environments and games (HAVE), PhoenixGoogle Scholar
  38. 38.
    Hudlicka E (2003) To feel or not to feel: the role of affect in humancomputer interaction. Int J Hum-Comput Stud 59:1–32CrossRefGoogle Scholar
  39. 39.
    Hudlicka E (2008) Affective computing for game design. In: Proceedings of the 4th international North American conference on intelligent games and simulation (GAMEON-NA), MontrealGoogle Scholar
  40. 40.
    Hudlicka E (2009) Affective game engines: motivation and requirements. In: Proceedings of the 4th international conference on foundations of digital games, OrlandoGoogle Scholar
  41. 41.
    Hudlicka E, Broekens J (2009) Foundations for modelling emotions in game characters: modelling emotion effects on cognition. ACII MiniTutorialGoogle Scholar
  42. 42.
    Hudlicka E, McNeese MD (2002) Assessment of user affective and belief states for interface adaptation: application to an air force pilot task. User Model User-Adapt Interact 12(1):1–47CrossRefGoogle Scholar
  43. 43.
    Kanwisher N, McDermott J, Chun MM (1997) The fusiform face area: a module in human extrastriate cortex specialized for face perception. J Neurosci 4302–4311Google Scholar
  44. 44.
    Kapoor A, Burleson W, Picard RW (2007) Automatic prediction of frustration. Int J Hum-Comput Stud 65:724–736CrossRefGoogle Scholar
  45. 45.
    Kapoor A, Picard RW, Ivanov Y (2004) Probabilistic combination of multiple modalities to detect interest. In: International conference on pattern recognition, CambridgeCrossRefGoogle Scholar
  46. 46.
    Karpouzis K, Shaker N, Yannakakis G, Asteriadis S (2015) The platformer experience dataset. In: 6th international conference on affective computing and intelligent interaction (ACII 2015) conference, Xi’an, 21–24 Sept 2015Google Scholar
  47. 47.
    Kleinsmith A, Bianchi-Berthouze N, Steed A: Automatic recognition of non-acted affective postures. IEEE Trans Syst Man Cybern B: Cybern 41(4):1027–1038 (2011)CrossRefGoogle Scholar
  48. 48.
    Kleinsmith A, Silva RD, Bianchi-Berthouze N (2006) Cross-cultural differences in recognizing affect from body posture. Interact Comput 18:1371–1389CrossRefGoogle Scholar
  49. 49.
    Kwon YC, Lee WH (2010) A study on multi-touch interface for game. In: 3rd international conference on human-centric computing (HumanCom), CebuGoogle Scholar
  50. 50.
    Leite I, Martinho C, Pereira A, Paiva A (2008) iCat: an affective game buddy based on anticipatory mechanisms. In: Proceedings of the 7th international joint conference on autonomous agents and multiagent systems, EstorilGoogle Scholar
  51. 51.
    Lyons MJ, Budynek J, Akamatsu S (1999) Automatic classification of single facial images. IEEE Trans Pattern Anal Mach Intell 21(12):1357–1362CrossRefGoogle Scholar
  52. 52.
    Ma Y, Paterson H, Pollick FE (2006) A motion capture library for the study of identity, gender, and emotion perception from biological motion. Behav Res Methods 38:134–141CrossRefPubMedGoogle Scholar
  53. 53.
    Mandryk R, Atkins MS (2007) A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies. Int J Hum-Comput Stud 65(4):329–347CrossRefGoogle Scholar
  54. 54.
    Mandryk RL, Atkins MS, Inkpen KM (2006) A continuous and objective evaluation of emotional experience with interactive play environments. In: Proceedings of the SIGCHI conference on human factors in computing systems, MontréalCrossRefGoogle Scholar
  55. 55.
    Martinez HP, Yannakakis GN, Hallam J (2014) Don’t classify ratings of affect; rank them! IEEE Trans Affect Comput 5(3):314–326CrossRefGoogle Scholar
  56. 56.
    Mehrabian A, Friar J (1969) Encoding of attitude by a seated communicator via posture and position cues. Consult Clin Psychol 33:330–336CrossRefGoogle Scholar
  57. 57.
    Microsoft (2010) Xbox kinect.
  58. 58.
    Moustakas K, Tzovaras D, Dybkjaer L, Bernsen N, Aran O (2011) Using modality replacement to facilitate communication between visually and hearing-impaired people. IEEE Trans MultimedGoogle Scholar
  59. 59.
    Nacke LE, Grimshaw NM, Lindley ACA (2010) More than a feeling: measurement of sonic user experience and psychophysiology in a first-person shooter game. Interact Comput 22:336–343CrossRefGoogle Scholar
  60. 60.
    Nacke LE, Mandryk RL (2010) Designing affective games with physiological input. In: Workshop on multiuser and social biosignal adaptive games and playful applications in fun and games conference (BioS-Play), LeuvenGoogle Scholar
  61. 61.
    Nusseck M, Cunningham DW, Wallraven C, Bülthoff HH (2008) The contribution of different facial regions to the recognition of conversational expressions. J Vis 8:1–23CrossRefPubMedGoogle Scholar
  62. 62.
    Ouhyoung M, Tsai WN, Tsai MC, Wu JR, Huang CH, Yang TJ (1995) A low-cost force feedback joystick and its use in PC video games. IEEE Trans Consum Electron 41:787CrossRefGoogle Scholar
  63. 63.
    Pantic M, Rothkrantz LJM (2003) Toward an affect-sensitive multimodal human-computer interaction. Proc IEEE 91:1370–1390CrossRefGoogle Scholar
  64. 64.
    Pantic M, Vinciarelli A (2009) Implicit human centered tagging. IEEE Signal Process Mag 26:173–180CrossRefGoogle Scholar
  65. 65.
    Paolis LD, Pulimeno M, Aloisio G (2007) The simulation of a billiard game using a haptic interface. In: IEEE international symposium on distributed simulation and real-time applications, ChaniaCrossRefGoogle Scholar
  66. 66.
    Park W, Kim L, Cho H, Park S (2009) Design of haptic interface for brickout game. In: IEEE international workshop on haptic audio visual environments and games, LeccoCrossRefGoogle Scholar
  67. 67.
    Picard R (2000) Towards computers that recognize and respond to user emotion. IBM Syst J 39:705CrossRefGoogle Scholar
  68. 68.
    Pollak SD, Messner M, Kistler DJ, Cohn JF (2009) Development of perceptual expertise in emotion recognition. Cognition 110:242–247CrossRefPubMedGoogle Scholar
  69. 69.
    Russell JA (1980) A circumplex model of affect. J Personal Soc Psychol 39:1161–1178CrossRefGoogle Scholar
  70. 70.
    Saari T, Ravaja N, Laarni J, Kallinen K, Turpeinen M (2004) Towards emotionally adapted games. In: Proceedings of presence, ValenciaGoogle Scholar
  71. 71.
    Salah AA, Sebe N, Gevers T (2010) Communication and automatic interpretation of affect from facial expressionsGoogle Scholar
  72. 72.
    Sanghvi J, Castellano G, Leite I, Pereira A, McOwan PW, Paiva A (2011) Automatic analysis of affective postures and body motion to detect engagement with a game companion. In: HRI’11, Lausanne, pp 305–312Google Scholar
  73. 73.
    Savva N, Bianchi-Berthouze N (2012) Automatic recognition of affective body movement in a video game scenario. In: International conference on intelligent technologies for interactive entertainment, GenovaCrossRefGoogle Scholar
  74. 74.
    Savva N, Scarinzi A, Bianchi-Berthouze N (2012) Continuous recognition of player’s affective body expression as dynamic quality of aesthetic experience. IEEE Trans Comput Intell AI Games 4(3):199–212CrossRefGoogle Scholar
  75. 75.
    Shaker N, Asteriadis S, Yannakakis GN, Karpouzis K (2013) Fusing visual and behavioral cues for modeling user experience in games. IEEE Trans Cybern 43(6):1519–1531CrossRefPubMedGoogle Scholar
  76. 76.
    Silva RD, Bianchi-Berthouze N (2004) Modeling human affective postures: an information theoretic characterization of posture features. Comput Animat Virtual Worlds 15:269–276CrossRefGoogle Scholar
  77. 77.
    Silva PRD, Madurapperuma AP, Marasinghe A, Osano M (2006) A multi-agent based interactive system towards child’s emotion performances quantified through affective body gestures. In: ICPR (1)’06, Hong Kong, pp 1236–1239Google Scholar
  78. 78.
    Silva PRD, Osano M, Marasinghe A, Madurapperuma AP (2006) Towards recognizing emotion with affective dimensions through body gestures. In: International conference on automatic face and gesture recognition, SouthamptonCrossRefGoogle Scholar
  79. 79.
    Sivak M, Unluhisarcikli O, Weinberg B, Melman-Harari A, Bonate P, Mavroidis C (2010) Haptic system for hand rehabilitation integrating an interactive game with an advanced robotic device. In: IEEE haptics symposium, WalthamCrossRefGoogle Scholar
  80. 80.
    Sjöström C (2001) Using haptics in computer interfaces for blind people. In: CHI ’01 extended abstracts on human factors in computing systems. ACM, New YorkGoogle Scholar
  81. 81.
    Sjöström C, Rassmus-Gröhn K (1999) The sense of touch provides new interaction techniques for disabled people. Technol Disabil 10:45–52Google Scholar
  82. 82.
    Sollenberger DJ, Singh MP (2009) Architecture for affective social games. In: Proceedings of the 1st AAMAS workshop on agents for games and simulation (AGS)Google Scholar
  83. 83.
    Studios W: Dune 2—battle for arrakis (1993). Google Scholar
  84. 84.
    Thomas B, Close B, Donoghue J, Squires J, Bondi PD, Morris M, Piekarski W (2002) First person indoor/outdoor augmented reality application: ARQuake. Pers Ubiquitous Comput Arch 6:75–86CrossRefGoogle Scholar
  85. 85.
    Thorpe A, Ma M, Oikonomou A (2011) History and alternative game input methods. In: International conference on computer games, LouisvilleCrossRefGoogle Scholar
  86. 86.
    Togelius J, Shaker N, Yannakakis GN (2013) Active player modelling. arXiv preprint arXiv:1312.2936Google Scholar
  87. 87.
    Ververidis D, Kotsia I, Kotropoulos C, Pitas I (2008) Multi-modal emotion-related data collection within a virtual earthquake emulator. In: 6th language resources and evaluation conference (LREC), MarrakechGoogle Scholar
  88. 88.
    Wallbott HG, Scherer KR (1986) Cues and channels in emotion recognition. Personal Soc Psychol 51(4):660–699Google Scholar
  89. 89.
    Wong CY, Chu K, Khong CW, Lim TY (2010) Evaluating playability on haptic user interface for mobile gaming. In: International symposium in information technology (ITSim), Kuala LumpurGoogle Scholar
  90. 90.
    Yang CY, Lo YS, Liu CT (2010) Developing an interactive dental casting educational game. In: 3rd IEEE international conference on computer science and information technology (ICCSIT), ChengduGoogle Scholar
  91. 91.
    Yannakakis GN, Togelius J (2011) Experience-driven procedural content generation. IEEE Trans Affect Comput 2(3):147–161CrossRefGoogle Scholar
  92. 92.
    Yannakakis GN, Togelius J, Khaled R, Jhala A, Karpouzis K, Paiva, A, Vasalou A (2010) Siren: towards adaptive serious games for teaching conflict resolution. In: 4th European conference on games based learning (ECGBL10), CopenhagenGoogle Scholar
  93. 93.
    Zeng Z, Pantic M, Roisman G, Huang T (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31(1):39–58CrossRefPubMedGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Irene Kotsia
    • 1
    • 2
  • Stefanos Zafeiriou
    • 3
  • George Goudelis
    • 4
  • Ioannis Patras
    • 1
  • Kostas Karpouzis
    • 5
  1. 1.School of Electronic Engineering and Computer ScienceQueen Mary University of LondonLondonUK
  2. 2.Electronics Laboratory, Department of PhysicsUniversity of PatrasPatrasGreece
  3. 3.Department of ComputingImperial College LondonLondonUK
  4. 4.Image, Video and Multimedia Systems LabNational Technical University of AthensZographouGreece
  5. 5.Institute of Communication and Computer SystemsNational Technical University of AthensAthensGreece

Personalised recommendations