Skip to main content

Multimodal Sensing in Affective Gaming

  • Chapter
  • First Online:
Emotion in Games

Part of the book series: Socio-Affective Computing ((SAC,volume 4))

Abstract

A typical gaming scenario, as developed in the past 20 years, involves a player interacting with a game using a specialized input device, such as a joystick, a mouse, a keyboard or a proprietary game controller. Recent technological advances have enabled the introduction of more elaborated approaches in which the player is able to interact with the game using body pose, facial expressions, actions, even physiological signals. The future lies in ‘affective gaming’, that is games that will be ‘intelligent’ enough not only to extract the player’s commands provided by speech and gestures, but also to extract behavioural cues, as well as emotional states and adjust the game narrative accordingly, in order to ensure more realistic and satisfactory player experience. In this chapter, we review the area of affective gaming by describing existing approaches and discussing recent technological advances. More precisely, we first elaborate on different sources of affect information in games and proceed with issues such as the affective evaluation of players and affective interaction in games. We summarize the existing commercial affective gaming applications and introduce new gaming scenarios. We outline some of the most important problems that have to be tackled in order to create more realistic and efficient interactions between players and games and conclude by highlighting the challenges such systems must overcome.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Database available at http://institutedigitalgames.com/PED/

References

  1. Adolphs R (2002) Recognizing emotion from facial expressions: psychological and neurological mechanisms. Behav Cogn Neurosci Rev 1:21–62

    Article  PubMed  Google Scholar 

  2. Antifakos S, Schiele B (2002) Bridging the gap between virtual and physical games using wearable sensors. In: Sixth international symposium on wearable computers, Seattle

    Book  Google Scholar 

  3. Argyle M (1988) Bodily communication

    Google Scholar 

  4. Asteriadis S, Karpouzis K, Kollias S (2014) Visual focus of attention in non-calibrated environments using gaze estimation. Int J Comput Vis 107(3):293–316

    Article  Google Scholar 

  5. Asteriadis S, Karpouzis K, Shaker N, Yannakakis GN (2012) Towards detecting clusters of players using visual and gameplay behavioral cues. Proc Comput Sci 15:140–147

    Article  Google Scholar 

  6. Asteriadis S, Shaker N, Karpouzis K, Yannakakis GN (2012) Towards player’s affective and behavioral visual cues as drives to game adaptation. In: LREC workshop on multimodal corpora for machine learning, Istanbul

    Google Scholar 

  7. Bateman C, Nacke L (2010) The neurobiology of play. In: Proceedings of future play 2010, Vancouver, pp 1–8

    Google Scholar 

  8. Bernhardt D, Robinson P (2007) Detecting affect from non-stylised body motions. In: International conference on affective computing and intelligent interaction, Lisbon

    Book  Google Scholar 

  9. Bertelsmeyer C, Koch E, Schirm AH (2006) A new approach on wearable game design and its evaluation. In: Proceedings of 5th ACM SIGCOMM workshop on network and system support for games, Singapore

    Google Scholar 

  10. Bianchi-Berthouze N (2010) Does body movement affect the player engagement experience? In: International conference on Kansei engineering and emotion research, Paris

    Google Scholar 

  11. Björk S, Falk J, Hansson R, Ljungstrand P (2001) Pirates: using the physical world as a game board. In: Proceedings of interact, Tokyo

    Google Scholar 

  12. Blair RJR, Morris JS, Frith CD, Perrett DI, Dolan RJ (1999) Dissociable neural responses to facial expressions of sadness and anger. Brain 122:883–893

    Article  PubMed  Google Scholar 

  13. Borod JC, Obler LK, Erhan HM, Grunwald IS, Cicero BA, Welkowitz J, Santschi C, Agosti RM, Whalen JR (1998) Right hemisphere emotional perception: evidence across multiple channels. Neuropsychology 12:446–458

    Article  CAS  PubMed  Google Scholar 

  14. Bruce V, Young AW (1986) Understanding face recognition. Br J Psychol 77:305–327

    Article  PubMed  Google Scholar 

  15. Calder AJ, Burton AM, Miller P, Young AW, Akamatsu S (2001) A principal component analysis of facial expressions. Vis Res 41:1179–1208

    Article  CAS  PubMed  Google Scholar 

  16. Camurri A, Mazzarino B, Ricchetti M, Timmers R, Timmers G (2004) Multimodal analysis of expressive gesture in music and dance performances. In: Gesture-based communication on human-computer interaction. Springer, Berlin/New York

    Book  Google Scholar 

  17. Caporusso N, Mkrtchyan L, Badia L (2010) A multimodal interface device for online board games designed for sight-impaired people. IEEE Trans Inf Technol Biomed 14:248–254

    Article  PubMed  Google Scholar 

  18. Caridakis G, Karpouzis K, Wallace M, Kessous L, Amir N (2010) Multimodal user’s affective state analysis in naturalistic interaction. J Multimodal User Interfaces 3(1):49–66

    Article  Google Scholar 

  19. Caridakis G, Wagner J, Raouzaiou A, Curto Z, Andre E, Kostas K (2010) A multimodal corpus for gesture expressivity analysis. In: Multimodal corpora: advances in capturing, coding and analyzing multimodality, Valletta

    Google Scholar 

  20. Cinaz B, Dselder E, Iben H, Koch E, Kenn H (2006) Wearable games—an approach for defining design principles. In: Student colloquium at the international symposium on wearable computers ISWC06, At Montreux

    Google Scholar 

  21. Close B, Donoghue J, Squires J, Bondi PD, Morris M, Piekarski W (2000) Arquake: an outdoor/indoor augmented reality first person application. In: 4th international symposium on wearable computers, Atlanta

    Google Scholar 

  22. Cohn J, Schmidt K (2004) The timing of facial motion in posed and spontaneous smiles. Int J Wavelets Multiresolut Inf Process 2:121–132

    Article  Google Scholar 

  23. Cowie R, Douglas-Cowie E, Karpouzis K, Caridakis G, Wallace M, Kollias S (2008) Recognition of emotional states in natural human-computer interaction. In: Tzovaras D (ed) Multimodal user interfaces. Springer, Berlin/Heidelberg, pp 119–153

    Chapter  Google Scholar 

  24. DeGroot D, Broekens J (2003) Using negative emotions to impair game play. In: 15th Belgian-Dutch conference on artificial intelligence, Nijmegen

    Google Scholar 

  25. Drachen A, Gbel S (2010) Methods for evaluating gameplay experience in a serious gaming context. Int J Comput Sci Sport 9

    Google Scholar 

  26. Ekman P, Davidson RJ (1994) The nature of emotion: fundamental questions. Oxford University Press, New York

    Google Scholar 

  27. Ekman P, Friesen W (1969) Nonverbal leakage and clues to deception. Psychiatry 32(1):88–105

    Article  CAS  Google Scholar 

  28. Ekman P, Friesen W (1974) Detecting deception from the body or face. Personal Soc Psychol 29(3):288–298

    Article  Google Scholar 

  29. Faust M, Yoo Y (2006) Haptic feedback in pervasive games. In: Third international workshop on pervasive gaming applications, PerGames, Dublin

    Google Scholar 

  30. Gauthier I, Tarr MJ, Aanderson A, Skudlarski P, Gore JC (1999) Activation of the middle fusiform ‘face area’ increases with expertise in recognizing novel objects. Nat Neurosci 2:568–573

    Article  CAS  PubMed  Google Scholar 

  31. Gilleade K, Dix A, Allanson J (2005) Affective videogames and modes of affective gaming: assist me, challenge me, emote me. In: DiGRA, Vancouver

    Google Scholar 

  32. Gunes H, Piccardi M (2009) Automatic temporal segment detection and affect recognition from face and body display. IEEE Trans Syst Man Cybern B Spec Issue Hum Comput 39(1):64–84

    Article  Google Scholar 

  33. Hasselmo ME, Rolls ET, Baylis GC (1989) The role of expression and identity in the face-selective responses of neurons in the temporal visual cortex of the monkey. Behav Brain Res 32:203–218

    Article  CAS  PubMed  Google Scholar 

  34. van Heijnsbergen CCRJ, Meeren HKM, Grezes J, de Gelder B (2007) Rapid detection of fear in body expressions, an ERP study. Brain Res 1186:233–241

    Article  PubMed  Google Scholar 

  35. Heinz EA, Kunze KS, Gruber M, Bannach D, Lukowicz P (2006) Using wearable sensors for real-time recognition tasks in games of martial arts—an initial experiment. In: IEEE symposium on computational intelligence and games, Reno/Lake Tahoe

    Book  Google Scholar 

  36. Holmgård C, Yannakakis GN, Martínez HP, Karstoft KI, Andersen HS (2015) Multimodal ptsd characterization via the startlemart game. J Multimodal User Interfaces 9(1):3–15

    Article  Google Scholar 

  37. Hossain SKA, Rahman ASMM, Saddik AE (2010) Haptic based emotional communication system in second life. In: IEEE international symposium on haptic audio-visual environments and games (HAVE), Phoenix

    Google Scholar 

  38. Hudlicka E (2003) To feel or not to feel: the role of affect in humancomputer interaction. Int J Hum-Comput Stud 59:1–32

    Article  Google Scholar 

  39. Hudlicka E (2008) Affective computing for game design. In: Proceedings of the 4th international North American conference on intelligent games and simulation (GAMEON-NA), Montreal

    Google Scholar 

  40. Hudlicka E (2009) Affective game engines: motivation and requirements. In: Proceedings of the 4th international conference on foundations of digital games, Orlando

    Google Scholar 

  41. Hudlicka E, Broekens J (2009) Foundations for modelling emotions in game characters: modelling emotion effects on cognition. ACII MiniTutorial

    Google Scholar 

  42. Hudlicka E, McNeese MD (2002) Assessment of user affective and belief states for interface adaptation: application to an air force pilot task. User Model User-Adapt Interact 12(1):1–47

    Article  Google Scholar 

  43. Kanwisher N, McDermott J, Chun MM (1997) The fusiform face area: a module in human extrastriate cortex specialized for face perception. J Neurosci 4302–4311

    Google Scholar 

  44. Kapoor A, Burleson W, Picard RW (2007) Automatic prediction of frustration. Int J Hum-Comput Stud 65:724–736

    Article  Google Scholar 

  45. Kapoor A, Picard RW, Ivanov Y (2004) Probabilistic combination of multiple modalities to detect interest. In: International conference on pattern recognition, Cambridge

    Book  Google Scholar 

  46. Karpouzis K, Shaker N, Yannakakis G, Asteriadis S (2015) The platformer experience dataset. In: 6th international conference on affective computing and intelligent interaction (ACII 2015) conference, Xi’an, 21–24 Sept 2015

    Google Scholar 

  47. Kleinsmith A, Bianchi-Berthouze N, Steed A: Automatic recognition of non-acted affective postures. IEEE Trans Syst Man Cybern B: Cybern 41(4):1027–1038 (2011)

    Article  CAS  Google Scholar 

  48. Kleinsmith A, Silva RD, Bianchi-Berthouze N (2006) Cross-cultural differences in recognizing affect from body posture. Interact Comput 18:1371–1389

    Article  Google Scholar 

  49. Kwon YC, Lee WH (2010) A study on multi-touch interface for game. In: 3rd international conference on human-centric computing (HumanCom), Cebu

    Google Scholar 

  50. Leite I, Martinho C, Pereira A, Paiva A (2008) iCat: an affective game buddy based on anticipatory mechanisms. In: Proceedings of the 7th international joint conference on autonomous agents and multiagent systems, Estoril

    Google Scholar 

  51. Lyons MJ, Budynek J, Akamatsu S (1999) Automatic classification of single facial images. IEEE Trans Pattern Anal Mach Intell 21(12):1357–1362

    Article  Google Scholar 

  52. Ma Y, Paterson H, Pollick FE (2006) A motion capture library for the study of identity, gender, and emotion perception from biological motion. Behav Res Methods 38:134–141

    Article  PubMed  Google Scholar 

  53. Mandryk R, Atkins MS (2007) A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies. Int J Hum-Comput Stud 65(4):329–347

    Article  Google Scholar 

  54. Mandryk RL, Atkins MS, Inkpen KM (2006) A continuous and objective evaluation of emotional experience with interactive play environments. In: Proceedings of the SIGCHI conference on human factors in computing systems, Montréal

    Book  Google Scholar 

  55. Martinez HP, Yannakakis GN, Hallam J (2014) Don’t classify ratings of affect; rank them! IEEE Trans Affect Comput 5(3):314–326

    Article  Google Scholar 

  56. Mehrabian A, Friar J (1969) Encoding of attitude by a seated communicator via posture and position cues. Consult Clin Psychol 33:330–336

    Article  Google Scholar 

  57. Microsoft (2010) Xbox kinect. http://www.xbox.com/en-GB/kinect

  58. Moustakas K, Tzovaras D, Dybkjaer L, Bernsen N, Aran O (2011) Using modality replacement to facilitate communication between visually and hearing-impaired people. IEEE Trans Multimed

    Google Scholar 

  59. Nacke LE, Grimshaw NM, Lindley ACA (2010) More than a feeling: measurement of sonic user experience and psychophysiology in a first-person shooter game. Interact Comput 22:336–343

    Article  Google Scholar 

  60. Nacke LE, Mandryk RL (2010) Designing affective games with physiological input. In: Workshop on multiuser and social biosignal adaptive games and playful applications in fun and games conference (BioS-Play), Leuven

    Google Scholar 

  61. Nusseck M, Cunningham DW, Wallraven C, Bülthoff HH (2008) The contribution of different facial regions to the recognition of conversational expressions. J Vis 8:1–23

    Article  PubMed  Google Scholar 

  62. Ouhyoung M, Tsai WN, Tsai MC, Wu JR, Huang CH, Yang TJ (1995) A low-cost force feedback joystick and its use in PC video games. IEEE Trans Consum Electron 41:787

    Article  Google Scholar 

  63. Pantic M, Rothkrantz LJM (2003) Toward an affect-sensitive multimodal human-computer interaction. Proc IEEE 91:1370–1390

    Article  Google Scholar 

  64. Pantic M, Vinciarelli A (2009) Implicit human centered tagging. IEEE Signal Process Mag 26:173–180

    Article  Google Scholar 

  65. Paolis LD, Pulimeno M, Aloisio G (2007) The simulation of a billiard game using a haptic interface. In: IEEE international symposium on distributed simulation and real-time applications, Chania

    Book  Google Scholar 

  66. Park W, Kim L, Cho H, Park S (2009) Design of haptic interface for brickout game. In: IEEE international workshop on haptic audio visual environments and games, Lecco

    Book  Google Scholar 

  67. Picard R (2000) Towards computers that recognize and respond to user emotion. IBM Syst J 39:705

    Article  Google Scholar 

  68. Pollak SD, Messner M, Kistler DJ, Cohn JF (2009) Development of perceptual expertise in emotion recognition. Cognition 110:242–247

    Article  PubMed  Google Scholar 

  69. Russell JA (1980) A circumplex model of affect. J Personal Soc Psychol 39:1161–1178

    Article  Google Scholar 

  70. Saari T, Ravaja N, Laarni J, Kallinen K, Turpeinen M (2004) Towards emotionally adapted games. In: Proceedings of presence, Valencia

    Google Scholar 

  71. Salah AA, Sebe N, Gevers T (2010) Communication and automatic interpretation of affect from facial expressions

    Google Scholar 

  72. Sanghvi J, Castellano G, Leite I, Pereira A, McOwan PW, Paiva A (2011) Automatic analysis of affective postures and body motion to detect engagement with a game companion. In: HRI’11, Lausanne, pp 305–312

    Google Scholar 

  73. Savva N, Bianchi-Berthouze N (2012) Automatic recognition of affective body movement in a video game scenario. In: International conference on intelligent technologies for interactive entertainment, Genova

    Book  Google Scholar 

  74. Savva N, Scarinzi A, Bianchi-Berthouze N (2012) Continuous recognition of player’s affective body expression as dynamic quality of aesthetic experience. IEEE Trans Comput Intell AI Games 4(3):199–212

    Article  Google Scholar 

  75. Shaker N, Asteriadis S, Yannakakis GN, Karpouzis K (2013) Fusing visual and behavioral cues for modeling user experience in games. IEEE Trans Cybern 43(6):1519–1531

    Article  PubMed  Google Scholar 

  76. Silva RD, Bianchi-Berthouze N (2004) Modeling human affective postures: an information theoretic characterization of posture features. Comput Animat Virtual Worlds 15:269–276

    Article  Google Scholar 

  77. Silva PRD, Madurapperuma AP, Marasinghe A, Osano M (2006) A multi-agent based interactive system towards child’s emotion performances quantified through affective body gestures. In: ICPR (1)’06, Hong Kong, pp 1236–1239

    Google Scholar 

  78. Silva PRD, Osano M, Marasinghe A, Madurapperuma AP (2006) Towards recognizing emotion with affective dimensions through body gestures. In: International conference on automatic face and gesture recognition, Southampton

    Book  Google Scholar 

  79. Sivak M, Unluhisarcikli O, Weinberg B, Melman-Harari A, Bonate P, Mavroidis C (2010) Haptic system for hand rehabilitation integrating an interactive game with an advanced robotic device. In: IEEE haptics symposium, Waltham

    Book  Google Scholar 

  80. Sjöström C (2001) Using haptics in computer interfaces for blind people. In: CHI ’01 extended abstracts on human factors in computing systems. ACM, New York

    Google Scholar 

  81. Sjöström C, Rassmus-Gröhn K (1999) The sense of touch provides new interaction techniques for disabled people. Technol Disabil 10:45–52

    Google Scholar 

  82. Sollenberger DJ, Singh MP (2009) Architecture for affective social games. In: Proceedings of the 1st AAMAS workshop on agents for games and simulation (AGS)

    Google Scholar 

  83. Studios W: Dune 2—battle for arrakis (1993). http://www.ea.com/official/cc/firstdecade/us

    Google Scholar 

  84. Thomas B, Close B, Donoghue J, Squires J, Bondi PD, Morris M, Piekarski W (2002) First person indoor/outdoor augmented reality application: ARQuake. Pers Ubiquitous Comput Arch 6:75–86

    Article  Google Scholar 

  85. Thorpe A, Ma M, Oikonomou A (2011) History and alternative game input methods. In: International conference on computer games, Louisville

    Book  Google Scholar 

  86. Togelius J, Shaker N, Yannakakis GN (2013) Active player modelling. arXiv preprint arXiv:1312.2936

    Google Scholar 

  87. Ververidis D, Kotsia I, Kotropoulos C, Pitas I (2008) Multi-modal emotion-related data collection within a virtual earthquake emulator. In: 6th language resources and evaluation conference (LREC), Marrakech

    Google Scholar 

  88. Wallbott HG, Scherer KR (1986) Cues and channels in emotion recognition. Personal Soc Psychol 51(4):660–699

    Google Scholar 

  89. Wong CY, Chu K, Khong CW, Lim TY (2010) Evaluating playability on haptic user interface for mobile gaming. In: International symposium in information technology (ITSim), Kuala Lumpur

    Google Scholar 

  90. Yang CY, Lo YS, Liu CT (2010) Developing an interactive dental casting educational game. In: 3rd IEEE international conference on computer science and information technology (ICCSIT), Chengdu

    Google Scholar 

  91. Yannakakis GN, Togelius J (2011) Experience-driven procedural content generation. IEEE Trans Affect Comput 2(3):147–161

    Article  Google Scholar 

  92. Yannakakis GN, Togelius J, Khaled R, Jhala A, Karpouzis K, Paiva, A, Vasalou A (2010) Siren: towards adaptive serious games for teaching conflict resolution. In: 4th European conference on games based learning (ECGBL10), Copenhagen

    Google Scholar 

  93. Zeng Z, Pantic M, Roisman G, Huang T (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31(1):39–58

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

This work has been supported by the Action “Supporting Postdoctoral Researchers” of the Operational Program “Education and Lifelong Learning” (Action’s Beneficiary: General Secretariat for Research and Technology), co-financed by the European Social Fund (ESF) and the Greek State, and by the FP7 Technology-enhanced Learning project “Siren: Social games for conflIct REsolution based on natural iNteraction” (Contract no.: 258453). KK and GG have been supported by European Union (European Social Fund ESF) and Greek national funds through the Operational Program “Education and Lifelong Learning” of the National Strategic Reference Framework (NSRF)—Research Funding Program “Thalis - Interdisciplinary Research in Affective Computing for Biological Activity Recognition in Assistive Environments”.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Irene Kotsia .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Kotsia, I., Zafeiriou, S., Goudelis, G., Patras, I., Karpouzis, K. (2016). Multimodal Sensing in Affective Gaming. In: Karpouzis, K., Yannakakis, G. (eds) Emotion in Games. Socio-Affective Computing, vol 4. Springer, Cham. https://doi.org/10.1007/978-3-319-41316-7_4

Download citation

Publish with us

Policies and ethics