Abstract
A typical gaming scenario, as developed in the past 20 years, involves a player interacting with a game using a specialized input device, such as a joystick, a mouse, a keyboard or a proprietary game controller. Recent technological advances have enabled the introduction of more elaborated approaches in which the player is able to interact with the game using body pose, facial expressions, actions, even physiological signals. The future lies in ‘affective gaming’, that is games that will be ‘intelligent’ enough not only to extract the player’s commands provided by speech and gestures, but also to extract behavioural cues, as well as emotional states and adjust the game narrative accordingly, in order to ensure more realistic and satisfactory player experience. In this chapter, we review the area of affective gaming by describing existing approaches and discussing recent technological advances. More precisely, we first elaborate on different sources of affect information in games and proceed with issues such as the affective evaluation of players and affective interaction in games. We summarize the existing commercial affective gaming applications and introduce new gaming scenarios. We outline some of the most important problems that have to be tackled in order to create more realistic and efficient interactions between players and games and conclude by highlighting the challenges such systems must overcome.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Database available at http://institutedigitalgames.com/PED/
References
Adolphs R (2002) Recognizing emotion from facial expressions: psychological and neurological mechanisms. Behav Cogn Neurosci Rev 1:21–62
Antifakos S, Schiele B (2002) Bridging the gap between virtual and physical games using wearable sensors. In: Sixth international symposium on wearable computers, Seattle
Argyle M (1988) Bodily communication
Asteriadis S, Karpouzis K, Kollias S (2014) Visual focus of attention in non-calibrated environments using gaze estimation. Int J Comput Vis 107(3):293–316
Asteriadis S, Karpouzis K, Shaker N, Yannakakis GN (2012) Towards detecting clusters of players using visual and gameplay behavioral cues. Proc Comput Sci 15:140–147
Asteriadis S, Shaker N, Karpouzis K, Yannakakis GN (2012) Towards player’s affective and behavioral visual cues as drives to game adaptation. In: LREC workshop on multimodal corpora for machine learning, Istanbul
Bateman C, Nacke L (2010) The neurobiology of play. In: Proceedings of future play 2010, Vancouver, pp 1–8
Bernhardt D, Robinson P (2007) Detecting affect from non-stylised body motions. In: International conference on affective computing and intelligent interaction, Lisbon
Bertelsmeyer C, Koch E, Schirm AH (2006) A new approach on wearable game design and its evaluation. In: Proceedings of 5th ACM SIGCOMM workshop on network and system support for games, Singapore
Bianchi-Berthouze N (2010) Does body movement affect the player engagement experience? In: International conference on Kansei engineering and emotion research, Paris
Björk S, Falk J, Hansson R, Ljungstrand P (2001) Pirates: using the physical world as a game board. In: Proceedings of interact, Tokyo
Blair RJR, Morris JS, Frith CD, Perrett DI, Dolan RJ (1999) Dissociable neural responses to facial expressions of sadness and anger. Brain 122:883–893
Borod JC, Obler LK, Erhan HM, Grunwald IS, Cicero BA, Welkowitz J, Santschi C, Agosti RM, Whalen JR (1998) Right hemisphere emotional perception: evidence across multiple channels. Neuropsychology 12:446–458
Bruce V, Young AW (1986) Understanding face recognition. Br J Psychol 77:305–327
Calder AJ, Burton AM, Miller P, Young AW, Akamatsu S (2001) A principal component analysis of facial expressions. Vis Res 41:1179–1208
Camurri A, Mazzarino B, Ricchetti M, Timmers R, Timmers G (2004) Multimodal analysis of expressive gesture in music and dance performances. In: Gesture-based communication on human-computer interaction. Springer, Berlin/New York
Caporusso N, Mkrtchyan L, Badia L (2010) A multimodal interface device for online board games designed for sight-impaired people. IEEE Trans Inf Technol Biomed 14:248–254
Caridakis G, Karpouzis K, Wallace M, Kessous L, Amir N (2010) Multimodal user’s affective state analysis in naturalistic interaction. J Multimodal User Interfaces 3(1):49–66
Caridakis G, Wagner J, Raouzaiou A, Curto Z, Andre E, Kostas K (2010) A multimodal corpus for gesture expressivity analysis. In: Multimodal corpora: advances in capturing, coding and analyzing multimodality, Valletta
Cinaz B, Dselder E, Iben H, Koch E, Kenn H (2006) Wearable games—an approach for defining design principles. In: Student colloquium at the international symposium on wearable computers ISWC06, At Montreux
Close B, Donoghue J, Squires J, Bondi PD, Morris M, Piekarski W (2000) Arquake: an outdoor/indoor augmented reality first person application. In: 4th international symposium on wearable computers, Atlanta
Cohn J, Schmidt K (2004) The timing of facial motion in posed and spontaneous smiles. Int J Wavelets Multiresolut Inf Process 2:121–132
Cowie R, Douglas-Cowie E, Karpouzis K, Caridakis G, Wallace M, Kollias S (2008) Recognition of emotional states in natural human-computer interaction. In: Tzovaras D (ed) Multimodal user interfaces. Springer, Berlin/Heidelberg, pp 119–153
DeGroot D, Broekens J (2003) Using negative emotions to impair game play. In: 15th Belgian-Dutch conference on artificial intelligence, Nijmegen
Drachen A, Gbel S (2010) Methods for evaluating gameplay experience in a serious gaming context. Int J Comput Sci Sport 9
Ekman P, Davidson RJ (1994) The nature of emotion: fundamental questions. Oxford University Press, New York
Ekman P, Friesen W (1969) Nonverbal leakage and clues to deception. Psychiatry 32(1):88–105
Ekman P, Friesen W (1974) Detecting deception from the body or face. Personal Soc Psychol 29(3):288–298
Faust M, Yoo Y (2006) Haptic feedback in pervasive games. In: Third international workshop on pervasive gaming applications, PerGames, Dublin
Gauthier I, Tarr MJ, Aanderson A, Skudlarski P, Gore JC (1999) Activation of the middle fusiform ‘face area’ increases with expertise in recognizing novel objects. Nat Neurosci 2:568–573
Gilleade K, Dix A, Allanson J (2005) Affective videogames and modes of affective gaming: assist me, challenge me, emote me. In: DiGRA, Vancouver
Gunes H, Piccardi M (2009) Automatic temporal segment detection and affect recognition from face and body display. IEEE Trans Syst Man Cybern B Spec Issue Hum Comput 39(1):64–84
Hasselmo ME, Rolls ET, Baylis GC (1989) The role of expression and identity in the face-selective responses of neurons in the temporal visual cortex of the monkey. Behav Brain Res 32:203–218
van Heijnsbergen CCRJ, Meeren HKM, Grezes J, de Gelder B (2007) Rapid detection of fear in body expressions, an ERP study. Brain Res 1186:233–241
Heinz EA, Kunze KS, Gruber M, Bannach D, Lukowicz P (2006) Using wearable sensors for real-time recognition tasks in games of martial arts—an initial experiment. In: IEEE symposium on computational intelligence and games, Reno/Lake Tahoe
Holmgård C, Yannakakis GN, Martínez HP, Karstoft KI, Andersen HS (2015) Multimodal ptsd characterization via the startlemart game. J Multimodal User Interfaces 9(1):3–15
Hossain SKA, Rahman ASMM, Saddik AE (2010) Haptic based emotional communication system in second life. In: IEEE international symposium on haptic audio-visual environments and games (HAVE), Phoenix
Hudlicka E (2003) To feel or not to feel: the role of affect in humancomputer interaction. Int J Hum-Comput Stud 59:1–32
Hudlicka E (2008) Affective computing for game design. In: Proceedings of the 4th international North American conference on intelligent games and simulation (GAMEON-NA), Montreal
Hudlicka E (2009) Affective game engines: motivation and requirements. In: Proceedings of the 4th international conference on foundations of digital games, Orlando
Hudlicka E, Broekens J (2009) Foundations for modelling emotions in game characters: modelling emotion effects on cognition. ACII MiniTutorial
Hudlicka E, McNeese MD (2002) Assessment of user affective and belief states for interface adaptation: application to an air force pilot task. User Model User-Adapt Interact 12(1):1–47
Kanwisher N, McDermott J, Chun MM (1997) The fusiform face area: a module in human extrastriate cortex specialized for face perception. J Neurosci 4302–4311
Kapoor A, Burleson W, Picard RW (2007) Automatic prediction of frustration. Int J Hum-Comput Stud 65:724–736
Kapoor A, Picard RW, Ivanov Y (2004) Probabilistic combination of multiple modalities to detect interest. In: International conference on pattern recognition, Cambridge
Karpouzis K, Shaker N, Yannakakis G, Asteriadis S (2015) The platformer experience dataset. In: 6th international conference on affective computing and intelligent interaction (ACII 2015) conference, Xi’an, 21–24 Sept 2015
Kleinsmith A, Bianchi-Berthouze N, Steed A: Automatic recognition of non-acted affective postures. IEEE Trans Syst Man Cybern B: Cybern 41(4):1027–1038 (2011)
Kleinsmith A, Silva RD, Bianchi-Berthouze N (2006) Cross-cultural differences in recognizing affect from body posture. Interact Comput 18:1371–1389
Kwon YC, Lee WH (2010) A study on multi-touch interface for game. In: 3rd international conference on human-centric computing (HumanCom), Cebu
Leite I, Martinho C, Pereira A, Paiva A (2008) iCat: an affective game buddy based on anticipatory mechanisms. In: Proceedings of the 7th international joint conference on autonomous agents and multiagent systems, Estoril
Lyons MJ, Budynek J, Akamatsu S (1999) Automatic classification of single facial images. IEEE Trans Pattern Anal Mach Intell 21(12):1357–1362
Ma Y, Paterson H, Pollick FE (2006) A motion capture library for the study of identity, gender, and emotion perception from biological motion. Behav Res Methods 38:134–141
Mandryk R, Atkins MS (2007) A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies. Int J Hum-Comput Stud 65(4):329–347
Mandryk RL, Atkins MS, Inkpen KM (2006) A continuous and objective evaluation of emotional experience with interactive play environments. In: Proceedings of the SIGCHI conference on human factors in computing systems, Montréal
Martinez HP, Yannakakis GN, Hallam J (2014) Don’t classify ratings of affect; rank them! IEEE Trans Affect Comput 5(3):314–326
Mehrabian A, Friar J (1969) Encoding of attitude by a seated communicator via posture and position cues. Consult Clin Psychol 33:330–336
Microsoft (2010) Xbox kinect. http://www.xbox.com/en-GB/kinect
Moustakas K, Tzovaras D, Dybkjaer L, Bernsen N, Aran O (2011) Using modality replacement to facilitate communication between visually and hearing-impaired people. IEEE Trans Multimed
Nacke LE, Grimshaw NM, Lindley ACA (2010) More than a feeling: measurement of sonic user experience and psychophysiology in a first-person shooter game. Interact Comput 22:336–343
Nacke LE, Mandryk RL (2010) Designing affective games with physiological input. In: Workshop on multiuser and social biosignal adaptive games and playful applications in fun and games conference (BioS-Play), Leuven
Nusseck M, Cunningham DW, Wallraven C, Bülthoff HH (2008) The contribution of different facial regions to the recognition of conversational expressions. J Vis 8:1–23
Ouhyoung M, Tsai WN, Tsai MC, Wu JR, Huang CH, Yang TJ (1995) A low-cost force feedback joystick and its use in PC video games. IEEE Trans Consum Electron 41:787
Pantic M, Rothkrantz LJM (2003) Toward an affect-sensitive multimodal human-computer interaction. Proc IEEE 91:1370–1390
Pantic M, Vinciarelli A (2009) Implicit human centered tagging. IEEE Signal Process Mag 26:173–180
Paolis LD, Pulimeno M, Aloisio G (2007) The simulation of a billiard game using a haptic interface. In: IEEE international symposium on distributed simulation and real-time applications, Chania
Park W, Kim L, Cho H, Park S (2009) Design of haptic interface for brickout game. In: IEEE international workshop on haptic audio visual environments and games, Lecco
Picard R (2000) Towards computers that recognize and respond to user emotion. IBM Syst J 39:705
Pollak SD, Messner M, Kistler DJ, Cohn JF (2009) Development of perceptual expertise in emotion recognition. Cognition 110:242–247
Russell JA (1980) A circumplex model of affect. J Personal Soc Psychol 39:1161–1178
Saari T, Ravaja N, Laarni J, Kallinen K, Turpeinen M (2004) Towards emotionally adapted games. In: Proceedings of presence, Valencia
Salah AA, Sebe N, Gevers T (2010) Communication and automatic interpretation of affect from facial expressions
Sanghvi J, Castellano G, Leite I, Pereira A, McOwan PW, Paiva A (2011) Automatic analysis of affective postures and body motion to detect engagement with a game companion. In: HRI’11, Lausanne, pp 305–312
Savva N, Bianchi-Berthouze N (2012) Automatic recognition of affective body movement in a video game scenario. In: International conference on intelligent technologies for interactive entertainment, Genova
Savva N, Scarinzi A, Bianchi-Berthouze N (2012) Continuous recognition of player’s affective body expression as dynamic quality of aesthetic experience. IEEE Trans Comput Intell AI Games 4(3):199–212
Shaker N, Asteriadis S, Yannakakis GN, Karpouzis K (2013) Fusing visual and behavioral cues for modeling user experience in games. IEEE Trans Cybern 43(6):1519–1531
Silva RD, Bianchi-Berthouze N (2004) Modeling human affective postures: an information theoretic characterization of posture features. Comput Animat Virtual Worlds 15:269–276
Silva PRD, Madurapperuma AP, Marasinghe A, Osano M (2006) A multi-agent based interactive system towards child’s emotion performances quantified through affective body gestures. In: ICPR (1)’06, Hong Kong, pp 1236–1239
Silva PRD, Osano M, Marasinghe A, Madurapperuma AP (2006) Towards recognizing emotion with affective dimensions through body gestures. In: International conference on automatic face and gesture recognition, Southampton
Sivak M, Unluhisarcikli O, Weinberg B, Melman-Harari A, Bonate P, Mavroidis C (2010) Haptic system for hand rehabilitation integrating an interactive game with an advanced robotic device. In: IEEE haptics symposium, Waltham
Sjöström C (2001) Using haptics in computer interfaces for blind people. In: CHI ’01 extended abstracts on human factors in computing systems. ACM, New York
Sjöström C, Rassmus-Gröhn K (1999) The sense of touch provides new interaction techniques for disabled people. Technol Disabil 10:45–52
Sollenberger DJ, Singh MP (2009) Architecture for affective social games. In: Proceedings of the 1st AAMAS workshop on agents for games and simulation (AGS)
Studios W: Dune 2—battle for arrakis (1993). http://www.ea.com/official/cc/firstdecade/us
Thomas B, Close B, Donoghue J, Squires J, Bondi PD, Morris M, Piekarski W (2002) First person indoor/outdoor augmented reality application: ARQuake. Pers Ubiquitous Comput Arch 6:75–86
Thorpe A, Ma M, Oikonomou A (2011) History and alternative game input methods. In: International conference on computer games, Louisville
Togelius J, Shaker N, Yannakakis GN (2013) Active player modelling. arXiv preprint arXiv:1312.2936
Ververidis D, Kotsia I, Kotropoulos C, Pitas I (2008) Multi-modal emotion-related data collection within a virtual earthquake emulator. In: 6th language resources and evaluation conference (LREC), Marrakech
Wallbott HG, Scherer KR (1986) Cues and channels in emotion recognition. Personal Soc Psychol 51(4):660–699
Wong CY, Chu K, Khong CW, Lim TY (2010) Evaluating playability on haptic user interface for mobile gaming. In: International symposium in information technology (ITSim), Kuala Lumpur
Yang CY, Lo YS, Liu CT (2010) Developing an interactive dental casting educational game. In: 3rd IEEE international conference on computer science and information technology (ICCSIT), Chengdu
Yannakakis GN, Togelius J (2011) Experience-driven procedural content generation. IEEE Trans Affect Comput 2(3):147–161
Yannakakis GN, Togelius J, Khaled R, Jhala A, Karpouzis K, Paiva, A, Vasalou A (2010) Siren: towards adaptive serious games for teaching conflict resolution. In: 4th European conference on games based learning (ECGBL10), Copenhagen
Zeng Z, Pantic M, Roisman G, Huang T (2009) A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans Pattern Anal Mach Intell 31(1):39–58
Acknowledgements
This work has been supported by the Action “Supporting Postdoctoral Researchers” of the Operational Program “Education and Lifelong Learning” (Action’s Beneficiary: General Secretariat for Research and Technology), co-financed by the European Social Fund (ESF) and the Greek State, and by the FP7 Technology-enhanced Learning project “Siren: Social games for conflIct REsolution based on natural iNteraction” (Contract no.: 258453). KK and GG have been supported by European Union (European Social Fund ESF) and Greek national funds through the Operational Program “Education and Lifelong Learning” of the National Strategic Reference Framework (NSRF)—Research Funding Program “Thalis - Interdisciplinary Research in Affective Computing for Biological Activity Recognition in Assistive Environments”.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Kotsia, I., Zafeiriou, S., Goudelis, G., Patras, I., Karpouzis, K. (2016). Multimodal Sensing in Affective Gaming. In: Karpouzis, K., Yannakakis, G. (eds) Emotion in Games. Socio-Affective Computing, vol 4. Springer, Cham. https://doi.org/10.1007/978-3-319-41316-7_4
Download citation
DOI: https://doi.org/10.1007/978-3-319-41316-7_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-41314-3
Online ISBN: 978-3-319-41316-7
eBook Packages: Biomedical and Life SciencesBiomedical and Life Sciences (R0)