Affective Computing in Games

  • Benjamin GuthierEmail author
  • Ralf Dörner
  • Hector P. Martinez
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9970)


Being able to automatically recognize and interpret the affective state of the player can have various benefits in a Serious Game. The difficulty and pace of a learning game could be adapted, or the quality of the interaction between the player and the game could be improved – just to name two examples. This Chapter aims to give an introduction to Affective Computing with the goal of helping developers to incorporate the player’s affective data into the games. Suitable psychological models of emotion and personality are described, and a multitude of sensors as well as methods to recognize affect are discussed in detail. The Chapter ends with a number examples where human affect is utilized in Serious Games.


Affective Computing Serious Game Emotion Affect detection Sensors Physiological data Facial expressions Speech 


  1. 1.
    Ambadar, Z., Schooler, J.W., Cohn, J.F.: Deciphering the enigmatic face - the importance of facial dynamics in interpreting subtle facial expressions. Psychol. Sci. 16(5), 403–410 (2005)CrossRefGoogle Scholar
  2. 2.
    Anderson, N.H.: Likableness ratings of 555 personality-trait words. J. Pers. Soc. Psychol. 9(3), 272 (1968)CrossRefGoogle Scholar
  3. 3.
    Anliker, U., Ward, J.A., Lukowicz, P., Tröster, G., Dolveck, F., Baer, M., Keita, F., Schenker, E.B., Catarsi, F., Coluccini, L., et al.: AMON: a wearable multiparameter medical monitoring and alert system. IEEE Trans. Inf. Technol. Biomed. 8(4), 415–427 (2004)CrossRefGoogle Scholar
  4. 4.
    Arroyo, I., Cooper, D.G., Burleson, W., Woolf, B.P., Muldner, K., Christopherson, R.: Emotion sensors go to school. In: Proceedings of AIED, vol. 200, pp. 17–24 (2009)Google Scholar
  5. 5.
    Aviezer, H., Hassin, R.R., Ryan, J., Grady, C., Susskind, J., Anderson, A., Moscovitch, M., Bentin, S.: Angry, disgusted, or afraid? Studies on the malleability of emotion perception. Psychol. Sci. 19(7), 724–732 (2008)CrossRefGoogle Scholar
  6. 6.
    Ayaz, H., Shewokis, P.A., Bunce, S., Onaral, B.: An optical brain computer interface for environmental control. In: International Conference on Engineering in Medicine and Biology Society (EMBC), pp. 6327–6330 (2011)Google Scholar
  7. 7.
    Bartlett, M.S., Littlewort, G., Fasel, I., Movellan, J.R.: Real time face detection and facial expression recognition: development and applications to human computer interaction. In: Proceedings of Computer Vision and Pattern Recognition Workshop, vol. 5, p. 53 (2003)Google Scholar
  8. 8.
    Baveye, Y., Dellandrea, E., Chamaret, C., Chen, L.: Liris-accede: a video database for affective content analysis. IEEE Trans. Affect. Comput. 6(1), 43–55 (2015)CrossRefGoogle Scholar
  9. 9.
    Bernays, R., Mone, J., Yau, P., Murcia, M., Gonzalez-Sanchez, J., Chavez-Echeagaray, M.E., Christopherson, R., Atkinson, R.: Lost in the dark: emotion adaption. In: Adjunct Proceedings of the 25th Annual ACM Symposium on User Interface Software and Technology (UIST), pp. 79–80 (2012). doi: 10.1145/2380296.2380331, ISBN 978-1-4503-1582-1
  10. 10.
    Biel, J.-I., Teijeiro-Mosquera, L., Gatica-Perez, D.: Facetube: predicting personality from facial expressions of emotion in online conversational video. In: Proceedings of the 14th ACM International Conference on Multimodal Interaction, pp. 53–56 (2012)Google Scholar
  11. 11.
    Bojko, A.: Eye Tracking the User Experience. Rosenfeld Media, Brooklyn (2013)Google Scholar
  12. 12.
    Bollen, J., Pepe, A., Mao, H.: Modeling public mood and emotion: twitter sentiment and socio-economic phenomena. In: Proceedings of ICWSM, vol. 11, pp. 450–453 (2009)Google Scholar
  13. 13.
    Boucsein, W.: Electrodermal Activity. Springer Science & Business Media, Berlin (2012)CrossRefGoogle Scholar
  14. 14.
    Bradley, M.M., Miccoli, L., Escrig, M.A., Lang, P.J.: The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 45(4), 602–607 (2008)CrossRefGoogle Scholar
  15. 15.
    Brave, S., Nass, C.: Emotion in human-computer interaction. In: Jacko, J.A., Sears, A. (eds.) Human-Computer Interaction, pp. 53–67. CRC Press, Boca Raton (2003)Google Scholar
  16. 16.
    Breazeal, C., Aryananda, L.: Recognition of affective communicative intent in robot-directed speech. Auton. Robots 12(1), 83–104 (2002)zbMATHCrossRefGoogle Scholar
  17. 17.
    Brouwer, A.-M., Van Wouwe, N., Muehl, C., Van Erp, J., Toet, A.: Perceiving blocks of emotional pictures, sounds: effects on physiological variables. Front. Hum. Neurosci. 7, 1–10 (2013). Article 295, ISSN 1662–5161Google Scholar
  18. 18.
    Cacioppo, J.T., Tassinary, L.G., Berntson, G.G.: Handbook of Psychophysiology. Cambridge University Press, Cambridge (2007)CrossRefGoogle Scholar
  19. 19.
    Calvo, R.A., D’Mello, S.: Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 1(1), 18–37 (2010)CrossRefGoogle Scholar
  20. 20.
    Carrera, P., Oceja, L.: Drawing mixed emotions: sequential or simultaneous experiences? Cogn. Emot. 21(2), 422–441 (2007)CrossRefGoogle Scholar
  21. 21.
    Castiglioni, P., Faini, A., Parati, G., Di Rienzo, M.: Wearable seismocardiography. In: 2007 29th Annual International Conference of the IEEE Engineering in Medicine, Biology Society, pp. 3954–3957, August 2007. doi: 10.1109/IEMBS.2007.4353199
  22. 22.
    Cattell, R.B., Eber, H.W., Tatsuoka, M.M.: Handbook for the Sixteen Personality Factor Questionnaire (16 PF), in Clinical, Educational, Industrial, and Research Psychology, for use with all forms of the Test. Institute for Personality and Ability Testing, Champaign (1970)Google Scholar
  23. 23.
    Cavazza, M., Pizzi, D., Charles, F., Vogt, T., André, E.: Emotional input for character-based interactive storytelling. In: Proceedings of the International Conference on Autonomous Agents and Multiagent Systems, vol. 1, pp. 313–320 (2009)Google Scholar
  24. 24.
    Chaffar, S., Inkpen, D.: Using a heterogeneous dataset for emotion analysis in text. In: Butz, C., Lingras, P. (eds.) AI 2011. LNCS (LNAI), vol. 6657, pp. 62–67. Springer, Heidelberg (2011). doi: 10.1007/978-3-642-21043-3_8 CrossRefGoogle Scholar
  25. 25.
    Chanel, G., Kronegg, J., Grandjean, D., Pun, T.: Emotion assessment: arousal evaluation using EEG’s and peripheral physiological signals. In: Gunsel, B., Jain, A.K., Tekalp, A.M., Sankur, B. (eds.) MRCS 2006. LNCS, vol. 4105, pp. 530–537. Springer, Heidelberg (2006). doi: 10.1007/11848035_70 CrossRefGoogle Scholar
  26. 26.
    Chanel, G., Rebetez, C., Bétrancourt, M., Pun, T.: Emotion assessment from physiological signals for adaptation of game difficulty. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 41(6), 1052–1063 (2011)CrossRefGoogle Scholar
  27. 27.
    Childers, D.G., Skinner, D.P., Kemerait, R.C.: The cepstrum: a guide to processing. Proc. IEEE 65(10), 1428–1443 (1977)CrossRefGoogle Scholar
  28. 28.
    Christie, I.C., Friedman, B.H.: Autonomic specificity of discrete emotion and dimensions of affective space: a multivariate approach. Int. J. Psychophysiol. 51(2), 143–153 (2004)CrossRefGoogle Scholar
  29. 29.
    Cohn, J.F., Schmidt, K.L.: The timing of facial motion in posed and spontaneous smiles. Int. J. Wavelets Multiresolut. Inf. Process. 2(02), 121–132 (2004)CrossRefGoogle Scholar
  30. 30.
    Costa Jr., P.T., McCrae, R.R.: Set like plaster? Evidence for the stability of adult personality. In: Heatherton, T., Weinberger, J. (eds.) Can Personality Change?, pp. 21–40. American Psychological Association, Washington, D.C (1994)CrossRefGoogle Scholar
  31. 31.
    Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., Taylor, J.G.: Emotion recognition in human-computer interaction. IEEE Sig. Process. Mag. 18(1), 32–80 (2001)CrossRefGoogle Scholar
  32. 32.
    Dalgleish, T., Dunn, B.D., Mobbs, D.: Affective neuroscience: past, present, and future. Emot. Rev. 1(4), 355–368 (2009)CrossRefGoogle Scholar
  33. 33.
    Davidson, R.J., Scherer, K.R., Goldsmith, H.: Handbook of Affective Sciences. Oxford University Press, Oxford (2003)Google Scholar
  34. 34.
    Davis, S.B., Mermelstein, P.: Comparison of parametric representations for monosyllabic word recognition in continuously spoken sentences. IEEE Trans. Acoust. Speech Sig. Process. 28(4), 357–366 (1980)CrossRefGoogle Scholar
  35. 35.
    De Choudhury, M.C.S., Gamon, M.: Not all moods are created equal! Exploring human emotional states in social media. In: Proceedings of the ICWSM (2012)Google Scholar
  36. 36.
    Dekker, A., Champion, E.: Please biofeed the zombies: enhancing the gameplay and display of a horror game using biofeedback. In: Proceedings of DiGRA, pp. 550–558 (2007)Google Scholar
  37. 37.
    Dhall, A., Goecke, R., Lucey, S., Gedeon, T.: Static facial expression analysis in tough conditions: data, evaluation protocol and benchmark. In: IEEE International Conference on Computer Vision Workshops (ICCV Workshops), pp. 2106–2112 (2011)Google Scholar
  38. 38.
    D’Mello, S., Graesser, A.: Autotutor and affective autotutor: learning by talking with cognitively and emotionally intelligent computers that talk back. ACM Trans. Interact. Intell. Syst. (TiiS) 2(4), 23 (2012)Google Scholar
  39. 39.
    D’Mello, S.K., Kory, J.: A review and meta-analysis of multimodal affect detection systems. ACM Comput. Surv. 47(3), February 2015. doi: 10.1145/2682899, ISSN 0360-0300Google Scholar
  40. 40.
    Drachen, A., Nacke, L.E., Yannakakis, G., Lee Pedersen, A.: Correlation between heart rate, electrodermal activity and player experience in first-person shooter games. In: Proceedings of the 5th ACM SIGGRAPH Symposium on Video Games, pp. 49–54 (2010)Google Scholar
  41. 41.
    Egges, A., Kshirsagar, S., Magnenat-Thalmann, N.: A model for personality and emotion simulation. In: Palade, V., Howlett, R.J., Jain, L. (eds.) KES 2003. LNCS (LNAI), vol. 2773, pp. 453–461. Springer, Heidelberg (2003). doi: 10.1007/978-3-540-45224-9_63 CrossRefGoogle Scholar
  42. 42.
    Ekman, P.: An argument for basic emotions. Cogn. Emot. 6(3–4), 169–200 (1992a)CrossRefGoogle Scholar
  43. 43.
    Ekman, P.: Are there basic emotions? Psychol. Rev. 99(3), 550–553 (1992b)CrossRefGoogle Scholar
  44. 44.
    Ekman, P.: Facial expression and emotion. Am. Psychol. 48(4), 384 (1993)CrossRefGoogle Scholar
  45. 45.
    Ekman, P., Friesen, W.V.: Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press, Stanford University, Palo Alto (1978)Google Scholar
  46. 46.
    Ekman, P., Rosenberg, E.L.: What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS). Oxford University Press, Oxford (1997)Google Scholar
  47. 47.
    El Ayadi, M., Kamel, M.S., Karray, F.: Survey on speech emotion recognition: features, classification schemes, and databases. Pattern Recogn. 44(3), 572–587 (2011)zbMATHCrossRefGoogle Scholar
  48. 48.
    Emotiv.Emotiv (2016). Accessed 26 May 2016
  49. 49.
    Fazli, S., Mehnert, J., Steinbrink, J., Curio, G., Villringer, A., Müller, K.-R., Blankertz, B.: Enhanced performance by a hybrid NIRS-EEG brain computer interface. Neuroimage 59(1), 519–529 (2012)CrossRefGoogle Scholar
  50. 50.
    Fernández-Aranda, F., Jiménez-Murcia, S., Santamaría, J.J., Gunnard, K., Soto, A., Kalapanidas, E., Bults, R.G.A., Davarakis, C., Ganchev, T., Granero, R.: Video games as a complementary therapy tool in mental disorders: PlayMancer, a European multicentre study. J. Ment. Health 21(4), 364–374 (2012)CrossRefGoogle Scholar
  51. 51.
    Fontaine, J.R.J., Scherer, K.R., Roesch, E.B., Ellsworth, P.C.: The world of emotions is not two-dimensional. Psychol. Sci. 18(12), 1050–1057 (2007)CrossRefGoogle Scholar
  52. 52.
    France, D.J., Shiavi, R.G., Silverman, S., Silverman, M., Wilkes, M.: Acoustical properties of speech as indicators of depression, suicidal risk. IEEE Trans. Biomed. Eng. 47(7), 829–837 (2000)CrossRefGoogle Scholar
  53. 53.
    Frijda, N.H.: Varieties of affect: emotions and episodes, moods, and sentiments. In: Ekman, P., Davison, R. (eds.) The Nature of Emotions: Fundamental Questions, pp. 197–202. Oxford University Press, Oxford (1994)Google Scholar
  54. 54.
    García-García, C., Larios-Rosillo, V., Luga, H.: Agent behaviour modeling using personality profile characterization for emergency evacuation serious games. In: Plemenos, D., Miaoulis, G. (eds.) Intelligent Computer Graphics 2012. Studies in Computational Intelligence, vol. 441, pp. 107–128. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  55. 55.
    Gebhard, P., Kipp, K.H.: Are computer-generated emotions and moods plausible to humans? In: Gratch, J., Young, M., Aylett, R., Ballin, D., Olivier, P. (eds.) IVA 2006. LNCS (LNAI), vol. 4133, pp. 343–356. Springer, Heidelberg (2006). doi: 10.1007/11821830_28 CrossRefGoogle Scholar
  56. 56.
    Golbeck, J., Robles, C., Turner, K.: Predicting personality with social media. In: Proceedings of CHI 2011 Extended Abstracts on Human Factors in Computing Systems, pp. 253–262 (2011)Google Scholar
  57. 57.
    Gunes, H., Piccardi, M.: A bimodal face and body gesture database for automatic analysis of human nonverbal affective behavior. In: International Conference on Pattern Recognition (ICPR), vol. 1, pp. 1148–1153 (2006)Google Scholar
  58. 58.
    Gunes, H., Schuller, B.: Categorical and dimensional affect analysis in continuous input: current trends and future directions. Image Vis. Comput. 31(2), 120–136 (2013)CrossRefGoogle Scholar
  59. 59.
    Gunes, H., Schuller, B., Pantic, M., Cowie, R.: Emotion representation, analysis, synthesis in continuous space: a survey. In: IEEE International Conference on Automatic Face & Gesture Recognition and Workshops, pp. 827–834 (2011)Google Scholar
  60. 60.
    Guthier, B., Alharthi, R., Abaalkhail, R., El Saddik, A.: Detection and visualization of emotions in an affect-aware city. In: Proceedings of the 1st International Workshop on Emerging Multimedia Applications and Services for Smart Cities, pp. 23–28 (2014)Google Scholar
  61. 61.
    Hamann, S.: Mapping discrete and dimensional emotions onto the brain: controversies and consensus. Trends Cogn. Sci. 16(9), 458–466 (2012)CrossRefGoogle Scholar
  62. 62.
    Hansen, J.H.L., Cairns, D.A.: Icarus: Source generator based real-time recognition of speech in noisy stressful and lombard effect environments. Speech Commun. 16(4), 391–422 (1995)CrossRefGoogle Scholar
  63. 63.
    Homma, I., Masaoka, Y.: Breathing rhythms and emotions. Exp. Physiol. 93(9), 1011–1021 (2008)CrossRefGoogle Scholar
  64. 64.
    Hoover, A., Singh, A., Fishel-Brown, S., Muth, E.: Real-time detection of workload changes using heart rate variability. Biomed. Sig. Process. Control 7(4), 333–341 (2012)CrossRefGoogle Scholar
  65. 65.
    Horlings, R., Datcu, D., Rothkrantz, L.J.M.: Emotion recognition using brain activity. In: Proceedings of the 9th International Conference on Computer Systems and Technologies and Workshop for PhD Students in Computing, p. 6 (2008)Google Scholar
  66. 66.
    Ikehara, C.S., Crosby, M.E.: Assessing cognitive load with physiological sensors. In: Proceedings of the Hawaii International Conference on System Sciences (HICSS), p. 295a (2005)Google Scholar
  67. 67.
    Izard, C.E., et al.: Special section: on defining emotion. Emot. Rev. 2(4), 363–385 (2010)CrossRefGoogle Scholar
  68. 68.
    Jerritta, S., Murugappan, M., Nagarajan, R., Wan, K.: Physiological signals based human emotion recognition: a review. In: IEEE International Colloquium on Signal Processing and its Applications (CSPA), pp. 410–415 (2011)Google Scholar
  69. 69.
    Johnstone, T., van Reekum, C.M., Hird, K., Kirsner, K., Scherer, K.R.: Affective speech elicited with a computer game. Emotion 5(4), 513 (2005)CrossRefGoogle Scholar
  70. 70.
    Kao, E.C.-C., Liu, C.-C., Yang, T.-H., Hsieh, C.-T., Soo, V.-W.: Towards text-based emotion detection a survey and possible improvements. In: International Conference on Information Management and Engineering, ICIME 2009, pp. 70–74 (2009)Google Scholar
  71. 71.
    Kapoor, A., Picard, R.W.: Multimodal affect recognition in learning environments. In: Proceedings of the 13th Annual ACM International Conference on Multimedia, pp. 677–682 (2005)Google Scholar
  72. 72.
    Kirk, M.: Thoughtful Machine Learning: A Test-Driven Approach. O’Reilly Media Inc., California (2014)Google Scholar
  73. 73.
    Kleinginna Jr., P.R., Kleinginna, A.M.: A categorized list of emotion definitions, with suggestions for a consensual definition. Motiv. Emot. 5(4), 345–379 (1981)CrossRefGoogle Scholar
  74. 74.
    Kleinsmith, A., Bianchi-Berthouze, N., Steed, A.: Automatic recognition of non-acted affective postures. IEEE Trans. Syst. Man Cybern. Part B Cybern. 41(4), 1027–1038 (2011)CrossRefGoogle Scholar
  75. 75.
    Knutson, B.: Facial expressions of emotion influence interpersonal trait inferences. J. Nonverbal Behav. 20(3), 165–182 (1996)CrossRefGoogle Scholar
  76. 76.
    Koelstra, S., Mühl, C., Soleymani, M., Lee, J.-S., Yazdani, A., Ebrahimi, T., Pun, T., Nijholt, A., Patras, I.: Deap: a database for emotion analysis; using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012)CrossRefGoogle Scholar
  77. 77.
    Koolagudi, S.G., Rao, K.S.: Emotion recognition from speech: a review. Int. J. Speech Technol. 15(2), 99–117 (2012)CrossRefGoogle Scholar
  78. 78.
    Kundu, S.K., Kumagai, S., Sasaki, M.: A wearable capacitive sensor for monitoring human respiratory rate. Japan. J. Appl. Phys. 52(4S), 04CL05 (2013)CrossRefGoogle Scholar
  79. 79.
    Lang, P.J., Bradley, M.M., Cuthbert, B.N.: International affective picture system (IAPS): Affective ratings of pictures and instruction manual. Technical report A-8 (2008)Google Scholar
  80. 80.
    Lankes, M., Riegler, S., Weiss, A., Mirlacher, T., Pirker, M., Tscheligi, M.: Facial expressions as game input with different emotional feedback conditions. In: Proceedings of the 2008 International Conference on Advances in Computer Entertainment Technology, pp. 253–256 (2008)Google Scholar
  81. 81.
    Laukka, P., Juslin, P., Bresin, R.: A dimensional approach to vocal expression of emotion. Cogn. Emot. 19(5), 633–653 (2005)CrossRefGoogle Scholar
  82. 82.
    Lee, C.M., Narayanan, S.S.: Toward detecting emotions in spoken dialogs. IEEE Trans. Speech Audio Process. 13(2), 293–303 (2005)CrossRefGoogle Scholar
  83. 83.
    Lee, C.M., Narayanan, S.S., Pieraccini, R.: Combining acoustic and language information for emotion recognition. In: Proceedings of INTERSPEECH (2002)Google Scholar
  84. 84.
    Lee, C.M., Yildirim, S., Bulut, M., Kazemzadeh, A., Busso, C., Deng, Z., Lee, S., Narayanan, S.: Emotion recognition based on phoneme classes. In: Proceedings of Interspeech, pp. 205–211 (2004)Google Scholar
  85. 85.
    Leichtenstern, K., Bee, N., André, E., Berkmüller, U., Wagner, J.: Physiological measurement of trust-related behavior in trust-neutral and trust-critical situations. In: Wakeman, I., Gudes, E., Jensen, C.D., Crampton, J. (eds.) IFIPTM 2011. IAICT, vol. 358, pp. 165–172. Springer, Heidelberg (2011). doi: 10.1007/978-3-642-22200-9_14 CrossRefGoogle Scholar
  86. 86.
    Leshed, G., Kaye, J.J.: Understanding how bloggers feel: recognizing affect in blog posts. In: Proceedings of CHI 2006 Extended Abstracts on Human Factors in Computing Systems, pp. 1019–1024 (2006)Google Scholar
  87. 87.
    Lewis, M., Haviland-Jones, J.M., Barrett, L.F.: Handbook of Emotions. Guilford Press, New York City (2010)Google Scholar
  88. 88.
    Liapis, A., Katsanos, C., Sotiropoulos, D., Xenos, M., Karousos, N.: Recognizing emotions in human computer interaction: studying stress using skin conductance. In: Abascal, J., Barbosa, S., Fetter, M., Gross, T., Palanque, P., Winckler, M. (eds.) INTERACT 2015. LNCS, vol. 9296, pp. 255–262. Springer, Heidelberg (2015). doi: 10.1007/978-3-319-22701-6_18 CrossRefGoogle Scholar
  89. 89.
    Lisetti, C.L., Nasoz, F.: Using noninvasive wearable computers to recognize human emotions from physiological signals. EURASIP J. Adv. Sig. Process. 2004(11), 1–16 (2004)Google Scholar
  90. 90.
    Litman, D.J., Forbes-Riley, K.: Predicting student emotions in computer-human tutoring dialogues. In: Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics, p. 351 (2004)Google Scholar
  91. 91.
    Littlewort, G., Whitehill, J., Wu,T., Fasel, I., Frank, M., Movellan, J., Bartlett, M.: The computer expression recognition toolbox (CERT). In: IEEE International Conference on Automatic Face & Gesture Recognition and Workshops, pp. 298–305 (2011)Google Scholar
  92. 92.
    Littlewort, G.C., Bartlett, M.S., Lee, K.: Automatic coding of facial expressions displayed during posed and genuine pain. Image Vis. Comput. 27(12), 1797–1803 (2009)CrossRefGoogle Scholar
  93. 93.
    Liu, C., Rani, P., Sarkar, N.: An empirical study of machine learning techniques for affect recognition in human-robot interaction. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2662–2667 (2005)Google Scholar
  94. 94.
    Liu, X., Zheng, Y., Phyu, M.W., Zhao, B., Je, M., Yuan, X.: Multiple functional ECG signal is processing for wearable applications of long-term cardiac monitoring. IEEE Trans. Biomed. Eng. 58(2), 380–389 (2011)CrossRefGoogle Scholar
  95. 95.
    Liu, Y., Sourina, O., Nguyen, M.K.: Real-time EEG-based human emotion recognition and visualization. In: 2010 International Conference on Cyberworlds (CW), pp. 262–269 (2010)Google Scholar
  96. 96.
    López, G., Custodio, V., Moreno, J.I.: Lobin: E-textile and wireless-sensor-network-based platform for healthcare monitoring in future hospital environments. IEEE Trans. Inf. Technol. Biomed. 14(6), 1446–1458 (2010)CrossRefGoogle Scholar
  97. 97.
    Lucey, P., Cohn, J.F., Kanade, T., Saragih, J., Ambadar, Z., Matthews, I.: The extended cohn-kanade dataset (ck+): a complete dataset for action unit and emotion-specified expression.In: IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 94–101 (2010)Google Scholar
  98. 98.
    Lugger, M., Janoir, M.-E., et al.: Combining classifiers with diverse feature sets for robust speaker independent emotion recognition. In: 2009 17th European Signal Processing Conference, pp. 1225–1229 (2009)Google Scholar
  99. 99.
    Mandryk, R.L.: Physiological measures for game evaluation. In: Lazzaro, M. (ed.) Game usability,: Advice from the experts for advancing the player experience, pp. 207–235. Morgan Kaufmann, Burlington (2008)Google Scholar
  100. 100.
    Mandryk, R.L., Atkins, M.S.: A fuzzy physiological approach for continuously modeling emotion during interaction with play technologies. Int. J. Hum. Comput. Stud. 65(4), 329–347 (2007)CrossRefGoogle Scholar
  101. 101.
    Mandryk, R.L., Atkins, M.S., Inkpen, K.M.: A continuous and objective evaluation of emotional experience with interactive play environments. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1027–1036 (2006)Google Scholar
  102. 102.
    Mandryk, R.L., Inkpen, K.M., Calvert, T.W.: Using psychophysiological techniques to measure user experience with entertainment technologies. Behav. Inf. Technol. 25(2), 141–158 (2006)CrossRefGoogle Scholar
  103. 103.
    Marwick, A.E., et al.: I tweet honestly, I tweet passionately: twitter users, context collapse, and the imagined audience. New Media Soc. 13(1), 114–133 (2011)CrossRefGoogle Scholar
  104. 104.
    McDuff, D., El Kaliouby, R., Senechal, T., Amr, M., Cohn, J.F., Picard, R.: Affectiva-MIT facial expression dataset (AM-FED): naturalistic and spontaneous facial expressions collected “in-the-wild”. In: IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp. 881–888 (2013)Google Scholar
  105. 105.
    Mehrabian, A.: Analysis of the big-five personality factors in terms of the PAD temperament model. Aust. J. Psychol. 48(2), 86–92 (1996a)CrossRefGoogle Scholar
  106. 106.
    Mehrabian, A.: Pleasure-arousal-dominance: a general framework for describing and measuring individual differences in temperament. Curr. Psychol. 14(4), 261–292 (1996b)MathSciNetCrossRefGoogle Scholar
  107. 107.
    Mehrabian, A.: Comparison of the PAD and PANAS as models for describing emotions and for differentiating anxiety from depression. J. Psychopathol. Behav. Assess. 19(4), 331–357 (1997)CrossRefGoogle Scholar
  108. 108.
    Miyamoto, Y., Uchida, Y., Ellsworth, P.C.: Culture, mixed emotions: co-occurrence of positive and negative emotions in Japan and the United States. Emotion 10(3), 404 (2010)CrossRefGoogle Scholar
  109. 109.
    Mohammad, S.M.: #Emotional tweets. In: Proceedings of the Sixth International Workshop on Semantic Evaluation, pp. 246–255 (2012)Google Scholar
  110. 110.
    Mower, E., Matarić, M.J., Narayanan, S.: A framework for automatic human emotion classification using emotion profiles. IEEE Trans. Audio Speech Lang. Process. 19(5), 1057–1070 (2011)CrossRefGoogle Scholar
  111. 111.
    Mundt, C.W., Montgomery, K.N., Udoh, U.E., Barker, V.N., Thonier, G.C., Tellier, A.M., Ricks, R.D., Darling, R.B., Cagle, Y.D., Cabrol, N.A., et al.: A multiparameter wearable physiologic monitoring system for space and terrestrial applications. IEEE Trans. Inf. Technol. Biomed. 9(3), 382–391 (2005)CrossRefGoogle Scholar
  112. 112.
    Murugappan, M., Ramachandran, N., Sazali, Y., et al.: Classification of human emotion from EEG using discrete wavelet transform. J. Biomed. Sci. Eng. 3(04), 390 (2010)CrossRefGoogle Scholar
  113. 113.
    Myers, C.S., Rabiner, L.R.: A comparative study of several dynamic time-warping algorithms for connected-word recognition. Bell Syst. Tech. J. 60(7), 1389–1409 (1981)CrossRefGoogle Scholar
  114. 114.
    Nacke, L., Lindley, C.A.: Flow and immersion in first-person shooters: measuring the player’s gameplay experience. In: Proceedings of the 2008 Conference on Future Play: Research, Play, Share (Future Play 2008), pp. 81–88. ACM, New York (2008).
  115. 115.
    Naqvi, N., Shiv, B., Bechara, A.: The role of emotion in decision making a cognitive neuroscience perspective. Current Directions in Psychological Science 15(5), 260–264 (2006)CrossRefGoogle Scholar
  116. 116.
    Neumann, S.A., Waldstein, S.R.: Similar patterns of cardiovascular response during emotional activation as a function of affective valence and arousal and gender. J. Psychosom. Res. 50(5), 245–253 (2001)CrossRefGoogle Scholar
  117. 117.
    Neviarouskaya, A., Prendinger, H., Ishizuka, M.: Compositionality principle in recognition of fine-grained emotions from text. In: Proceedings of ICWSM (2009)Google Scholar
  118. 118.
    Newberg, L.A.: Error statistics of hidden Markov model and hidden Boltzmann model results. BMC Bioinform. 10(1), 1 (2009)MathSciNetCrossRefGoogle Scholar
  119. 119.
    Nicholson, J., Takahashi, K., Nakatsu, R.: Emotion recognition in speech using neural networks. Neural Comput. Appl. 9(4), 290–296 (2000)zbMATHCrossRefGoogle Scholar
  120. 120.
    Nicolaou, M.A., Gunes, H., Pantic, M.: Continuous prediction of spontaneous affect from multiple cues, modalities in valence-arousal space. IEEE Trans. Affect. Comput. 2(2), 92–105 (2011)CrossRefGoogle Scholar
  121. 121.
    Pedro Alves Nogueira: Rui Amaral Rodrigues, Eugénio C Oliveira, and Lennart E Nacke. Understanding and shaping players’ affective experiences in digital games. In AIIDE, Guided emotional state regulation (2013)Google Scholar
  122. 122.
    Norman, W.T.: Toward an adequate taxonomy of personality attributes: replicated factor structure in peer nomination personality ratings. J. Abnorm. Soc. Psychol. 66(6), 574 (1963)CrossRefGoogle Scholar
  123. 123.
    Nwe, T.L., Foo, S.W., De Silva, L.C.: Speech emotion recognition using hidden Markov models. Speech Commun. 41(4), 603–623 (2003)CrossRefGoogle Scholar
  124. 124.
    Oppenheim, A.V., Schafer, R.W.: From frequency to quefrency: a history of the cepstrum. IEEE Sig. Process. Mag. 21(5), 95–106 (2004)CrossRefGoogle Scholar
  125. 125.
    Ortigosa, A., Carro, R.M., Quiroga, J.I.: Predicting user personality by mining social interactions in facebook. Journal of Computer and System Sciences 80(1), 57–71 (2014)MathSciNetCrossRefGoogle Scholar
  126. 126.
    Ortony, A., Clore, G.L., Collins, A.: The Cognitive Structure of Emotions. Cambridge University Press, Cambridge (1990)Google Scholar
  127. 127.
    Paas, F.G.W.C., Van Merriënboer, J.J.G.: Instructional control of cognitive load in the training of complex cognitive tasks. Educ. Psychol. Rev. 6(4), 351–371 (1994)CrossRefGoogle Scholar
  128. 128.
    Pantic, M., Bartlett, M.S.: Machine Analysis of Facial Expressions. I-Tech Education and Publishing, Vienna (2007)CrossRefGoogle Scholar
  129. 129.
    Parikh, R., Movassate, M.: Sentiment analysis of user-generated twitter updates using various classification techniques. CS224N Final Report, pp. 1–18 (2009)Google Scholar
  130. 130.
    Paunonen, S.V., Haddock, G., Forsterling, F., Keinonen, M.: Broad versus narrow personality measures and the prediction of behaviour across cultures. Eur. J. Pers. 17(6), 413–433 (2003)CrossRefGoogle Scholar
  131. 131.
    Pavlidis, I., Dowdall, J., Sun, N., Puri, C., Fei, J., Garbey, M.: Interacting with human physiology. Comput. Vis. Image Underst. 108(1), 150–170 (2007)CrossRefGoogle Scholar
  132. 132.
    Pekrun, R., Stephens, E.J.: Achievement emotions: a control-value approach. Soc. Pers. Psychol. Compass 4(4), 238–255 (2010)CrossRefGoogle Scholar
  133. 133.
    Pennebaker, J.W., Francis, M.E., Booth, R.J.: Linguistic inquiry, word count: LIWC 2001. Mahwah: Lawrence Erlbaum Associates, vol. 71 no. 2001 (2001)Google Scholar
  134. 134.
    Perrinet, J., Olivier, A.-H., Pettré, J.: Walk with me: interactions in emotional walking situations, a pilot study. In: Proceedings of the ACM Symposium on Applied Perception, pp. 59–66 (2013)Google Scholar
  135. 135.
    Peter, C., Herbon, A.: Emotion representation and physiology assignments in digital systems. Interact. Comput. 18(2), 139–170 (2006)CrossRefGoogle Scholar
  136. 136.
    Petrantonakis, P.C., Hadjileontiadis, L.J.: Emotion recognition from EEG using higher order crossings. IEEE Trans. Inf. Technol. Biomed. 14(2), 186–197 (2010)CrossRefGoogle Scholar
  137. 137.
    Picard, R.W.: Affective Computing. MIT press, Cambridge (1997)CrossRefGoogle Scholar
  138. 138.
    Picard, R.W., Vyzas, E., Healey, J.: Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans. Pattern Anal. Mach. Intell. 23(10), 1175–1191 (2001)CrossRefGoogle Scholar
  139. 139.
    Plutchik, R.: A general psychoevolutionary theory of emotion. In: Plutchik, R., Kellerman, H. (eds.) Theories of Emotion, vol. 1, pp. 3–31. Academic press, Cambridge (1980)CrossRefGoogle Scholar
  140. 140.
    Porter, M.F.: An algorithm for suffix stripping. Program 14(3), 130–137 (1980)CrossRefGoogle Scholar
  141. 141.
    Prkachin, K.M., Solomon, P.E.: The structure, reliability and validity of pain expression: evidence from patients with shoulder pain. Pain 139(2), 267–274 (2008)CrossRefGoogle Scholar
  142. 142.
    Quercia, D., Kosinski, M., Stillwell, D., Crowcroft, J.: Our twitter profiles, our selves: predicting personality with twitter. In: IEEE International Conference on Privacy, Security, Risk and Trust (PASSAT) and Social Computing (SocialCom), pp. 180–185 (2011)Google Scholar
  143. 143.
    Quigley, K.S., Barrett, L.F.: Is there consistency and specificity of autonomic changes during emotional episodes? Guidance from the conceptual act theory and psychophysiology. Biol. Psychol. 98, 82–94 (2014)CrossRefGoogle Scholar
  144. 144.
    Rainville, P., Bechara, A., Naqvi, N., Damasio, A.R.: Basic emotions are associated with distinct patterns of cardiorespiratory activity. Int. J. Psychophysiol. 61(1), 5–18 (2006)CrossRefGoogle Scholar
  145. 145.
    Ramirez, G.A., Baltrušaitis, T., Morency, L.-P.: Modeling latent discriminative dynamic of multi-dimensional affective signals. In: D’Mello, S., Graesser, A., Schuller, B., Martin, J.-C. (eds.) ACII 2011. LNCS, vol. 6975, pp. 396–406. Springer, Heidelberg (2011). doi: 10.1007/978-3-642-24571-8_51 CrossRefGoogle Scholar
  146. 146.
    Rani, P., Sarkar, N., Liu, C.: Maintaining optimal challenge in computer games through real-time physiological feedback. In: Proceedings of the 11th International Conference on Human Computer Interaction, pp. 184–192 (2005)Google Scholar
  147. 147.
    Ravaja, N.: Contributions of psychophysiology to media research: review and recommendations. Media Psychol. 6(2), 193–235 (2004)CrossRefGoogle Scholar
  148. 148.
    Ravaja, N., Turpeinen, M., Saari, T., Puttonen, S., Keltikangas-Järvinen, L.: The psychophysiology of James Bond: phasic emotional responses to violent video game events. Emotion 8(1), 114 (2008)CrossRefGoogle Scholar
  149. 149.
    Revelle, W., Scherer, K.R.: Personality and emotion. In: Scherer, K., Sander, D. (eds.) Oxford Companion to Emotion and the Affective Sciences, pp. 304–306. Oxford University Press, OXford (2009)Google Scholar
  150. 150.
    Ruan, S., Chen, L., Sun, J., Chen, G.: Study on the change of physiological signals during playing body-controlled games. In: Proceedings of the International Conference on Advances in Computer Enterntainment Technology, pp. 349–352 (2009)Google Scholar
  151. 151.
    Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161 (1980)CrossRefGoogle Scholar
  152. 152.
    Russell, J.A.: Core affect and the psychological construction of emotion. Psychol. Rev. 110(1), 145 (2003)CrossRefGoogle Scholar
  153. 153.
    Sandbach, G., Zafeiriou, S., Pantic, M., Yin, L.: Static and dynamic 3D facial expression recognition: a comprehensive survey. Image Vis. Comput. 30(10), 683–697 (2012)CrossRefGoogle Scholar
  154. 154.
    Schafer, R.W., Rabiner, L.R.: Digital representations of speech signals. Proc. IEEE 63(4), 662–667 (1975)CrossRefGoogle Scholar
  155. 155.
    Scherer, K.R.: What are emotions? And how can they be measured? Soc. Sci. Inf. 44(4), 695–729 (2005)CrossRefGoogle Scholar
  156. 156.
    Schuller, B., Rigoll, G., Lang, M.: Speech emotion recognition combining acoustic features and linguistic information in a hybrid support vector machine-belief network architecture. In: IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), vol. 1, p. I-577 (2004)Google Scholar
  157. 157.
    Schuller, B., Lang, M., Rigoll, G.: Robust acoustic speech emotion recognition by ensembles of classifiers. Fortschritte der Akustik 31(1), 329 (2005)Google Scholar
  158. 158.
    Schuller, B., Valster, M., Eyben, F., Cowie, R., Pantic, M.: AVCE 2012: the continuous audio/visual emotion challenge. In: Proceedings of the 14th ACM International Conference on Multimodal Interaction, pp. 449–456 (2012)Google Scholar
  159. 159.
    Setz, C., Arnrich, B., Schumm, J., La Marca, R., Troster, G., Ehlert, U.: Discriminating stress from cognitive load using a wearable EDA device. IEEE Trans. Inf. Technol. Biomed. 14(2), 410–417 (2010)CrossRefGoogle Scholar
  160. 160.
    Shen, L., Wang, M., Shen, R.: Affective e-learning: using emotional data to improve learning in pervasive learning environment. J. Educ. Technol. Soc. 12(2), 176–189 (2009)Google Scholar
  161. 161.
    Shergill, G.S., Sarrafzadeh, A., Diegel, O., Shekar, A.: Computerized sales assistants: the application of computer technology to measure consumer interest-a conceptual framework. J. Electron. Commer. Res. 9(2), 176–191 (2008)Google Scholar
  162. 162.
    Shi, Y., Ruiz, N., Taib, R., Choi, E., Chen, F.: Galvanic skin response (GSR) as an index of cognitive load. In: Proceedings of CHI 2007 Extended Abstracts on Human Factors in Computing Systems, pp. 2651–2656 (2007)Google Scholar
  163. 163.
    Shivhare, S.N., Khethawat, S.: Emotion detection from text. Comput. Sci. Inf. Technol. 5, 371–377 (2012)Google Scholar
  164. 164.
    Strapparava, C., Valitutti, A., et al.: WordNet Affect: an affective extension of WordNet. In: Proceedings of LREC, vol. 4, pp. 1083–1086 (2004)Google Scholar
  165. 165.
    Sun, F.-T., Kuo, C., Cheng, H.-T., Buthpitiya, S., Collins, P., Griss, M.: Activity-aware mental stress detection using physiological sensors. In: Gris, M., Yang, G. (eds.) MobiCASE 2010. LNICSSITE, vol. 76, pp. 211–230. Springer, Heidelberg (2012). doi: 10.1007/978-3-642-29336-8_12 CrossRefGoogle Scholar
  166. 166.
    Sun, Y., Hu, S., Azorin-Peris, V., Kalawsky, R., Greenwald, S.: Noncontact imaging photoplethysmography to effectively access pulse rate variability. J. Biomed. Optics 18(6), 1–9 (2013). Article 061205Google Scholar
  167. 167.
    Teixeira, T., Wedel, M., Pieters, R.: Emotion-induced engagement in internet video advertisements. J. Mark. Res. 49(2), 144–159 (2012)CrossRefGoogle Scholar
  168. 168.
    Thought Technology Ltd.Procomp infiniti system (2016). Accessed 26 May 2016
  169. 169.
    Tian, Y., Kanade, T., Cohn, J.F.: Facial expression recognition. In: Li, S.Z., Jain, A.K. (eds.) Handbook of Face Recognition, pp. 487–519. Springer, London (2011)CrossRefGoogle Scholar
  170. 170.
    Tiller, W.A., McCraty, R., Atkinson, M.: Cardiac coherence: a new, noninvasive measure of autonomic nervous system order. Altern. Ther. Health Med. 2(1), 52–65 (1996)Google Scholar
  171. 171.
    Toole, A.J., Harms, J., Snow, S.L., Hurst, D.R., Pappas, M.R., Ayyad, J.H.: Hervé Abdi, A.: video database of moving faces, people. IEEE Trans. Pattern Anal. Mach. Intell. 27(5), 812–816 (2005)CrossRefGoogle Scholar
  172. 172.
    Trejo, L.J., Knuth, K., Prado, R., Rosipal, R., Kubitz, K., Kochavi, R., Matthews, B., Zhang, Y.: EEG-based estimation of mental fatigue: convergent evidence for a three-state model. In: Schmorrow, D.D., Reeves, L.M. (eds.) FAC 2007. LNCS (LNAI), vol. 4565, pp. 201–211. Springer, Heidelberg (2007). doi: 10.1007/978-3-540-73216-7_23 CrossRefGoogle Scholar
  173. 173.
    Valstar, M., Pantic, M.: Induced disgust, happiness, surprise: an addition to the MMI facial expression database. In: Proceedings of International Workshop on EMOTION (satellite of LREC): Corpora for Research on Emotion and Affect, p. 65 (2010)Google Scholar
  174. 174.
    Valstar, M.F., Gunes, H., Pantic, M.: How to distinguish posed from spontaneous smiles using geometric features. In: Proceedings of the International Conference on Multimodal Interfaces, pp. 38–45 (2007)Google Scholar
  175. 175.
    Vasu, V., Heneghan, C., Arumugam, T., Sezer, S.: Signal processing methods for non-contact cardiac detection using doppler radar. In: 2010 IEEE Workshop on Signal Processing Systems (SIPS), pp. 368–373 (2010)Google Scholar
  176. 176.
    Vi, C., Subramanian, S.: Detecting error-related negativity for interaction design. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 493–502 (2012)Google Scholar
  177. 177.
    Wache, J.: The secret language of our body: affect and personality recognition using physiological signals. In: Proceedings of the 16th International Conference on Multimodal Interaction, pp. 389–393 (2014)Google Scholar
  178. 178.
    Watson, D., Tellegen, A.: Toward a consensual structure of mood. Psychol. Bull. 98(2), 219 (1985)CrossRefGoogle Scholar
  179. 179.
    Weigert, A.J.: Mixed Emotions: Certain Steps Toward Understanding Ambivalence. SUNY Press, New York (1991)Google Scholar
  180. 180.
    Westerink, J.H.D.M., Van Den Broek, E.L., Schut, M.H., Van Herk, J., Tuinenbreijer, K.: Computing emotion awareness through galvanic skin response and facial electromyography. In: Probing Experience, pp. 149–162. Springer (2008)Google Scholar
  181. 181.
    Witten, I.H., Frank, E., Mining, D.: Practical Machine Learning Tools and Techniques. Morgan Kaufmann, Burlington (2005)zbMATHGoogle Scholar
  182. 182.
    Xu, J., Wang, Y., Chen, F., Choi, H., Li, G., Chen, S., Hussain, S.: Pupillary response based cognitive workload index under luminance and emotional changes. In: Proceedings of CHI 2011 Extended Abstracts on Human Factors in Computing Systems, pp. 1627–1632 (2011)Google Scholar
  183. 183.
    Yik, M., Russell, J.A., Steiger, J.H.: A 12-point circumplex structure of core affect. Emotion 11(4), 705 (2011)CrossRefGoogle Scholar
  184. 184.
    Zeng, Z., Pantic, M., Roisman, G., Huang, T.S., et al.: A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58 (2009)CrossRefGoogle Scholar
  185. 185.
    Zhou, F., Xingda, Q., Jiao, J.R., Helander, M.G.: Emotion prediction from physiological signals: a comparison study between visual and auditory elicitors. Interact. Comput. 26(3), 285–302 (2014)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Benjamin Guthier
    • 1
    Email author
  • Ralf Dörner
    • 2
  • Hector P. Martinez
    • 3
  1. 1.Department of Computer Science IVUniversity of MannheimMannheimGermany
  2. 2.Department Design, Computer Science, MediaRheinMain University of Applied SciencesWiesbadenGermany
  3. 3.Center for Computer Games ResearchIT University of CopenhagenCopenhagen SDenmark

Personalised recommendations