Abstract
The present work reports the results of perceptual experiments aimed to investigate if some of the basic emotions are perceptually privileged and if the cultural environment and the perceptual mode play a role in this preference. To this aim, Italian subjects were requested to assess emotional stimuli extracted from Italian and American English movies in the single (either video or audio alone) and the combined audio/video mode. Results showed that anger, fear, and sadness are better perceived than surprise, happiness in both the cultural environments (irony instead strongly depend on the language), that emotional information is affected by the communication mode and that language plays a role in assessing emotional information. Implications for the implementation of emotionally colored interactive systems are discussed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Apolloni, B., Aversano, G., Esposito, A.: Preprocessing and classification of emotional features in speech sentences. In: Kosarev, Y. (ed.) Proceedings, IEEE Workshop on Speech and Computer, pp. 49–52 (2000)
Apolloni, B., Esposito, A., Malchiodi, D., Orovas, C., Palmas, G., Taylor, J.G.: A general framework for learning rules from data. IEEE Transactions on Neural Networks 15(6), 1333–1350 (2004)
Apple, W., Hecht, K.: Speaking emotionally: The relation between verbal and vocal communication of affect. Journal of Personality and Social Psychology 42, 864–875 (1982)
Banse, R., Scherer, K.: Acoustic profiles in vocal emotion expression. Journal of Personality & Social Psychology 70(3), 614–636 (1996)
Bartneck, C.: Affective expressions of machines. StanAckerman Institute, Eindhoven (2000)
Bryll, R., Quek, F., Esposito, A.: Automatic hand hold detection in natural conversation. In: Proceedings of IEEE Workshop on Cues in Communication, Hawai, December 9 (2001)
Bourbakis, N.G., Esposito, A., Kavraki, D.: Multi-modal interfaces for interaction-communication between hearing and visually impaired individuals: Problems & issues. In: Proceedings of the International Conference on Tool for Artificial Intelligence, Patras, Greece, October 29-31, pp. 1–10 (2007)
Bourbakis, N.G., Esposito, A., Kavraki, D.: Analysis of invariant meta-features for learning and understanding disable people’s emotional behavior related to their health conditions: A case study. In: Proceedings of 6th International IEEE Symposium BioInformatics and BioEngineering, pp. 357–369. IEEE Computer Society, Los Alamitos (2006)
Cacioppo, J.T., Berntson, G.G., Larsen, J.T., Poehlmann, K.M., Ito, T.A.: The Psychophysiology of emotion. In: Lewis, J.M., Haviland-Jones, M. (eds.) Handbook of Emotions, 2nd edn., pp. 173–191. Guilford Press, New York (2000)
Campos, J.J., Barrett, K., Lamb, M.E., Goldsmith, H.H., Stenberg, C.: Socioemotional development. In: Haith, M.M., Campos, J.J. (eds.) Handbook of Child Psychology, 4th edn., vol. 2, pp. 783–915. Wiley, New York (1983)
Cassell, J., Vilhjalmsson, H., Bickmore, T.: BEAT: the Behavior Expression Animation Toolkit. In: Proceedings of SIGGRAPH (2001)
Chollet, G., Esposito, A., Gentes, A., Horain, P., Karam, W., Li, Z., Pelachaud, C., Perrot, P., Petrovska-Delacrétaz, D., Zhou, D., Zouari, L.: Multimodal Human Machine Interactions in Virtual and Augmented Reality. In: Esposito, A., et al. (eds.) Multimodal Signals: Cognitive and Algorithmic Issues. LNCS, vol. 5398, pp. 1–23. Springer, Heidelberg (2009)
Cosmides, L.: Invariances in the acoustic expressions of emotions during speech. Journal of Experimental Psychology, Human Perception Performance 9, 864–881 (1983)
Doyle, P.: When is a communicative agent a good idea? In: Proceedings of Inter. Workshop on Communicative and Autonomous Agents, Seattle (1999)
Ekman, P.: Facial expression of emotion: New findings, new questions. Psychological Science 3, 34–38 (1992)
Ekman, P., Friesen, W.V.: Facial action coding system: A technique for the measurement of facial movement. Consulting Psychologists Press, Palo Alto (1978)
Ekman, P., Friesen, W.V.: Manual for the Facial Action Coding System. Consulting Psychologists Press, Palo Alto (1977)
Elliott, C.D.: The affective reasoner: A process model of emotions in a multi-agent system. Ph.D. Thesis, Institute for the Learning Sciences, Northwestern University, Evanston, Illinois (1992)
Esposito, A.: The Perceptual and Cognitive Role of Visual and Auditory Channels in Conveying Emotional Information. Cognitive Computation Journal 2, 1–11 (2009)
Esposito, A.: Affect in Multimodal Information. In: Tao, J., Tan, T. (eds.) Affective Information Processing, pp. 211–234. Springer, Heidelberg (2008)
Esposito, A.: The amount of information on emotional states conveyed by the verbal and nonverbal channels: some perceptual data. In: Stylianou, Y., Faundez-Zanuy, M., Esposito, A. (eds.) COST 277. LNCS, vol. 4391, pp. 249–268. Springer, Heidelberg (2007)
Esposito, A.: COST 2102: Cross-modal analysis of verbal and nonverbal Communication (CAVeNC). In: Esposito, A., Faundez-Zanuy, M., Keller, E., Marinaro, M. (eds.) COST Action 2102. LNCS (LNAI), vol. 4775, pp. 1–10. Springer, Heidelberg (2007)
Ezzat, T., Geiger, G., Poggio, T.: Trainable videorealistic speech animation. In: Proceedings of SIGGRAPH, San Antonio, Texas, July 2002, pp. 388–397 (2002)
Fasel, B., Luettin, J.: Automatic facial expression analysis: A survey. Pattern Recognition 36, 259–275 (2002)
Friend, M.: Developmental changes in sensitivity to vocal paralanguage. Developmental Science 3, 148–162 (2000)
Frick, R.: Communicating emotions: the role of prosodic features. Psychological Bullettin 93, 412–429 (1985)
Fulcher, J.A.: Vocal affect expression as an indicator of affective response. Behavior Research Methods, Instruments, & Computers 23, 306–313 (1991)
Fu, S., Gutierrez-Osuna, R., Esposito, A., Kakumanu, P., Garcia, O.N.: Audio/visual mapping with cross-modal Hidden Markov Models. IEEE Transactions on Multimedia 7(2), 243–252 (2005)
Gutierrez-Osuna, R., Kakumanu, P., Esposito, A., Garcia, O.N., Bojorquez, A., Castello, J., Rudomin, I.: Speech-driven facial animation with realistic dynamic. IEEE Transactions on Multimedia 7(1), 33–42 (2005)
Hozjan, V., Kacic, Z.: A rule-based emotion-dependent feature extraction method for emotion analysis from speech. Journal of the Acoustical Society of America 119(5), 3109–3120 (2006)
Hozjan, V., Kacic, Z.: Context-independent multilingual emotion recognition from speech signals. International Journal of Speech Technology 6, 311–320 (2003)
Huang, C.L., Huang, Y.M.: Facial expression recognition using model-based feature extraction and action parameters Classification. Journal of Visual Commumication and Image Representation 8(3), 278–290 (1997)
Izard, C.E., Ackerman, B.P.: Motivational, organizational, and regulatory functions of discrete emotions. In: Lewis, J.M., Haviland-Jones, M. (eds.) Handbook of Emotions, 2nd edn., pp. 253–264. Guilford Press, New York (2000)
Izard, C.E.: The maximally discriminative facial movement coding system (MAX). Unpublished manuscript. Available from Instructional Resource Center, University of Delaware (1979)
Izard, C.E.: Human Emotions. Plenum Press, New York (1977)
Kähler, K., Haber, J., Seidel, H.: Geometry-based muscle modeling for facial animation. In: Proceedings of the International Conference on Graphics Interface, pp. 27–36 (2001)
Kakumanu, P., Esposito, A., Garcia, O.N., Gutierrez-Osuna, R.: A comparison of acoustic coding models for speech-driven facial animation. Speech Commumication 48, 598–615 (2006)
Kakumanu, P., Gutierrez-Osuna, R., Esposito, A., Bryll, R., Goshtasby, A., Garcia, O.N.: Speech Dirven Facial Animation. In: Proceedings of ACM Workshop on Perceptive User Interfaces, Orlando, November 15-16 (2001)
Kanade, T., Cohn, J., Tian, Y.: Comprehensive database for facial expression analysis. In: Proceedings of the 4th IEEE International Conference on Automatic Face and Gesture Recognition, pp. 46–53 (2000)
Koda, T.: Agents with faces: A study on the effect of personification of software agents. Master Thesis, MIT Media Lab, Cambridge (1996)
Morishima, S.: Face analysis and synthesis. IEEE Signal Processing Magazine 18(3), 26–34 (2001)
Oatley, K., Jenkins, J.M.: Understanding emotions. Blackwell, Oxford (1996)
Pantic, M., Patras, I., Rothkrantz, J.M.: Facial action recognition in face profile image sequences. In: Proceedings IEEE International Conference Multimedia and Expo., pp. 37–40 (2002)
Pantic, M., Rothkrantz, J.M.: Expert system for automatic analysis of facial expression. Image and Vision Computing Journal 18(11), 881–905 (2000)
Scherer, K.R.: Vocal communication of emotion: A review of research paradigms. Speech Communication 40, 227–256 (2003)
Scherer, K.R., Banse, R., Wallbott, H.G., Goldbeck, T.: Vocal cues in emotion encoding and decoding. Motiation and Emotion 15, 123–148 (1991)
Scherer, K.R.: Vocal correlates of emotional arousal and affective disturbance. In: Wagner, H., Manstead, A. (eds.) Handbook of social Psychophysiology, pp. 165–197. Wiley, New York (1989)
Stocky, T., Cassell, J.: Shared reality: Spatial intelligence in intuitive user interfaces. In: Proceedings of Intelligent User Interfaces, San Francisco, CA, pp. 224–225 (2002)
Van Bezooijen, R.: The Characteristics and Recognizability of Vocal Expression of Emotions. Foris, Drodrecht, The Netherlands (1984)
Schubert, T.W.: The Power in Your Hand: Gender Differences in Bodily Feedback from Making a Fist. Personality and Social Psychology Bulletin 30, 757–769 (2004)
Bargh, J.A., Chen, M., Burrows, L.: Automaticity of Social Behavior: Direct Effects of Trait Construct and Stereotype Activation on Action. Journal of Personality and Social Psychology 71, 230–244 (1996)
Stepper, S., Strack, F.: Proprioceptive Determinants of Emotional and Nonnemotional Feelings. Journal of Personality and Social Psychology 64, 211–220 (1993)
Esposito, A., Carbone, D., Riviello, M.T.: Visual Context Effects on the Perception of Musical Emotional Expressions. In: Fierrez, J., et al. (eds.) Biometric ID Management and Multimodal Communication. LNCS, vol. 5707, pp. 81–88. Springer, Heidelberg (2009)
Stickel, C., Ebner, M., Steinbach-Nordmann, S., Searle, G., Holzinger, A.: Emotion Detection: Application of the Valence Arousal Space for Rapid Biological Usability Testing to Enhance Universal Access. In: Stephanidis, C. (ed.) Universal Access in HCI, Part I, HCII 2009. LNCS, vol. 5614, pp. 615–624. Springer, Heidelberg (2009)
Gallager, R.G.: Information theory and reliable communication. John Wiley & Son, Chichester (1968)
Sinanović, S., Johnson, D.H.: Toward a theory of information processing. Signal Processing 87, 1326–1344 (2007)
Massaro, D.W.: Perceiving talking faces. MIT Press, Cambridge (1998)
Ali, S.M., Silvey, S.D.: A general class of coefficients of divergence of one distribution from another. Journal of Royal Statistic Society 28, 131–142 (1996)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Esposito, A., Riviello, M.T., Bourbakis, N. (2009). Cultural Specific Effects on the Recognition of Basic Emotions: A Study on Italian Subjects. In: Holzinger, A., Miesenberger, K. (eds) HCI and Usability for e-Inclusion. USAB 2009. Lecture Notes in Computer Science, vol 5889. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-10308-7_9
Download citation
DOI: https://doi.org/10.1007/978-3-642-10308-7_9
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-10307-0
Online ISBN: 978-3-642-10308-7
eBook Packages: Computer ScienceComputer Science (R0)