Abstract
Humans express and perceive emotional states in multi-modal fashion such as facial, acoustic expressions, gesture, and posture. Our task as AI researchers is to give computers the ability to communicate with users in consideration of their emotions. In recognition of a subject’s emotions, it is significantly important to be aware of the emotional context. Thanks to the advancement of mobile technology, it is feasible to collect contextual data. In this paper, the authors descibe the first step to extract insightful emotional information using cloud-based Big Data infrastructure. Relevant aspects of emotion recognition and challenges that come with multi-modal emotion recognition are also discussed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
El Ayadi, M., Kamel, M.S., Karray, F.: Survey on speech emotion recognition: Features, classification schemes, and databases. Pattern Recognition 44(3), 572–587 (2011)
Zeng, Z., Pantic, M., Roisman, G.I., Huang, T.S.: A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence 31(1), 39–58 (2009)
Berthelon, F., Sander, P.: Emotion ontology for context awareness. In: Cognitive Infocommunications (CogInfoCom), 2013 IEEE 4th International Conference on Cognitive Infocommunications, pp. 59–64, 2–5 Dec. 2013
Schachter, S., Singer, J.: Cognitive, social and physiological determinants of emotional state. Psychological Review (1979)
Pavaloi, I., Musca, E., Rotaru, F.: Emotion recognition in audio records. Signals, Circuits and Systems (ISSCS), pp. 1–4, 11–12 July 2013
Asawa, K., Verma, V., Agrawal, A.: Recognition of vocal emotions from acoustic profile. In: Proceedings of the International Conference on Advances in Computing, Communications and Informatics (ICACCI 2012), pp. 710-716. ACM, New York, NY, USA
Pantic, M., Sebe, N., Cohn, J.F., Huang, T.: Affective multi-modal human-computer interaction. In: Proceedings of the 13th annual ACM international conference on Multimedia (MULTIMEDIA 2005), pp. 669–676. ACM, New York, NY, USA (2005)
An Oracle White Paper: Oracle Information Architecture: An Architect’s Guide to Big Data. August 2012
O’Reilly Media: Big Data New, 2012th edn. O’Reilly Media Inc., Sebastopol, CA (2012)
Krcadinac, U., Pasquier, P., Jovanovic, J., Devedzic, V.: Synesketch: An Open Source Library for Sentence-Based Emotion Recognition. IEEE Transactions on Affective Computing 4(3), 312–325 (2013)
Kao, E.C.-C., Chun-Chieh, L., Yang, T.-H., Hsieh, C.-T., Soo, V.-W.: Towards Text-based Emotion Detection A Survey and Possible Improvements. In: Information Management and Engineering, 2009, ICIME 2009, International Conference on Information Management and Engineering, pp. 70–74, 3–5 April 2009
Ekman, P., Friesen, W.V., Ancoli, S.: Facial signs of emotional experience. Journal of Personality and Social Psychology 19(6), 1123–1134 (1980)
Metri, P., Ghorpade, J., Butalia, A.: Facial Emotion Recognition Using Context Based Multimodal Approach. International Journal of Artificial Intelligence and Interactive Multimedia 1(4) (2011)
Ekman, P.: Universals and cultural differences in facial expressions of emotions. Nebraska Symposium on Motivation 19, 207–283 (1972)
Zeng, Z., Tu, J., Pianfetti, B., Liu, M., Zhang, T., Zhang, Z., Huang, T.S., Levinson, S.: Audio-visual affect recognition through multi-stream fused HMM for HCI. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2005, vol. 2, pp. 967–972, 20–25 June 2005
Khanna, P., Sasikumar, M.: Rule based system for recognizing emotions using multi-modal approach. (IJACSA) International Journal of Advanced Computer Science and Applications 4(7) (2013)
Niforatos, E., Karapanos, E.: EmoSnaps: A Mobile Application for Emotion Recall from Facial Expressions. CHI 2013, April 27 - May 2, 2013, Paris, France
Oh, K., Park, H.-S., Cho, S.-B.: A Mobile Context Sharing System Using Activity and Emotion Recognition with Bayesian Networks. In: 2010 7th International Conference Ubiquitous Intelligence and Computing and 7th International Conference on Autonomic and Trusted Computing (UIC/ATC), pp. 244–249, 26–29 Oct. 2010
Revelle, W., Scherer, K.R.: Personality and Emotion. In the Oxford Companion to the Affective Sciences, Oxford University Press (2010)
Fischer, A.H., Manstead, A.S.R., Zaalberg, R.: Socail influence on the emotion process. European Review of Social Psychology 14(1), 171–202 (2003)
Funf - Open Sensing Framework. http://www.funf.org
Luxand FaceSDK. https://www.luxand.com/facesdk
Wagner, J., Andre, E., Jung, F.: Smart sensor integration: A framework for multi-modal emotion recognition in real-time. In: ACII 2009, 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, pp. 1–8, 10–12 Sept. 2009
Lingenfelser, F., Wagner, J., André, E.: A systematic discussion of fusion techniques for multi-modal affect recognition tasks. In: Proceedings of the 13th international conference on multi-modal interfaces (ICMI 2011). ACM, New York, NY, USA, 19–26
Tang, K., Tie, Y., Yang, T., Guan, L.: Multimodal emotion recognition (MER) system. In: 2014 IEEE 27th Canadian Conference on Electrical and Computer Engineering (CCECE), pp. 1–6, 4–7 May 2014
Shortiffe, E.H., Buchanan, B.G.: A Multimodal of Inexact Reasoning in Medicine. Mathematical Bioscience 23, 351–379 (1975)
Steinbauer, M., Kotsis, G.: Building an Information System for Reality Mining Based on Communication Traces. NBiS, 2012, pp. 306–310
Mousannif, H., Khalil, I.: The human face of mobile. In: Linawati, Mahendra, M.S., Neuhold, E.J., Tjoa, A.M., You, I. (eds.) ICT-EurAsia 2014. LNCS, vol. 8407, pp. 1–20. Springer, Heidelberg (2014)
Demchenko, Y., de Laat, C., Membrey, P.: Defining architecture components of the Big Data Ecosystem. In: 2014 International Conference on Collaboration Technologies and Systems (CTS), pp. 104–112, 19–23 May 2014
Demchenko, Y., Grosso, P., de Laat, C., Membrey, P.: Addressing big data issues in Scientific Data Infrastructure. In: 2013 International Conference on Collaboration Technologies and Systems (CTS), pp. 48–55, 20–24 May 2013
Welcome to Apache Hadoop! http://hadoop.apache.org/
Apache Storm. http://hortonworks.com/hadoop/storm/
Xuggler. http://www.xuggle.com/xuggler/
NIST Big Data Working Group (NBD-WG). http://bigdatawg.nist.gov/home.php
Big Data Reference Architecture. NBD-WG. NIST. http://bigdatawg.nist.gov/_uploadfiles/M0226_v10_1554566513.docx
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Baimbetov, Y., Khalil, I., Steinbauer, M., Anderst-Kotsis, G. (2015). Using Big Data for Emotionally Intelligent Mobile Services through Multi-Modal Emotion Recognition. In: Geissbühler, A., Demongeot, J., Mokhtari, M., Abdulrazak, B., Aloulou, H. (eds) Inclusive Smart Cities and e-Health. ICOST 2015. Lecture Notes in Computer Science(), vol 9102. Springer, Cham. https://doi.org/10.1007/978-3-319-19312-0_11
Download citation
DOI: https://doi.org/10.1007/978-3-319-19312-0_11
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-19311-3
Online ISBN: 978-3-319-19312-0
eBook Packages: Computer ScienceComputer Science (R0)