Skip to main content

Using Big Data for Emotionally Intelligent Mobile Services through Multi-Modal Emotion Recognition

  • Conference paper
  • First Online:
Inclusive Smart Cities and e-Health (ICOST 2015)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 9102))

Included in the following conference series:

Abstract

Humans express and perceive emotional states in multi-modal fashion such as facial, acoustic expressions, gesture, and posture. Our task as AI researchers is to give computers the ability to communicate with users in consideration of their emotions. In recognition of a subject’s emotions, it is significantly important to be aware of the emotional context. Thanks to the advancement of mobile technology, it is feasible to collect contextual data. In this paper, the authors descibe the first step to extract insightful emotional information using cloud-based Big Data infrastructure. Relevant aspects of emotion recognition and challenges that come with multi-modal emotion recognition are also discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. El Ayadi, M., Kamel, M.S., Karray, F.: Survey on speech emotion recognition: Features, classification schemes, and databases. Pattern Recognition 44(3), 572–587 (2011)

    Article  MATH  Google Scholar 

  2. Zeng, Z., Pantic, M., Roisman, G.I., Huang, T.S.: A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence 31(1), 39–58 (2009)

    Article  Google Scholar 

  3. Berthelon, F., Sander, P.: Emotion ontology for context awareness. In: Cognitive Infocommunications (CogInfoCom), 2013 IEEE 4th International Conference on Cognitive Infocommunications, pp. 59–64, 2–5 Dec. 2013

    Google Scholar 

  4. Schachter, S., Singer, J.: Cognitive, social and physiological determinants of emotional state. Psychological Review (1979)

    Google Scholar 

  5. Pavaloi, I., Musca, E., Rotaru, F.: Emotion recognition in audio records. Signals, Circuits and Systems (ISSCS), pp. 1–4, 11–12 July 2013

    Google Scholar 

  6. Asawa, K., Verma, V., Agrawal, A.: Recognition of vocal emotions from acoustic profile. In: Proceedings of the International Conference on Advances in Computing, Communications and Informatics (ICACCI 2012), pp. 710-716. ACM, New York, NY, USA

    Google Scholar 

  7. Pantic, M., Sebe, N., Cohn, J.F., Huang, T.: Affective multi-modal human-computer interaction. In: Proceedings of the 13th annual ACM international conference on Multimedia (MULTIMEDIA 2005), pp. 669–676. ACM, New York, NY, USA (2005)

    Google Scholar 

  8. An Oracle White Paper: Oracle Information Architecture: An Architect’s Guide to Big Data. August 2012

    Google Scholar 

  9. O’Reilly Media: Big Data New, 2012th edn. O’Reilly Media Inc., Sebastopol, CA (2012)

    Google Scholar 

  10. Krcadinac, U., Pasquier, P., Jovanovic, J., Devedzic, V.: Synesketch: An Open Source Library for Sentence-Based Emotion Recognition. IEEE Transactions on Affective Computing 4(3), 312–325 (2013)

    Article  Google Scholar 

  11. Kao, E.C.-C., Chun-Chieh, L., Yang, T.-H., Hsieh, C.-T., Soo, V.-W.: Towards Text-based Emotion Detection A Survey and Possible Improvements. In: Information Management and Engineering, 2009, ICIME 2009, International Conference on Information Management and Engineering, pp. 70–74, 3–5 April 2009

    Google Scholar 

  12. Ekman, P., Friesen, W.V., Ancoli, S.: Facial signs of emotional experience. Journal of Personality and Social Psychology 19(6), 1123–1134 (1980)

    Google Scholar 

  13. Metri, P., Ghorpade, J., Butalia, A.: Facial Emotion Recognition Using Context Based Multimodal Approach. International Journal of Artificial Intelligence and Interactive Multimedia 1(4) (2011)

    Google Scholar 

  14. Ekman, P.: Universals and cultural differences in facial expressions of emotions. Nebraska Symposium on Motivation 19, 207–283 (1972)

    Google Scholar 

  15. Zeng, Z., Tu, J., Pianfetti, B., Liu, M., Zhang, T., Zhang, Z., Huang, T.S., Levinson, S.: Audio-visual affect recognition through multi-stream fused HMM for HCI. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2005, vol. 2, pp. 967–972, 20–25 June 2005

    Google Scholar 

  16. Khanna, P., Sasikumar, M.: Rule based system for recognizing emotions using multi-modal approach. (IJACSA) International Journal of Advanced Computer Science and Applications 4(7) (2013)

    Google Scholar 

  17. Niforatos, E., Karapanos, E.: EmoSnaps: A Mobile Application for Emotion Recall from Facial Expressions. CHI 2013, April 27 - May 2, 2013, Paris, France

    Google Scholar 

  18. Oh, K., Park, H.-S., Cho, S.-B.: A Mobile Context Sharing System Using Activity and Emotion Recognition with Bayesian Networks. In: 2010 7th International Conference Ubiquitous Intelligence and Computing and 7th International Conference on Autonomic and Trusted Computing (UIC/ATC), pp. 244–249, 26–29 Oct. 2010

    Google Scholar 

  19. Revelle, W., Scherer, K.R.: Personality and Emotion. In the Oxford Companion to the Affective Sciences, Oxford University Press (2010)

    Google Scholar 

  20. Fischer, A.H., Manstead, A.S.R., Zaalberg, R.: Socail influence on the emotion process. European Review of Social Psychology 14(1), 171–202 (2003)

    Article  Google Scholar 

  21. Funf - Open Sensing Framework. http://www.funf.org

  22. Luxand FaceSDK. https://www.luxand.com/facesdk

  23. Wagner, J., Andre, E., Jung, F.: Smart sensor integration: A framework for multi-modal emotion recognition in real-time. In: ACII 2009, 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, pp. 1–8, 10–12 Sept. 2009

    Google Scholar 

  24. Lingenfelser, F., Wagner, J., André, E.: A systematic discussion of fusion techniques for multi-modal affect recognition tasks. In: Proceedings of the 13th international conference on multi-modal interfaces (ICMI 2011). ACM, New York, NY, USA, 19–26

    Google Scholar 

  25. Tang, K., Tie, Y., Yang, T., Guan, L.: Multimodal emotion recognition (MER) system. In: 2014 IEEE 27th Canadian Conference on Electrical and Computer Engineering (CCECE), pp. 1–6, 4–7 May 2014

    Google Scholar 

  26. Shortiffe, E.H., Buchanan, B.G.: A Multimodal of Inexact Reasoning in Medicine. Mathematical Bioscience 23, 351–379 (1975)

    Article  Google Scholar 

  27. Steinbauer, M., Kotsis, G.: Building an Information System for Reality Mining Based on Communication Traces. NBiS, 2012, pp. 306–310

    Google Scholar 

  28. Mousannif, H., Khalil, I.: The human face of mobile. In: Linawati, Mahendra, M.S., Neuhold, E.J., Tjoa, A.M., You, I. (eds.) ICT-EurAsia 2014. LNCS, vol. 8407, pp. 1–20. Springer, Heidelberg (2014)

    Chapter  Google Scholar 

  29. Demchenko, Y., de Laat, C., Membrey, P.: Defining architecture components of the Big Data Ecosystem. In: 2014 International Conference on Collaboration Technologies and Systems (CTS), pp. 104–112, 19–23 May 2014

    Google Scholar 

  30. Demchenko, Y., Grosso, P., de Laat, C., Membrey, P.: Addressing big data issues in Scientific Data Infrastructure. In: 2013 International Conference on Collaboration Technologies and Systems (CTS), pp. 48–55, 20–24 May 2013

    Google Scholar 

  31. Welcome to Apache Hadoop! http://hadoop.apache.org/

  32. Apache Storm. http://hortonworks.com/hadoop/storm/

  33. Xuggler. http://www.xuggle.com/xuggler/

  34. NIST Big Data Working Group (NBD-WG). http://bigdatawg.nist.gov/home.php

  35. Big Data Reference Architecture. NBD-WG. NIST. http://bigdatawg.nist.gov/_uploadfiles/M0226_v10_1554566513.docx

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Matthias Steinbauer .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Baimbetov, Y., Khalil, I., Steinbauer, M., Anderst-Kotsis, G. (2015). Using Big Data for Emotionally Intelligent Mobile Services through Multi-Modal Emotion Recognition. In: Geissbühler, A., Demongeot, J., Mokhtari, M., Abdulrazak, B., Aloulou, H. (eds) Inclusive Smart Cities and e-Health. ICOST 2015. Lecture Notes in Computer Science(), vol 9102. Springer, Cham. https://doi.org/10.1007/978-3-319-19312-0_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-19312-0_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-19311-3

  • Online ISBN: 978-3-319-19312-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics