Advertisement

Estimate Emotion Method to Use Biological, Symbolic Information Preliminary Experiment

  • Yuhei Ikeda
  • Midori SugayaEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9743)

Abstract

Imagine the day that a robot would comfort you when you feel sad. To achieve the ability to estimate emotion and feeling, a lot of work has been done in the field of artificial intelligence [1] and robot engineering that focuses on human robot communications, especially where it applies to therapy [2, 3]. Generally, estimating emotions of people is based on expressed information such as facial expression, eye-gazing direction and behaviors that are observable by the robot [4, 5, 6]. However, sometimes this information would not be suitable, as some people do not express themselves with observable information. In this case, it is difficult to estimate the emotion even if the analysis technologies are sophisticated. The main idea of our proposal is to use biological information for estimating the actual emotion of people. The preliminary experiments show that our suggested method will outperform the traditional method, for the people who cannot expressed emotion directly.

Keywords

Estimate emotion Robotics application Biological information Estimation Feeling 

Notes

Acknowledgement

We would like to thank Tateishi Science Foundation, and MEXT/JSPS KAKENHI Grant 15K00105 for a grant that made it possible to complete this study.

References

  1. 1.
    Muehlhauser, L., Helm, L.: Intelligence explosion and machine ethics. In: Eden, A.H., Moor, J.H., Søraker, J.H., Steinhart, E. (eds.) Singularity Hypotheses: A Scientific and Philosophical Assessment, pp. 101–126. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  2. 2.
    Weingartz, Sarah: Robotising Dementia Care? A Qualitative Analysis on Technological Mediations of a Therapeutic Robot Entering the Lifeworld of Danish Nursing Homes. MA European Studies of Science, Society and Technology (ESST), Cambridge (2011)Google Scholar
  3. 3.
    Ekman, P.: Universals and cultural differences in facial expressions of emotions. In: Cole, J. (Ed.) Nebraska Symposium on Motivation, pp. 207–282 (1972)Google Scholar
  4. 4.
    Traver, V., Javier, D.E.L., Pobil, A.P., Pérez-Francisco, M.: Making service robots human-safe. In: Proceedings of 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2000 (IROS 2000), pp. 696–701. IEEE (2000)Google Scholar
  5. 5.
    Song, W.-K., et al.: Visual servoing for a user’s mouth with effective intention reading in a wheelchair-based robotic arm. In: IEEE International Conference on Robotics and Automation, Proceedings 2001 ICRA, pp. 3662–3667. IEEE (2001)Google Scholar
  6. 6.
    Ono, K., Miyamichi, J., Yamaguchi, T.: Intelligent robot system using “model of knowledge, emotion and intention” and “information sharing architecture”. In: Proceedings of 2001 IEEE International Symposium on Computational Intelligence in Robotics and Automation, pp. 498–501. IEEE (2001)Google Scholar
  7. 7.
    Hiraki, N.: Zibun no kimochi wo kichinto<Tsutaeru>gizyutsu (Techniques to communicate properly my feelings). PHPKenkyuzyo (2007)Google Scholar
  8. 8.
    Ohkura, M., et al.: Measurement of “wakuwaku” feeling generated by interactive systems using biological signals. In: Proceedings of KANSEI Engineering and Emotion Research International Conference, pp. 2293–2301 (2010)Google Scholar
  9. 9.
    Minamitani, H.: Fatigue and stress. J. Soc. Biomechanisms 21(2), 58–64 (1997)Google Scholar
  10. 10.
    Navalyal, G.U., Gavas, R.D.: A dynamic attention assessment and enhancement tool using computer graphics. Hum.-Cent. Comput. Inf. Sci. 4(1), 1–7 (2014)CrossRefGoogle Scholar
  11. 11.
    Chu, K.-Y., Wong, C.Y.: Player’s attention and meditation level of input devices on mobile gaming. In: IEEE 2014 3rd International Conference on User Science and Engineering (i-user), pp. 13–17 (2014)Google Scholar
  12. 12.
    Russell, James A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980)CrossRefGoogle Scholar
  13. 13.
    Grimm, M., et al.: Primitives-based evaluation and estimation of emotions in speech. Speech Commun. 49(10), 787–800 (2007)CrossRefGoogle Scholar
  14. 14.
    Sakamatsu, H., et al.: Proposal of self-feedback interface by MMD model based on detecting emotions using biosensors. In: 2015 Information Processing Society of Japan, pp. 602–605 (2015)Google Scholar
  15. 15.
    Kawazoe, J., Mizuki, Y.Y.T., Tokuda, T.Y.K.T.H.: momo!: Mood modeling and visualization based on vital information. In: IPSJ ubiquitous computing system (UBI), pp. 79–86 (2007) Google Scholar
  16. 16.
    Hayashi, M., Miyashita, H., Okada, K.: A mapping method for KANSEI information utilizing physiological information in virtual reality space. In: Groupware and Network Services (GN), pp. 25–30 (2008)Google Scholar
  17. 17.
  18. 18.
    Mietus, J.E., et al.: The pNNx files: re-examining a widely used heart rate variability measure. Heart 88(4), 378–380 (2002)CrossRefGoogle Scholar
  19. 19.
    Moscato, F., et al.: Continuous monitoring of cardiac rhythms in left ventricular assist device patients. Artif. Organs 38(3), 191–198 (2014)CrossRefGoogle Scholar
  20. 20.
    Image Sensing Technology | Products | OMRON Electronic Components Web (2015). https://www.omron.com/ecb/products/mobile/
  21. 21.
    Arduino Heartbeat sensor Shield Kit A.P. Shield 05_Tokyodevices (2015). https://tokyodevices.jp/items/3
  22. 22.
    RAPIRO: official site (2015). http://www.rapiro.com/ja/

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.College of EngineeringShibaura Institute of TechnologyTokyoJapan

Personalised recommendations