Skip to main content
Log in

Empathetic robot evaluation through emotion estimation analysis and facial expression synchronization from biological information

  • Original Article
  • Published:
Artificial Life and Robotics Aims and scope Submit manuscript

Abstract

Empathy is an important factor in human communication. For a robot to apply a matching emotion in human–robot communication, the robot needs to be able to understand human feelings. Therefore, in this study, we aimed to improve the human impression of the robot using a robot that expresses human-like expressions by synchronizing with human biological information and changing the expressions in real time. We first measured and estimated human emotions using an emotion estimation method based on biological information (brain waves and heartbeats). The three-emotion estimation methods were proposed and evaluated in the preliminary experiment. Among the three-emotion estimation methods proposed, the one that yields the highest impression rating was chosen to be used in the second experiment which was based on the emotional value in each cycle method. Then, we developed a robot that shows expressions in two patterns: (1) synchronized emotion (same emotion as subject conveyed) and (2) inversed emotion with the human. The subjects evaluated the robot’s expression from both patterns using semantic differential (SD) method while having their biological information measured based on the selected emotion estimation method from previous preliminary experiment. The evaluation by SD method and biological information results showed that when the human experienced the happiness emotion, and the robot synchronized and expressed the same emotion, this could increase the intimacy between human and robot. Here, it can be said that the impression created by the robot’s expression can be improved using biological information.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. Necessity and issues of partner robots (2015), http://www.soumu.go.jp/johotsusintokei/whitepaper/ja/h27/html/nc241350.html (in Japanese)

  2. Hiroi Y, Ito A, Nakano E (2008) Improvement of user familiarity using robot avatar for the human symbiosis robot. J Jpn Soc Kansei Eng 7(4):797–805

    Article  Google Scholar 

  3. Yamaguci R, Miyamoto R (2018) Delineating the neural basis of cognitive and affective empathy in adult women during a novel facial assessment and observation task. Jpn J Clin Neuro-physiol 46(6):567–577

    Google Scholar 

  4. SoftBank, Pepper. https://www.softbank.jp/robot/consumer/products/

  5. Hasegawa R, Fujimura T (2014) A New issue for the practical application of an eeg-based brain-machine interface (BMI); an emotional communication aid by the CG avatars that exhibit a variety of facial expressions. J Inst Image Info Television Eng 68(12):902–906

    Google Scholar 

  6. Yamano M, Usui T, Hashimoto M (2008) A proposal of human-robot interaction design based on emotional synchronization. Human-Agent Interaction (HAI) Symposium

  7. Jimenez F, Yoshikawa T, Furuhashi T, Kanoh M (2016) Effects of collaborative learning with robots using model of emotional expressions. J Jpn Soc Fuzzy Theory Intell Info 28(4):700–704

    Google Scholar 

  8. Tanizaki Y, Jimenez F, Yoshikawa T, Furuhashi T (2018) Impression effects of educational support robots using sympathy expressions method by body movement and facial expression. J Jpn Soc Fuzzy Theory Intell Info 30(5):700–708

    Google Scholar 

  9. Encyclopedia Nipponice, Shogakukan. https://kotobank.jp/word/%E5%85%B1%E6%84%9F-477908

  10. Kurono Y, Sripian P, Chen F, Sugaya M (2019) A preliminary experiment on the estimation of emotion using facial expression and biological signals. International Conference on Human-Computer Interaction, pp 133–142

  11. Sripian P, Kurono Y, Yoshida R, Sugaya M (2019) Study of empathy on robot expression based on emotion estimated from facial expression and biological signals.In: 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), pp 1–8

  12. Ikeda Y, Horie R, Sugaya M (2017) Estimate emotion with biological information for robot interaction. In: 21st International Conference on Knowledge-Based and Intelligent Information & Engineering Systems (KES-2017), Marseille, France, pp 6–8

  13. Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39(6):1161–1178

    Article  Google Scholar 

  14. NeuroSky (2004) MindWave Mobile. http;//store.neurosky.com

  15. Panicker SS, Gayathri P (2019) A survey of machine learning techniques in physiology based mental stress detection systems. Biocybern Biomed Eng 39(2):444–469

    Article  Google Scholar 

  16. Switch Science (2010) Pulse sensor. https;//www.switch-science.com

  17. Ragot M, Martin N, Em S, Pallamin N, Diverrez JM (2017), Emotion recognition using physiological signals: laboratory vs. wearable sensors. Advances in human factors in wearable technologies and game design, pp 15–22

  18. López-Gil JM, Virgili-Gomá J, Gil R, García R (2016) Method for improving EEG based emotion recognition by combining it with synchronized biometric and eye tracking technologies in a non-invasive and low cost way. Front Comput Neurosci 10:1–14

    Google Scholar 

  19. Noguchi A, Watanabe D (2010) Estimation of emotional strength received from the accelerated linear motion. Tokyo University of Technology

  20. Baveye Y, Dellandrea E, Chamaret C, Chen L (2015) LIRIS-ACCEDE: a video database for affective content analysis. IEEE Trans Affect Comput 6(1):43–55

    Article  Google Scholar 

  21. Hayashi F (1978) The fundamental dimensions of interpersonal cognitive structure. Bulletin of the school of education. Psychology 25:233–247

    Google Scholar 

  22. Eerola T, Vuoskoski JK (2011) A comparison of the discrete and dimensional models of emotion in music. Psychol Music 39(1):18–49

    Article  Google Scholar 

  23. Mori K, Iwanaga M (2014) Recent progress on music and motion studies: psychological response, peripheral nervous system activity, and musico-acoustic features. Jpn Psychol Rev 57(2):215–234

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Muhammad Nur Adilin Mohd Anuardi.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sripian, P., Mohd Anuardi, M.N.A., Kajihara, Y. et al. Empathetic robot evaluation through emotion estimation analysis and facial expression synchronization from biological information. Artif Life Robotics 26, 379–389 (2021). https://doi.org/10.1007/s10015-021-00696-w

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10015-021-00696-w

Keywords

Navigation