“How Is His/Her Mood”: A Question That a Companion Robot May Be Able to Answer

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9979)


Mood, as one of the human affects, plays a vital role in human-human interaction, especially due to its long lasting effects. In this paper, we introduce an approach in which a companion robot, capable of mood detection, is employed to detect and report the mood state of a person to his/her partner to make him/her prepared for upcoming encounters. Such a companion robot may be used at home or at work which would be able to improve the interaction experience for couples, partners, family members, etc. We have implemented the proposed approach using a vision-based method for mood detection. The approach has been tested by an experiment and a follow up study. Descriptive and statistical analysis were performed to analyze the gathered data. The results show that this type of information can have positive impact on interaction of partners.


Emotion Facial expressions HRI Social robot Mood 



The first author would like to thank her friends in ARIS and Mobile Robot Lab at the school ECE, University of Tehran, as well as her collogues in FANAP Company for their kind help and participation in these experiments. Furthermore, she would like to thank Dr. Leila Kashani for her constructive review and feedbacks on the manuscript. This work was supported by national funds through Fundação para a Ciência e a Tecnologia (FCT) with reference UID/CEC/50021/2013.


  1. Ekman, P., Rosenberg, E.L.: What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS). Oxford University Press, New York (1997)Google Scholar
  2. Gebhard, P.: ALMA: a layered model of affect. In: Proceedings of the fourth international joint conference on Autonomous agents and multiagent systems. ACM (2005)Google Scholar
  3. Hashemian, M., Moradi, H., Mirian, M.S., Tehrani-doost, M.: Determining mood via emotions observed in face by induction. In: Second RSI/ISM International Conference on Robotics and Mechatronics (ICRoM). IEEE (2014)Google Scholar
  4. Hashemian, M., Moradi, H., Mirian, M.S., Tehrani-Doost, M., Ward, R.K.: Is the mood really in the eye of the beholder? In: Stephanidis, C. (ed.) HCI 2015. CCIS, vol. 528, pp. 712–717. Springer, Heidelberg (2015). doi: 10.1007/978-3-319-21380-4_120 CrossRefGoogle Scholar
  5. Hashemian, M., Nikoukaran, A., Moradi, H., Mirian, M.S., Tehrani-doost, M.: Determining mood using emotional features. In: 7th International Symposium on Telecommunications (IST). IEEE (2014)Google Scholar
  6. Hashemian, M., Moradi, H., Mirian, M.S., Tehrani-doost, M.: Recognizing mood using facial emotional features. Technical report, MIR_TechReport 94-12-10/1, School of Electrical Engineering and Computer Science, University of Tehran (2016)Google Scholar
  7. Hori, M., Tsuruda, Y., Yoshimura, H., Iwai, Y.: Expression transmission using exaggerated animation for Elfoid. In: Frontiers in psychology 6 (2015)Google Scholar
  8. Kintz, B.L., Delprato, D.J., Mettee, D.R., Persons, C.E., Schappe, R.H.: The experimenter effect. Psychol. Bull. 63(4), 223 (1965)CrossRefGoogle Scholar
  9. Lee, C.M., Narayanan, S.S.: Toward detecting emotions in spoken dialogs. IEEE Trans. Speech Audio Process. 13(2), 293–303 (2005)CrossRefGoogle Scholar
  10. Mayer, J.: Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT), version 2.0. Multi-Health Systems, Toronto (2002)Google Scholar
  11. Movellan, J. R., Tanaka, F., Fasel, I.R., Taylor, C., Ruvolo, P., Eckhardt, M.: The RUBI project: a progress report. In: Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction. ACM (2007)Google Scholar
  12. Park, S., Moshkina, L., Arkin, R.C.: Mood as an affective component for robotic behavior with continuous adaptation via learning momentum. In: 10th IEEE-RAS International Conference on Humanoid Robots (Humanoids). IEEE (2010)Google Scholar
  13. Ryan, R.M.: The Oxford Handbook of Human Motivation. Oxford University Press, New York (2012)CrossRefGoogle Scholar
  14. Sakagami, Y., Watanabe, R., Aoyama, C., Matsunaga, S., Higaki, N., Fujimura, K.: The intelligent ASIMO: System overview and integration. In: IEEE/RSJ International Conference on Intelligent Robots and Systems. IEEE (2002)Google Scholar
  15. Sanchez-Cortes, D., Biel, J.I., Kumano, S., Yamato, J., Otsuka, K., Gatica-Perez, D.: Inferring mood in ubiquitous conversational video. In: Proceedings of the 12th International Conference on Mobile and Ubiquitous Multimedia. ACM (2013)Google Scholar
  16. Sebe, N., Cohen, I., Huang, T.S.: Multimodal emotion recognition. Handbook Pattern Recogn. Comput. Vis. 4, 387–419 (2005)CrossRefGoogle Scholar
  17. Stiehl, W.D., Lee, J.K., Breazeal, C., Nalin, M., Morandi, A., Sanna, A.: The huggable: a platform for research in robotic companions for pediatric care. In: Proceedings of the 8th International Conference on interaction Design and Children. ACM (2009)Google Scholar
  18. Sy, T., Côté, S., Saavedra, R.: The contagious leader: impact of the leader’s mood on the mood of group members, group affective tone, and group processes. J. Appl. Psychol. 90(2), 295 (2005)CrossRefGoogle Scholar
  19. Thayer, R.E.: The Biopsychology of Mood and Arousal. Oxford University Press, New York (1989)Google Scholar
  20. Thayer, R.E.: The Origin of Everyday Moods: Managing Energy, Tension, and Stress. Oxford University Press, New York (1997)Google Scholar
  21. Thrasher, M., Zwaag, M.D., Bianchi-Berthouze, N., Westerink, J.H.D.M.: Mood recognition based on upper body posture and movement features. In: D’Mello, S., Graesser, A., Schuller, B., Martin, J.-C. (eds.) ACII 2011. LNCS, vol. 6974, pp. 377–386. Springer, Heidelberg (2011). doi: 10.1007/978-3-642-24600-5_41 CrossRefGoogle Scholar
  22. Wood, J.: Interpersonal Communication: Everyday Encounters. Nelson Education, Toronto (2015)Google Scholar
  23. Xu, J., Broekens, J., Hindriks, K., Neerincx, M.A.: Effects of bodily mood expression of a robotic teacher on students. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2014). IEEE (2014)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.Advanced Robotics and Intelligent Systems Lab, School of Electrical and Computer Engineering, College of EngineeringUniversity of TehranTehranIran
  2. 2.INESC-ID and Instituto Superior TécnicoUniversity of LisbonPorto SalvoPortugal
  3. 3.Intelligent Systems Research Institute, SKKUSuwonSouth Korea
  4. 4.Center for Integrated Computer Systems Research, Faculty of Computer ScienceUniversity of British ColumbiaVancouverCanada

Personalised recommendations