Skip to main content
Log in

Emotion Modelling for Social Robotics Applications: A Review

  • Published:
Journal of Bionic Engineering Aims and scope Submit manuscript

Abstract

Robots of today are eager to leave constrained industrial environments and embrace unexplored and unstructured areas, for extensive applications in the real world as service and social robots. Hence, in addition to these new physical frontiers, they must face human ones, too. This implies the need to consider a human-robot interaction from the beginning of the design; the possibility for a robot to recognize users’ emotions and, in a certain way, to properly react and “behave”. This could play a fundamental role in their integration in society. However, this capability is still far from being achieved. Over the past decade, several attempts to implement automata for different applications, outside of the industry, have been pursued. But very few applications have tried to consider the emotional state of users in the behavioural model of the robot, since it raises questions such as: how should human emotions be modelled for a correct representation of their state of mind? Which sensing modalities and which classification methods could be the most feasible to obtain this desired knowledge? Furthermore, which applications are the most suitable for the robot to have such sensitivity? In this context, this paper aims to provide a general overview of recent attempts to enable robots to recognize human emotions and interact properly.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. International Federation of Robotics. World Robotics Report 2016: European Union Occupies Top Position in the Global Automation Race, Report, Frankfurt, Germany, 2016.

  2. Fiorini L, Esposito R, Bonaccorsi M, Petrazzuolo C, Saponara F, Giannantonio R, De Petris G, Dario P, Cavallo F. Enabling personalised medical support for chronic disease management through a hybrid robot-cloud approach. Autonomous Robots, 2017, 41, 1263–1276.

    Article  Google Scholar 

  3. Reppou S, Karagiannis G. Social inclusion with robots: A RAPP case study using NAO for technology illiterate elderly at ormylia foundation. In: Szewczyk R, Zielinski C, Kaliczynska M eds., Progress in Automation, Robotics and Measuring Techniques. Advances in Intelligent Systems and Computing, Springer, Cham, Switzerland, 2015, 233–241.

    Google Scholar 

  4. Fasola J, Matarić M J. A socially assistive robot exercise coach for the elderly. Journal of Human-Robot Interaction, 2013, 2, 3–32.

    Article  Google Scholar 

  5. García-Soler Á, Facal D, Díaz-Orueta U, Pigini L, Blasi L, Qiu R X. Inclusion of service robots in the daily lives of frail older users: A step-by-step definition procedure on users’ requirements. Archives of Gerontology and Geriatrics, 2018, 74, 191–196.

    Article  Google Scholar 

  6. Lisetti C L. Affective computing. Pattern Analysis and Applications, 1998, 1, 71–73.

    Article  Google Scholar 

  7. Castellano G, Leite I, Pereira A, Martinho C, Paiva A, McOwan P W. Affect recognition for interactive companions: Challenges and design in real world scenarios. Journal on Multimodal User Interfaces, 2010, 3, 89–98.

    Article  Google Scholar 

  8. Spaulding S, Breazeal C. Towards affect-awareness for social robots. 2015 AAAI Fall Symposium Series, USA, 2015, 128–130.

    Google Scholar 

  9. Gray K, Wegner D M. Feeling robots and human zombies: Mind perception and the uncanny valley. Cognition, 2012, 125, 125–130.

    Article  Google Scholar 

  10. Rau P L P, Li Y, Li D. A cross-cultural study: Effect of robot appearance and task. International Journal of Social Robotics, 2010, 2, 175–186.

    Article  Google Scholar 

  11. Ho C C, MacDorman K F. Revisiting the uncanny valley theory: Developing and validating an alternative to the Godspeed indices. Computers in Human Behavior, 2010, 26, 1508–1518.

    Article  Google Scholar 

  12. Mathur M B, Reichling D B. Navigating a social world with robot partners: A quantitative cartography of the Uncanny Valley. Cognition, 2016, 146, 22–32.

    Article  Google Scholar 

  13. Hassenzahl M, Tractinsky N. User experience–A research agenda. Behaviour & Information Technology, 2006, 25, 91–97.

    Article  Google Scholar 

  14. Chumkamon S, Hayashi E, Masato K. Intelligent emotion and behavior based on topological consciousness and adaptive resonance theory in a companion robot. Biologically Inspired Cognitive Architectures, 2016, 18, 51–67.

    Article  Google Scholar 

  15. Limbu D K, Anthony W C Y, Adrian T H J, Dung T A, Kee T Y, Dat T H, Alvin W H Y, Terence N W Z, Jiang R D, Li J. Affective social interaction with CuDDler robot. The Proceedings of 6th IEEE Conference on Robotics, Automation and Mechatronics (RAM), IEEE, Manila, Philippines, 2013, 179–184.

    Google Scholar 

  16. Ekman P, Friesen W V. Constants across cultures in the face and emotion. Journal of Personality and Social Psychology, 1971, 17, 124–129.

    Article  Google Scholar 

  17. Bennett C C, Sabanovic S. Deriving minimal features for human-like facial expressions in robotic faces. International Journal of Social Robotics, 2014, 6, 367–381.

    Article  Google Scholar 

  18. Salmeron J L. Fuzzy cognitive maps for artificial emotions forecasting. Applied Soft Computing, 2012, 12, 3704–3710

    Article  Google Scholar 

  19. Russell J A. A circumplex model of affect. Journal of Personality and Social Psychology, 1980, 39, 1161–1178.

    Article  Google Scholar 

  20. Koelstra S, Patras I. Fusion of facial expressions and EEG for implicit affective tagging. Image & Vision Computing, 2013, 31, 164–174.

    Article  Google Scholar 

  21. Banda N, Engelbrecht A, Robinson P. Feature reduction for dimensional emotion recognition in human-robot interaction. 2015 IEEE Symposium Series on Computational Intelligence, Cape Town, South Africa, 2015, 803–810.

    Chapter  Google Scholar 

  22. Han J, Xie L, Li D, He Z J, Wang Z L. Cognitive emotion model for eldercare robot in smart home. China Communications, 2015, 12, 32–41.

    Google Scholar 

  23. Cambria E, Livingstone A, Hussain A. The hourglass of emotions. In: Esposito A, Esposito A M, Vinciarelli A, Hoffmann R, Müller V C, eds, Cognitive Behavioural Systems. Lecture Notes in Computer Science, Springer, Berlin, Heidelberg, Germany, 2012.

    Google Scholar 

  24. Poria S, Peng H, Hussain A, Howard N, Cambria E. Ensemble application of convolutional neural networks and multiple kernel learning for multimodal sentiment analysis. Neurocomputing, 2017, 261, 217–230.

    Article  Google Scholar 

  25. Terada K, Yamauchi A, Ito A. Artificial emotion expression for a robot by dynamic color change. The Proceedings of IEEE RO-MAN, Paris, France, 2012, 314–321.

    Google Scholar 

  26. Raymundo C R, Johnson C G, Vargas P A. An architecture for emotional and context-aware associative learning for robot companions. The Proceedings of 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Kobe, Japan, 2015, 8799–8804.

    Google Scholar 

  27. Shi X-F, Wang Z L, Ping A, Zhang L-K. Artificial emotion model based on reinforcement learning mechanism of neural network. The Journal of China Universities of Posts Telecommunications, 2011, 18, 105–109.

    Article  Google Scholar 

  28. Lewis M, Canamero L. Are discrete emotions useful in human-robot interaction? Feedback from motion capture analysis. Humaine Association Conference on Affective Computing and Intelligent Interaction, 2013, 97–102.

    Google Scholar 

  29. Henriques R, Paiva A, Antunes C. Accessing emotion patterns from affective interactions using electrodermal activity. Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII), Geneva, Switzerland, 2013, 43–48.

    Google Scholar 

  30. Kirby R, Forlizzi J, Simmons R. Affective social robots. Robotics and Autonomous Systems, 2010, 58, 322–332.

    Article  Google Scholar 

  31. Picard R W. Affective computing. Pattern Recognition, 1995, 20.

  32. Giorgana G, Ploeger P G. Facial Expression Recognition for Domestic Service Robots, Springer, Berlin, Heidelberg, Germany, 2012.

    Book  Google Scholar 

  33. Vitale J, Williams M A, Johnston B, Boccignone G. Affective facial expression processing via simulation: A probabilistic model. Biologically Inspired Cognitive Architectures, 2014, 10, 30–41.

    Article  Google Scholar 

  34. Lourens T, Van Berkel R, Barakova E. Communicating emotions and mental states to robots in a real time parallel framework using Laban movement analysis. Robotics & Autonomous Systems, 2010, 58, 1256–1265.

    Article  Google Scholar 

  35. Lim A, Okuno H G. The MEI robot: Towards using motherese to develop multimodal emotional intelligence. IEEE Transactions on Autonomous Mental Development, 2014, 6, 126–138.

    Article  Google Scholar 

  36. Liu H, Zhang W. Mandarin emotion recognition based on multifractal theory towards human-robot interaction. IEEE International Conference on Robotics and Biomimetics (ROBIO), Shenzhen, China, 2013, 593–598.

    Chapter  Google Scholar 

  37. Juszkiewicz L. Improving speech emotion recognition system for a social robot with speaker recognition. 19th International Conference on Methods and Models in Automation and Robotics (MMAR), Miedzyzdroje, Poland, 2014.

    Google Scholar 

  38. Le B V, Lee S. Adaptive hierarchical emotion recognition from speech signal for human-robot communication. 2014 Tenth International Conference on Intelligent Information Hiding and Multimedia Signal Processing (IIH-MSP), Kitakyushu, Japan, 2014, 807–810.

    Chapter  Google Scholar 

  39. Perez-Gaspar L A, Caballero-Morales S O, Trujillo-Romero F. Multimodal emotion recognition with evolutionary computation for human-robot interaction. Expert Systems with Applications, 2016, 66, 42–61.

    Article  Google Scholar 

  40. Nardelli M, Valenza G, Greco A, Lanata A, Scilingo E P. Arousal recognition system based on heartbeat dynamics during auditory elicitation. 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 2015.

    Google Scholar 

  41. Rattanyu K, Ohkura M, Mizukawa M. Emotion monitoring from physiological signals for service robots in the living space. International Conference on Control Automation and Systems (ICCAS), Gyeonggi-do, South Korea, 2010, 580–583.

    Google Scholar 

  42. Ferreira J, Br As S, Silva C F, Soares S C. An automatic classifier of emotions built from entropy of noise. Psychophysiology, 2017, 54, 620–627.

    Article  Google Scholar 

  43. Khezri M, Firoozabadi M, Sharafat A R. Reliable emotion recognition system based on dynamic adaptive fusion of forehead biopotentials and physiological signals. Computer Methods and Programs in Biomedicine, 2015, 122, 149–164.

    Article  Google Scholar 

  44. Wang S, Du J, Xu R. Decision fusion for EEG-based emotion recognition. International Conference on Machine Learning and Cybernetics (ICMLC), Guangzhou, China, 2015, 883–889.

    Chapter  Google Scholar 

  45. Nhan B R, Chau T. Classifying affective states using thermal infrared imaging of the human face. IEEE Transactions on Bio-medical Engineering, 2010, 57, 979–987.

    Article  Google Scholar 

  46. Henriques R, Paiva A. Learning effective models of emotions from physiological signals: The seven principles. Lecture Notes in Computer Science, 2014, 8908, 137–155.

    Article  Google Scholar 

  47. Saldien J, Goris K, Vanderborght B, Vanderfaeillie J, Lefeber D. Expressing emotions with the social robot probo. International Journal of Social Robotics, 2010, 2, 377–389.

    Article  Google Scholar 

  48. Ghayoumi M, Bansal A K. Multimodal architecture for emotion in robots using deep learning. Future Technologies Conference (FTC), San Francisco, USA, 2016.

    Google Scholar 

  49. Esposito D, Cavallo F. Preliminary design issues for inertial rings in Ambient Assisted Living applications. IEEE Instrumentation and Measurement Technology Conference, Pisa, Italy, 2015, 250–255.

    Google Scholar 

  50. Park J-H, Jang D-G, Park J, Youm S-K. Wearable sensing of in-ear pressure for heart rate monitoring with a piezoelectric Sensor. Sensors, 2015, 15, 23402–23417.

    Article  Google Scholar 

  51. Leo M, Del Coco M, Carcagnì P, Distante C, Bernava M, Pioggia G, Palestra G. Automatic emotion recognition in robot-children interaction for ASD treatment. IEEE International Conference on Computer Vision Workshop (ICCVW), Santiago, Chile, 2015, 537–545.

    Chapter  Google Scholar 

  52. De A, Saha A, Pal M C. A human facial expression recognition model based on eigen face approach. Procedia Computer Science, 2015, 45, 282–289.

    Article  Google Scholar 

  53. Luo R C, Lin P H, Wu Y C, Huang C Y. Dynamic face recognition system in recognizing facial expressions for service robotics. IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Kachsiung, Taiwan, 2012, 879–884.

    Chapter  Google Scholar 

  54. Blais C, Roy C, Fiset D, Arguin M, Gosselin F. The eyes are not the window to basic emotions. Neuropsychologia, 2012, 50, 2830–2838.

    Article  Google Scholar 

  55. Khan R A, Meyer A, Konik H, Bouakaz S. Framework for reliable, real-time facial expression recognition for low resolution images. Pattern Recognition Letters, 2013, 34, 1159–1168.

    Article  Google Scholar 

  56. Ali H, Hariharan M, Yaacob S, Adom A H. Facial emotion recognition using empirical mode decomposition. Expert Systems with Applications, 2015, 42, 1261–1277.

    Article  Google Scholar 

  57. Kushiro K, Harada Y, Takeno J. Robot uses emotions to detect and learn the unknown. Biologically Inspired Cognitive Architectures, 2013, 4, 69–78

    Article  Google Scholar 

  58. Jitviriya W, Koike M, Hayashi E. Behavior selection system based on emotional variations. 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Kobe, Japan, 2015, 462–467.

    Google Scholar 

  59. Mei Y, Liu Z T. An emotion-driven attention model for service robot. 12th World Congress on Intelligent Control and Automation (WCICA), Guilin, China, 2016, 1526–1531.

    Google Scholar 

  60. Yin Z, Zhao M, Wang Y, Yang J, Zhang J. Recognition of emotions using multimodal physiological signals and an ensemble deep learning model. Computer Methods & Programs in Biomedicine, 2017, 140, 93–110.

    Article  Google Scholar 

  61. Mayya V, Pai R M, Pai M. Automatic facial expression recognition using DCNN. Procedia Computer Science, 2016, 93, 453–461.

    Article  Google Scholar 

  62. Barros P, Jirak D, Weber C, Wermter S. Multimodal emotional state recognition using sequence-dependent deep hierarchical features. Neural Networks, 2015, 72, 140–151.

    Article  Google Scholar 

  63. Mower E, Matarić M J, Narayanan S. A framework for automatic human emotion classification using emotion profiles. IEEE Transactions on Audio, Speech, and Language Processing, 2011, 19, 1057–1070.

    Article  Google Scholar 

  64. Nho Y-H, Seo J-W, Seol W-J, Kwon D-S. Emotional interaction with a mobile robot using hand gestures. 11th International Conference on Ubiquitous Robots and Ambient Intelligence, Kuala Lumpur, Malaysia, 2014, 506–509.

    Google Scholar 

  65. Röning J, Holappa J, Kellokumpu V, Tikanmäki A, Pietikäinen M. Minotaurus: A system for affective human–robot interaction in smart environments. Cognitive Computation, 2014, 6, 940–953.

    Article  Google Scholar 

  66. Van Chien D, Sung K J, Trung P X, Kim J-W. Emotion expression of humanoid robot by modification of biped walking pattern. 15th International Conference on Control, Automation and Systems (ICCAS), Busan, South Korea, 2015, 741–743.

    Chapter  Google Scholar 

  67. Sincák P, Novotná E, Cádrik T Magyar G, Mach M, Cavallo F. Bonaccorsi M. Cloud-based Wizard of Oz as a service. IEEE 19th International Conference on Intelligent Engineering Systems (INES), Bratislava, Slovakia, 2015, 445–448.

    Chapter  Google Scholar 

  68. Mazzei D, Zaraki A, Lazzeri N, De Rossi D. Recognition and expression of emotions by a symbiotic android head. 14th IEEE-RAS International Conference on Humanoid Robots (Humanoids), Madrid, Spain, 2014, 134–139.

    Google Scholar 

  69. Boccanfuso L, Barney E, Foster C, Ahn Y A, Chawarska K, Scassellati B, Shic F. Emotional robot to examine differences in play patterns and affective response of Children with and without ASD. 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Christchurch, New Zealand, 2016, 19–26.

    Google Scholar 

  70. Cao H L, Esteban P G, De Beir A, Simut R, Van De Perre G, Lefeber D, Vanderborght B. ROBEE: A homeostatic-based social behavior controller for robots in Human-Robot Interaction experiments. IEEE International Conference on Robotics and Biomimetics (ROBIO), 2014, 516–521.

    Google Scholar 

  71. Cavallo F, Limosani R, Manzi A, Bonaccorsi M, Esposito R, Di Rocco M, Pecora F, Teti G, Saffiotti A, Dario P. Development of a socially believable multi-robot solution from town to home. Cognitive Computation, 2014, 6, 954–967.

    Article  Google Scholar 

  72. Kopacek P. Ethical and social aspects of robots. IFAC Proceedings Volumes, 2014, 47, 11425–11430.

    Article  Google Scholar 

  73. Mell P, Grance T, Grance T. The NIST Definition of Cloud Computing Recommendations of the National Institute of Standards and Technology, National Institute of Standards and Technology-Special Publication 800-145, 2011.

    Google Scholar 

  74. Goldberg K, Kehoe B. Cloud Robotics and Automation: A Survey of Related Work, Technical Report, USA, 2013.

    Google Scholar 

  75. Kehoe B, Patil S, Abbeel P. Goldberg K. A survey of research on cloud robotics and automation. IEEE Transactions on Automation Science & Engineering, 2015, 12, 398–409.

    Google Scholar 

  76. Chen M, Ma Y, Song J, Lai C-F, Hu B. Smart clothing: Connecting human with clouds and big data for sustainable health monitoring. Mobile Networks & Applications, 2016, 21, 825–845.

    Article  Google Scholar 

  77. Lim A, Okuno H G. The MEI robot: Towards using motherese to develop multimodal emotional intelligence. IEEE Transactions on Autonomous Mental Development, 2014, 6, 126–138.

    Article  Google Scholar 

  78. Yan H, Ang Jr M H, Neow Poo A. A survey on perception methods for human–robot interaction in social robots. International Journal of Social Robotics, 2014, 6, 85–119.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Filippo Cavallo.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cavallo, F., Semeraro, F., Fiorini, L. et al. Emotion Modelling for Social Robotics Applications: A Review. J Bionic Eng 15, 185–203 (2018). https://doi.org/10.1007/s42235-018-0015-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s42235-018-0015-y

Keywords

Navigation