Skip to main content
Log in

Humanoid robot heads for human-robot interaction: A review

  • Review
  • Published:
Science China Technological Sciences Aims and scope Submit manuscript

Abstract

The humanoid robot head plays an important role in the emotional expression of human-robot interaction (HRI). They are emerging in industrial manufacturing, business reception, entertainment, teaching assistance, and tour guides. In recent years, significant progress has been made in the field of humanoid robots. Nevertheless, there is still a lack of humanoid robots that can interact with humans naturally and comfortably. This review comprises a comprehensive survey of state-of-the-art technologies for humanoid robot heads over the last three decades, which covers the aspects of mechanical structures, actuators and sensors, anthropomorphic behavior control, emotional expression, and human-robot interaction. Finally, the current challenges and possible future directions are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Graterol W, DiazAmado J, Cardinale Y, et al. Emotion detection for social robots based on NLP transformers and an emotion ontology. Sensors, 2021, 21: 1322

    Article  Google Scholar 

  2. Kanda T, Shiomi M, Miyashita Z, et al. A communication robot in a shopping mall. IEEE Trans Robot, 2010, 26: 897–913

    Article  Google Scholar 

  3. Lee M C, Chiang S Y, Yeh S C, et al. Study on emotion recognition and companion Chatbot using deep neural network. Multimed Tools Appl, 2020, 79: 19629–19657

    Article  Google Scholar 

  4. Kozima H, Nakagawa C, Yasuda Y. Interactive robots for communication-care: A case-study in autism therapy. In: Proceedings of the IEEE International Workshop on Robot and Human Interactive Communication. Nashville, 2005. 341–346

  5. Mehrabian A, Ferris S R. Inference of attitudes from nonverbal communication in two channels. J Consulting Psychol, 1967, 31: 248–252

    Article  Google Scholar 

  6. Esposito A, Cuciniello M, Amorese T, et al. Humanoid and android robots in the imaginary of adolescents, young adults and seniors. J Ambient Intell Hum Comput, 2022, doi: https://doi.org/10.1007/s12652-022-03806-z

  7. Fu C, Deng Q, Shen J, et al. A preliminary study on realizing human-robot mental comforting dialogue via sharing experience emotionally. Sensors, 2022, 22: 991–1006

    Article  Google Scholar 

  8. Nakata Y, Yagi S, Yu S, et al. Development of “ibuki” an electrically actuated childlike android with mobility and its potential in the future society. Robotica, 2021, 40: 933–950

    Article  Google Scholar 

  9. Pioggia G, Ahluwalia A, Carpi F, et al. FACE: Facial automaton for conveying emotions. Appl BIon Biomech, 2004, 1: 91–100

    Article  Google Scholar 

  10. Kondo Y, Takemura K, Takamatsu J, et al. Multi-person human-robot interaction system for android robot. In: Proceedings of the 2010 IEEE/SICE International Symposium on System Integration. Sendai, 2010. 176–181

  11. Zecca M, Mizoguchi Y, Endo K, et al. Whole body emotion expressions for KOBIAN humanoid robot—preliminary experiments with different emotional patterns. In: Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication. Toyama, 2009. 381–386

  12. Hashimoto T, Hitramatsu S, Tsuji T, et al. Development of the face robot SAYA for rich facial expressions. In: Proceedings of the 2006 SICE-ICASE International Joint Conference. Busan, 2006. 5423–5428

  13. Hashimoto T, Hiramatsu S, Tsuji T, et al. Realization and evaluation of realistic nod with receptionist robot SAYA. In: Proceedings of the 16th IEEE International Conference on Robot and Human Interactive Communication. Jeju, 2007. 326–331

  14. Hashimoto T, Kato N, Kobayashi H. Field trial of android-type remote class support system in elementary school and effect evaluation. In: Proceedings of the 2009 IEEE International Conference on Robotics and Biomimetics (ROBIO). Guilin, 2009. 1135–1140

  15. Ishihara H, Yoshikawa Y, Asada M. Realistic child robot “affetto” for understanding the caregiver-child attachment relationship that guides the child development. In: Proceedings of the 2011 IEEE International Conference on Development and Learning (ICDL). Frankfurt am Main, 2011. 1–5

  16. Minato T, Yoshikawa Y, Noda T, et al. CB2: A child robot with biomimetic body for cognitive developmental robotics. In: Proceedings of the 2007 7th IEEE-RAS International Conference on Humanoid Robots. Pittsburgh, 2007. 557–562

  17. Geller T. Overcoming the uncanny valley. IEEE Comput Grap Appl, 2008, 28: 11–17

    Article  Google Scholar 

  18. Park S, Lee H, Hanson D, et al. Sophia-Hubo’s arm motion generation for a handshake and gestures. In: Proceedings of the 2018 15th International Conference on Ubiquitous Robots (UR). Honolulu, 2018. 511–515

  19. Hanson D F, White V. Converging the capabilities of EAP artificial muscles and the requirements of bio-inspired robotics. SPIE Proc, 2004, 5385: 29–40

    Article  Google Scholar 

  20. Berns K, Hirth J. Control offacial expressions of the humanoid robot head ROMAN. In: Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems. Beijing, 2006. 3119–3124

  21. Oh J H, Hanson D, Kim W S, et al. Design of android type humanoid robot Albert HUBO. In: Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems. Beijing: IEEE, 2006. 1428–1433

    Google Scholar 

  22. Endo N, Takanishi A. Development of whole-body emotional expression humanoid robot for ADL-assistive RT services. J Robot Mechatron, 2011, 23: 969–977

    Article  Google Scholar 

  23. Tadesse Y, Hong D, Priya S. Twelve degree of freedom baby humanoid head using shape memory alloy actuators. J Mech Robot, 2011, 3: 011008

    Article  Google Scholar 

  24. Kobayashi K, Akasawa H, Hara F. Study on new face robot platform for robot-human communication. In: Proceedings of the 8th IEEE International Workshop on Robot and Human Interaction. Pisa, 1999. 242–247

  25. Itoh C, Kato S, Itoh H. Mood-transition-based emotion generation model for the robot’s personality. In: Proceedings of the 2009 IEEE International Conference on Systems, Man and Cybernetics. San Antonio, 2009. 2878–2883

  26. Russell J A. A circumplex model of affect. J Personality Soc Psychol, 1980, 39: 1161–1178

    Article  Google Scholar 

  27. Mehrabian A. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Curr Psychol, 1996, 14: 261–292

    Article  MathSciNet  Google Scholar 

  28. Li Y I, Hashimoto M. Emotional synchronization-based human-robot communication and its effects. Int J Hum Robot, 2013, 10: 1350014–1350044

    Article  Google Scholar 

  29. Miwa H, Umetsu T, Takanishi A, et al. Robot personality based on the equations of emotion defined in the 3D mental space. In: Proceedings of the IEEE International Conference on Robotics and Automation. Seoul, 2001. 2602–2607

  30. Meng Q M, Wu W G, Lu L. Research and development of the humanoid head portrait robot “H&F robot-II” with expression and function of recognizing human expression. In: Proceedings of the 2006 IEEE International Conference on Robotics and Biomimetics. Kunming, 2006. 1372–1377

  31. Song K T, Han M J, Chang F Y, et al. A robotic facial expression recognition system using real-time vision system. Key Eng Mater, 2008, 381–382: 375–378

    Article  Google Scholar 

  32. Pioggia G, Ferro M, Kozlowski J, et al. Automatic facial expression recognition by means of a neural approach. In: Proceedings of the 2nd International Symposium on Measurement, Analysis and Modeling of Human Functiction. Genoa, 2004. 121–124

  33. Habib A, Das S K, Bogdan I C, et al. Learning human-like facial expressions for android Phillip K. Dick. In: Proceedings of the 2014 IEEE International Conference on Automation Science and Engineering (CASE). New Taipei, 2014. 1159–1165

  34. Hyung H J, Yoon H U, Choi D, et al. Optimizing android facial expressions using genetic algorithms. Appl Sci, 2019, 9: 3379–3396

    Article  Google Scholar 

  35. Chen B, Hu Y, Li L, et al. Smile like you mean it: Driving anima-tronic robotic face with learned models. In: Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA). Xi’an, 2021. 2739–2746

  36. Cid F, Moreno J, Bustos P, et al. Muecas: A multi-sensor robotic head for affective human robot interaction and imitation. Sensors, 2014, 14: 7711–7737

    Article  Google Scholar 

  37. Ekman P, Friesen W V. Facial action coding system (FACS): A technique for the measurement of facial actions. Riv Psichiatr, 1978, 47: 126–138

    Google Scholar 

  38. Netter F H. Netter atlas of human anatomy: A systems. Elsevier Health Sci, 2022, 7–15

  39. GB/T 10000-1988, Chinese Adult Body Size (in Chinese). Beijing: China Standard Press, 1988

    Google Scholar 

  40. GB/T 23461-2009, Three-Dimensional Dimensions of the Adult Male Head (in Chinese). Beijing: China Standard Press, 2009

    Google Scholar 

  41. GB/T 2428-1998, Adult Head and Facial Dimensions (in Chinese). Beijing: China Standard Press, 1998

    Google Scholar 

  42. Wang G Z. Research on robot bionic binocular motion control model (in Chinese). Dissertation for Master’s Degree. Jinan: Shandong University, 2017. 1–83

    Google Scholar 

  43. Liu Y C, Cheng X F. Design of mandibular opening and closing mouth rehabilitation robot (in Chinese). J North China Uni Sci Technol (Natural Science Edition), 2018, 40: 93–99

    Google Scholar 

  44. Sahoo D, Deck C, Yoganandan N, et al. Influence of head mass on temporo-parietal skull impact using finite element modeling. Med Biol Eng Comput, 2015, 53: 869–878

    Article  Google Scholar 

  45. Hashimoto M, Morooka D. Robotic facial expression using a curved surface display. J Robot Mechatron, 2006, 18: 504–510

    Article  Google Scholar 

  46. Takanishi A, Matsuno T, Kato I. Development of an anthropomorphic head-eye robot with two eyes-coordinated head-eye motion and pursuing motion in the depth direction. In: Proceedings of the 1997 IEEE RSJ International Conference on Intelligent Robot and Systems. Grenoble, 1997. 799–804

  47. Li L, Godaba H, Ren H, et al. Bioinspired soft actuators for eyeball motions in humanoid robots. IEEE ASME Trans Mechatron, 2019, 24: 100–108

    Article  Google Scholar 

  48. Takanishi A, Hirano S, Sato K. Development of an anthropomorphic head-eye system for a humanoid robot-realization of human-like head-eye motion using eyelids adjusting to brightness. In: Proceedings of the 1998 IEEE International Conference on Robotics and Automation. Leuven, 1998. 1308–1314

  49. Beira R, Lopes M, Praça M, et al. Design of the robot-cub (icub) head. In: Proceedings of the 2006 IEEE International Conference on Robotics and Automation. Orlando, 2006. 94–100

  50. Endo N, Momoki S, Zecca M, et al. Development of whole-body emotion expression humanoid robot. In: Proceedings of the 2008 IEEE International Conference on Robotics and Automation. Pasadena, 2008. 2140–2145

  51. Faraj Z, Selamet M, Morales C, et al. Facially expressive humanoid robotic face. HardwareX, 2020, 9: e00117

    Article  Google Scholar 

  52. Hirth J, Schmitz N, Berns K. Emotional architecture for the humanoid robot head ROMAN. In: Proceedings of the 2007 IEEE International Conference on Robotics and Automation. Rome, 2007. 2150–2155

  53. Lenz A, Anderson S R, Pipe A G, et al. Cerebellar-inspired adaptive control of a robot eye actuated by pneumatic artificial muscles. IEEE Trans Syst Man Cybern B, 2009, 39: 1420–1433

    Article  Google Scholar 

  54. Lin C Y, Huang C C, Cheng L C. An expressional simplified mechanism in anthropomorphic face robot design. Robotica, 2016, 34: 652–670

    Article  Google Scholar 

  55. Hashimoto T, Hiramatsu S, Kobayashi H. Dynamic display offacial expressions on the face robot made by using a life mask. In: Proceedings of the Humanoids 2008-8th IEEE-RAS International Conference on Humanoid Robots. Daejeon, 2008. 521–526

  56. Wu W G, Men Q M, Wang Y. Development of the humanoid head portrait robot system with flexible face and expression. In: Proceedings of the 2004 IEEE International Conference on Robotics and Biomimetics. Shenyang, 2004. 757–762

  57. Takanishi A, Takanobu H, Kato I, et al. Development of the anthropomorphic head-eye robot WE-3RII with an autonomous facial expression mechanism. In: Proceedings of the 1999 IEEE International Conference on Robotics and Automation. Detroit, 1999. 3255–3260

  58. Laschi C, Miwa H, Takanishi A, et al. Visuo-motor coordination of a humanoid robot head with human-like vision in face tracking. In: Proceedings of the 2003 IEEE International Conference on Robotics and Automation. Taipei, 2003. 232–237

  59. Wu W G, Song C, Meng Q M. Development and experiment of voice and mouth shape system of humanoid avatar robot “H&F robot-Ill” (in Chinese). Mechanical Design, 2008, 1: 15–19

    Google Scholar 

  60. Lütkebohle I, Hegel F, Schulz S, et al. The Bielefeld anthropomorphic robot head “Flobi”. In: Proceedings of the 2010 IEEE International Conference on Robotics and Automation. Anchorage, 2010. 3384–3391

  61. Wang Y, Zhu J. Artificial muscles for jaw movements. Extreme Mech Lett, 2016, 6: 88–95

    Article  Google Scholar 

  62. Shiomi M, Shatani K, Minato T, et al. How should a robot react before people’s touch?: Modeling a pre-touch reaction distance for a robot’s face. IEEE Robot Autom Lett, 2018, 3: 3773–3780

    Article  Google Scholar 

  63. Ke X X, Yang Y, Xin J. Facial expression on robot SHFR-III based on head-neck coordination. In: Proceedings of the 2015 IEEE International Conference on Information and Automation. Lijiang, 2015. 1622–1627

  64. Tadesse Y, Subbarao K, Priya S. Realizing a humanoid neck with serial chain four-bar mechanism. J Intell Mater Syst Struct, 2010, 21: 1169–1191

    Article  Google Scholar 

  65. Trovato G, Kishi T, Endo N, et al. Cross-cultural perspectives on emotion expressive humanoid robotic head: Recognition of facial expressions and symbols. Int J Soc Robot, 2013, 5: 515–527

    Article  Google Scholar 

  66. Becker-Asano C, Ishiguro H. Evaluating facial displays of emotion for the android robot Geminoid F. In: Proceedings of the 2011 IEEE Workshop on Affective Computational Intelligence (WACI). Paris, 2011. 1–8

  67. Glas D F, Minato T, Ishi C T, et al. Erica: The erato intelligent conversational android. In: Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN). New York, 2016. 22–29

  68. Kobayashi H, Hara F. Study on face robot for active human interface-mechanisms of face robot and expression of 6 basic facial expressions. In: Proceedings of the 1993 2nd IEEE International Workshop on Robot and Human Communication. Tokyo, 1993. 276–281

  69. Ahn H S, Lee D W, Choi D, et al. Designing of android head system by applying facial muscle mechanism of humans. In: Proceedings of the 2012 12th IEEE-RAS International Conference on Humanoid Robots. Osaka, 2012. 799–804

  70. Liu R H, Wang L, Beebe D J. Progress towards a smart skin: Fabrication and preliminary testing. In: Proceedings of the 20th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. Hong Kong, 1998. 1841–1844

  71. Tadesse Y, Priya S. Humanoid face utilizing rotary actuator and piezoelectric sensors. In: Proceedings of the ASME 2008 International Mechanical Engineering Congress and Exposition. Boston, 2009. 573–581

  72. Pioggia G, Igliozzi R, Ferro M, et al. An android for enhancing social skills and emotion recognition in people with autism. IEEE Trans Neural Syst Rehabil Eng, 2005, 13: 507–515

    Article  Google Scholar 

  73. Tomar A, Tadesse Y. Multi-layer robot skin with embedded sensors and muscles. SPIE Proc, 2016, 9798: 979809

    Article  Google Scholar 

  74. Breazeal C. Emotion and sociable humanoid robots. Int J Hum-Comput Studies, 2003, 59: 119–155

    Article  Google Scholar 

  75. Kozima H, Zlatev J. An epigenetic approach to human-robot communication. In: Proceedings of the 9th IEEE International Workshop on Robot and Human Interactive Communication. Osaka, 2000. 346–351

  76. Kobayashi H, Ichikawa Y, Senda M, et al. Toward rich facial expression by face robot. In: Proceedings of the 2002 International Symposium on Micromechatronics and Human Science. Nagoya, 2002. 139–145

  77. Hashimoto T, Kobayashi H. Study on natural head motion in waiting state with receptionist robot SAYA that has human-like appearance. In: Proceedings of the 2009 IEEE Workshop on Robotic Intelligence in Informationally Structured Space. Nashville, 2009. 93–98

  78. Jiao Z D, Ye Z Q, Zhu P A, et al. Self-sensing actuators with programmable actuation performances for soft robots. Sci China Tech Sci, 2023, 66: 3070–3079

    Article  Google Scholar 

  79. Haines C S, Lima M D, Li N, et al. Artificial muscles from fishing line and sewing thread. Science, 2014, 343: 868–872

    Article  Google Scholar 

  80. Almubarak Y, Tadesse Y. Design and motion control of bioinspired humanoid robot head from servo motors toward artificial muscles. SPIE Proc, 2017, 10163: 295–303

    Google Scholar 

  81. Cao C J, Chen L J, Li B, et al. Toward broad optimal output bandwidth dielectric elastomer actuators. Sci China Tech Sci, 2022, 65: 1137–1148

    Article  Google Scholar 

  82. Sun Q, Cao B, Iwamoto T, et al. Effect of impact deformation on shape recovery behavior in Fe-Mn-Si shape memory alloy under shape memory training process with cyclic thermo-mechanical loading. Sci China Tech Sci, 2021, 64: 1389–1400

    Article  Google Scholar 

  83. Luo K, Wu J N, Chen F F. Optimal biaxial prestretch ratios of soft dielectric elastomer actuators for maximal voltage-induced displacement. Sci China Tech Sci, 2023, 66: 2871–2881

    Article  Google Scholar 

  84. Zou M, Li S, Hu X, et al. Progresses in tensile, torsional, and multifunctional soft actuators. Adv Funct Mater, 2021, 31: 2007437

    Article  Google Scholar 

  85. Rojas-Quintero J A, Rodriguez-Liñán M C. A literature review of sensor heads for humanoid robots. Robot Autonomous Syst, 2021, 143: 103834–103855

    Article  Google Scholar 

  86. Takanishi A, Sato K, Segawa K, et al. An anthropomorphic head-eye robot expressing emotions based on equations of emotion. In: Proceedings of 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings. San Francisco, 2000. 2243–2249

  87. Miwa H, Umetsu T, Takanishi A, et al. Human-like robot head that has olfactory sensation and facial color expression. In: Proceedings of the 2001 ICRA. IEEE International Conference on Robotics and Automation. Seoul, 2001. 459–464

  88. Kozima H, Nakagawa C. Interactive robots as facilitators of childrens social development. In: Mobile Robots: Towards New Applications. London: IntechOpen, 2006. 2140–2145

  89. Kozima H, Nakagawa C, Kawai N, et al. A humanoid in company with children. In: Proceedings of the 4th IEEE/RAS International Conference on Humanoid Robots, 2004. Santa Monica, 2004. 470–477

  90. Ishida T. Development of a small biped entertainment robot QRIO. In: Proceedings of the Micro-Nanomechatronics and Human Science, 2004 and the Fourth Symposium Micro-Nanomechatronics for Information-Based Society, 2004. Nagoya, 2004. 23–28

  91. Shiomi M, Minato T, Ishiguro H. Subtle reaction and response time effects in human-robot touch interaction. In: International Conference on Social Robotics. Cham: Springer, 2017. 242–251

  92. Ren F, Bao Y. A review on human-computer interaction and intelligent robots. Int J Info Tech Dec Mak, 2020, 19: 5–47

    Article  Google Scholar 

  93. Yang D P, Liu H. Human-machine shared control: New avenue to dexterous prosthetic hand manipulation. Sci China Tech Sci, 2021, 64: 767–773

    Article  Google Scholar 

  94. Chen X, Chaudhary K, Tanaka Y, et al. Reasoning-based vision recognition for agricultural humanoid robot toward tomato harvesting. In: Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Hamburg, 2015. 6487–6494

  95. Zhuang H C, Xia Y L, Wang N, et al. Interactive method research of dual mode information coordination integration for astronaut gesture and eye movement signals based on hybrid model. Sci China Tech, 2023, 66: 1717–1733

    Article  Google Scholar 

  96. Radford N A, Strawser P, Hambuchen K, et al. Valkyr e: NASA’s first bipedal humanoid robot. J Field Robot, 2015, 32: 397–419

    Article  Google Scholar 

  97. Wijayasinghe I B, Das S K, Miller H L, et al. Head-eye coordination of humanoid robot with potential controller. J Intell Robot Syst, 2018, 94: 15–27

    Article  Google Scholar 

  98. Mokhtari M, Shariati A, Meghdari A. “Taban”: A retro-projected social robotic-head for human-robot interaction. In: Proceedings of the 2019 7th International Conference on Robotics and Mechatronics (ICRoM). Tehran, 2019. 46–51

  99. Wei W, Jia Q. 3D facial expression recognition based on kinect. Int J Innov Comput Inf Control, 2017, 13: 1843–1854

    Google Scholar 

  100. Liu Z T, Pan F F, Wu M, et al. A multimodal emotional communication based humans-robots interaction system. In: Proceedings of the 2016 35th Chinese Control Conference (CCC). Chengdu, 2016. 6363–6368

  101. Martinez-Hernandez U, Rubio-Solis A, Prescott T J. Bayesian perception of touch for control of robot emotion. In: Proceedings of the 2016 International Joint Conference on Neural Networks (IJCNN). Vancouver, 2016. 4927–4933

  102. Cannata G, Maggiali M, Metta G, et al. An embedded artificial skin for humanoid robots. In: Proceedings of the 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems. Seoul, 2008. 434–438

  103. Miyashita T, Tajika T, Ishiguro H, et al. Haptic communication between humans and robots. In: Robotics Research. Berlin: Springer, 2007. 2140–2145

    Google Scholar 

  104. Ishiguro H, Nishio S. Building artificial humans to understand humans. J Artif Organs, 2007, 10: 133–142

    Article  Google Scholar 

  105. Takanishi A, Ishimoto S, Matsuno T. Development of an anthropomorphic head-eye system for robot and human communication. In: Proceedings of the 4th IEEE International Workshop on Robot and Human Communication. Tokyo, 1995. 77–82

  106. Zheng N, Liu Z, Ren P, et al. Hybrid-augmented intelligence: Collaboration and cognition. Front Inf Technol Electron Eng, 2017, 18: 153–179

    Article  Google Scholar 

  107. Atkeson C G, Hale J G, Pollick F, et al. Using humanoid robots to study human behavior. IEEE Intell Syst, 2000, 15: 46–56

    Article  Google Scholar 

  108. Yuan J H, Wu Y, Lu X, et al. Recent advances in deep learning based sentiment analysis. Sci China Tech Sci, 2020, 63: 1947–1970

    Article  Google Scholar 

  109. Kim H R, Kwon D S. Computational model of emotion generation for human-robot interaction based on the cognitive appraisal theory. J Intell Robot Syst, 2011, 60: 263–283

    Article  Google Scholar 

  110. Russell J A, Bullock M. Multidimensional scaling of emotional facial expressions: Similarity from preschoolers to adults. J Personality Soc Psychol, 2011, 48: 1290–1298

    Article  Google Scholar 

  111. Remington N A, Fabrigar L R, Visser P S. Reexamining the cir-cumplex model of affect. J Personality Soc Psychol, 2000, 79: 286–300

    Article  Google Scholar 

  112. Duhaut D. A generic architecture for emotion and personality. In: Proceedings of the 2008 IEEE/ASME International Conference on Advanced Intelligent Mechatronics. Xi’an, 2008. 188–193

  113. Watson D, Wiese D, Vaidya J, et al. The two general activation systems of affect: Structural findings, evolutionary considerations, and psychobiological evidence. J Personality Soc Psychol, 2011, 76: 820–838

    Article  Google Scholar 

  114. Trnka R, Lačev A, Balcar K, et al. Modeling semantic emotion space using a 3D hypercube-projection: An innovative analytical approach for the psychology of emotions. Front Psychol, 2016, 7: 1–2

    Article  Google Scholar 

  115. Cavallo F, Semeraro F, Fiorini L, et al. Emotion modelling for social robotics applications: A review. J Bionic Eng, 2018, 15: 185–203

    Article  Google Scholar 

  116. Mogi S, Hara F. Artificial emotion model for human-machine communication by using harmony theory. In: Proceedings of the IEEE International Workshop on Robot and Human Communication. Tokyo, 1992. 149–154

  117. Miwa H, Okuchi T, Itoh K, et al. A new mental model for humanoid robots for human friendly communication introduction of learning system, mood vector and second order equations of emotion. In: Proceedings of the 2003 IEEE International Conference on Robotics and Automation. Taipei, 2003. 3588–3593

  118. Smith C A, Ellsworth P C. Patterns of cognitive appraisal in emotion. J Personality Soc Psychol, 1985, 48: 813–838

    Article  Google Scholar 

  119. Miwa H, Itoh K, Ito D, et al. Introduction of the need model for humanoid robots to generate active behavior. In: Proceedings of the 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems. Las Vegas, 2003. 1400–1406

  120. Miwa H, Itoh K, Matsumoto M, et al. Effective emotional expressions with emotion expression humanoid robot WE-4R II. In: Proceedings of the 2004 IEEE. In RSJ International Conference on Intelligent Robots and Systems. Minato, 2004. 2203–2208

  121. Itoh K, Miwa H, Takanobu H, et al. Application of neural network to humanoid robots—Development of co-associative memory model. Neural Networks, 2005, 18: 666–673

    Article  Google Scholar 

  122. Usui T, Kume K, Yamano M, et al. A robotic kansei communication system based on emotional synchronization. In: Proceedings of the 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems. Nice, 2008. 3344–3349

  123. Park J C, Kim H R, Kim Y M, et al. Robot’s individual emotion generation model and action coloring according to the robot’s personality. In: Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication. Toyama, 2009. 257–262

  124. Han M J, Lin C H, Song K T. Robotic emotional expression generation based on mood transition and personality model. IEEE Trans Cybern, 2012, 43: 1290–1303

    Article  Google Scholar 

  125. Qi X, Wang W, Guo L, et al. Building a Plutchik’s wheel inspired affective model for social robots. J Bionic Eng, 2019, 16: 209–221

    Article  Google Scholar 

  126. Masuyama N, Loo C K, Seera M. Personality affected robotic emotional model with associative memory for human-robot interaction. Neurocomputing, 2018, 272: 213–225

    Article  Google Scholar 

  127. Sham A H, Khan A, Lamas D, et al. Towards context-aware facial emotion reaction database for dyadic interaction settings. Sensors, 2023, 23: 458

    Article  Google Scholar 

  128. Bazo D, Vaidyanathan R, Lentz A, et al. Design and testing of a hybrid expressive face for a humanoid robot. In: Proceedings of the 2010 IEEE RSJ International Conference on Intelligent Robots and Systems. Taipei, 2010. 5317–5322

  129. Hashimoto T, Kato N, Kobayashi H. Study on educational application of android robot SAYA: Field trial and evaluation at elementary school. In: Proceedings of the International Conference on Intelligent Robotics and Applications. Berlin: Springer, 2010. 505–516

    Chapter  Google Scholar 

  130. Becker-Asano C, Ishiguro H. Evaluating facial displays of emotion for the android robot Geminoid F. In: Proceedings of the 2011 IEEE Workshop on Affective Computational Intelligence (WACI). Paris, 2011. 1–8

  131. Becker-Asano C, Ishiguro H. Intercultural differences in decoding facial expressions of the android robot Geminoid F. J Artif Intell Soft, 2011, 1: 215–231

    Google Scholar 

  132. Rautaray S S, Agrawal A. Vision based hand gesture recognition for human computer interaction: A survey. Artif Intell Rev, 2011, 43: 1–54

    Article  Google Scholar 

  133. Yang Y, Ge S S, Lee T H, et al. Facial expression recognition and tracking for intelligent human-robot interaction. Intel Serv Robot, 2008, 1: 143–157

    Article  Google Scholar 

  134. Tapus A, Bandera A, Vazquez-Martin R, et al. Perceiving the person and their interactions with the others for social robotics—A review. Pattern Recogn Lett, 2018, 118: 3–13

    Article  Google Scholar 

  135. Kobayashi H, Hara F. Dynamic recognition of basic facial expressions by discrete-time recurrent neural network. In: Proceedings of the 1993 International Conference on Neural Networks. Nagoya, 1993. 155–158

  136. Ni R, Liu X, Chen Y, et al. Negative emotions sensitive humanoid robot with attention-enhanced facial expression recognition network. Intell Autom Soft Comput, 2022, 34: 149–164

    Article  Google Scholar 

  137. Ren F, Huang Z. Automatic facial expression learning method based on humanoid robot XIN-REN. IEEE Trans Hum-Mach Syst, 2016, 46: 810–821

    Article  Google Scholar 

  138. Liu X, Chen Y, Li J, et al. Real-time robotic mirrored behavior of facial expressions and head motions based on lightweight networks. IEEE Internet Things J, 2023, 10: 1401–1413

    Article  Google Scholar 

  139. Chen L, Li M, Wu M, et al. Coupled multimodal emotional feature analysis based on broad-deep fusion networks in human-robot interaction. IEEE Trans Neural Netw Learn Syst, 2023, 1–11

  140. Fu D, Abawi F, Carneiro H, et al. A trained humanoid robot can perform human-like crossmodal social attention and conflict resolution. Int J Soc Robot, 2023, 15: 1325–1340

    Article  Google Scholar 

  141. Liu S Y, Cao Y W, Meng H L. Emotional voice conversion with cycle-consistent adversarial network. arXiv: 2004.03781

  142. Uchida T, Minato T, Nakamura Y, et al. Female-type android’s drive to quickly understand a user’s concept of preferences stimulates dialogue satisfaction: Dialogue strategies for modeling user’s concept of preferences. Int J Soc Robot, 2021, 13: 1499–1516

    Article  Google Scholar 

  143. Ma G, Gao J, Yu Z, et al. Development of a socially interactive system with whole-body movements for BHR-4. Int J Soc Robot, 2016, 8: 183–192

    Article  Google Scholar 

  144. Miura K, Yoshikawa Y, Asada M. Unconscious anchoring in maternal imitation that helps find the correspondence of a caregiver’s vowel categories. Adv Robot, 2007, 21: 1583–1600

    Article  Google Scholar 

  145. Ke X, Yun Y, Yang Y, et al. Sound positioning system offacial robot SHFR-III. In: Proceedings of the 2015 IEEE International Conference on Robotics and Biomimetics (ROBIO). Beijing, 2015. 2151–2156

  146. Ishi C T, Even J, Hagita N. Speech activity detection and face orientation estimation using multiple microphone arrays and human position information. In: Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Hamburg, 2015. 5574–5579

  147. Chen H, Liu C, Chen Q. Efficient and robust approaches for three-dimensional sound source recognition and localization using huma-noid robots sensor arrays. Int J Adv Robot Syst, 2020, 17: 172988142094135

    Article  Google Scholar 

  148. Zheng M, Moon A, Croft E A, et al. Impacts of robot head gaze on robot-to-human handovers. Int J Soc Robot, 2015, 7: 783–798

    Article  Google Scholar 

  149. Hashimoto M, Yoshida S, Tamatsu Y. Gaze guidance with emotional presentation of a head robot. In: Proceedings of the 2007IEEE/ICME International Conference on Complex Medical Engineering. Beijing, 2007. 1357–1360

  150. Hashimoto M, Kondo H, Tamatsu Y. Gaze guidance using a facial expression robot. Adv Robot, 2009, 23: 1831–1848

    Article  Google Scholar 

  151. Shaw P, Law J, Lee M. A comparison of learning strategies for biologically constrained development of gaze control on an iCub robot. Auton Robot, 2014, 37: 97–110

    Article  Google Scholar 

  152. Lathuilière S, Massé B, Mesejo P, et al. Neural network based reinforcement learning for audio-visual gaze control in human-robot interaction. Pattern Recogn Lett, 2019, 118: 61–71

    Article  Google Scholar 

  153. Zaraki A, Mazzei D, Giuliani M, et al. Designing and evaluating a social gaze-control system for a humanoid robot. IEEE Trans Hum-Mach Syst, 2014, 44: 157–168

    Article  Google Scholar 

  154. Yoo B S, Kim J H. Fuzzy integral-based gaze control of a robotic head for human robot interaction. IEEE Trans Cybern, 2015, 45: 1769–1783

    Article  Google Scholar 

  155. Duque-Domingo J, Gómez-García-Bermejo J, Zalama E. Gaze control of a robotic head for realistic interaction with humans. Front Neurorobot, 2020, 14: 34

    Article  Google Scholar 

  156. Domingo J D, Gómez-García-Bermejo J, Zalama E. Optimization and improvement of a robotics gaze control system using LSTM networks. Multimed Tools Appl, 2022, 81: 3351–3368

    Article  Google Scholar 

  157. Kondo Y, Kawamura M, Takemura K, et al. Gaze motion planning for android robot. In: Proceedings of the 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI). Lausanne, 2011. 171–172

  158. Ishi C T, Liu C, Ishiguro H, et al. Evaluation of formant-based lip motion generation in tele-operated humanoid robots. In: Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems. Vilamoura-Algarve, 2012. 2377–2382

  159. Masuko T, Kobayashi T, Tamura M, et al. Text-to-visual speech synthesis based on parameter generation from HMM. In: Proceedings of the 1998 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP’98. Seattle, 1998. 3745–3748

  160. Hofer G, Yamagishi J, Shimodaira H. Speech-driven lip motion generation with a trajectory HMM. In: Proceedings of the 9th Annual Conference of the International Speech Communication Association 2008. Brisbane, 2008. 2314–2317

  161. Zhuang X, Wang L, Soong F K, et al. A minimum converted trajectory error (MCTE) approach to high quality speech-to-lips conversion. In: Proceedings of the 11th Annual Conference of the International Speech Communication Association. Makuhari, 2010. 1726–1739

  162. Wu J, Pan X, Kong J, et al. Statistical correlation analysis between lip contour parameters and formant parameters for Mandarin monophthongs. In: Proceedings of the International Conference on Auditory-Visual Speech Processing 2008. Moreton Island, 2008. 121–126

  163. Hong P, Wen Z, Huang T. Real-time speech-driven face animation with expressions using neural networks. IEEE Trans Neural Netw, 2002, 13: 916–927

    Article  Google Scholar 

  164. Mara M, Appel M. Effects of lateral head tilt on user perceptions of humanoid and android robots. Comput Hum Behav, 2015, 44: 326–334

    Article  Google Scholar 

  165. Gu L Z, Su J B. On coordinated head-eye motion control of a humanoid robot (in Chinese). Robot, 2018, 30: 165–170

    Google Scholar 

  166. Rajruangrabin J, Popa D O. Robot head motion control with an emphasis on realism of neck-eye coordination during object tracking. J Intell Robot Syst, 2011, 63: 163–190

    Article  Google Scholar 

  167. Ghosh B K, Wijayasinghe I B, Kahagalage S D. A geometric approach to head/eye control. IEEE Access, 2014, 2: 316–332

    Article  Google Scholar 

  168. Saeb S, Weber C, Triesch J. Learning the optimal control of coordinated eye and head movements. PLoS Comput Biol, 2011, 7: e1002253

    Article  Google Scholar 

  169. Muhammad W, Spratling M W. A neural model of coordinated head and eye movement control. J Intell Robot Syst, 2017, 85: 107–126

    Article  Google Scholar 

  170. Liu C, Ishi C T, Ishiguro H, et al. Generation of nodding, head tilting and gazing for human-robot speech interaction. Int J Hum Robot, 2013, 10: 1350009

    Article  Google Scholar 

  171. Lepora N F, Martinez-Hernandez U, Prescott T J. Active bayesian perception for simultaneous object localization and identification. In: Proceedings of the Robotics: Science and Systems IX, 2013. Berlin: RSS, 2013. 1–8

    Google Scholar 

  172. Ishi C T, Machiyashiki D, Mikata R, et al. A speech-driven hand gesture generation method and evaluation in android robots. IEEE Robot Autom Lett, 2018, 3: 3757–3764

    Article  Google Scholar 

  173. Kondo Y, Takemura K, Takamatsu J, et al. Smooth human-robot interaction by interruptible gesture planning. In: Proceedings of the 2010 IEEE/ASME International Conference on Advanced Intelligent Mechatronics. Montreal, 2010. 213–218

  174. Chen Y, Wu F, Shuai W, et al. Kejia robot-An attractive shopping mall guider. In: Proceedings of the 7th International Conference on Social Robotics. Cham: Springer, 2015. 145–154

    Google Scholar 

  175. Hashimoto T, Kato N, Kobayashi H. Development of educational system with the android robot SAYA and evaluation. Int J Adv Robot Syst, 2011, 8: 28

    Article  Google Scholar 

  176. Kumazaki H, Muramatsu T, Yoshikawa Y, et al. Differences in the optimal motion of android robots for the ease of communications among individuals with autism spectrum disorders. Front Psychiatry, 2022, 13: 883371

    Article  Google Scholar 

  177. Chen S T, Wang Y S, Li D C, et al. Enhancing interaction performance of soft pneumatic-networks grippers by skeleton topology optimization. Sci China Tech Sci, 2021, 64: 2709–2717

    Article  Google Scholar 

  178. Wang L L, Zhang F H, Du S Y, et al. Advances in 4D printed shape memory composites and structures: Actuation and application. Sci China Tech Sci, 2023, 66: 1271–1288

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Yi Li, YanBiao Li or Minoru Hashimoto.

Additional information

This work was supported by Zhejiang Provincial Natural Science Foundation of China (Grant Nos. LY22E050019 and LGG21E050015), Ningbo Public Welfare Research Program Foundation of China (Grant No. 2023S066), the National Natural Science Foundation of China (Grant No. U21A20122), and the JSPS Grant-in-Aid for Scientific Research (C) (Grant No. JP22K04010).

Supporting Information

The supporting information is available online at tech.scichina.com and link.springer.com. The supporting materials are published as submitted, without typesetting or editing. The responsibility for scientific accuracy and content remains entirely with the authors.

Supplemental Information

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Li, Y., Zhu, L., Zhang, Z. et al. Humanoid robot heads for human-robot interaction: A review. Sci. China Technol. Sci. 67, 357–379 (2024). https://doi.org/10.1007/s11431-023-2493-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11431-023-2493-y

Navigation