International Journal of Social Robotics

, Volume 8, Issue 2, pp 183–192 | Cite as

Development of a Socially Interactive System with Whole-Body Movements for BHR-4

  • Gan Ma
  • Junyao Gao
  • Zhangguo Yu
  • Xuechao Chen
  • Qiang Huang
  • Yunhui Liu
Article

Abstract

For a long time, humans have been communicating with others through voice, facial expressions, and body movements. If a humanoid robot has a human-habitual, natural, and human-like interactive form, it tends to be accepted by humans. To date, the majority of the existing humanoid robots have had difficulty in interacting with humans in a human-like way. This study focuses on this issue and develops a socially interactive system for enhancing the natural communication ability of a humanoid robot. The system, which is implemented in an android robot, BHR-4, features hearing, voice conversation, and facial and body emotional expression capabilities. Then, a full-body social motion planner for a humanoid robot is presented. The objective of this planner is to control the whole-body motion of the robot in a way similar to that of humans. Finally, experiments are conducted on the robot regarding its interactions with humans in a pure indoor environment. It is expected that the socially interactive system can enhance the natural communication ability of an android robot. The results of the experiments show that the combination of verbal behavior with facial expressions and body movements is better than verbal behavior alone, verbal behavior combined with facial expressions, or verbal behavior combined with body movements.

Keywords

Human–robot interaction Whole-body movement Social robot Humanoid robot 

References

  1. 1.
    Burgard W, Cremers AB, Fox D, Hahnel D, Lakemeyer G, Schulz D, Steiner W, Thrun S (1998) The interactive museum tour-guide robot. In: Proceedings of the 5th national conference on AAAI, pp 11–18Google Scholar
  2. 2.
    Thrun S, Bennewitz M, Burgard W, Cremers AB, Dellaert F, Fox D, Hahnel D, Rosenberg C, Roy N, Schulte J, Schulz D (1999) MINERVA: a second-generation museum tour-guide robot. In: Proceedings of the IEEE international conference on robotics and automation, pp 1999–2005Google Scholar
  3. 3.
    Shiomi M, Kanda T, Ishiguro H, Hagita N (2006) Interactive humanoid robots for a science museum. In: Proceedings of the 1st ACM SIGCHI/SIGART conference on human–robot interaction, pp 305–312Google Scholar
  4. 4.
    Kanda T, Shiomi M, Miyashita Z, Ishiguro H, Hagita N (2009) An affective guide robot in a shopping mall. In: Proceedings of the 4th ACM/IEEE international conference on human–robot interaction, pp 173–180Google Scholar
  5. 5.
    Kanda T, Shiomi M, Miyashita Z, Ishiguro H, Hagita N (2010) A communication robot in a shopping mall. IEEE Trans Robot 26(5):897–913CrossRefGoogle Scholar
  6. 6.
    Kanda T et al (2004) Interactive robots as social partners and peer tutors for children: a field trial. Hum Comput Interact 19(1):61–84MathSciNetCrossRefGoogle Scholar
  7. 7.
    Gockley R, Forlizzi J, Simmons R (2006) Interactions with a moody robot. HRI, SheffieldCrossRefGoogle Scholar
  8. 8.
    Ahn HS (2014) Designing of a personality based emotional decision model for generating various emotional behavior of social robots. Adv Hum Comput Interact 2014:1–14. doi:10.1155/2014/630808 CrossRefGoogle Scholar
  9. 9.
    Kozima H, Nakagawa C, Yasuda Y (2005) Interactive robots for communication-care: a case-study in autism therapy. In: International workshop on Ro-Man, pp 341–346Google Scholar
  10. 10.
    Tanaka F (2007) Socialization between toddlers and robots at an early childhood education center. Proc Natl Acad Sci USA 104(46):17954–17958CrossRefGoogle Scholar
  11. 11.
    Wada K (2004) Effects of robot-assisted activity for elderly people and nurses at a day service center. Proc IEEE 92(11):1780–1788CrossRefGoogle Scholar
  12. 12.
    Mehrabian A (1968) Communication without words. Psychol Today 2:53–55Google Scholar
  13. 13.
    Sakamoto D, Kanda T, Ono T (2004) Cooperative embodied communication emerged by interactive humanoid robots. Int J Hum Comput Stud 62:247–265CrossRefGoogle Scholar
  14. 14.
    Fukuda T, Tachibana D, Arai F, Taguri J, Nakashima M, Hasegawa Y (2001) Human–robot mutual communication system. In: Proceedings of the IEEE International workshop on robot and human interactive communication. Paris, pp 14–19Google Scholar
  15. 15.
    Breazeal C (2003) Emotion and sociable humanoid robots. Int J Hum Comput Stud 59:119–155CrossRefGoogle Scholar
  16. 16.
    Breemen AV (2004) Bringing robots to life: applying principles of animation to robots. In: Workshop on shaping human-robot interaction-understanding the social aspects of intelligent robotic products, ViennaGoogle Scholar
  17. 17.
    Bartneck C, Reichenbach J, Breemen A (2004) In your face, robot! The influence of a character’s embodiment on how users perceive its emotional expressions. In: Proceedings of the design and emotion conference. Ankara, pp 32–51Google Scholar
  18. 18.
    Zecca M, Mizoguchi Y, Endo K, Iida F, Kawabata Y, Endo N, Itoh K, Takanishi A (2009) Whole body emotion expressions for KOBIAN humanoid robot-preliminary experiments with different emotional patterns. In: IEEE international symposium on robot and human interactive communication, pp 381–386Google Scholar
  19. 19.
    He H, Ge SS, Zhang Z (2013) A saliency-driven robotic head with bio-inspired saccadic behaviors for social robotics. Auton Robots 36(3):225–240CrossRefGoogle Scholar
  20. 20.
    He H, Ge SS, Zhang Z (2011) Visual attention prediction using saliency determination of scene understanding for social robots. Int J Soc Robot 3(4):457–468MathSciNetCrossRefGoogle Scholar
  21. 21.
    Destephe M, Henning A, Zecca M, Hashimoto K, Takanishi A (2013) Perception of emotion and emotional intensity in humanoid robots gait. In: Proceedings of the IEEE international conference on robotics and biomimetics, Shenzhen, pp 1276–1281Google Scholar
  22. 22.
    Cosentino S, Kishi T, Zecca M, Sessa S, Bartolomeo L, Hashimoto K, Nozawa T, Takanishi A (2013) Human-humanoid robot social interaction: laughter. In: Proceedings of the IEEE international conference on robotics and biomimetics, Shenzhen, pp 1396–1401Google Scholar
  23. 23.
    Tasaki T, Ogata TG, Okuno H (2014) The interaction between a robot and multiple people based on spatially mapping of friendliness and motion parameters. Adv Robot 28:39–51CrossRefGoogle Scholar
  24. 24.
    Hashimoto T, Kato N, Kobayashi H (2011) Development of educational system with the android robot SAYA and evaluation. Int J Adv Robot Syst 8:51–61Google Scholar
  25. 25.
    Hara F, Akazawa H, Kobayashi H (2001) Realistic facial expressions by SMA driven face robot. In: Proceedings of the IEEE international workshop on robot and human communication, Paris, pp 504–511Google Scholar
  26. 26.
    Ge SS, He H, Zhang Z (2011) Bottom-up saliency detection for attention determination. Mach Vis Appl 24(1):103–116CrossRefGoogle Scholar
  27. 27.
    Oh JH, Hanson D, Kim WS, Han IY, Kim JY, Park IW (2006) Design of android type humanoid robot Albert HUBO. In: Proceedings of the IEEE/RSJ international conference on intelligent robots and systems, Beijing, pp 1428–1433Google Scholar
  28. 28.
    Nishio S, Ishiguro H, Hagita N (2007) Geminoid: teleoperated android of an existing person. INTECH Open Access Publisher, Vienna, pp 343–352Google Scholar
  29. 29.
    Sakamoto D, Kanda T, Ono T, Ishiguro H, Hagita N (2007) Android as a telecommunication medium with a human-like presence. In: Proceedings of 2nd ACM/IEEE international conference on human-robot interaction, Washington, pp 193–200Google Scholar
  30. 30.
    Becker-Asano C, Ishiguro H (2011) Intercultural differences in decoding facial expressions of the android robot Geminoid F. J Artif Intell Soft Comput Res 1:215–231Google Scholar
  31. 31.
    Lin C, Huang H (2009) Design of a face robot with facial expression. In: Proceedings of the IEEE international conference on robotics and biomimetics, Guilin, pp 492–497Google Scholar
  32. 32.
    Kaneko K, Kanehiro F, Morisawa M (2011) Hardware improvement of cybernetic human HRP-4C for entertainment use. In: Proceedings of the IEEE international conference on intelligent robots and systems, San Francisco, pp 4392–4399Google Scholar
  33. 33.
    Vlachos E, Scharfe H (2015) Towards designing android faces after actual humans. In: Proceeds of 9th KES international conference, Sorrento, pp 109–119Google Scholar
  34. 34.
    Mark L, Knapp A (2009) Hall Nonverbal Commun Hum Interact. Belmont, CaliforniaGoogle Scholar
  35. 35.
    Gao J, Huang Q, Yu Z (2011) Design of the facial expression mechanism for humanoid robots. In: 18th CISM-IFToMM symposium on robot design, dynamics and control, Udine, pp 433–440Google Scholar
  36. 36.
    Ma G, Huang Q, Yu Z (2014) Experiments of a human–robot social interactive system with whole-body movements. In: Proceedings of ROMANSY 2014 XX CISM-IFToMM symposium on theory and practice of robots and manipulators, Moscow, pp 501–508Google Scholar
  37. 37.
    Yu Z, Ma G, Huang Q (2014) Modeling and design of a humanoid robotic face based on an active drive points model. Adv Robot 28:379–388CrossRefGoogle Scholar
  38. 38.
    Ekman P, Friesen WV (1978) The facial action coding system. Consulting Psychologists Press, SunnyvaleGoogle Scholar
  39. 39.
    Ekman P, Rosenberg EL (1997) What the face reveals: basic and applied studies of spontaneous expression using the facial action coding system (FACS). Oxford University Press, New YorkGoogle Scholar
  40. 40.
    Microsoft Speech Programming Guide. https://msdn.microsoft.com/en-us/library/hh378466.aspx. Accessed 11 November 2015
  41. 41.
    Ma Y, Paterson HM, Pollick FE (2006) A motion capture library for the study of identity, gender, and emotion perception from biological motion. Behav Res Methods 38(1):134–141CrossRefGoogle Scholar
  42. 42.
    CMU Graphics Lab Motion Capture Database. http://mocap.cs.cmu.edu. Accessed 11 November 2015
  43. 43.
    Muller M, Roder T, Clausen M (2007) Documentation mocap database HDM05. The University of Bonn Computer Graphics Technical Reports, CG-2007-2, BonnGoogle Scholar
  44. 44.
    Guerra-Filho G, Biswas A (2012) The human motion database: a cognitive and parametric sampling of human motion. Image Vis Comput 30(3):251–261CrossRefGoogle Scholar
  45. 45.
    Mocapdata.com. http://www.mocapdata.com. Accessed 11 November 2015
  46. 46.
    Kuehne H, Jhuang H, Stiefelhagen R, Serre T (2013) Hmdb51: a large video database for human motion recognition. In: High performance computing in science and engineering 12. Springer, pp 571–582Google Scholar
  47. 47.
    Mandery C, Terlemez O, Do M (2015) The KIT whole-body human motion database. In: IEEE international conference on robotics and automation, Seattle, pp 329–336Google Scholar
  48. 48.
    Huang Q, Yu Z, Zhang W (2010) Design and similarity evaluation on humanoid motion based on human motion capture. Robotica 28:737–745CrossRefGoogle Scholar
  49. 49.
    Vukobratovic M, Borovac B (2004) Zero-moment point-thirty five years of its life. Int J Humanoid Robot 1:157–173CrossRefGoogle Scholar
  50. 50.
    Moos FA, Hunt KT, Omwake KT (1927) Social intelligence test. George Washington University, WashingtonGoogle Scholar
  51. 51.
    Gough HG (1968) Chapin social insight test manual Palo Alto. Consulting Psychologists Press, Palo AltoGoogle Scholar
  52. 52.
    Banham KM (1968) Social competence inventory for adults: a social competence inventory for older persons. Family Life Publications, DurhamGoogle Scholar
  53. 53.
    Heerink M, Krose B, Evers V (2010) Assessing acceptance of assistive social agent technology by older adults: the Almere model. Int J Soc Robot 2(4):361–375CrossRefGoogle Scholar
  54. 54.
    Saini P, Ruyter B, Markopoulos P (2005) Assessing the effects of building social intelligence in a robotic interface for the home. Int J Soc Robot 17(5):522–541Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2015

Authors and Affiliations

  • Gan Ma
    • 1
    • 2
  • Junyao Gao
    • 1
  • Zhangguo Yu
    • 1
  • Xuechao Chen
    • 1
  • Qiang Huang
    • 1
  • Yunhui Liu
    • 1
    • 3
  1. 1.The Intelligent Robotics InstituteSchool of Mechatronical Engineering, Beijing Institute of TechnologyBeijingChina
  2. 2.Humanoid Robotics InstituteWaseda UniversityTokyoJapan
  3. 3.The Department of Mechanical and Automation EngineeringThe Chinese University of Hong KongHong KongHong Kong

Personalised recommendations