Advertisement

Romansy 16 pp 255-262 | Cite as

Mechanical Design of Emotion Expression Humanoid Robot WE-4RII

  • Kazuko Itoh
  • Hiroyasu Miwa
  • Massimiliano Zecca
  • Hideaki Takanobu
  • Stefano Roccella
  • Maria Chiara Carrozza
  • Paolo Dario
  • Atsuo Takanishi
Part of the CISM Courses and Lectures book series (CISM, volume 487)

Abstract

A Personal Robot is expected to become popular in the future. It is required to be active in joint work and community life with humans. Therefore, we have been developing new mechanisms and functions for a humanoid robot that can express emotions and communicate naturally with humans. In this paper, we present the mechanical design of the Emotion Expression Humanoid Robot WE-4RII, which was developed by integrating the Humanoid Robot Hands RCH-1 into the previous version WE-4R. The robot has four of the five human senses for detecting external stimuli: visual, tactile, auditory and olfactory, and 59-DOFs for expressing motion and emotions. It is capable of expressing seven basic emotional patterns.

Keywords

Facial Expression Recognition Rate Humanoid Robot Drive Mechanism Average Recognition Rate 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Bibliography

  1. Breazeal, C, and Scassellati, B. (1999). How to build robots that make friends and influence people. Proceedings of the 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems, 858–863.Google Scholar
  2. Ekman, P., and Friesen, W. V. (1978). Facial Action Coding System. Consulting Psychologists Press Inc.Google Scholar
  3. Hama, H., et al. (2001). Kanjo Shinrigaku heno Syotai (in Japanese). Saiensu. 162–170.Google Scholar
  4. Itoh, K., Miwa, H., Takanobu, H., and Takanishi, A. (2004). Mechanical Design and Motion Control of Emotion Expression Humanoid Robot WE-4R. Proceeding of the 15th CISM-IFToMM Symposium on Robot Design, Dynamics, and Control, ROM04-14.Google Scholar
  5. Itoh, K., et al. (2005). Application of Neural Network to Humanoid Robots-Development of Co-Associative Memory Model. Neural Networks, Vol.18, No.5–6, 666–673.CrossRefGoogle Scholar
  6. Kobayashi, H., and Hara, F. (1996). Real Time Dynamic Control of 6 Basic Facial Expressions on Face Robot. Journal of the Robotics Society of Japan, Vol.14, No.5, 677–685.Google Scholar
  7. Miwa, H., Takanobu, H., and Takanishi, A. (2002). Human-Like Head Robot WE-4RV for Emotional Human-Robot Interaction. ROMANSY 14-THEORY AND PRACTICE OF ROBOTS AND MANIPULATORS, 519–526.Google Scholar
  8. Zecca, M., et al. (2006). From the Human Hand to the Humanoid Hand: Development of RoboCasa Hand #1. to be submitted to 16th CISM-IFToMM Symposium on Robot Design, Dynamics, and Control.Google Scholar

Copyright information

© CISM, Udine 2006

Authors and Affiliations

  • Kazuko Itoh
    • 1
    • 2
  • Hiroyasu Miwa
    • 3
    • 4
    • 5
  • Massimiliano Zecca
    • 2
    • 4
  • Hideaki Takanobu
    • 2
    • 5
    • 6
  • Stefano Roccella
    • 7
  • Maria Chiara Carrozza
    • 2
    • 7
  • Paolo Dario
    • 2
    • 7
  • Atsuo Takanishi
    • 1
    • 2
    • 4
    • 5
    • 8
  1. 1.Department of Mechanical EngineeringWaseda UniversityTokyoJapan
  2. 2.RoboCasaTokyoJapan
  3. 3.Digital Human Research CenterNational Institute of Advanced Industrial Science and Technology (AIST)TokyoJapan
  4. 4.Institute for Biomedical Engineering, ASMeWWaseda UniversityTokyoJapan
  5. 5.Humanoid Robotics Institute (HRI)Waseda UniversityTokyoJapan
  6. 6.Department of Mechanical Systems EngineeringKogakuin UniversityTokyoJapan
  7. 7.ARTS LabScuola Superiore Sant’AnnaPontederaItaly
  8. 8.Advanced Research Institute for Science and EngineeringWaseda UniversityTokyoJapan

Personalised recommendations