The Huggable Robot Probo, a Multi-disciplinary Research Platform

  • Kristof Goris
  • Jelle Saldien
  • Innes Vanderniepen
  • Dirk Lefeber
Part of the Communications in Computer and Information Science book series (CCIS, volume 33)

Abstract

The concept of the huggable robot Probo is a result of the desire to improve the living conditions of children in hospital environment. These children need distraction and lots of information. In this paper we present the concept of this new robot. The robot will be employed in hospitals, as a tele-interface for entertainment, communication and medical assistance. To communicate according to social rules, the robot needs the ability to show facial expressions. Using a well defined set of Action Units (AU) it’s possible to express some basic emotions. A prototype of the robot’s head, capable of showing these basic emotions is presented. In order to express emotions, an emotional interface is developed. The emotions, represented as a vector in an 2D emotion space, are mapped to the DOF used in the robot. A graphical user interface to control the virtual and real prototype is also presented.

Keywords

Emotional interface human-robot interaction huggable robot multi-disciplinary research platform 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Vanderfaeillie, J., Vandenplas, Y.: Zieke kinderen en jongeren. In: Handboek orthopedagogische hulpverlening. Deel 1. Een orthopedagogisch perspectief op kinderen en jongeren met problemen, pp. 199–237. Leuven/Voorburg: Acco (2005)Google Scholar
  2. 2.
    Fels, D., Shrimpton, B., Roberston, M.: Kids in hospital, kids in school. In: Lassner, D., McNaught, C. (eds.) Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2003, Honolulu, Hawaii, USA, AACE, pp. 2358–2363 (2003)Google Scholar
  3. 3.
    Breazeal, C.: Designing Sociable Robots. Mit Pr (2002)Google Scholar
  4. 4.
    Burch, M.: Animal-assisted therapy and crack babies: A new frontier. Pet Partners Program: A Delta Society Newsletter (1991)Google Scholar
  5. 5.
    Shibata, T., Mitsui, T., Wada, K., Touda, A., Kumasaka, T., Tagami, K., Tanie, K.: Mental commit robot and its application to therapy of children. In: Proceedings of 2001 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, vol. 2 (2001)Google Scholar
  6. 6.
    Shibata, T., Wada, K., Saito, T., Tanie, K.: Robot Assisted Activity for Senior People at Day Service Center. In: Proc. of Int’l Conf. on Information Technology in Mechatronics, pp. 71–76 (2001)Google Scholar
  7. 7.
    Tamura, T., Yonemitsu, S., Itoh, A., Oikawa, D., Kawakami, A., Higashi, Y., Fujimooto, T., Nakajima, K.: Is an Entertainment Robot Useful in the Care of Elderly People With Severe Dementia? Journals of Gerontology Series A: Biological and Medical Sciences 59(1), 83–85 (2004)CrossRefGoogle Scholar
  8. 8.
    van Breemen, A.: iCat: Experimenting with Animabotics. In: Proceedings, AISB 2005 Creative Robotics Symposium (2005)Google Scholar
  9. 9.
    Kaya, N., Epps, H.: Relationship between Color and Emotion: A Study of College Students. College Student Journal 38(3), 396–406 (2004)Google Scholar
  10. 10.
    Ortony, A., Norman, D., Revelle, W.: 7 Affect and Proto-Affect in Effective Functioning. Who Needs Emotions? The Brain Meets the Robot (2005)Google Scholar
  11. 11.
    Daerden, F., Lefeber, D.: Pneumatic artificial muscles: actuators for robotics and automation. European Journal of Mechanical and Environmental Engineering 47(1), 11–21 (2002)Google Scholar
  12. 12.
    Verrelst, B., Van Ham, R., Vanderborght, B., Lefeber, D., Daerden, F., Van Damme, M.: Second generation pleated pneumatic artificial muscle and its robotic applications. Advanced Robotics 20(7), 783–805 (2006)CrossRefGoogle Scholar
  13. 13.
    Tonietti, G., Schiavi, R., Bicchi, A.: Design and control of a variable stiffness actuator for safe and fast physical human/robot interaction. In: Proceedings of the IEEE International Conference on Robotics and Automation, pp. 528–533 (2000)Google Scholar
  14. 14.
    Hurst, J., Chestnutt, J., Rizzi, A.: An actuator with physically variable stiffness for highly dynamic legged locomotion. In: Proceedings of 2004 IEEE International Conference on Robotics and Automation. ICRA 2004, p. 5 (2004)Google Scholar
  15. 15.
    Van Ham, R., Vanderborght, B., Van Damme, M., Verrelst, B., Lefeber, D.: MACCEPA, the mechanically adjustable compliance and controllable equilibrium position actuator: Design and implementation in a biped robot. Robotics and Autonomous Systems 55(10), 761–768 (2007)CrossRefGoogle Scholar
  16. 16.
    McBean, J., Breazeal, C.: Voice Coil Actuators for Human-Robot Interaction. In: IEEE. RSJ International Conference on Intelligent Robots and Systems (IROS 2004), Sendai, Japan (2004)Google Scholar
  17. 17.
    Stiehl, W., Lieberman, J., Breazeal, C., Basel, L., Lalla, L., Wolf, M.: Design of a therapeutic robotic companion for relational, affective touch. In: IEEE International Workshop on Robot and Human Interactive Communication, 2005. ROMAN 2005, pp. 408–415 (2005)Google Scholar
  18. 18.
    Urbanczyk, W., Martynkien, T., Szpulak, M., Statkiewicz, G., Olszewski, J., Golojuch, G., Wojcik, J., Mergo, P., Makara, M., Nasilowski, T.: Photonic crystal fibers: new opportunities for sensing. In: Proceedings of SPIE, vol. 6619, p. 66190G (2007)Google Scholar
  19. 19.
    Mehrabian, A.: Communication without words. Psychology Today 2(4), 53–56 (1968)Google Scholar
  20. 20.
    Ekman, P., Friesen, W.: Facial Action Coding System. Consulting Psychologists Press (1978)Google Scholar
  21. 21.
    Ekman, P.: Are there basic emotions? Psychological review 99(3), 550–553 (1992)CrossRefGoogle Scholar
  22. 22.
    Goris, K., Saldien, J., Vanderborght, B., Lefeber, D.: The Huggable Robot Probo: Design of the Robotic Head. In: AISB symposium (accepted) (2008)Google Scholar
  23. 23.
    Breazeal, C.: Regulation and Entrainment in Human-Robot Interaction. Experimental Robotics VII (2001)Google Scholar
  24. 24.
    Miyauchi, D., Sakurai, A., Nakamura, A., Kuno, Y.: Human-robot eye contact through observations and actions. In: Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004, vol. 4 (2004)Google Scholar
  25. 25.
    Beira, R., Lopes, M., Praça, M., Santos-Victor, J., Bernardino, A., Metta, G., Becchi, F., Saltaren, R.: Design of the Robot-Cub (iCub) Head. In: Proc. IEEE International Conference on Robotics and Automation, ICRA, Orlando (May 2006)Google Scholar
  26. 26.
    Posner, J., Russell, J., Peterson, B.: The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology. Development and Psychopathology 17(03), 715–734 (2005)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Kristof Goris
    • 1
  • Jelle Saldien
    • 1
  • Innes Vanderniepen
    • 1
  • Dirk Lefeber
    • 1
  1. 1.Robotics & Multibody Mechanics Research GroupVrije Universiteit BrusselBrusselBelgium

Personalised recommendations