Body Model Transition by Tool Grasping During Motor Babbling Using Deep Learning and RNN

  • Kuniyuki Takahashi
  • Hadi Tjandra
  • Tetsuya Ogata
  • Shigeki Sugano
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9886)

Abstract

We propose a method of tool use considering the transition process of a body model from not grasping to grasping a tool using a single model. In our previous research, we proposed a tool-body assimilation model in which a robot autonomously learns tool functions using a deep neural network (DNN) and recurrent neural network (RNN) through experiences of motor babbling. However, the robot started its motion already holding the tools. In real-life situations, the robot would make decisions regarding grasping (handling) or not grasping (manipulating) a tool. To achieve this, the robot performs motor babbling without the tool pre-attached to the hand with the same motion twice, in which the robot handles the tool or manipulates without graping it. To evaluate the model, we have the robot generate motions with showing the initial and target states. As a result, the robot could generate the correct motions with grasping decisions.

Keywords

Grasping Recurrent neural network Deep neural network 

Notes

Acknowledgment

This work has been supported by JSPS Grant-in-Aid for Scientific Research 15J12683; the Program for Leading Graduate Schools, “Graduate Program for Embodiment Informatics” of the Ministry of Education, Culture, Sports, Science, and Technology; JSPS Grant-in-Aid for Scientific Research (S) (2522005); “Fundamental Study for Intelligent Machine to Coexist with Nature” Research Institute for Science and Engineering, Waseda University; MEXT Grant-in-Aid for Scientific Research (A) 15H01710; and MEXT Grant-in-Aid for Scientific Research on Innovative Areas “Constructive Developmental Science” (24119003)

References

  1. 1.
    Maravita, A., Iriki, A.: Tools for the body (schema). Trends Cogn. Sci. 8(2), 79–86 (2004)CrossRefGoogle Scholar
  2. 2.
    Martens, J.: Deep learning via Hessian-free optimization. In: Proceedings of the 27th International Conference on Machine Learning (ICML 2010), pp. 735–742 (2010)Google Scholar
  3. 3.
    Murata, S., Yamashita, Y., Arie, H., Ogata, T., Sugano, S., Tani, J.: Learning to perceive the world as probabilistic or deterministic via interaction with others: a neuro-robotics experiment. IEEE Trans. Neural Netw. Learn. Syst. (2015)Google Scholar
  4. 4.
    Nagahama, K., Yamazaki, K., Okada, K., Inaba, M.: Manipulation of multiple objects in close proximity based on visual hierarchical relationships. In: IEEE International Conference on Robotics and Automation, pp. 1303–1310 (2013)Google Scholar
  5. 5.
    Noda, K., Arie, H., Suga, Y., Ogata, T.: Multimodal integration learning of robot behavior using deep neural networks. Robot. Auton. Syst. 62(6), 721–736 (2014)CrossRefGoogle Scholar
  6. 6.
    Sturm, J., Plagemann, C., Burgard, W.: Unsupervised body scheme learning through self-perception. In: IEEE International Conference on Robotics and Automation, pp. 3328–3333 (2008)Google Scholar
  7. 7.
    Takahashi, K., Ogata, T., Tjandra, H., Yamaguchi, Y., Sugano, S.: Tool-body assimilation model based on body babbling and neurodynamical system. Math. Prob. Eng. (2015)Google Scholar
  8. 8.
    Tikhanoff, V., Pattacini, U., Natale, L., Metta, G.: Exploring affordances and tool use on the iCub. In: IEEE-RAS International Conference on Humanoid Robots, pp. 130–137 (2013)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Kuniyuki Takahashi
    • 1
    • 2
  • Hadi Tjandra
    • 1
  • Tetsuya Ogata
    • 3
  • Shigeki Sugano
    • 1
  1. 1.Graduate School of Creative Science and EngineeringWaseda UniversityTokyoJapan
  2. 2.Japan and Research Fellow of Japan Society for the Promotion of Science (JSPS Research Fellow)TokyoJapan
  3. 3.Graduate School of Fundamental Science and EngineeringWaseda UniversityTokyoJapan

Personalised recommendations