Chapter

Neural Information Processing

Volume 4985 of the series Lecture Notes in Computer Science pp 214-221

From Biologically Realistic Imitation to Robot Teaching Via Human Motor Learning

  • Erhan OztopAffiliated withJST, ICORP, Computational Brain ProjectATR Computational Neuroscience Laboratories
  • , Jan BabicAffiliated withATR Computational Neuroscience LaboratoriesDepartment of Automation, Biocybernetics and Robotics, Jozef Stefan Institute
  • , Joshua HaleAffiliated withJST, ICORP, Computational Brain ProjectATR Computational Neuroscience Laboratories
  • , Gordon ChengAffiliated withJST, ICORP, Computational Brain ProjectATR Computational Neuroscience Laboratories
  • , Mitsuo KawatoAffiliated withJST, ICORP, Computational Brain ProjectATR Computational Neuroscience Laboratories

* Final gross prices may vary according to local VAT.

Get Access

Abstract

Understanding mechanisms of imitation is a complex task in both human sciences and robotics. On the one hand, one can build systems that analyze observed motion, map it to their own body, and produce the motor commands to needed to achieve the inferred motion using engineering techniques. On the other hand, one can model the neural circuits involved in action observation and production in minute detail and hope that imitation will be an emergent property of the system. However if the goal is to build robots capable of skillful actions, midway solutions appear to be more appropriate. In this direction, we first introduce a conceptually biologically realistic neural network that can learn to imitate hand postures, either with the help of a teacher or by self-observation. Then we move to a paradigm we have recently proposed, where robot skill synthesis is achieved by exploiting the human capacity to learn novel control tasks.