Unsupervised Learning of Sensory Primitives from Optical Flow Fields

  • Oswald Berthold
  • Verena V. Hafner
Conference paper

DOI: 10.1007/978-3-319-08864-8_18

Part of the Lecture Notes in Computer Science book series (LNCS, volume 8575)
Cite this paper as:
Berthold O., Hafner V.V. (2014) Unsupervised Learning of Sensory Primitives from Optical Flow Fields. In: del Pobil A.P., Chinellato E., Martinez-Martin E., Hallam J., Cervera E., Morales A. (eds) From Animals to Animats 13. SAB 2014. Lecture Notes in Computer Science, vol 8575. Springer, Cham

Abstract

Adaptive behaviour of animats largely depends on the processing of their sensory information. In this paper, we examine the estimation of robot egomotion from visual input by unsupervised online learning. The input is a sparse optical flow field constructed from discrete motion detectors. The global flow field properties depend on the robot motion, the spatial distribution of motion detectors with respect to the robot body and the visual environment. We show how online linear Principal Component Analysis can be applied to this problem to enable a robot to continuously adapt to a changing environment.

Keywords

adaptive behaviour source separation feature learning neural network optical flow primitives redundancy representation learning sensor array unsupervised vision 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Oswald Berthold
    • 1
  • Verena V. Hafner
    • 1
  1. 1.Cognitive Robotics Group, Dept. of Computer ScienceHumboldt-Universität zu BerlinBerlinGermany

Personalised recommendations