A Bio-Inspired Model for Visual Collision Avoidance on a Hexapod Walking Robot

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9793)


While navigating their environments it is essential for autonomous mobile robots to actively avoid collisions with obstacles. Flying insects perform this behavioural task with ease relying mainly on information the visual system provides. Here we implement a bio-inspired collision avoidance algorithm based on the extraction of nearness information from visual motion on the hexapod walking robot platform HECTOR. The algorithm allows HECTOR to navigate cluttered environments while actively avoiding obstacles.


Biorobotics Bio-inspired vision Collision avoidance Optic flow Elementary motion detector 



This work has been supported by the DFG Center of Excellence Cognitive Interaction TEChnology (CITEC, EXC 277) within the EICCI-project. We thank Dr. Wolfgang Stürzl for kindly providing us with a dataset of a laser scanned outdoor environment.


  1. 1.
    Bertrand, O.J., Lindemann, J.P., Egelhaaf, M.: A bio-inspired collision avoidance model based on spatial information derived from motion detectors leads to common routes. PLoS Comput. Biol. 11(11), e1004339 (2015)CrossRefGoogle Scholar
  2. 2.
    Borst, A.: Modelling fly motion vision. In: Feng, J. (ed.) Computational Neuroscience: A Comprehensive Approach, pp. 397–429. Chapman and Hall/CTC, Boca Raton, London, New York (2004)Google Scholar
  3. 3.
    Borst, A.: Fly visual course control: behaviour, algorithms and circuits. Nat. Rev. Neurosci. 15(9), 590–599 (2014)CrossRefGoogle Scholar
  4. 4.
    Egelhaaf, M., Boeddeker, N., Kern, R., Kurtz, R., Lindemann, J.P.: Spatial vision in insects is facilitated by shaping the dynamics of visual input through behavioral action. Front. Neural Circuits 6(108), 1–23 (2012)Google Scholar
  5. 5.
    Egelhaaf, M., Kern, R., Lindemann, J.P.: Motion as a source of environmental information: a fresh view on biological motion computation by insect brains. Front. Neural Circuits 8(127), 1–15 (2014)Google Scholar
  6. 6.
    Goslin, M., Mine, M.R.: The panda3d graphics engine. Computer 37(10), 112–114 (2004)CrossRefGoogle Scholar
  7. 7.
    Koenderink, J.J.: Optic flow. Vis. Res. 26(1), 161–179 (1986)CrossRefGoogle Scholar
  8. 8.
    Kress, D., Egelhaaf, M.: Head and body stabilization in blowflies walking on differently structured substrates. J. Exp. Biol. 215(9), 1523–1532 (2012)CrossRefGoogle Scholar
  9. 9.
    Kress, D., Egelhaaf, M.: Impact of stride-coupled gaze shifts of walking blowflies on the neuronal representation of visual targets. Front. Behav. Neurosci. 8(307), 1–13 (2014)Google Scholar
  10. 10.
    Lucas, B.D., Kanade, T., et al.: An iterative image registration technique with an application to stereo vision. In: IJCAI, vol. 81, pp. 674–679 (1981)Google Scholar
  11. 11.
    Matthews, R.W., Matthews, J.R.: Insect Behavior. Springer, Netherlands (2009)Google Scholar
  12. 12.
    Miyamoto, K.: Fish eye lens. JOSA 54(8), 1060–1061 (1964)CrossRefGoogle Scholar
  13. 13.
    Montano, L., Asensio, J.R.: Real-time robot navigation in unstructured environments using a 3d laser rangefinder. In: Proceedings of the 1997 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 1997, vol. 2, pp. 526–532. IEEE (1997)Google Scholar
  14. 14.
    Paskarbeit, J., Annunziata, S., Basa, D., Schneider, A.: A self-contained, elastic joint drive for robotics applications based on a sensorized elastomer coupling - design and identification. Sens. Actuators A Phys. 199, 56–66 (2013)CrossRefGoogle Scholar
  15. 15.
    Paskarbeit, J., Schilling, M., Schmitz, J., Schneider, A.: Obstacle crossing of a real, compliant robot based on local evasion movements and averaging of stance heights using singular value decomposition. In: 2015 IEEE International Conference on Robotics and Automation (ICRA), pp. 3140–3145. IEEE (2015)Google Scholar
  16. 16.
    Petrowitz, R., Dahmen, H., Egelhaaf, M., Krapp, H.G.: Arrangement of optical axes and spatial resolution in the compound eye of the female blowfly calliphora. J. Comp. Physiol. A 186(7–8), 737–746 (2000)CrossRefGoogle Scholar
  17. 17.
    Schilling, M., Hoinville, T., Schmitz, J., Cruse, H.: Walknet, a bio-inspired controller for hexapod walking. Biol. Cybern. 107(4), 397–419 (2013)MathSciNetCrossRefGoogle Scholar
  18. 18.
    Schwegmann, A., Lindemann, J.P., Egelhaaf, M.: Depth information in natural environments derived from optic flow by insect motion detection system: a model analysis. Front. Comput. Neurosci. 8(83), 1–15 (2014)Google Scholar
  19. 19.
    Shoemaker, P.A., Ocarroll, D.C., Straw, A.D.: Velocity constancy and models for wide-field visual motion detection in insects. Biol. Cybern. 93(4), 275–287 (2005)CrossRefzbMATHGoogle Scholar
  20. 20.
    Srinivasan, M., Guy, R.: Spectral properties of movement perception in the dronefly eristalis. J. Comp. Physiol. A 166(3), 287–295 (1990)CrossRefGoogle Scholar
  21. 21.
    Stürzl, W., Böddeker, N., Dittmar, L., Egelhaaf, M.: Mimicking honeybee eyes with a 280 field of view catadioptric imaging system. Bioinspir. Biomim. 5(3), 036002 (2010)CrossRefGoogle Scholar
  22. 22.
    Stürzl, W., Grixa, I., Mair, E., Narendra, A., Zeil, J.: Three-dimensional models of natural environments and the mapping of navigational information. J. Comp. Physiol. A 201(6), 563–584 (2015)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Biomechatronics, Center of Excellence ‘Cognitive Interaction Technology’ (CITEC)University of BielefeldBielefeldGermany
  2. 2.Department of Neurobiology and Center of Excellence ‘Cognitive Interaction Technology’ (CITEC)University of BielefeldBielefeldGermany
  3. 3.Embedded Systems and Biomechatronics Group, Faculty of Engineering and MathematicsUniversity of Applied SciencesBielefeldGermany

Personalised recommendations