Biological Cybernetics

, Volume 95, Issue 5, pp 413–430 | Cite as

Local visual homing by matched-filter descent in image distances

  • Ralf Möller
  • Andrew Vardy
Original Paper

Abstract

In natural images, the distance measure between two images taken at different locations rises smoothly with increasing distance between the locations. This fact can be exploited for local visual homing where the task is to reach a goal location that is characterized by a snapshot image: descending in the image distance will lead the agent to the goal location. To compute an estimate of the spatial gradient in the distance measure, its value must be sampled at three noncollinear points. An animal or robot would have to insert exploratory movements into its home trajectory to collect these samples. Here we suggest a method based on the matched-filter concept that allows one to estimate the gradient without exploratory movements. Two matched filters – optical flow fields resulting from translatory movements in the horizontal plane – are used to predict two images in perpendicular directions from the current location. We investigate the relation to differential flow methods applied to the local homing problem and show that the matched-filter approach produces reliable homing behavior on image databases. Two alternative methods that only require a single matched filter are suggested. The matched-filter concept is also applied to derive a home-vector equation for a Fourier-based parameter method.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Barron JL, Fleet DJ, Beauchemin SS (1994) Performance of optical flow techniques. Int J Comput Vis 12:43–77CrossRefGoogle Scholar
  2. Batschelet E (1981) Circular statistics in biology. Academic, San DiegoGoogle Scholar
  3. Beauchemin SS, Barron JL (1995) The computation of optical flow. ACM Comput Surv 27:433–467CrossRefGoogle Scholar
  4. Cartwright BA, Collett TS (1983) Landmark learning in bees. J Comp Physiol A 151:521–543CrossRefGoogle Scholar
  5. Egelhaaf M et al (2002) Neural encoding of behaviorally relevant visual-motion information in the fly. Trends Neurosci 25:96–102PubMedCrossRefGoogle Scholar
  6. Franz MO, Chahl JS, Krapp HG (2004) Insect-inspired estimation of egomotion. Neural Comput 16:2245–2260PubMedCrossRefGoogle Scholar
  7. Franz MO, Krapp HG (2000) Wide-field, motion-sensitive neurons and matched filters for optic flow fields. Biol Cybern 83:185–197PubMedCrossRefGoogle Scholar
  8. Franz MO, Mallot HA (2000) Biomimetic robot navigation. Robot Auton Syst, special issue: Biomimetic Robots 30:133–153Google Scholar
  9. Franz MO, Schölkopf B, Mallot HA, Bülthoff HH (1998a) Learning view graphs for robot navigation. Auton Robots 5:111–125CrossRefGoogle Scholar
  10. Franz MO, Schölkopf B, Mallot HA, Bülthoff HH (1998b) Where did I take that snapshot? Scene-based homing by image matching. Biol Cybern 79:191–202CrossRefGoogle Scholar
  11. Giachetti A (2000) Matching techniques to compute image motion. Image Vis Comput 18:247–260CrossRefGoogle Scholar
  12. Jähne B (2002) Digital image processing, 5th edn. Springer, Berlin Heidelberg New YorkGoogle Scholar
  13. Koenderink JJ, van Doorn AJ (1987) Facts on optic flow. Biol Cybern 56:247–254PubMedCrossRefGoogle Scholar
  14. Krapp HG (2000) Neuronal matched filters for optic flow processing in flying insects. Int Rev Neurobiol 44:93–120PubMedCrossRefGoogle Scholar
  15. Lambrinos D, Möller R, Labhart T, Pfeifer R, Wehner R (2000) A mobile robot employing insect strategies for navigation. Robot Auton Syst, special issue: Biomimetic Robots 30: 39–64Google Scholar
  16. Menegatti E, Maeda T, Ishiguro H (2004) Image-based memory for robot navigation using properties of omnidirectional images. Robot Auton Syst 47:251–267CrossRefGoogle Scholar
  17. Möller R (2000) Insect visual homing strategies in a robot with analog processing. Biol Cybern 83:231–243PubMedCrossRefGoogle Scholar
  18. Möller R (2001) Do insects use templates or parameters for landmark navigation? J Theor Biol 210:33–45PubMedCrossRefGoogle Scholar
  19. Möller R (2002) Visual homing without image matching. In: Ayers J, Davis JL, Rudolph A (eds) Neurotechnology for biomimetic Robots. MIT Press, Cambridge, pp 517–532Google Scholar
  20. Stürzl W, Mallot HA (2002) Vision-based homing with a panoramic stereo sensor. In: Biologically motivated computer vision. Lecture Notes in Computer Science 2525. Springer, Berlin Heidelberg New York, pp 620–628Google Scholar
  21. Stürzl W, Mallot HA (2006) Efficient visual homing based on Fourier transformed panoramic images. Robot Auton Syst 54:300–313CrossRefGoogle Scholar
  22. Vardy A, Möller R (2005) Biologically plausible visual homing methods based on optical flow techniques. Connect Sci 17:47–89CrossRefGoogle Scholar
  23. Wehner R (1987) ‘Matched filters’ – neural models of the external world. J Comp Physiol A 161:511–531CrossRefGoogle Scholar
  24. Wehner R, Räber F (1979) Visual spatial memory in desert ants, cataglyphis bicolor (Hymenoptera: Formicidae). Experientia 35:1569–1571CrossRefGoogle Scholar
  25. Zeil J, Hoffmann MI, Chahl JS (2003) Catchment areas of panoramic images in outdoor scenes. J Opt Soc Am A 20:450–469Google Scholar

Copyright information

© Springer-Verlag 2006

Authors and Affiliations

  • Ralf Möller
    • 1
  • Andrew Vardy
    • 2
  1. 1.Computer Engineering, Faculty of TechnologyBielefeld UniversityBielefeldGermany
  2. 2.Computer Science/Engineering and Applied ScienceMemorial University of NewfoundlandSt. John’sCanada

Personalised recommendations