Skip to main content
Log in

Local visual homing by matched-filter descent in image distances

  • Original Paper
  • Published:
Biological Cybernetics Aims and scope Submit manuscript

Abstract

In natural images, the distance measure between two images taken at different locations rises smoothly with increasing distance between the locations. This fact can be exploited for local visual homing where the task is to reach a goal location that is characterized by a snapshot image: descending in the image distance will lead the agent to the goal location. To compute an estimate of the spatial gradient in the distance measure, its value must be sampled at three noncollinear points. An animal or robot would have to insert exploratory movements into its home trajectory to collect these samples. Here we suggest a method based on the matched-filter concept that allows one to estimate the gradient without exploratory movements. Two matched filters – optical flow fields resulting from translatory movements in the horizontal plane – are used to predict two images in perpendicular directions from the current location. We investigate the relation to differential flow methods applied to the local homing problem and show that the matched-filter approach produces reliable homing behavior on image databases. Two alternative methods that only require a single matched filter are suggested. The matched-filter concept is also applied to derive a home-vector equation for a Fourier-based parameter method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Barron JL, Fleet DJ, Beauchemin SS (1994) Performance of optical flow techniques. Int J Comput Vis 12:43–77

    Article  Google Scholar 

  • Batschelet E (1981) Circular statistics in biology. Academic, San Diego

    Google Scholar 

  • Beauchemin SS, Barron JL (1995) The computation of optical flow. ACM Comput Surv 27:433–467

    Article  Google Scholar 

  • Cartwright BA, Collett TS (1983) Landmark learning in bees. J Comp Physiol A 151:521–543

    Article  Google Scholar 

  • Egelhaaf M et al (2002) Neural encoding of behaviorally relevant visual-motion information in the fly. Trends Neurosci 25:96–102

    Article  PubMed  CAS  Google Scholar 

  • Franz MO, Chahl JS, Krapp HG (2004) Insect-inspired estimation of egomotion. Neural Comput 16:2245–2260

    Article  PubMed  Google Scholar 

  • Franz MO, Krapp HG (2000) Wide-field, motion-sensitive neurons and matched filters for optic flow fields. Biol Cybern 83:185–197

    Article  PubMed  CAS  Google Scholar 

  • Franz MO, Mallot HA (2000) Biomimetic robot navigation. Robot Auton Syst, special issue: Biomimetic Robots 30:133–153

    Google Scholar 

  • Franz MO, Schölkopf B, Mallot HA, Bülthoff HH (1998a) Learning view graphs for robot navigation. Auton Robots 5:111–125

    Article  Google Scholar 

  • Franz MO, Schölkopf B, Mallot HA, Bülthoff HH (1998b) Where did I take that snapshot? Scene-based homing by image matching. Biol Cybern 79:191–202

    Article  Google Scholar 

  • Giachetti A (2000) Matching techniques to compute image motion. Image Vis Comput 18:247–260

    Article  Google Scholar 

  • Jähne B (2002) Digital image processing, 5th edn. Springer, Berlin Heidelberg New York

    Google Scholar 

  • Koenderink JJ, van Doorn AJ (1987) Facts on optic flow. Biol Cybern 56:247–254

    Article  PubMed  CAS  Google Scholar 

  • Krapp HG (2000) Neuronal matched filters for optic flow processing in flying insects. Int Rev Neurobiol 44:93–120

    Article  PubMed  CAS  Google Scholar 

  • Lambrinos D, Möller R, Labhart T, Pfeifer R, Wehner R (2000) A mobile robot employing insect strategies for navigation. Robot Auton Syst, special issue: Biomimetic Robots 30: 39–64

    Google Scholar 

  • Menegatti E, Maeda T, Ishiguro H (2004) Image-based memory for robot navigation using properties of omnidirectional images. Robot Auton Syst 47:251–267

    Article  Google Scholar 

  • Möller R (2000) Insect visual homing strategies in a robot with analog processing. Biol Cybern 83:231–243

    Article  PubMed  Google Scholar 

  • Möller R (2001) Do insects use templates or parameters for landmark navigation? J Theor Biol 210:33–45

    Article  PubMed  Google Scholar 

  • Möller R (2002) Visual homing without image matching. In: Ayers J, Davis JL, Rudolph A (eds) Neurotechnology for biomimetic Robots. MIT Press, Cambridge, pp 517–532

    Google Scholar 

  • Stürzl W, Mallot HA (2002) Vision-based homing with a panoramic stereo sensor. In: Biologically motivated computer vision. Lecture Notes in Computer Science 2525. Springer, Berlin Heidelberg New York, pp 620–628

  • Stürzl W, Mallot HA (2006) Efficient visual homing based on Fourier transformed panoramic images. Robot Auton Syst 54:300–313

    Article  Google Scholar 

  • Vardy A, Möller R (2005) Biologically plausible visual homing methods based on optical flow techniques. Connect Sci 17:47–89

    Article  Google Scholar 

  • Wehner R (1987) ‘Matched filters’ – neural models of the external world. J Comp Physiol A 161:511–531

    Article  Google Scholar 

  • Wehner R, Räber F (1979) Visual spatial memory in desert ants, cataglyphis bicolor (Hymenoptera: Formicidae). Experientia 35:1569–1571

    Article  Google Scholar 

  • Zeil J, Hoffmann MI, Chahl JS (2003) Catchment areas of panoramic images in outdoor scenes. J Opt Soc Am A 20:450–469

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Möller, R., Vardy, A. Local visual homing by matched-filter descent in image distances. Biol Cybern 95, 413–430 (2006). https://doi.org/10.1007/s00422-006-0095-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00422-006-0095-3

Keywords

Navigation