Skip to main content

Monocular Obstacle Detection

  • Chapter
  • First Online:
The DelFly

Abstract

This chapter deals with monocular obstacle detection. The visual cue of optic flow is used to determine the time-to-impact to obstacles in the environment. Since the flapping wing motion hampers the determination of optic flow, a complementary, “appearance variation cue” is studied. Combining these visual cues significantly improves detection results.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    All parameter settings will be mentioned in Sect. 8.2.2.

  2. 2.

    http://www.bene-guido.eu/guido/.

  3. 3.

    In fact, using the Delta method, in [1] the following formula was obtained: \(E[H(\hat{p})] \sim H(p) - \frac{n-1}{2s}\).

  4. 4.

    Please note that the appearance variation cue does not directly depend on the time-to-impact, but on the distances to obstacles in view. However, by assuming a constant velocity the time-to-impact and distance to the imminent obstacle are linearly related. For this reason, both methods can be compared on the time-to-impact classification task.

References

  1. G.P. Basharin, On a statistical estimate for the entropy of a sequence of independent random variables. Theor. Probab. Appl. 4(3), 333–336 (1959)

    Article  MathSciNet  Google Scholar 

  2. J-Y. Bouguet, Pyramidal implementation of the Lucas Kanade feature tracker. Description of the algorithm (2000)

    Google Scholar 

  3. V. Bruce, P.R. Green, M.A. Georgeson, Visual Perception: Physiology, Psychology and Ecology. (Psychology Press, Routledge, 2003)

    Google Scholar 

  4. A. Bruhn, J. Weickert, C. Feddern, T. Kohlberger, C. Schnorr, Real-time optic flow computation with variational methods, in CAIP 2003, LNCS 2756 (2003), pp. 222–229

    Google Scholar 

  5. J. Conroy, G. Gremillion, B. Ranganathan, J.S. Humbert, Implementation of wide-field integration of optic flow for autonomous quadrotor navigation. Auton. Robot. 27(3), 189–198 (2009)

    Article  Google Scholar 

  6. G.C.H.E. de Croon, E. de Weerdt, C. de Wagter, B.D.W. Remes, R. Ruijsink, The appearance variation cue for obstacle avoidance, in ROBIO (2010)

    Google Scholar 

  7. G.C.H.E. de Croon, E. de Weerdt, C. de Wagter, B.D.W. Remes, R. Ruijsink, The appearance variation cue for obstacle avoidance. IEEE Trans. Robot. 28(2), 529–534 (2012)

    Article  Google Scholar 

  8. G.C.H.E. de Croon, C. De Wagter, B.D.W. Remes, R. Ruijsink, Sub-sampling: real-time vision for micro air vehicles. Robot. Auton. Syst. 60(2), 167–181

    Google Scholar 

  9. T. Fawcett, An introduction to roc analysis. Pattern Recogn. Lett. 27, 861–874 (2006)

    Article  Google Scholar 

  10. D.J. Fleet, A.D. Jepson, Stability of phase information. IEEE Trans. Pattern Anal. Mach. Intell. 15(12), 1253–1268 (1993)

    Article  Google Scholar 

  11. T. Gautama, M.M. Van Hulle, A phase-based approach to the estimation of the optical flow field using spatial filtering. IEEE Trans. Neural Networks 13(5), 1127–1136 (2002)

    Article  Google Scholar 

  12. B.K.P. Horn, B.G. Schunck, Determining optical flow. Artif. Intell. 17, 185–203 (1981)

    Article  Google Scholar 

  13. B.M. Jedynak, S.M. Khudanpur, Maximum likelihood set for estimating a probability mass function. Neural Comput. 17(7), 1508–1530 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  14. L. Kaufman, Sight and Mind (Oxford University Press, New York, 1974)

    Google Scholar 

  15. T. Kohonen, Self-Organizing Maps. (Springer, Berlin, 2001)

    Google Scholar 

  16. H.C. Longuet-Higgins, K. Prazdny, The interpretation of a moving retinal image. Proc. R. Soc. Lond. B 208, 385–397 (1980)

    Article  Google Scholar 

  17. B.D. Lucas, T. Kanade, An iterative image registration technique with an application to stereo vision, in Proceedings of Imaging understanding workshop (1981), pp. 121–130

    Google Scholar 

  18. T. Schuermann, Bias analysis in entropy estimation. J. Phys. A Math. Gen. 37, 295–301 (2004)

    Article  MathSciNet  Google Scholar 

  19. C.E. Shannon, A mathematical theory of communication. Bell Syst. Tech. J. 27(379–423), 623–656 (1948)

    Article  MathSciNet  MATH  Google Scholar 

  20. J. Shi, C. Tomasi, Good features to track, in CVPR (1994)

    Google Scholar 

  21. R.T. Surdick, T.D. Davis, T. Elizabeth, R.A. King, G.M. Corso, A. Shapiro, L. Hodges, K. Elliot, Relevant cues for the visual perception of depth: Is where you see it where it is? in Human Factors and Ergonomics Society Annual Meeting Proceedings, Visual Performance (1994), pp. 1305–1309(5)

    Google Scholar 

  22. N. Takeda, M. Watanabe, K. Onoguchi, Moving obstacle detection using residual error of FOE estimation, in IROS (1996), pp. 1642–1647

    Google Scholar 

  23. L.J.P. van der Maaten, An introduction to dimensionality reduction using matlab, micc 07-07. Technical report, Maastricht University, The Netherlands (2007)

    Google Scholar 

  24. M. Varma, A. Zisserman, Texture classification: are filter banks necessary? in (CVPR 2003), vol 2 (2003), pp. 691–698

    Google Scholar 

  25. T. Zhang, H. Wu, A. Borst, K. Kuhnlenz, M. Buss, An fpga implementation of insect-inspired motion detector for high-speed vision systems, in 2008 IEEE International Conference on Robotics and Automation Pasadena, CA, USA (2008), pp. 335–340

    Google Scholar 

Download references

Acknowledgments

This chapter is partly based on [6, 7].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to G. C. H. E. de Croon .

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer Science+Bussiness Media Dordrecht

About this chapter

Cite this chapter

de Croon, G.C.H.E., Perçin, M., Remes , B.D.W., Ruijsink, R., De Wagter, C. (2016). Monocular Obstacle Detection. In: The DelFly. Springer, Dordrecht. https://doi.org/10.1007/978-94-017-9208-0_8

Download citation

  • DOI: https://doi.org/10.1007/978-94-017-9208-0_8

  • Published:

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-94-017-9207-3

  • Online ISBN: 978-94-017-9208-0

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics