Advertisement

Optic Flow to Steer and Avoid Collisions in 3D

  • Jean-Christophe ZuffereyEmail author
  • Antoine Beyeler
  • Dario Floreano
Chapter

Abstract

Optic flow is believed to be the main source of information allowing insects to control their flight. Some researchers have tried to apply this paradigm to small unmanned aerial vehicles (UAVs). So far, none of them has been able to demonstrate a fully autonomous flight of a free-flying system without relying on other cues such as GPS and/or some sort of orientation sensors (IMU, horizon detector, etc.). Getting back to the reactive approach suggested by Gibson (direct perception) and Braitenberg (direct connection from sensors to actuators), this chapter discusses how a few optic flow signals can be directly mapped into control commands for steering an aircraft in cluttered environments. The implementation of the proposed control strategy on a 10-g airplane flying autonomously in an office-sized room demonstrates how the proposed approach can result in ultra-light autopilots.

Keywords

Optic Flow Unmanned Aerial Vehicle Collision Avoidance Inertial Measurement Unit Vision Sensor 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgements

We wish to thank Jean-Daniel Nicoud, Adam Klaptocz and André Guignard for their significant contributions in the development of the hardware and the electronics of the MC2. Many thanks also to Tim Stirling for his help in improving this manuscript. This project is funded by EPFL and by the Swiss National Science Foundation, grant number 200021-105545/1.

References

  1. 1.
    Barber, D., Griffiths, S., McLain, T., Beard, R.: Autonomous landing of miniature aerial vehicles. AIAA Infotech@Aerospace (2005)Google Scholar
  2. 2.
    Barron, J., Fleet, D., Beauchemin, S.: Performance of optical flow techniques. International Journal of Computer Vision 12(1), 43–77 (1994)CrossRefGoogle Scholar
  3. 3.
    Barrows, G., Chahl, J., Srinivasan, M.: Biomimetic visual sensing and flight control. Bristol Conference on UAV Systems (2002)Google Scholar
  4. 4.
    Barrows, G., Neely, C., Miller, K.: Optic flow sensors for MAV navigation. In: T.J. Mueller (ed.) Fixed and Flapping Wing Aerodynamics for Micro Air Vehicle Applications, Progress in Astronautics and Aeronautics, vol. 195, pp. 557–574. AIAA (2001)Google Scholar
  5. 5.
    Beyeler, A., Zufferey, J., Floreano, D.: 3D vision-based navigation for indoor microflyers. IEEE International Conference on Robotics and Automation (ICRA’07) (2007)Google Scholar
  6. 6.
    Beyeler, A., Zufferey, J., Floreano, D.: Vision-based control of near-obstacle flight. Autonomous Robots. In Press (2010)Google Scholar
  7. 7.
    Borst, A., Bahde, S.: Spatio-temporal integration of motion. Naturwissenschaften 75, 265–267 (1988)CrossRefGoogle Scholar
  8. 8.
    Borst, A., Egelhaaf, M., Seung, H.S.: Two-dimensional motion perception in flies. Neural Computation 5(6), 856–868 (1993)CrossRefGoogle Scholar
  9. 9.
    Braitenberg, V.: Vehicles – Experiments In Synthetic Psychology. The MIT Press, Cambridge, MA (1984)Google Scholar
  10. 10.
    Camus, T.: Calculating time-to-contact using real-time quantized optical flow. Tech. Rep. 5609, National Institute Of Standards and Technology NISTIR (1995)Google Scholar
  11. 11.
    Chahl, J., Srinivasan, M., Zhang, H.: Landing strategies in honeybees and applications to uninhabited airborne vehicles. The International Journal of Robotics Research 23(2), 101–110 (2004)CrossRefGoogle Scholar
  12. 12.
    Chapman, R.: The Insects: Structure and Function, 4th edn. Cambridge University Press (1998)Google Scholar
  13. 13.
    Collett, T., Land, M.: Visual control of flight behavior in the hoverfly, syritta pipiens. Journal of Comparative Physiology 99, 1–66 (1975)CrossRefGoogle Scholar
  14. 14.
    Coombs, D., Herman, M., Hong, T., Nashman, M.: Real-time obstacle avoidance using central flow divergence and peripheral flow. International Conference on Computer Vision, pp. 276–283 (1995)Google Scholar
  15. 15.
    Duchon, A., Warren, W.H., Kaelbling, L.: Ecological robotics. Adaptive Behavior 6, 473–507 (1998)CrossRefGoogle Scholar
  16. 16.
    Egelhaaf, M., Kern, R., Krapp, H., Kretzberg, J., Kurtz, R., Warzechna, A.: Neural encoding of behaviourally relevant visual-motion information in the fly. Trends in Neurosciences 25(2), 96–102 (2002)CrossRefGoogle Scholar
  17. 17.
    Fermüller, C., Aloimonos, Y.: Primates, bees, and ugvs (unmanned ground vehicles) in motion. In: M. Srinivisan, S. Venkatesh (eds.) From Living Eyes to Seeing Machines, pp. 199–225. Oxford University Press (1997)Google Scholar
  18. 18.
    Floreano, D., Mondada, F.: Automatic creation of an autonomous agent: Genetic evolution of a neural-network driven robot. From Animals to Animats 3, 421–430 (1994)Google Scholar
  19. 19.
    Franceschini, N., Pichon, J., Blanes, C.: From insect vision to robot vision. Philosophical Transactions of the Royal Society B 337, 283–294 (1992)CrossRefGoogle Scholar
  20. 20.
    Gibson, J.: The Perception of the Visual World. Houghton Mifflin, Boston (1950)Google Scholar
  21. 21.
    Gibson, J.: The Ecological Approach to Visual Perception. Houghton Mifflin, Boston (1979)Google Scholar
  22. 22.
    Green, W., Oh, P., Barrows, G.: Flying insect inspired vision for autonomous aerial robot maneuvers in near-earth environments. Proceeding of the IEEE International Conference on Robotics and Automation, vol. 3, pp. 2347– 2352 (2004)Google Scholar
  23. 23.
    Griffiths, S., Saunders, J., Curtis, A., McLain, T., Beard, R.: Obstacle and Terrain Avoidance for Miniature Aerial Vehicles, Intelligent Systems, Control and Automation: Science and Engineering, vol. 33, chap. I.7, pp. 213–244. Springer (2007)CrossRefGoogle Scholar
  24. 24.
    Hassenstein, B., Reichardt, W.: Systemtheoretische analyse der zeit-, reihenfolgen- und vorzeichenauswertung bei der bewe-gungsperzeption des rüsselkäfers chlorophanus. Zeitschrift für Naturforschung 11b, 513–524 (1956)Google Scholar
  25. 25.
    Hildreth, E.: The Measurement of Visual Motion. MIT, Cambridge (1984)Google Scholar
  26. 26.
    Horn, B.: Robot vision. MIT Press (1986)Google Scholar
  27. 27.
    Horridge, A.: Insects which turn and look. Endeavour 1, 7–17 (1977)CrossRefGoogle Scholar
  28. 28.
    Hrabar, S., Sukhatme, G.S., Corke, P., Usher, K., Roberts, J.: Combined optic-flow and stereo-based navigation of urban canyons for uav. IEEE International Conference on Intelligent Robots and Systems, pp. 3309–3316. IEEE (2005)Google Scholar
  29. 29.
    Kern, R., van Hateren, J., Egelhaaf, M.: Representation of behaviourally relevant information by blowfly motion-sensitive visual interneurons requires precise compensatory head movements. Journal of Experimental Biology 206, 1251–1260 (2006)CrossRefGoogle Scholar
  30. 30.
    Koenderink, J., van Doorn, A.: Facts on optic flow. Biological Cybernetics 56, 247–254 (1987)zbMATHCrossRefGoogle Scholar
  31. 31.
    Krapp, H.: Neuronal matched filters for optic flow processing in flying insects. In: M. Lappe (ed.) Neuronal Processing of Optic Flow, pp. 93–120. San Diego: Academic Press (2000)Google Scholar
  32. 32.
    Krapp, H., Hengstenberg, B., Hengstenberg, R.: Dendritic structure and receptive-field organization of optic flow processing interneurons in the fly. Journal of Neurophysiology 79, 1902–1917 (1998)Google Scholar
  33. 33.
    Krapp, H., Hengstenberg, R.: Estimation of self-motion by optic flow processing in single visual interneurons. Nature 384, 463–466 (1996)CrossRefGoogle Scholar
  34. 34.
    Land, M.: Visual acuity in insects. Annual Review of Entomology 42, 147–177 (1997)CrossRefGoogle Scholar
  35. 35.
    Lichtensteiger, L., Eggenberger, P.: Evolving the morphology of a compound eye on a robot. Proceedings of the Third European Workshop on Advanced Mobile Robots (Eurobot ’99), pp. 127–134 (1999)Google Scholar
  36. 36.
    Mallot, H.: Computational Vision: Information Processing in Perception and Visual Behavior. The MIT Press (2000)Google Scholar
  37. 37.
    Marr, D.: Vision: A Computational Investigation into the Human Representation and Processing of Visual Information. W.H. Freeman and Company, New York (1982)Google Scholar
  38. 38.
    Mura, F., Franceschini, N.: Visual control of altitude and speed in a flying agent. From Animals to Animats III, pp. 91–99. MIT Press (1994)Google Scholar
  39. 39.
    Muratet, L., Doncieux, S., Brière, Y., Meyer, J.: A contribution to vision-based autonomous helicopter flight in urban environments. Robotics and Autonomous Systems 50(4), 195–209 (2005)CrossRefGoogle Scholar
  40. 40.
    Nalbach, G.: The halteres of the blowfly calliphora. I. Kinematics and dynamics. Journal of Comparative Physiology A 173(3), 293–300 (1993)CrossRefGoogle Scholar
  41. 41.
    Nelson, R., Aloimonos, Y.: Obstacle avoidance using flow field divergence. IEEE Transactions on Pattern Analysis and Machine Intelligence 11(10), 1102–1106 (1989)CrossRefGoogle Scholar
  42. 42.
    Neumann, T., Bülthoff, H.: Behavior-oriented vision for biomimetic flight control. Proceedings of the EPSRC/ BBSRC International Workshop on Biologically Inspired Robotics, pp. 196–203 (2002)Google Scholar
  43. 43.
    Ruffier, F., Franceschini, N.: Optic flow regulation: the key to aircraft automatic guidance. Robotics and Autonomous Systems 50(4), 177–194 (2005)CrossRefGoogle Scholar
  44. 44.
    Santos-Victor, J., Sandini, G., Curotto, F., Garibaldi, S.: Divergent stereo for robot navigation: A step forward to a robotic bee. International Journal of Computer Vision 14, 159–177 (1995)CrossRefGoogle Scholar
  45. 45.
    Scherer, S., Singh, S., Chamberlain, L., Saripalli, S.: Flying fast and low among obstacles. Proceedings of the 2007 IEEE Conference on Robotics and Automation, pp. 2023–2029 (2007)Google Scholar
  46. 46.
    Schilstra, C., van Hateren, J.: Stabilizing gaze in flying blowflies. Nature 395, 654 (1998)CrossRefGoogle Scholar
  47. 47.
    Schuppe, H., Hengstenberg, R.: Optical properties of the ocelli of calliphora erythrocephala and their role in the dorsal light response. Journal of Comparative Physiology A 173, 143–149 (1993)CrossRefGoogle Scholar
  48. 48.
    Serres, J., Ruffier, F., Viollet, S., Franceschini, N.: Toward optic flow regulation for wall-following and centring behaviours. International Journal of Advanced Robotic Systems 3(27), 147–154 (2006)Google Scholar
  49. 49.
    Sobel, E.: The locust’s use of motion parallax to measure distance. Journal of Comparative Physiology A 167, 579–588 (1990)CrossRefGoogle Scholar
  50. 50.
    Sobey, P.: Active navigation with a monocular robot. Biological Cybernetics 71, 433–440 (1994)CrossRefGoogle Scholar
  51. 51.
    Srinivasan, M.: An image-interpolation technique for the computation of optic flow and egomotion. Biological Cybernetics 71, 401–416 (1994)zbMATHCrossRefGoogle Scholar
  52. 52.
    Srinivasan, M., Chahl, J., Nagle, M., Zhang, S.: Embodying natural vision into machines. In: M. Srinivasan, S. Venkatesh (eds.) From Living Eyes to Seeing Machines, pp. 249–265 (1997)Google Scholar
  53. 53.
    Srinivasan, M., Chahl, J., Weber, K., Venkatesh, S., Zhang, H.: Robot navigation inspired by principles of insect vision. In: A. Zelinsky (ed.) Field and Service Robotics, pp. 12–16. Springer-Verlag (1998)Google Scholar
  54. 54.
    Srinivasan, M., Lehrer, M., Kirchner, W., Zhang, S.: Range perception through apparent image speed in freely-flying honeybees. Visual Neuroscience 6, 519–535 (1991)CrossRefGoogle Scholar
  55. 55.
    Srinivasan, M., Zhang, S., Chahl, J., Barth, E., Venkatesh, S.: How honeybees make grazing landings on flat surfaces. Biological Cybernetics 83, 171–183 (2000)CrossRefGoogle Scholar
  56. 56.
    Stevens, B., Lewis, F.: Aircraft Control and Simulation, 2nd edn. Wiley (2003)Google Scholar
  57. 57.
    Strausfeld, N.: Atlas of an Insect Brain. Springer (1976)Google Scholar
  58. 58.
    Tammero, L., Dickinson, M.: The influence of visual landscape on the free flight behavior of the fruit fly drosophila melanogaster. The Journal of Experimental Biology 205, 327–343 (2002)Google Scholar
  59. 59.
    Taylor, G., Krapp, H.: Sensory systems and flight stability: What do insects measure and why. Advances in Insect Physiology 34, 231–316 (2008)CrossRefGoogle Scholar
  60. 60.
    Thakoor, S., Morookian, J., Chahl, J., Hine, B., Zornetzer, S.: BEES: Exploring mars with bioinspired technologies. Computer 37(9), 38–47 (2004)CrossRefGoogle Scholar
  61. 61.
    Wagner, H.: Flight performance and visual control of flight of the free-flying housefly (Musca domestica L.). I. organization of the flight motor. Philosophical Transactions of the Royal Society B 312, 527–551 (1986)CrossRefGoogle Scholar
  62. 62.
    Weber, K., Venkatesh, S., Srinivasan, M.: Insect inspired behaviours for the autonomous control of mobile robots. In: M.V. Srinivasan, S. Venkatesh (eds.) From Living Eyes to Seeing Machines, pp. 226–248. Oxford University Press (1997)Google Scholar
  63. 63.
    Wehner, R.: Matched filters - neural models of the external world. Journal of Comparative Physiology A 161, 511–531 (1987)CrossRefGoogle Scholar
  64. 64.
    Whiteside, T., Samuel, G.: Blur zone. Nature 225, 94–95 (1970)CrossRefGoogle Scholar
  65. 65.
    Zeil, J., Boeddeker, N., Hemmi, J.: Vision and the organization of behaviour. Current Biology 18(8), 320–323 (2008)CrossRefGoogle Scholar
  66. 66.
    Zufferey, J.C.: Bio-inspired Flying Robots: Experimental Synthesis of Autonomous Indoor Flyers. EPFL/CRC Press (2008)Google Scholar
  67. 67.
    Zufferey, J.C., Floreano, D.: Fly-inspired visual steering of an ultralight indoor aircraft. IEEE Transactions on Robotics 22, 137–146 (2006)CrossRefGoogle Scholar
  68. 68.
    Zufferey, J.C., Klaptocz, A., Beyeler, A., Nicoud, J.D., Floreano, D.: A 10-gram vision-based flying robot. Advanced Robotics, Journal of the Robotics Society of Japan 21(14), 1671–1684 (2007)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Jean-Christophe Zufferey
    • 1
    Email author
  • Antoine Beyeler
    • 1
  • Dario Floreano
    • 1
  1. 1.Laboratory of Intelligent SystemsEPFLLausanneSwitzerland

Personalised recommendations