Toward an Autonomous Lunar Landing Based on Low-Speed Optic Flow Sensors

  • Guillaume SabironEmail author
  • Paul Chavent
  • Laurent Burlion
  • Erwan Kervendal
  • Eric Bornschlegl
  • Patrick Fabiani
  • Thibaut Raharijaona
  • Franck Ruffier


For the last few decades, growing interest has returned to the quite challenging task of the autonomous lunar landing. Soft landing of payloads on the lunar surface requires the development of new means of ensuring safe descent with strong final conditions and aerospace-related constraints in terms of mass, cost and computational resources. In this paper, a two-phase approach is presented: first a biomimetic method inspired from the neuronal and sensory system of flying insects is presented as a solution to perform safe lunar landing. In order to design an autopilot relying only on optic flow (OF) and inertial measurements, an estimation method based on a two-sensor setup is introduced: these sensors allow us to accurately estimate the orientation of the velocity vector which is mandatory to control the lander’s pitch in a quasi-optimal way with respect to the fuel consumption. Secondly a new low-speed Visual Motion Sensor (VMS) inspired by insects’ visual systems performing local angular 1-D speed measurements ranging from 1.5°/s to 25°/s and weighing only 2.8 g is presented. It was tested under free-flying outdoor conditions over various fields onboard an 80 kg unmanned helicopter. These preliminary results show that the optic flow measured despite the complex disturbances encountered closely matched the ground-truth optic flow.


Global Position System Pitch Angle Angular Speed Inertial Measurement Unit Refresh Rate 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Barrows, G., Neely, C.: Mixed-mode VLSI optic flow sensors for in-flight control of a Micro Air Vehicle. In: SPIE: Critical Technologies for the Future of Computing, pp. 52–63 (2000)Google Scholar
  2. 2.
    Benson, R., Delbrück, T., Touretzky, D.S.: Direction selective silicon retina that uses null inhibition. In: Adv. Neural Inf. Process. Syst., vol. 4, pp. 756–763. Morgan Kaufman, San Mateo (1992)Google Scholar
  3. 3.
    Beyeler, A., Zufferey, J., Floreano, D.: OptiPilot: control of take-off and landing using optic flow. In: European Micro Aerial Vehicle Conference (EMAV) (2009)Google Scholar
  4. 4.
    Beyeler, A., Zufferey, J.C., Floreano, D.: Vision-based control of near-obstacle flight. Auton. Robot. 27, 201–219 (2009)CrossRefGoogle Scholar
  5. 5.
    Blanes, C.: Appareil Visuel élémentaire pour la navigation à vue d’un robot mobile autonome, Master’s thesis (Advisor: N. Franceschini), Neurosciences, Univ. Aix-Marseille II (1986)Google Scholar
  6. 6.
    Braun, R., Manning, R.: Mars exploration entry, descent and landing challenges. In: IEEE Aerosp. Conf., Pasadena, CA: Jet Propulsion Laboratory, National Aeronautics and Space Administration, Big Sky, Montana (2006)Google Scholar
  7. 7.
    Cheng, Y., Ansar, A.: Landmark Based Position Estimation for Pinpoint Landing on Mars. In: IEEE Int. Conf. Robot. Autom. (ICRA), pp. 1573–1578 (2005)Google Scholar
  8. 8.
    Conroy, J., Gremillion, G., Ranganathan, B., Humbert, J.: Implementation of wide-field integration of optic flow for autonomous quadrotor navigation. Auton. Robot. 27, 189–198 (2009)CrossRefGoogle Scholar
  9. 9.
    Dubois-Matra, O., Parkes, S., Dunstam, M.: Testing and Validation of Planetary Vision-based navigation systems with PANGU. In: 21st International Symposium on Space Flight Dynamics, ISSFD (2009)Google Scholar
  10. 10.
    Expert, F., Viollet, S., Ruffier, F.: Outdoor Field Performances of Insect-Based Visual Motion Sensors. J. Field Robot. 28, 529–541 (2011)CrossRefGoogle Scholar
  11. 11.
    Flandin, G., Polle, B., Frapard, B., Vidal, P., Philippe, C., Voirin, T.: Vision Based Navigation for Planetary Exploration. In: 32nd Annual AAS Rocky Mountain Guidance and Control Conference (2009)Google Scholar
  12. 12.
    Franceschini, N., Riehle, A., Nestour, A., Stavenga, D., Hardie, R.: Directionally selective motion detection by insect neurons. In: Facets of Vision, pp. 360–390. Springer (1989)Google Scholar
  13. 13.
    Franceschini, N., Pichon, J., Blanes, C.: From insect vision to robot vision. Philos. T. R. Soc. Lond. 337, 283–294 (1992)CrossRefGoogle Scholar
  14. 14.
    Franceschini, N., Ruffier, F., Serres, J.: A Bio-Inspired Flying Robot Sheds Light on Insect Piloting Abilities. Curr. Biol. 17, 329–335 (2007)CrossRefGoogle Scholar
  15. 15.
    Frapard, B., Champetier, C., Kemble, S., Parkinson, B., Strandmoe, S., Lang, M.: Vision-Based GNC Design for the LEDA Mission. In: ESA 3rd International Conference on Spacecraft GNC (1996)Google Scholar
  16. 16.
    Frapard, B., Polle, B., Flandin, G., Bernard, P., Vétel, C., Sembely, X., Mancuso, S.: Navigation for Planetary Approach and Landing. In: 5th International ESA Conference on GNC, p. 159 (2002)Google Scholar
  17. 17.
    Götz, K.: Optomotorische untersuchung des visuellen systems einiger Augenmutanten der fruchtfliege Drosophila. Biol. Cybern. 2, 77–92 (1964)Google Scholar
  18. 18.
    Garratt, M., Chahl, J.: Vision-Based Terrain Following for an Unmanned Rotorcraft. J. Field Robot. 25, 284–301 (2008)CrossRefGoogle Scholar
  19. 19.
    Green, W., Oh, P., Barrows, G.: Flying insect inspired vision for autonomous aerial robot maneuvers in near-earth environments. In: IEEE Int. Conf. Robot. Autom. (ICRA), vol. 3, pp. 2347–2352 (2004)Google Scholar
  20. 20.
    Griffiths, S., Saunders, J., Curtis, A., Barber, B., McLain, T., Beard, R.: Maximizing miniature aerial vehicles. IEEE Robot. Autom. Mag. 13, 34–43 (2006)CrossRefGoogle Scholar
  21. 21.
    Hérissé, B., Hamel, T., Mahony, R., Russotto, F.-X.: Landing a VTOL Unmanned Aerial Vehicle on a Moving Platform Using Optical Flow. IEEE T. Robot. 28(1), 77–89 (2012)CrossRefGoogle Scholar
  22. 22.
    Hrabar, S., Sukhatme, G., Corke, P., Usher, K., Roberts, J.: Combined optic-flow and stereo-based navigation of urban canyons for a UAV. In: IEEE Int. Conf. Intell. Rob. Syst. (IROS), pp. 3309–3316 (2005)Google Scholar
  23. 23.
    Izzo, D., de Croon, G.: Landing with time-to-contact and ventral optic flow estimates. J. Guid. Control Dynam. 35(4), 1362–1367 (2011)CrossRefGoogle Scholar
  24. 24.
    Izzo, D., Weiss, N., Seidl, T.: Constant-Optic-Flow Lunar Landing: Optimality and Guidance. J. Guid. Control Dynam. 34, 1383–1395 (2011)CrossRefGoogle Scholar
  25. 25.
    Janschek, K., Tchernykh, V., Beck, M.: Performance Analysis for Visual Planetary Landing Navigation Using Optical Flow and DEM matching. In: AIAA Guidance, Navigation and Control Conference and Exhibit (2006)Google Scholar
  26. 26.
    Jean-Marius, T., Strandmoe, S.E.: Integrated vision and navigation for a planetary lander, ESA, Estec. Technical report, AEROSPATIAL, Espace et Défense, Les Mureaux, France (1998)Google Scholar
  27. 27.
    Kendoul, F., Fantoni, I., Nonamib, K.: Optic flow-based vision system for autonomous 3D localization and control of small aerial vehicles. Robot. Auton. Syst. 57, 591–602 (2009)CrossRefGoogle Scholar
  28. 28.
    Kendoul, F., Nonami, K., Fantoni, I., Lozano, R.: An adaptive vision-based autopilot for mini flying machines guidance, navigation and control. Auton. Robot. 27, 165–188 (2009)CrossRefGoogle Scholar
  29. 29.
    Koenderink, J., Doorn, A.: Facts on optic flow. Biol. Cybern. 56, 247–254 (1987)zbMATHCrossRefGoogle Scholar
  30. 30.
    Land, M.: Visual Acuity In Insects. Annu. Rev. Entomol. 42, 147–177 (1997)CrossRefGoogle Scholar
  31. 31.
    Landolt, A., Mitros, A.: Visual sensor with resolution enhancement by mechanical vibrations. Auton. Robot. 11(3), 233–239 (2001)zbMATHCrossRefGoogle Scholar
  32. 32.
    Mahony, R., Corke, P., Hamel, T.: A Dynamic Image-Based Visual Servo Control Using Centroid and Optic Flow Features. J. Dyn. Sys., Meas., Control 130(1), 1–12 (2008)CrossRefGoogle Scholar
  33. 33.
    Moeckel, R., Liu, S.-C.: Motion Detection Circuits for a Time-To-Travel Algorithm. In: IEEE Int. Symp. Circ. S., pp. 3079–3082 (2007)Google Scholar
  34. 34.
    Mourikis, A.I., Trawny, N., Roumeliotis, S.I., Johnson, A.E., Ansar, A., Matthies, L.: Vision-Aided Inertial Navigation for Spacecraft Entry, Descent, and Landing. IEEE Trans. Robot. 25(2), 264–280 (2009)CrossRefGoogle Scholar
  35. 35.
    Orfanidis, S.J.: Introduction to signal processing. Prentice-Hall Inc., Upper Saddle River (1995)Google Scholar
  36. 36.
    Parkes, S., Silva, V.: GNC sensors for planetary landers: a review. In: Data Systems in Aerospace (DASIA), pp. 1–9 (2002)Google Scholar
  37. 37.
    Parkes, S., Dunstan, M., Matthews, D., Martin, I., Silva, V.: LIDAR-based GNC for Planetary Landing: Simulation with PANGU. In: Harris, R.A. (ed.) Data Systems in Aerospace (DASIA), Prague, Czech Republic, p. 18.1 (2003)Google Scholar
  38. 38.
    Parkes, S., Martin, I., Dunstan, M.: Planet Surface Simulation with PANGU. In: 8th International Conference on Space Operations, pp. 1–10 (2004)Google Scholar
  39. 39.
    Pichon, J.-M., Blanes, C., Franceschini, N.: Visual guidance of a mobile robot equipped with a network of self-motion sensors. In: Wolfe, W.J., Chun, W.H. (eds.) SPIE Conf. on Mobile Robots IV, pp. 44–53 (1989)Google Scholar
  40. 40.
    Roubieu, F., Expert, F., Boyron, M., Fuschlock, B., Viollet, S., Ruffier, F.: A novel 1-gram insect based device measuring visual motion along 5 optical directions. In: IEEE Sens. Conf., pp. 687–690 (2011)Google Scholar
  41. 41.
    Roumeliotis, S., Johnson, A., Montgomery, J.: Augmenting inertial navigation with image-based motion estimation. In: IEEE Int. Conf. Robot. Autom. (ICRA), pp. 4326–4333 (2002)Google Scholar
  42. 42.
    Ruffier, F., Viollet, S., Amic, S., Franceschini, N.: Bio-inspired optical flow circuits for the visual guidance of micro air vehicles. In: IEEE Int. Symp. Circ. S. (ISCAS), pp. 846–849 (2003)Google Scholar
  43. 43.
    Ruffier, F.: Pilote automatique biomimétique. Système générique inspiré du contrôle visuomoteur des insectes pour: le suivi de terrain, la réaction au vent et l’atterrissage automatiques d’un micro-aéronef. PhD thesis, Institut National Polytechnique de Grenoble (2004) (in French)Google Scholar
  44. 44.
    Ruffier, F., Franceschini, N.: Visually guided micro-aerial vehicle: automatic take off, terrain following, landing and wind reaction. In: IEEE Int. Conf. Robot. Autom. (ICRA), pp. 2339–2346 (2004)Google Scholar
  45. 45.
    Ruffier, F., Franceschini, N.: Optic flow regulation: the key to aircraft automatic guidance. Robot. Auton. Syst. 50, 177–194 (2005)CrossRefGoogle Scholar
  46. 46.
    Ruffier, F., Expert, F.: Visual motion sensing onboard a 50-g helicopter flying freely under complex VICON-lighting conditions. In: International Conference on Complex Medical Engineering, pp. 634–639 (2012)Google Scholar
  47. 47.
    Shang, Y., Palmer, P.: The dynamic motion estimation of a lunar lander. In: 21st International Symposium on Space Flight Dynamics (ISSFD), pp. 1–14 (2009)Google Scholar
  48. 48.
    Strandmoe, S., Jean-Marius, T., Trinh, S.: Toward a vision based autonomous planetary lander. In: AIAA Guidance, Navigation, and Control Conference and Exhibit, Portland, OR, AIAA-99-4154 (1999)Google Scholar
  49. 49.
    Tchernykh, V., Beck, M., Janschek, K.: An Embedded Optical Flow Processor for Visual Navigation using Optical Correlator Technology. In: IEEE Int. Conf. Intell. Rob. Syst. (IEEE/RSJ), Beijing, pp. 67–72 (2006)Google Scholar
  50. 50.
    Trawny, N., Mourikis, A.I., Roumeliotis, S.I., Johnson, A.E., Montgomery, J.: Vision-aided inertial navigation for pin-point landing using observations of mapped landmarks. J. Field Robot. 24, 357–378 (2007)CrossRefGoogle Scholar
  51. 51.
    Valette, F., Ruffier, F., Viollet, S., Seidl, T.: Biomimetic optic flow sensing applied to a lunar landing scenario. In: IEEE Int. Conf. Robot. Autom. (ICRA), pp. 2253–2260 (2010)Google Scholar
  52. 52.
    Watanabe, Y., Fabiani, P., Le Besnerais, G.: Simultaneous visual target tracking and navigation in a GPS-denied environment. In: Int. Conf. Adv. Robot. (ICAR), pp. 1–6 (2009)Google Scholar
  53. 53.
    Watanabe, Y., Lesire, C., Piquereau, A., Fabiani, P., Sanfourche, M., Le Besnerais, G.: The ONERA ReSSAC Unmanned Autonomous Helicopter: Visual Air-to-Ground Target Tracking in an Urban Environment. In: American Helicopter Society 66th Annual Forum (2010)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Guillaume Sabiron
    • 1
    • 2
    Email author
  • Paul Chavent
    • 2
  • Laurent Burlion
    • 2
  • Erwan Kervendal
    • 3
  • Eric Bornschlegl
    • 4
  • Patrick Fabiani
    • 2
  • Thibaut Raharijaona
    • 1
  • Franck Ruffier
    • 1
  1. 1.Biorobotic Dept. of ISM, CNRS, ISM UMR 7287Aix-Marseille UniversitéMarseille cedex 09France
  2. 2.The French Aerospace LabONERAToulouseFrance
  3. 3.Astrium SatellitesToulouseFrance
  4. 4.European Space Agency ESTECNoordwijkThe Netherlands

Personalised recommendations