Skip to main content

Toward an Autonomous Lunar Landing Based on Low-Speed Optic Flow Sensors

  • Conference paper

Abstract

For the last few decades, growing interest has returned to the quite challenging task of the autonomous lunar landing. Soft landing of payloads on the lunar surface requires the development of new means of ensuring safe descent with strong final conditions and aerospace-related constraints in terms of mass, cost and computational resources. In this paper, a two-phase approach is presented: first a biomimetic method inspired from the neuronal and sensory system of flying insects is presented as a solution to perform safe lunar landing. In order to design an autopilot relying only on optic flow (OF) and inertial measurements, an estimation method based on a two-sensor setup is introduced: these sensors allow us to accurately estimate the orientation of the velocity vector which is mandatory to control the lander’s pitch in a quasi-optimal way with respect to the fuel consumption. Secondly a new low-speed Visual Motion Sensor (VMS) inspired by insects’ visual systems performing local angular 1-D speed measurements ranging from 1.5°/s to 25°/s and weighing only 2.8 g is presented. It was tested under free-flying outdoor conditions over various fields onboard an 80 kg unmanned helicopter. These preliminary results show that the optic flow measured despite the complex disturbances encountered closely matched the ground-truth optic flow.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   259.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   329.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   329.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Barrows, G., Neely, C.: Mixed-mode VLSI optic flow sensors for in-flight control of a Micro Air Vehicle. In: SPIE: Critical Technologies for the Future of Computing, pp. 52–63 (2000)

    Google Scholar 

  2. Benson, R., Delbrück, T., Touretzky, D.S.: Direction selective silicon retina that uses null inhibition. In: Adv. Neural Inf. Process. Syst., vol. 4, pp. 756–763. Morgan Kaufman, San Mateo (1992)

    Google Scholar 

  3. Beyeler, A., Zufferey, J., Floreano, D.: OptiPilot: control of take-off and landing using optic flow. In: European Micro Aerial Vehicle Conference (EMAV) (2009)

    Google Scholar 

  4. Beyeler, A., Zufferey, J.C., Floreano, D.: Vision-based control of near-obstacle flight. Auton. Robot. 27, 201–219 (2009)

    Article  Google Scholar 

  5. Blanes, C.: Appareil Visuel élémentaire pour la navigation à vue d’un robot mobile autonome, Master’s thesis (Advisor: N. Franceschini), Neurosciences, Univ. Aix-Marseille II (1986)

    Google Scholar 

  6. Braun, R., Manning, R.: Mars exploration entry, descent and landing challenges. In: IEEE Aerosp. Conf., Pasadena, CA: Jet Propulsion Laboratory, National Aeronautics and Space Administration, Big Sky, Montana (2006)

    Google Scholar 

  7. Cheng, Y., Ansar, A.: Landmark Based Position Estimation for Pinpoint Landing on Mars. In: IEEE Int. Conf. Robot. Autom. (ICRA), pp. 1573–1578 (2005)

    Google Scholar 

  8. Conroy, J., Gremillion, G., Ranganathan, B., Humbert, J.: Implementation of wide-field integration of optic flow for autonomous quadrotor navigation. Auton. Robot. 27, 189–198 (2009)

    Article  Google Scholar 

  9. Dubois-Matra, O., Parkes, S., Dunstam, M.: Testing and Validation of Planetary Vision-based navigation systems with PANGU. In: 21st International Symposium on Space Flight Dynamics, ISSFD (2009)

    Google Scholar 

  10. Expert, F., Viollet, S., Ruffier, F.: Outdoor Field Performances of Insect-Based Visual Motion Sensors. J. Field Robot. 28, 529–541 (2011)

    Article  Google Scholar 

  11. Flandin, G., Polle, B., Frapard, B., Vidal, P., Philippe, C., Voirin, T.: Vision Based Navigation for Planetary Exploration. In: 32nd Annual AAS Rocky Mountain Guidance and Control Conference (2009)

    Google Scholar 

  12. Franceschini, N., Riehle, A., Nestour, A., Stavenga, D., Hardie, R.: Directionally selective motion detection by insect neurons. In: Facets of Vision, pp. 360–390. Springer (1989)

    Google Scholar 

  13. Franceschini, N., Pichon, J., Blanes, C.: From insect vision to robot vision. Philos. T. R. Soc. Lond. 337, 283–294 (1992)

    Article  Google Scholar 

  14. Franceschini, N., Ruffier, F., Serres, J.: A Bio-Inspired Flying Robot Sheds Light on Insect Piloting Abilities. Curr. Biol. 17, 329–335 (2007)

    Article  Google Scholar 

  15. Frapard, B., Champetier, C., Kemble, S., Parkinson, B., Strandmoe, S., Lang, M.: Vision-Based GNC Design for the LEDA Mission. In: ESA 3rd International Conference on Spacecraft GNC (1996)

    Google Scholar 

  16. Frapard, B., Polle, B., Flandin, G., Bernard, P., Vétel, C., Sembely, X., Mancuso, S.: Navigation for Planetary Approach and Landing. In: 5th International ESA Conference on GNC, p. 159 (2002)

    Google Scholar 

  17. Götz, K.: Optomotorische untersuchung des visuellen systems einiger Augenmutanten der fruchtfliege Drosophila. Biol. Cybern. 2, 77–92 (1964)

    Google Scholar 

  18. Garratt, M., Chahl, J.: Vision-Based Terrain Following for an Unmanned Rotorcraft. J. Field Robot. 25, 284–301 (2008)

    Article  Google Scholar 

  19. Green, W., Oh, P., Barrows, G.: Flying insect inspired vision for autonomous aerial robot maneuvers in near-earth environments. In: IEEE Int. Conf. Robot. Autom. (ICRA), vol. 3, pp. 2347–2352 (2004)

    Google Scholar 

  20. Griffiths, S., Saunders, J., Curtis, A., Barber, B., McLain, T., Beard, R.: Maximizing miniature aerial vehicles. IEEE Robot. Autom. Mag. 13, 34–43 (2006)

    Article  Google Scholar 

  21. Hérissé, B., Hamel, T., Mahony, R., Russotto, F.-X.: Landing a VTOL Unmanned Aerial Vehicle on a Moving Platform Using Optical Flow. IEEE T. Robot. 28(1), 77–89 (2012)

    Article  Google Scholar 

  22. Hrabar, S., Sukhatme, G., Corke, P., Usher, K., Roberts, J.: Combined optic-flow and stereo-based navigation of urban canyons for a UAV. In: IEEE Int. Conf. Intell. Rob. Syst. (IROS), pp. 3309–3316 (2005)

    Google Scholar 

  23. Izzo, D., de Croon, G.: Landing with time-to-contact and ventral optic flow estimates. J. Guid. Control Dynam. 35(4), 1362–1367 (2011)

    Article  Google Scholar 

  24. Izzo, D., Weiss, N., Seidl, T.: Constant-Optic-Flow Lunar Landing: Optimality and Guidance. J. Guid. Control Dynam. 34, 1383–1395 (2011)

    Article  Google Scholar 

  25. Janschek, K., Tchernykh, V., Beck, M.: Performance Analysis for Visual Planetary Landing Navigation Using Optical Flow and DEM matching. In: AIAA Guidance, Navigation and Control Conference and Exhibit (2006)

    Google Scholar 

  26. Jean-Marius, T., Strandmoe, S.E.: Integrated vision and navigation for a planetary lander, ESA, Estec. Technical report, AEROSPATIAL, Espace et Défense, Les Mureaux, France (1998)

    Google Scholar 

  27. Kendoul, F., Fantoni, I., Nonamib, K.: Optic flow-based vision system for autonomous 3D localization and control of small aerial vehicles. Robot. Auton. Syst. 57, 591–602 (2009)

    Article  Google Scholar 

  28. Kendoul, F., Nonami, K., Fantoni, I., Lozano, R.: An adaptive vision-based autopilot for mini flying machines guidance, navigation and control. Auton. Robot. 27, 165–188 (2009)

    Article  Google Scholar 

  29. Koenderink, J., Doorn, A.: Facts on optic flow. Biol. Cybern. 56, 247–254 (1987)

    Article  MATH  Google Scholar 

  30. Land, M.: Visual Acuity In Insects. Annu. Rev. Entomol. 42, 147–177 (1997)

    Article  Google Scholar 

  31. Landolt, A., Mitros, A.: Visual sensor with resolution enhancement by mechanical vibrations. Auton. Robot. 11(3), 233–239 (2001)

    Article  MATH  Google Scholar 

  32. Mahony, R., Corke, P., Hamel, T.: A Dynamic Image-Based Visual Servo Control Using Centroid and Optic Flow Features. J. Dyn. Sys., Meas., Control 130(1), 1–12 (2008)

    Article  Google Scholar 

  33. Moeckel, R., Liu, S.-C.: Motion Detection Circuits for a Time-To-Travel Algorithm. In: IEEE Int. Symp. Circ. S., pp. 3079–3082 (2007)

    Google Scholar 

  34. Mourikis, A.I., Trawny, N., Roumeliotis, S.I., Johnson, A.E., Ansar, A., Matthies, L.: Vision-Aided Inertial Navigation for Spacecraft Entry, Descent, and Landing. IEEE Trans. Robot. 25(2), 264–280 (2009)

    Article  Google Scholar 

  35. Orfanidis, S.J.: Introduction to signal processing. Prentice-Hall Inc., Upper Saddle River (1995)

    Google Scholar 

  36. Parkes, S., Silva, V.: GNC sensors for planetary landers: a review. In: Data Systems in Aerospace (DASIA), pp. 1–9 (2002)

    Google Scholar 

  37. Parkes, S., Dunstan, M., Matthews, D., Martin, I., Silva, V.: LIDAR-based GNC for Planetary Landing: Simulation with PANGU. In: Harris, R.A. (ed.) Data Systems in Aerospace (DASIA), Prague, Czech Republic, p. 18.1 (2003)

    Google Scholar 

  38. Parkes, S., Martin, I., Dunstan, M.: Planet Surface Simulation with PANGU. In: 8th International Conference on Space Operations, pp. 1–10 (2004)

    Google Scholar 

  39. Pichon, J.-M., Blanes, C., Franceschini, N.: Visual guidance of a mobile robot equipped with a network of self-motion sensors. In: Wolfe, W.J., Chun, W.H. (eds.) SPIE Conf. on Mobile Robots IV, pp. 44–53 (1989)

    Google Scholar 

  40. Roubieu, F., Expert, F., Boyron, M., Fuschlock, B., Viollet, S., Ruffier, F.: A novel 1-gram insect based device measuring visual motion along 5 optical directions. In: IEEE Sens. Conf., pp. 687–690 (2011)

    Google Scholar 

  41. Roumeliotis, S., Johnson, A., Montgomery, J.: Augmenting inertial navigation with image-based motion estimation. In: IEEE Int. Conf. Robot. Autom. (ICRA), pp. 4326–4333 (2002)

    Google Scholar 

  42. Ruffier, F., Viollet, S., Amic, S., Franceschini, N.: Bio-inspired optical flow circuits for the visual guidance of micro air vehicles. In: IEEE Int. Symp. Circ. S. (ISCAS), pp. 846–849 (2003)

    Google Scholar 

  43. Ruffier, F.: Pilote automatique biomimétique. Système générique inspiré du contrôle visuomoteur des insectes pour: le suivi de terrain, la réaction au vent et l’atterrissage automatiques d’un micro-aéronef. PhD thesis, Institut National Polytechnique de Grenoble (2004) (in French)

    Google Scholar 

  44. Ruffier, F., Franceschini, N.: Visually guided micro-aerial vehicle: automatic take off, terrain following, landing and wind reaction. In: IEEE Int. Conf. Robot. Autom. (ICRA), pp. 2339–2346 (2004)

    Google Scholar 

  45. Ruffier, F., Franceschini, N.: Optic flow regulation: the key to aircraft automatic guidance. Robot. Auton. Syst. 50, 177–194 (2005)

    Article  Google Scholar 

  46. Ruffier, F., Expert, F.: Visual motion sensing onboard a 50-g helicopter flying freely under complex VICON-lighting conditions. In: International Conference on Complex Medical Engineering, pp. 634–639 (2012)

    Google Scholar 

  47. Shang, Y., Palmer, P.: The dynamic motion estimation of a lunar lander. In: 21st International Symposium on Space Flight Dynamics (ISSFD), pp. 1–14 (2009)

    Google Scholar 

  48. Strandmoe, S., Jean-Marius, T., Trinh, S.: Toward a vision based autonomous planetary lander. In: AIAA Guidance, Navigation, and Control Conference and Exhibit, Portland, OR, AIAA-99-4154 (1999)

    Google Scholar 

  49. Tchernykh, V., Beck, M., Janschek, K.: An Embedded Optical Flow Processor for Visual Navigation using Optical Correlator Technology. In: IEEE Int. Conf. Intell. Rob. Syst. (IEEE/RSJ), Beijing, pp. 67–72 (2006)

    Google Scholar 

  50. Trawny, N., Mourikis, A.I., Roumeliotis, S.I., Johnson, A.E., Montgomery, J.: Vision-aided inertial navigation for pin-point landing using observations of mapped landmarks. J. Field Robot. 24, 357–378 (2007)

    Article  Google Scholar 

  51. Valette, F., Ruffier, F., Viollet, S., Seidl, T.: Biomimetic optic flow sensing applied to a lunar landing scenario. In: IEEE Int. Conf. Robot. Autom. (ICRA), pp. 2253–2260 (2010)

    Google Scholar 

  52. Watanabe, Y., Fabiani, P., Le Besnerais, G.: Simultaneous visual target tracking and navigation in a GPS-denied environment. In: Int. Conf. Adv. Robot. (ICAR), pp. 1–6 (2009)

    Google Scholar 

  53. Watanabe, Y., Lesire, C., Piquereau, A., Fabiani, P., Sanfourche, M., Le Besnerais, G.: The ONERA ReSSAC Unmanned Autonomous Helicopter: Visual Air-to-Ground Target Tracking in an Urban Environment. In: American Helicopter Society 66th Annual Forum (2010)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Guillaume Sabiron .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Sabiron, G. et al. (2013). Toward an Autonomous Lunar Landing Based on Low-Speed Optic Flow Sensors. In: Chu, Q., Mulder, B., Choukroun, D., van Kampen, EJ., de Visser, C., Looye, G. (eds) Advances in Aerospace Guidance, Navigation and Control. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-38253-6_39

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-38253-6_39

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-38252-9

  • Online ISBN: 978-3-642-38253-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics