Advertisement

Asteroids pp 221-246 | Cite as

Bio-inspired Landing Approaches and Their Potential Use on Extraterrestrial Bodies

Chapter

Abstract

Landing on asteroids and extraterrestrial bodies is a critical stage for future exploration missions. Safe and soft landing on asteroids will be required even though the task is way harder than on the Earth due to the small size, irregular shape and variable surface properties of asteroids, as well as the low gravity and negligible drag experienced by the spacecraft. Optical guidance and navigation for autonomous landing on small celestial bodies have been studied in the past years with a focus on the closed-loop guidance, navigation, and control (GNC) systems (De Lafontaine1992, Kawaguchi et al. 1999).

Keywords

Global Position System Optic Flow Pitch Angle Unmanned Helicopter Above Ground Level 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Barrows, G., Neely, C.: Mixed-mode VLSI optic flow sensors for in flight control of a Micro Air Vehicle. In: SPIE: Critical Technologies for the Future of Computing, San Diego, CA, USA, vol. 4109, pp. 52–63 (2000)Google Scholar
  2. Beyeler, A., Zufferey, J., Floreano, D.: Optipilot: control of take-off and landing using optic flow. In: European Micro Aerial Vehicle Conference, vol. 27, pp. 201–2019 (2009)Google Scholar
  3. Blanes, C.: Appareil visuel élémentaire pour la navigation à vue d’un robot mobile autonome. Master’s thesis, Master thesis in Neurosciences (DEA in French), Neurosciences, Advisor: N. Franceschini, Univ. Aix-Marseille II, France (1986)Google Scholar
  4. Blanes, C.: Guidage visuel d’un robot mobile autonome d’inspiration bionique. Ph.D. thesis, INP Grenoble, France (1991)Google Scholar
  5. Braun, R., Manning, R.: Mars exploration entry, descent and landing challenges. In: The Proceedings of the IEEE Aerospace Conference, Big Sky, Montana, Pasadena, CA, Jet Propulsion Laboratory, National Aeronautics and Space Administration (2006)Google Scholar
  6. Cheng, Y., Ansar, A.: Landmark based position estimation for pinpoint landing on mars. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), pp. 1573–1578 (2005)Google Scholar
  7. Collett, T.S., Land, M.F.: Visual control of flight behaviour in the hoverfly Syritta pipiens. Journal of Comparative Physiology A: Neuroethology, Sensory, Neural, and Behavioral Physiology 99(1), 1–66 (1975)CrossRefGoogle Scholar
  8. Conroy, J., Gremillion, G., Ranganathan, B., Humbert, J.: Implementation of wide-flield integration of optic flow for autonomous quadrotor navigation. Autonomous Robots 27, 189–198 (2009)CrossRefGoogle Scholar
  9. De Lafontaine, J.: Autonomous spacecraft navigation and control for comet landing. Journal of Guidance, Control, and Dynamics 15(3), 567–576 (1992)CrossRefGoogle Scholar
  10. Dubois-Matra, O., Parkes, S., Dunstam, M.: Testing and validation of planetary vision-based navigation systems with pangu. In: Proceedings of the 21st International Symposium on Space Flight Dynamics (ISSFD), Toulouse, France (2009)Google Scholar
  11. Expert, F., Viollet, S., Ruffier, F.: Outdoor field performances of insect-based visual motion sensors. Journal of Field Robotics 28(4), 529–541 (2011)CrossRefGoogle Scholar
  12. Expert, F., Roubieu, F.L., Ruffier, F.: Interpolation based ”time of travel” scheme in a visual motion sensor using a small 2d retina. In: The Proceedings of the IEEE Sensors Conference, Taipei, Taiwan, pp. 2231–2234 (2012)Google Scholar
  13. Flandin, G., Polle, B., Frapard, B., Vidal, P., Philippe, C., Voirin, T.: Vision based navigation for planetary exploration. In: Proceedings of the 32nd Annual AAS Rocky Mountain Guidance and Control Conference (2009)Google Scholar
  14. Franceschini, N.: Early processing of colour and motion in a mosaic visual system. Neurosc. Res. Suppl. 2, 17–49 (1985)CrossRefGoogle Scholar
  15. Franceschini, N.: De la mouche au robot: reconstruire pour mieux comprendre. In: Bloch, V. (ed.) Cerveaux et Machines, pp. 247–270 (1999)Google Scholar
  16. Franceschini, N., Chagneux, R.: Repetitive scanning in the fly compound eye. In: Göttingen Neurobiology Report, Thieme, vol. 2, p. 279 (1997)Google Scholar
  17. Franceschini, N., Riehle, A., Nestour, A.L.: Directionally Selective Motion Detection by Insect Neurons. In: Facets of Vision, pp. 360–390. Springer (1989)Google Scholar
  18. Franceschini, N., Pichon, J.M., Blanes, C.: Real time visuomotor control: from flies to robots. In: Proceedings of the IEEE Conference on Advanced Robotics (ICAR 1991), Pisa, Italy, pp. 931–935 (1991)Google Scholar
  19. Franceschini, N., Pichon, J.M., Blanes, C.: From insect vision to robot vision. Philosophical Transactions of the Royal Society B: Biological Sciences 337(1281), 283–294 (1992)CrossRefGoogle Scholar
  20. Franceschini, N., Pichon, J.M., Blanes, C.: Bionics of visuo-motor control. In: Gomi, T. (ed.) Evolutionary Robotics: From Intelligent Robots to Artificial Life, pp. 49–67. AAI Books, Ottawa (1997)Google Scholar
  21. Franceschini, N., Ruffier, F., Serres, J.: A bio-inspired flying robot sheds light on insect piloting abilities. Current Biology 17(4), 329–335 (2007)CrossRefGoogle Scholar
  22. Franceschini, N., Ruffier, F., Serres, J.: Obstacle avoidance and speed control in insects and micro-aerial vehicles. Acta Futura 3(4), 15–34 (2009)Google Scholar
  23. Franceschini, N., Ruffier, F., Serres, J.: Biomimetic Optic Flow Sensors and Autopilots for MAV Guidance. In: Encyclopedia of Aerospace Engineering, p. E309 (2010)Google Scholar
  24. Frapard, B., Champetier, C., Kemble, S., Parkinson, B., Strandmoe, S., Lang, M.: Vision-based gnc design for the leda mission. In: Proceedings of the 3rd International ESA Conference on Spacecraft GNC, Noordwijk, The Netherlands, pp. 411–421 (1996)Google Scholar
  25. Frapard, B., Polle, B., Flandin, G., Bernard, P., Vétel, C., Sembely, X., Mancuso, S.: Navigation for planetary approach and landing. In: Proceedings of the 5th International ESA Conference on Spacecraft GNC, Rome, Italy (2002)Google Scholar
  26. Garratt, M., Chahl, J.: Vision-based terrain following for an unmanned rotorcraft. Journal of Field Robotics 25, 284–301 (2008)CrossRefGoogle Scholar
  27. Green, W., Oh, P., Barrows, G.: Flying insect inspired vision for autonomous aerial robot maneuvers in near-earth environments. In: IEEE International Conference on Robotics and Automation (ICRA), vol. 1, pp. 2347–2352 (2004)Google Scholar
  28. Griffiths, S., Saunders, J., Curtis, A., Barber, B., McLain, T., Beard, R.: Maximizing miniature aerial vehicles. Robotics & Automation Magazine (IEEE) 13, 34–43 (2006)CrossRefGoogle Scholar
  29. Heisenberg, M., Wolf, R.: Vision in Drosophila. Springer, New York (1984)CrossRefGoogle Scholar
  30. Herisse, B., Hamel, T., Mahony, R., Russotto, F.X.: Landing a vtol unmanned aerial vehicle on a moving platform using optical flow. IEEE Transaction on Robotics 28(1), 77–89 (2012)CrossRefGoogle Scholar
  31. Hrabar, S., Sukhatme, G., Corke, P., Usher, K., Roberts, J.: Combined optic-flow and stereo-based navigation of urban canyons for a uav. Bio-inspired landing approaches and their potential use on extraterrestrial bodies. In: The Proceedings of the International Conference on Intelligent Robots and Systems (IROS), pp. 3309–3316 (2005)Google Scholar
  32. Indiveri, G., Kramer, J., Kocj, C.: System implementations of analog vlsi velocity sensors. IEEE Micro 16(5), 40–49 (1996)CrossRefGoogle Scholar
  33. Izzo, D., de Croon, G.: Landing with time-to-contact and ventral optic flow estimates. Journal of Guidance, Control, and Dynamics 35(4), 1362–1367 (2011)CrossRefGoogle Scholar
  34. Izzo, D., Weiss, N., Seidl, T.: Constant-optic-flow lunar landing: Optimality and guidance. Journal of Guidance, Control, and Dynamics 34, 1383–1395 (2011)CrossRefGoogle Scholar
  35. Janschek, K., Tchernykh, V., Beck, M.: Performance analysis for visual planetary landing navigation using optical flow and dem matching. In: Proceedings of the AIAA Guidance, Navigation and Control Conference and Exhibit (2006)Google Scholar
  36. Jean-Marius, T., Strandmoe, S.E.: Integrated vision and navigation for a planetary lander. Technical report, AEROSPATIAL, Espace et Defense, Les Mureaux-France. ESA, Estec (1998)Google Scholar
  37. Kawaguchi, J., Hashimoto, T., Misu, T., Sawai, S.: An autonomous optical guidance and navigation around asteroids. Acta Astronautica 44(5), 267–280 (1999)CrossRefGoogle Scholar
  38. Kendoul, F., Nonami, K., Fantoni, I., Lozano, R.: An adaptive vision-based autopilot for mini flying machines guidance, navigation and control. Autonomous Robots 27, 165–188 (2009)CrossRefGoogle Scholar
  39. Kerhuel, L., Viollet, S., Franceschini, N.: The vodka sensor: A bioinspired hyperacute optical position sensing device. IEEE Sensors Journal 12(2), 315–324 (2012)CrossRefGoogle Scholar
  40. Kirschfeld, K., Franceschini, N.: Optische eigenschaften der ommatidien im komplexauge von Musca. Kybernetik 5, 47–52 (1968)CrossRefGoogle Scholar
  41. Koenderink, J., van Doorn, A.: Facts on optic flow. Biological Cybernetics 56, 247–254 (1987)zbMATHCrossRefGoogle Scholar
  42. Landolt, O., Mitros, A.: Visual sensor with resolution enhancement by mechanical vibrations. Autonomous Robots 11(3), 233–239 (2001)zbMATHCrossRefGoogle Scholar
  43. Mahony, R., Corke, P., Hamel, T.: A dynamic image-based visual servo control using centroid and optic flow features. Journal of Dynamic Systems, Measurement, and Control 130(1), 1–12 (2008)CrossRefGoogle Scholar
  44. Moeckel, R., Liu, S.C.: Motion detection circuits for a time-to-travel algorithm. In: IEEE International Symposium on Circuits and Systems (ISCAS), New orleans, LA, USA, pp. 3079–3082 (2007)Google Scholar
  45. Mourikis, A.I., Trawny, N., Roumeliotis, S.I., Johnson, A.E., Ansar, A., Matthies, L.: Vision-aided inertial navigation for spacecraft entry, descent, and landing. IEEE Transactions on Robotics 25(2), 264–280 (2009)CrossRefGoogle Scholar
  46. Netter, T., Franceschini, N.: A robotic aircraft that follows terrain using a neuromorphic eye. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2002, vol. 1, pp. 129–134 (2002)Google Scholar
  47. Orfanidis, S.J.: Introduction to signal processing. Prentice-Hall, Inc., Upper Saddle River (1995)Google Scholar
  48. Parkes, S., Dunstan, M., Matthews, D., Martin, I., Silva, V.: Lidar-based gnc for planetary landing: Simulation with PANGU. In: Proceedings of the DASIA (Data Systems in Aerospace), pp. 18.1–18.12 (2003)Google Scholar
  49. Parkes, S., Martin, I., Dunstan, M., Matthews, D.: Planet surface simulation with pangu. In: Proceedings of the 8th International Conference on Space Operations, SpaceOps (2004)Google Scholar
  50. Parkes, S.M., Silva, V.: Gnc sensors for planetary landers: a review. In: The Proceedings of the DASIA (Data Systems in Aerospace), pp. 1–9 (2002)Google Scholar
  51. Pichon, J., Blanes, C., Franceschini, N.: Visual guidance of a mobile robot equipped with a network of self-motion sensors. In: Mobile Robots IV, SPI, vol. 1195, pp. 44–53 (1989)Google Scholar
  52. Roubieu, F., Expert, F., Boyron, M., Fuschlock, B., Viollet, S., Ruffier, F.: A novel 1-gram insect based device measuring visual motion along optical directions. In: Proceedings of the IEEE Sensors Conference, Limerick, Ireland, pp. 687–690 (2011)Google Scholar
  53. Roubieu, F.L., Serres, J., Franceschini, N., Ruffier, F., Viollet, S.: A fully-autonomous hovercraft inspired by bees; wall-following and speed control in straight and tapered corridors. In: IEEE International Conference on Robotics and Biomimetics (ROBIO), Guangzhou, China (2012)Google Scholar
  54. Roumeliotis, S., Johnson, A., Montgomery, J.: Augmenting inertial navigation with image-based motion estimation. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), vol. 4, pp. 4326–4333 (2002)Google Scholar
  55. Ruffier, F.: Pilote Automatique Biomimetique Systeme générique inspiré du contrôle visuomoteur des insectes pour: le décollage, le suivi de terrain, la réaction au vent et l’atterrissage automatiques d’un micro-aeronef. Ph.D. thesis, INP Grenoble, France (2004)Google Scholar
  56. Ruffier, F., Expert, F.: Visual motion sensing onboard a 50-g helicopter flying freely under complex VICON-lighting conditions. In: Proceedings of the International Conference on Complex Medical Engineering, Kobe, Japan, pp. 634–639 (2012)Google Scholar
  57. Ruffier, F., Franceschini, N.: Octave, a bioinspired visuo-motor control system for the guidance of micro-air vehicles. In: Rodriguez-Vazquez, A., Abbott, D., Carmona, R. (eds.) Proceedings of the Conference on Bioengineered and Bioinspired Systems, SPIE, Maspalomas, Spain, Bellingham, USA, vol. 5119, pp. 1–12 (2003)Google Scholar
  58. Ruffier, F., Franceschini, N.: Visually guided micro-aerial vehicle: automatic take off, terrain following, landing and wind reaction. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA 2004), Coimbra, Portugal (2004)Google Scholar
  59. Ruffier, F., Franceschini, N.: Optic flow regulation: the key to aircraft automatic guidance. Robotics and Autonomous Systems 50, 177–194 (2005)CrossRefGoogle Scholar
  60. Ruffier, F., Viollet, S., Amic, S., Franceschini, N.: Bio-inspired optical flow circuits for the visual guidance of micro-air vehicles. In: Proceedings of the IEEE International Symposium on Circuits and Systems Bio-inspired Landing Approaches and their Potential use on Extraterrestrial Bodies (ISCAS), Bangkok, Thailand, vol. 3, pp. 846–849 (2003)Google Scholar
  61. Schilstra, C., Hateren, J.H.: Blowfly flight and optic flow. 1. Thorax kinematics and flight dynamics. J. Exp. Biol. 202(Pt. 11), 1481–1490 (1999)Google Scholar
  62. Shang, Y., Palmer, P.: The dynamic motion estimation of a lunar lander. In: The Proceedings of the 21st ISSFD, Toulouse, France (2009)Google Scholar
  63. Strandmoe, S., Jean-Marius, T., Trinh, S.: Toward a vision based autonomous planetary lander. In: AIAA, AIAA–99–4154 (1999)Google Scholar
  64. Tammero, L.F., Dickinson, M.H.: The influence of visual landscape on the free flight behavior of the fruit fly drosophila melanogaster. Journal of Experimental Biology 205, 327–343 (2002)Google Scholar
  65. Tchernykh, V., Beck, M., Janschek, K.: An embedded optical flow processor for visual navigation using optical correlator technology. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, pp. 67–72 (2006)Google Scholar
  66. Trawny, N., Mourikis, A.I., Roumeliotis, S.I., Johnson, A.E., Montgomery, J.: Vision-aided inertial navigation for pin-point landing using observations of mapped landmarks. Journal of Field Robotics 24, 357–378 (2007)CrossRefGoogle Scholar
  67. Valette, F., Ruffier, F., Viollet, S., Seidl, T.: Biomimetic optic flow sensing applied to a lunar landing scenario. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA 2010), Anchorage, Alaska, pp. 2253–2260 (2010)Google Scholar
  68. Viollet, S., Franceschini, N.: Biologically-inspired visual scanning sensor for stabilization and tracking. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 1, pp. 204–209 (1999a)Google Scholar
  69. Viollet, S., Franceschini, N.: Visual servo system based on a biologically inspired scanning sensor. In: Sensor Fusion and Decentralized Control in Robotics II. SPIE, vol. 3839, pp. 144–155 (1999b)Google Scholar
  70. Viollet, S., Franceschini, N.: Super-accurate visual control of an aerial minirobot. In: Autonomous Minirobots for Research and Edutainment, AMIRE, Padderborn, Germany, pp. 215–224. Heinz Nixdorf Institute (2001)Google Scholar
  71. Wagner, H.: Flight performance and visual control of flight of the free-flying housefly (musca domestica l.) i. Organization of the flight motor. Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences 312, 527–551 (1986)CrossRefGoogle Scholar
  72. Watanabe, Y., Fabiani, P., Le Besnerais, G.: Simultaneous visual target tracking and navigation in a gps-denied environment. In: Proceedings of the International Conference on Advanced Robotics (ICAR), Munich, Germany, pp. 1–6 (2009)Google Scholar
  73. Watanabe, Y., Lesire, C., Piquereau, A., Fabiani, P., Sanfourche, M., Le Besnerais, G.: The ONERA ReSSAC unmanned autonomous helicopter: Visual air-to-ground target tracking in an urban environment. In: Proceedings of the American Helicopter Society 66th Annual Forum, Phoenix, AZ, USA (2010)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  1. 1.Aix-Marseille UniversityMarseilleFrance
  2. 2.French Aerospace LaboratoryONERAToulouseFrance

Personalised recommendations