Advertisement

Numerical Simulation of Visually Guided Landing Based on a Honeybee Motion Model

  • A. KhamukhinEmail author
Article
  • 31 Downloads

Abstract

We verified the validity of a bio-inspired strategy for visually guided landing and its mathematical model proposed M.V. Srinivasan et al. by numerical simulation. We studied the influence of temporal discretization and the values of the supported optical flow on the landing duration and its result (from smooth to crash). An algorithm for landing simulation was developed taking into account accepted assumptions of the model. Two formulas (sine and tangent) were derived to calculate the distance and the speed of the flying robot, ensuring the constancy of the optical flow at given time steps. A limitation was found in the very value of the optical flow (threshold value), when exceeding this, the strategy leads to a hard touchdown or a crash (at near zero distance the speed is not close to zero). It was shown that the threshold value of the optical flow decreases with increasing time step in both formulas. However, calculating the distance using sine formula has a significantly lower threshold value of the optical flow than the calculation using the tangent formula. It was found that landing occurs faster if we use the sine formula at equal values of the optical flow. Nevertheless, the smooth landing ends at lower threshold values of the optical flow than using the tangent formula. As a result, using a larger value of the optical flow, a faster smooth landing can be achieved using the tangent formula.

Keywords

Bio-inspired landing Optical flow Numerical simulation Computer vision Flying robots 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Notes

Acknowledgements

This work was funded within the framework of realization of Strategic Program on National Research Tomsk Polytechnic University Competitiveness Enhancement in the Group of Top Level World Research and Academic Institutions. (Project VIU-ORPA-79/2018).

References

  1. 1.
    Thurrowgood, S., Moore, R.J.D., Soccol, D., Knight, M., Srinivasan, M.V.: A biologically inspired, vision-based guidance system for automatic landing of a fixed-wing aircraft. J. Field Robot. 31(4), 699–727 (2014).  https://doi.org/10.1002/rob.21527 CrossRefGoogle Scholar
  2. 2.
    Zhang, H., Li, L., McCray, D.L., Scheiding, S., Naples, N.J., Gebhardt, A., Risse, S., Eberhardt, R., Tünnermann, A., Yi, A.Y.: Development of a low cost high precision three-layer 3D artificial compound eye. Opt. Express. 21(19), 22232–22245 (2013).  https://doi.org/10.1364/OE.21.022232 CrossRefGoogle Scholar
  3. 3.
    Srinivasan, M.V., Lehrer, M., Zhang, S.W., Horridge, G.A.: How honeybees measure their distance from objects of unknown size. J. Compar. Physiol. A. 165(5), 605–613 (1989).  https://doi.org/10.1007/BF00610992 CrossRefGoogle Scholar
  4. 4.
    Baird, E., Boeddeker, N., Ibbotson, M.R., Srinivasan, M.V.: A universal strategy for visually guided landing. Proc. Nat. Acad. Sci. 110(46), 18686–18691 (2013).  https://doi.org/10.1073/pnas.1314311110 CrossRefGoogle Scholar
  5. 5.
    Srinivasan, M.V., Lehrer, M., Kirchner, W.H., Zhang, S.W.: Range perception through apparent image speed in freely flying honeybees. Vis. Neurosci. 6(5), 519–535 (1991)CrossRefGoogle Scholar
  6. 6.
    Srinivasan, M.V., Zhang, S.W., Witney, K.: Visual discrimination of pattern orientation by honeybees: Performance and implications for ’cortical’ processing. Philos. Trans. Biol. Sci. 343(1304), 199–210 (1994)CrossRefGoogle Scholar
  7. 7.
    Zhang, S.W., Srinivasan, M.V.: Prior experience enhances pattern discrimination in insect vision. Lett. Nat. 368, 330–332 (1994)CrossRefGoogle Scholar
  8. 8.
    Zhang, S.W., Srinivasan, M.V., Collett, T.: Convergent processing in honeybee vision: Multiple channel for the recognition of shape. Proc. Nat. Acad. Sci. 92(7), 3029–3031 (1995)CrossRefGoogle Scholar
  9. 9.
    Srinivasan, M.V., Zhang, S.W., Lehrer, M., Collett, T.: Honeybee navigation en route to the goal: Visual flight control and odometry. J. Exper. Biol. 199(Pt 1), 237–244 (1996)Google Scholar
  10. 10.
    Srinivasan, M.V., Chahl, J.S., Zhang, S.W.: Robot navigation by visual dead-reckoning: Inspiration from insects. Int. J. Pattern Recogn. Artif. Intell. 11(1), 35–47 (1997).  https://doi.org/10.1142/S0218001497000032 CrossRefGoogle Scholar
  11. 11.
    Srinivasan, M.V., Zhang, S.W.: Visual control of honeybee flight. In: Lehrer, R (ed.) Orientation and Communication in Arthropods, pp 95–113. Basel, Birkhäuser (1997).  https://doi.org/10.1007/978-3-0348-8878-3 Google Scholar
  12. 12.
    Srinivasan, M.V., Zhang, S.W., Bidwell, J.: Visually mediated odometry in honeybees. J. Exper. Biol. 200(19), 2513–2522 (1997)Google Scholar
  13. 13.
    Srinivasan, M.V., Chahl, J.S., Weber, K., Venkatesh, S., Nagle, M.G., Zhang, S.W.: Robot navigation inspired by principles of insect vision. In: Zelinsky, A. (ed.) Field and Service Robotics.  https://doi.org/10.1007/978-1-4471-1273-03, pp 12–16. Springer, London (1998)
  14. 14.
    Srinivasan, M.V., Zhang, S.W., Lehrer, M.: Honeybee navigation: Odometry with monocular input. Animal Behav. 56(5), 1245–1259 (1998)CrossRefGoogle Scholar
  15. 15.
    Cheng, K., Srinivasan, M.V., Zhang, S.W.: Error is proportional to distance measured by Honeybee: Weber’s Law in the odometer. Animal Cogn. 2(1), 11–16 (1999).  https://doi.org/10.1007/s100710050020 CrossRefGoogle Scholar
  16. 16.
    Srinivasan, M.V., Chahl, J.S., Weber, K., Venkatesh, S., Nagle, M.G., Zhang, S.W.: Robot navigation inspired by principles of insect vision. Robot. Auton. Syst. 26(2–3), 203–216 (1999).  https://doi.org/10.1016/S0921-8890(98)00069-4 CrossRefGoogle Scholar
  17. 17.
    Srinivasan, M.V., Zhang, S.W., Berry, J., Cheng, K., Zhu, H.: Honeybee navigation: Linear perception of short distances travelled. J. Comp. Physiol. A. 185(3), 239–245 (1999).  https://doi.org/10.1007/s003590050383 CrossRefGoogle Scholar
  18. 18.
    Srinivasan, M.V., Zhang, S.W., Altwein, M., Tautz, J.: Honeybee navigation: Nature and calibration of the odometer. Science 287, 851–853 (2000).  https://doi.org/10.1126/science.287.5454.851 CrossRefGoogle Scholar
  19. 19.
    Srinivasan, M.V., Zhang, S.W.: Visual navigation in flying insects. Int. Rev. Neurobiol. 44, 67–92 (2000).  https://doi.org/10.1016/S0074-7742(08)60738-2 CrossRefGoogle Scholar
  20. 20.
    Srinivasan, M.V., Zhang, S.W., Chahl, J.S., Barth, E., Venkatesh, S.: How honeybees make grazing landings on flat surfaces. Biol. Cybern. 83(3), 171–83 (2000).  https://doi.org/10.1007/s004220000162 CrossRefGoogle Scholar
  21. 21.
    Esch, H., Zhang, S.W., Srinivasan, M.V., Tautz, J.: Honeybee dances communicate distances measured by optic flow. Lett. Nat. 411, 581–583 (2001).  https://doi.org/10.1038/35079072 CrossRefGoogle Scholar
  22. 22.
    Si, A., Srinivasan, M.V., Zhang, S.W.: Honeybee navigation: Properties of the visually driven ‘odometer’. J. Exper. Biol. 206, 1265–1273 (2003).  https://doi.org/10.1242/jeb.00236 CrossRefGoogle Scholar
  23. 23.
    Chahl, J.S., Srinivasan, M.V., Zhang, S.W.: Landing strategies in honeybees and applications to uninhabited airborne vehicles. Int. J. Robot. Res. 23(2), 101–110 (2004).  https://doi.org/10.1177/0278364904041320 CrossRefGoogle Scholar
  24. 24.
    Srinivasan, M.V., Zhang, S.W.: Motion cues in insect vision and navigation. In: Chalupa, L., Werner, J. (eds.) The Visual Neurosciences, pp 1193–1202. MIT Press, Cambridge (2004)Google Scholar
  25. 25.
    Baird, E., Srinivasan, M.V., Zhang, S., Cowling, A.: Visual control of flight speed in honeybees. J. Exper. Biol. 208, 3895–905 (2005).  https://doi.org/10.1242/jeb.01818 CrossRefGoogle Scholar
  26. 26.
    Srinivasan, M.V., Zhang, S.W., Reinhard, J.: Small brains, smart minds: Vision, perception, navigation and ’cognition’ in insects. In: Warrant, E., Nilsson, D. (eds.) Invertebrate Vision, pp 462–493. Cambridge University Press, Cambridge (2006)Google Scholar
  27. 27.
    Evangelista, C., Kraft, P., Dacke, M., Reinhard, J., Srinivasan, M.V.: The moment before touchdown: Landing manoeuvres of the honeybee Apis mellifera. J. Exper. Biol. 213(2), 262–270 (2010).  https://doi.org/10.1242/jeb.037465 CrossRefGoogle Scholar
  28. 28.
    Srinivasan, M.V.: Honeybees as a model for the study of visually guided flight, navigation, and biologically inspired robotics. Physiol. Rev. 91(2), 413–460 (2011).  https://doi.org/10.1152/physrev.00005.2010 CrossRefGoogle Scholar
  29. 29.
    Vatani, N.N., Borges, P., Roberts, J.M., Srinivasan, M.V.: On the use of optical flow for scene change detection and description. J. Intell. Robot. Syst. 74(3–4), 817–846 (2014).  https://doi.org/10.1007/s10846-013-9840-8 CrossRefGoogle Scholar
  30. 30.
    Taylor, G.J., Paulk, A.C., Pearson, T.W.J., Moore, R.J., Stacey, J.A., Ball, D., Swinderen, B., Srinivasan, M.V.: Insects modify their behaviour depending on the feedback sensor used when walking on a trackball in virtual reality. J. Exper. Biol. 218, 3118–3127 (2015).  https://doi.org/10.1242/jeb.125617 CrossRefGoogle Scholar
  31. 31.
    Van De Poll, M.N., Zajaczkowski, E.L., Taylor, G.J., Srinivasan, M.V., Swinderen, B.: Using an abstract geometry in virtual reality to explore choice behaviour: Visual flicker preferences in honeybees. J. Exper. Biol. 218, 3448–3460 (2015).  https://doi.org/10.1242/jeb.125138 CrossRefGoogle Scholar
  32. 32.
    Strydom, R., Singh, S.P.N., Srinivasan, M.V.: Biologically inspired interception: A comparison of pursuit and constant bearing strategies in the presence of sensorimotor delay. In: 2015 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 2442–2448.  https://doi.org/10.1109/ROBIO.2015.7419705 (2015)
  33. 33.
    Strydom, R., Thurrowgood, S., Denuelle, A., Srinivasan, M.V.: TCM: A vision-based algorithm for distinguishing between stationary and moving objects irrespective of depth contrast from a UAS. Int. J. Adv. Robot. Syst. 13(3), 1–17 (2016).  https://doi.org/10.5772/62846 CrossRefGoogle Scholar
  34. 34.
    Falanga, D., Zanchettin, A., Simovic, A., Delmerico, J., Scaramuzza, D.: Vision-based autonomous quadrotor landing on a moving platform. In: 2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR), pp. 200–207.  https://doi.org/10.1109/SSRR.2017.8088164 (2017)
  35. 35.
    Zhang, Y., Yu, Y., Jia, S., Wang, X.: Autonomous landing on ground target of UAV by using image-based visual servo control. In: 2017 36th Chinese Control Conference (CCC), pp. 11204–11209.  https://doi.org/10.23919/ChiCC.2017.8029145 (2017)
  36. 36.
    Khamukhin, A.: A simple algorithm for distance estimation without radar and stereo vision based on the bionic principle of bee eyes. In: IOP Conference Series: Materials Science and Engineering, vol. 177(1), pp. 1–8.  https://doi.org/10.1088/1757-899X/177/1/012028 (2017)CrossRefGoogle Scholar

Copyright information

© Springer Nature B.V. 2018

Authors and Affiliations

  1. 1.School of Computer Science and Robotics at National Research Tomsk Polytechnic UniversityTomskRussia

Personalised recommendations