Numerical Simulation of Visually Guided Landing Based on a Honeybee Motion Model
- 31 Downloads
We verified the validity of a bio-inspired strategy for visually guided landing and its mathematical model proposed M.V. Srinivasan et al. by numerical simulation. We studied the influence of temporal discretization and the values of the supported optical flow on the landing duration and its result (from smooth to crash). An algorithm for landing simulation was developed taking into account accepted assumptions of the model. Two formulas (sine and tangent) were derived to calculate the distance and the speed of the flying robot, ensuring the constancy of the optical flow at given time steps. A limitation was found in the very value of the optical flow (threshold value), when exceeding this, the strategy leads to a hard touchdown or a crash (at near zero distance the speed is not close to zero). It was shown that the threshold value of the optical flow decreases with increasing time step in both formulas. However, calculating the distance using sine formula has a significantly lower threshold value of the optical flow than the calculation using the tangent formula. It was found that landing occurs faster if we use the sine formula at equal values of the optical flow. Nevertheless, the smooth landing ends at lower threshold values of the optical flow than using the tangent formula. As a result, using a larger value of the optical flow, a faster smooth landing can be achieved using the tangent formula.
KeywordsBio-inspired landing Optical flow Numerical simulation Computer vision Flying robots
Unable to display preview. Download preview PDF.
This work was funded within the framework of realization of Strategic Program on National Research Tomsk Polytechnic University Competitiveness Enhancement in the Group of Top Level World Research and Academic Institutions. (Project VIU-ORPA-79/2018).
- 2.Zhang, H., Li, L., McCray, D.L., Scheiding, S., Naples, N.J., Gebhardt, A., Risse, S., Eberhardt, R., Tünnermann, A., Yi, A.Y.: Development of a low cost high precision three-layer 3D artificial compound eye. Opt. Express. 21(19), 22232–22245 (2013). https://doi.org/10.1364/OE.21.022232 CrossRefGoogle Scholar
- 9.Srinivasan, M.V., Zhang, S.W., Lehrer, M., Collett, T.: Honeybee navigation en route to the goal: Visual flight control and odometry. J. Exper. Biol. 199(Pt 1), 237–244 (1996)Google Scholar
- 12.Srinivasan, M.V., Zhang, S.W., Bidwell, J.: Visually mediated odometry in honeybees. J. Exper. Biol. 200(19), 2513–2522 (1997)Google Scholar
- 13.Srinivasan, M.V., Chahl, J.S., Weber, K., Venkatesh, S., Nagle, M.G., Zhang, S.W.: Robot navigation inspired by principles of insect vision. In: Zelinsky, A. (ed.) Field and Service Robotics. https://doi.org/10.1007/978-1-4471-1273-03, pp 12–16. Springer, London (1998)
- 24.Srinivasan, M.V., Zhang, S.W.: Motion cues in insect vision and navigation. In: Chalupa, L., Werner, J. (eds.) The Visual Neurosciences, pp 1193–1202. MIT Press, Cambridge (2004)Google Scholar
- 26.Srinivasan, M.V., Zhang, S.W., Reinhard, J.: Small brains, smart minds: Vision, perception, navigation and ’cognition’ in insects. In: Warrant, E., Nilsson, D. (eds.) Invertebrate Vision, pp 462–493. Cambridge University Press, Cambridge (2006)Google Scholar
- 30.Taylor, G.J., Paulk, A.C., Pearson, T.W.J., Moore, R.J., Stacey, J.A., Ball, D., Swinderen, B., Srinivasan, M.V.: Insects modify their behaviour depending on the feedback sensor used when walking on a trackball in virtual reality. J. Exper. Biol. 218, 3118–3127 (2015). https://doi.org/10.1242/jeb.125617 CrossRefGoogle Scholar
- 32.Strydom, R., Singh, S.P.N., Srinivasan, M.V.: Biologically inspired interception: A comparison of pursuit and constant bearing strategies in the presence of sensorimotor delay. In: 2015 IEEE International Conference on Robotics and Biomimetics (ROBIO), pp. 2442–2448. https://doi.org/10.1109/ROBIO.2015.7419705 (2015)
- 34.Falanga, D., Zanchettin, A., Simovic, A., Delmerico, J., Scaramuzza, D.: Vision-based autonomous quadrotor landing on a moving platform. In: 2017 IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR), pp. 200–207. https://doi.org/10.1109/SSRR.2017.8088164 (2017)
- 35.Zhang, Y., Yu, Y., Jia, S., Wang, X.: Autonomous landing on ground target of UAV by using image-based visual servo control. In: 2017 36th Chinese Control Conference (CCC), pp. 11204–11209. https://doi.org/10.23919/ChiCC.2017.8029145 (2017)