Advertisement

Realtime Edge Based Visual Inertial Odometry for MAV Teleoperation in Indoor Environments

  • 374 Accesses

Abstract

A working solution for control and teleoperation of Micro Aerial Vehicles using a frontal camera and an inertial measurement unit as sole sensors is presented. The system is an extension of an edge based visual odometry algorithm to integrate inertial sensors. A mixed tightly-loosely coupled approach is used, taking advantage of each sensor in this minimalistic setup, while keeping the complexity low. The system runs completely on board a MAV providing a semidense output that is more useful for navigation than the sparse maps generated by most feature based systems. To the best of the author’s knowledge, the system is the first semidense VO method running fully on board a MAV for vision in the loop control. An extensive evaluation of the method is presented using the EuRoC MAV dataset, that is specially targeted for MAV navigation in realistic situations. Some of the practical issues of teleoperation are also addressed, in particular how data is transmitted and presented to the user. Finally, real life experiments are included to illustrate the performance of the complete system and the teleoperation interface.

This is a preview of subscription content, log in to check access.

Access options

Buy single article

Instant unlimited access to the full article PDF.

US$ 39.95

Price includes VAT for USA

Subscribe to journal

Immediate online access to all issues from 2019. Subscription will auto renew annually.

US$ 199

This is the net price. Taxes to be calculated in checkout.

References

  1. 1.

    Burri, M., Nikolic, J., Gohl, P., Schneider, T., Rehder, J., Omari, S., Achtelik, M.W., Siegwart, R.: The euroc micro aerial vehicle datasets. Int. J. Robot. Res. (2016). https://doi.org/10.1177/0278364915620033. http://ijr.sagepub.com/content/early/2016/01/21/0278364915620033.abstract

  2. 2.

    Burri, M., Oleynikova, H., Achtelik, M.W., Siegwart, R.: Real-time visual-inertial mapping, re-localization and planning onboard Mavs in unknown environments. In: 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1872–1878 (2015)

  3. 3.

    Engel, J., Schöps, T., Cremers, D.: Lsd-slam: Large-scale direct monocular slam. In: Computer Vision–ECCV 2014, pp. 834–849. Springer (2014)

  4. 4.

    Engel, J., Sturm, J., Cremers, D.: Camera-based navigation of a low-cost quadrocopter. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2815–2821 (2012)

  5. 5.

    Engel, J., Sturm, J., Cremers, D.: Semi-dense visual odometry for a monocular camera. In: 2013 IEEE International Conference on Computer Vision (ICCV), pp. 1449–1456 (2013)

  6. 6.

    Faessler, M., Fontana, F., Forster, C., Mueggler, E., Pizzoli, M., Scaramuzza, D.: Autonomous, vision-based flight and live dense 3d mapping with a quadrotor micro aerial vehicle. J. Field Robot., vol. 1 (2015)

  7. 7.

    Forster, C., Carlone, L., Dellaert, F., Scaramuzza, D.: Imu preintegration on manifold for efficient visual-inertial maximum-a-posteriori estimation. In: Robotics: Science and Systems XI, EPFL-CONF-214687 (2015)

  8. 8.

    Forster, C., Pizzoli, M., Scaramuzza, D.: Svo: fast semi-direct monocular visual odometry. In: 2014 IEEE International Conference on Robotics and Automation (ICRA), pp. 15–22 (2014)

  9. 9.

    Geiger, A., Ziegler, J., Stiller, C.: Stereoscan: Dense 3D reconstruction in real-time. In: Intelligent Vehicles Symposium (IV), 2011 IEEE, pp. 963–968. IEEE (2011)

  10. 10.

    Heng, L., Honegger, D., Lee, G.H., Meier, L., Tanskanen, P., Fraundorfer, F., Pollefeys, M.: Autonomous visual mapping and exploration with a micro aerial vehicle. J. Field Robot. 31(4), 654–675 (2014)

  11. 11.

    Indelman, V., Melim, A., Dellaert, F.: Incremental light bundle adjustment for robotics navigation. In: 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1952–1959 (2013)

  12. 12.

    Engelhard, J., Sturm, N., F.E.W.B., Cremers, D.: Benchmark for the evaluation of rgb-d slam systems (2012)

  13. 13.

    Jones, E.S., Soatto, S.: Visual-inertial navigation, mapping and localization: a scalable real-time causal approach. Int. J. Robot. Res. 30(4), 407–430 (2011)

  14. 14.

    Jose Tarrio, J., Pedre, S.: Realtime edge-based visual odometry for a monocular camera. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 702–710 (2015)

  15. 15.

    Kelly, J., Sukhatme, G.S.: Visual-inertial sensor fusion: localization, mapping and sensor-to-sensor self-calibration. Int. J. Robot. Res. 30(1), 56–79 (2011)

  16. 16.

    Klein, G., Murray, D.: Parallel tracking and mapping for small ar workspaces. In: 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, 2007. ISMAR 2007, pp. 225–234. IEEE (2007)

  17. 17.

    Krajník, T., Nitsche, M., Faigl, J., Vaněk, P., Saska, M., Přeučil, L., Duckett, T., Mejail, M.: A practical multirobot localization system. J. Intell. Robot. Syst., pp. 1–24 (2014). https://doi.org/10.1007/s10846-014-0041-x

  18. 18.

    Krombach, N., Droeschel, D., Behnke, S.: Combining feature-based and direct methods for semi-dense real-time stereo visual odometry. In: International Conference on Intelligent Autonomous Systems (2016)

  19. 19.

    Kuse, M.P., Shen, S.: Robust camera motion estimation using direct edge alignment and sub-gradient method. In: IEEE International Conference on Robotics and Automation (ICRA-2016), Stockholm, Sweden (2016)

  20. 20.

    Leutenegger, S., Lynen, S., Bosse, M., Siegwart, R., Furgale, P.: Keyframe-based visual–inertial odometry using nonlinear optimization. Int. J. Robot. Res. 34(3), 314–334 (2015)

  21. 21.

    Li, M., Mourikis, A.I.: High-precision, consistent ekf-based visual–inertial odometry. Int. J. Robot. Res. 32(6), 690–711 (2013)

  22. 22.

    Martinelli, A.: Vision and imu data fusion: Closed-form solutions for attitude, speed, absolute scale, and bias determination. IEEE Trans. Robot. 28(1), 44–60 (2012)

  23. 23.

    Montemerlo, M., Thrun, S., Koller, D., Wegbreit, B., et al.: Fastslam: A factored solution to the simultaneous localization and mapping problem. In: Aaai/iaai, pp. 593–598 (2002)

  24. 24.

    Mur-Artal, R., Montiel, J., Tardós, J. D.: Orb-slam: a versatile and accurate monocular slam system. IEEE Trans. Robot. 31(5), 1147–1163 (2015)

  25. 25.

    Newcombe, R.A., Lovegrove, S.J., Davison, A.J.: Dtam: Dense tracking and mapping in real-time. In: 2011 IEEE International Conference on Computer Vision (ICCV), pp. 2320–2327 (2011)

  26. 26.

    Pire, T., Fischer, T., Civera, J., De Cristóforis, P., Berlles, J.J.: Stereo parallel tracking and mapping for robot localization. In: 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1373–1378 (2015)

  27. 27.

    Pizzoli, M., Forster, C., Scaramuzza, D.: Remode: Probabilistic, monocular dense reconstruction in real time. In: 2014 IEEE International Conference on Robotics and Automation (ICRA), pp. 2609–2616 (2014)

  28. 28.

    Scaramuzza, D., Achtelik, M.C., Doitsidis, L., Friedrich, F., Kosmatopoulos, E., Martinelli, A., Achtelik, M.W., Chli, M., Chatzichristofis, S., Kneip, L., et al.: Vision-controlled micro flying robots: from system design to autonomous navigation and mapping in gps-denied environments. IEEE Robot. Autom. Magazine 21(3), 26–40 (2014)

  29. 29.

    Houben, S.J., Quenzel, N.K., Behnke, S.: Efficient multi-camera visual-inertial slam for micro aerial vehicles. In: International Conference on Intelligent Robots and Systems (IROS) (2016)

  30. 30.

    Shen, S., Michael, N., Kumar, V.: Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft Mavs. In: 2015 IEEE International Conference on Robotics and Automation (ICRA), pp. 5303–5310 (2015)

  31. 31.

    Strasdat, H., Montiel, J., Davison, A.J.: Real-time monocular slam: why filter? In: 2010 IEEE International Conference on Robotics and Automation (ICRA), pp. 2657–2664 (2010)

  32. 32.

    Tarrio, J.J.: REBVO i-VO Long Teleoperation Trial (2016). https://youtu.be/TLPQy_0qvbi

  33. 33.

    Usenko, V., Engel, J., Stueckler, J., Cremers, D.: Direct visual-inertial odometry with stereo cameras. In: International Conference on Robotics and Automation (2016)

  34. 34.

    Valipour, M.: Optimization of neural networks for precipitation analysis in a humid region to detect drought and wet year alarms. Meteorol. Appl. 23(1), 91–100 (2016)

  35. 35.

    Valipour, M., Sefidkouhi, M.A.G., Raeini, M., et al.: Selecting the best model to estimate potential evapotranspiration with respect to climate change and magnitudes of extreme events. Agric. Water Manag. 180, 50–60 (2017)

  36. 36.

    Weiss, S., Achtelik, M.W., Lynen, S., Chli, M., Siegwart, R.: Real-time onboard visual-inertial state estimation and self-calibration of Mavs in unknown environments. In: 2012 IEEE International Conference on Robotics and Automation (ICRA), pp. 957–964 (2012)

  37. 37.

    Weiss, S., Scaramuzza, D., Siegwart, R.: Monocular-slam–based navigation for autonomous micro helicopters in gps-denied environments. J. Field Robot. 28(6), 854–874 (2011)

Download references

Author information

Correspondence to Juan José Tarrio.

Electronic supplementary material

Below is the link to the electronic supplementary material.

(MP4 51.7 MB)

(MP4 51.7 MB)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Tarrio, J.J., Pedre, S. Realtime Edge Based Visual Inertial Odometry for MAV Teleoperation in Indoor Environments. J Intell Robot Syst 90, 235–252 (2018) doi:10.1007/s10846-017-0670-y

Download citation

Keywords

  • Micro aerial vehicles
  • Edge based visual odometry
  • Teleoperation
  • Embedded processing