Advertisement

Autonomous Robots

, Volume 42, Issue 6, pp 1263–1280 | Cite as

Autonomous navigation of micro aerial vehicles using high-rate and low-cost sensors

  • Angel Santamaria-NavarroEmail author
  • Giuseppe Loianno
  • Joan Solà
  • Vijay Kumar
  • Juan Andrade-Cetto
Article

Abstract

The combination of visual and inertial sensors for state estimation has recently found wide echo in the robotics community, especially in the aerial robotics field, due to the lightweight and complementary characteristics of the sensors data. However, most state estimation systems based on visual-inertial sensing suffer from severe processor requirements, which in many cases make them impractical. In this paper, we propose a simple, low-cost and high rate method for state estimation enabling autonomous flight of micro aerial vehicles, which presents a low computational burden. The proposed state estimator fuses observations from an inertial measurement unit, an optical flow smart camera and a time-of-flight range sensor. The smart camera provides optical flow measurements up to a rate of 200 Hz, avoiding the computational bottleneck to the main processor produced by all image processing requirements. To the best of our knowledge, this is the first example of extending the use of these smart cameras from hovering-like motions to odometry estimation, producing estimates that are usable during flight times of several minutes. In order to validate and defend the simplest algorithmic solution, we investigate the performances of two Kalman filters, in the extended and error-state flavors, alongside with a large number of algorithm modifications defended in earlier literature on visual-inertial odometry, showing that their impact on filter performance is minimal. To close the control loop, a non-linear controller operating in the special Euclidean group SE(3) is able to drive, based on the estimated vehicle’s state, a quadrotor platform in 3D space guaranteeing the asymptotic stability of 3D position and heading. All the estimation and control tasks are solved on board and in real time on a limited computational unit. The proposed approach is validated through simulations and experimental results, which include comparisons with ground-truth data provided by a motion capture system. For the benefit of the community, we make the source code public.

Keywords

Micro aerial vehicles Vision for robotics Localization 

Supplementary material

Supplementary material 1 (mp4 9320 KB)

References

  1. Bar-Shalom, Y., Li, X. R., & Kirubarajan, T. (2004). Estimation with applications to tracking and navigation: Theory algorithms and software. Hoboken: Wiley.Google Scholar
  2. Blösch, M., Omari, S., Fankhauser, P., Sommer, H., Gehring, C., Hwangbo, J., Hoepflinger, M., Hutter, M., & Siegwart, R. (2014). Fusion of optical flow and inertial measurements for robust egomotion estimation. In Proceedings of the IEEE/RSJ international conference on intelligent robots and system (pp. 3102–3107). Chicago.Google Scholar
  3. Blösch, M., Weiss, S., Scaramuzza, D., & Siegwart, R. (2010). Vision based MAV navigation in unknown and unstructured environments. In Proceedings of the IEEE international conference on robotics and automation (pp. 21–28). Anchorage.Google Scholar
  4. Bullo, F., & Lewis, A. (2004). Geometric control of mechanical systems. Berlin: Springer.zbMATHGoogle Scholar
  5. Faessler, M., Fontana, F., Forster, C., Mueggler, E., Pizzoli, M., & Scaramuzza, D. (2016). Autonomous, vision-based flight and live dense 3D mapping with a quadrotor micro aerial vehicle. Journal of Field Robotics, 33(4), 431–450.CrossRefGoogle Scholar
  6. Fliess, M., Lévine, J., Martin, P., & Rouchon, P. (1995). Flatness and defect of non-linear systems: Introductory theory and examples. International Journal of Control, 61(6), 1327–1361.MathSciNetCrossRefzbMATHGoogle Scholar
  7. Forster, C., Pizzoli, M., & Scaramuzza, D. (2014). SVO: Fast semi-direct monocular visual odometry. In Proceedings of the IEEE international conference on robotics and automation (pp. 15–22). Hong Kong.Google Scholar
  8. Forte, F., Naldi, R., & Marconi, L. (2012). Impedance control of an aerial manipulator. In Proceedings of the American control conference (pp. 3839–3844). Montreal.Google Scholar
  9. Fraundorfer, F., Heng, L., Honegger, D., Lee, G. H., Meier, L., Tanskanen, P., & Pollefeys, M. (2012). Vision-based autonomous mapping and exploration using a quadrotor MAV. In Proceedings of the IEEE/RSJ international conference on intelligent robots and system (pp. 4557–4564). Vilamoura.Google Scholar
  10. Hérissé, B., Hamel, T., Mahony, R., & Russotto, F. X. (2012). Landing a VTOL unmanned aerial vehicle on a moving platform using optical flow. IEEE Transactions on Robotics, 28(1), 77–89.CrossRefGoogle Scholar
  11. Hesch, J. A., Kottas, D. G., Bowman, S. L., & Roumeliotis, S. I. (2014). Camera-IMU-based localization: Observability analysis and consistency improvement. The International Journal of Robotics Research, 33(1), 182–201.CrossRefGoogle Scholar
  12. Honegger, D., Lorenz, M., Tanskanen, P., & Pollefeys, M. (2013). An open source and open hardware embedded metric optical flow CMOS camera for indoor and outdoor applications. In Proceedings of the IEEE international conference on robotics and automation (pp. 1736–1741). Karlsruhe.Google Scholar
  13. Jones, E. S., & Soatto, S. (2011). Visual-inertial navigation, mapping and localization: A scalable real-time causal approach. The International Journal of Robotics Research, 30(4), 407–430.CrossRefGoogle Scholar
  14. Kelly, J., & Sukhatme, G. S. (2011). Visual-inertial sensor fusion: Localization, mapping and sensor-to-sensor self-calibration. The International Journal of Robotics Research, 30(1), 56–79.CrossRefGoogle Scholar
  15. Lee, T., Leok, M., & McClamroch, N. H. (2013). Nonlinear robust tracking control of a quadrotor UAV on SE(3). Asian Journal of Control, 15(2), 391–408.MathSciNetCrossRefzbMATHGoogle Scholar
  16. Li, M., & Mourikis, A. I. (2012). Improving the accuracy of EKF-based visual-inertial odometry. In Proceedings of the IEEE international conference on robotics and automation (pp. 828–835). Saint Paul.Google Scholar
  17. Li, M., & Mourikis, A. I. (2013). High-precision, consistent EKF-based visual-inertial odometry. The International Journal of Robotics Research, 32(6), 690–711.CrossRefGoogle Scholar
  18. Liu, H., Darabi, H., Banerjee, P., & Liu, J. (2007). Survey of wireless indoor positioning techniques and systems. IEEE Transactions on Systems, Man, and Cybernetics, 37(6), 1067–1080.CrossRefGoogle Scholar
  19. Loianno, G., Mulgaonkar, Y., Brunner, C., Ahuja, D., Ramanandan, A., Chari, M., et al. (2015a). Smartphones power flying robots. In Proceedings of the IEEE/RSJ international conference on intelligent robots and system (pp. 1256–1263). Hamburg.Google Scholar
  20. Loianno, G., Thomas, J., & Kumar, V. (2015b). Cooperative localization and mapping of MAVs using RGB-D sensors. In Proceedings of the IEEE international conference on robotics and automation (pp. 4021–4028). Seattle.Google Scholar
  21. Madyastha, V. K., Ravindra, V. C., Mallikarjunan, S., & Goyal, A. (2011) Extended Kalman filter vs. error state Kalman filter for aircraft attitude estimation. In Proceedings of the AIAA guidance, navigation, and control conference (pp. 6615–6638). Portland.Google Scholar
  22. Martinelli, A. (2012). Vision and IMU data fusion: Closed-form solutions for attitude, speed, absolute scale, and bias determination. IEEE Transactions on Robotics, 28(1), 44–60.CrossRefGoogle Scholar
  23. Martinelli, A. (2013). Visual-inertial structure from motion: Observability and resolvability. In Proceedings of the IEEE/RSJ international conference on intelligent robots and system (pp. 4235–4242). Tokyo.Google Scholar
  24. Meier, L., Tanskanen, P., Heng, L., Lee, G., Fraundorfer, F., & Pollefeys, M. (2012). PIXHAWK: A micro aerial vehicle design for autonomous flight using onboard computer vision. Autonomous Robots, 33(1–2), 21–39.CrossRefGoogle Scholar
  25. Mellinger, D., & Kumar, V. (2011). Minimum snap trajectory generation and control for quadrotors. In Proceedings of the IEEE international conference on robotics and automation (pp. 2520–2525). Shanghai.Google Scholar
  26. Michael, N., Mellinger, D., Lindsey, Q., & Kumar, V. (2010). The grasp multiple micro-UAV test bed. IEEE Robotics & Automation Magazine, 17(3), 56–65.CrossRefGoogle Scholar
  27. Michael, N., Shen, S., Mohta, K., Kumar, V., Nagatani, K., Okada, Y., et al. (2012). Collaborative mapping of an earthquake-damaged building via ground and aerial robots. Journal of Field Robotics, 29(5), 832–841.CrossRefGoogle Scholar
  28. Nikolic, J., Rehder, J., Burri, M., Gohl, P., Leutenegger, S., Furgale, P. T., & Siegwart, R. (2014). A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM. In Proceedings of the IEEE international conference on robotics and automation (pp. 431–437). Hong Kong.Google Scholar
  29. Omari, S., & Ducard, G. (2013). Metric visual-inertial navigation system using single optical flow feature. In Proceedings of the European control conference (pp. 1310–1316). Zurich.Google Scholar
  30. Ozaslan, T., Shen, S., Mulgaonkar, Y., Michael, N., & Kumar, V. (2013). Inspection of penstocks and featureless tunnel-like environments using micro UAVs. In Proceedings of the conference on field and service robotics (pp. 123–136). Brisbane.Google Scholar
  31. Ravindra, V., Madyastha, V., & Goyal, A. (2012). The equivalence between two well-known variants of the Kalman filter. In: Proc. Adv. Cont. Opt. of Dynamic Syst. Bangalore.Google Scholar
  32. Rossi, R., Santamaria-Navarro A., Andrade-Cetto J., & Rocco P. (2017). Trajectory generation for unmanned aerial manipulators through quadratic programming. IEEE Robotics and Automation Letters, 2(2), 389–396.Google Scholar
  33. Roumeliotis, S. I., Johnson, A. E., & Montgomery, J. F. (2002) Augmenting inertial navigation with image-based motion estimation. In Proceedings of the IEEE international conference on robotics and automation (Vol. 4, pp. 4326–4333). Washington.Google Scholar
  34. Roussillon, C., Gonzalez, A., Solà, J., Codol, J. M., Mansard, N., Lacroix, S., & Devy, M. (2011). RT-SLAM: A generic and real-time visual SLAM implementation. In J. L. Crowley, B. A. Draper, & M. Thonnat (Eds.), Computer vision systems, lecture notes in computer science (Vol. 6962, pp. 31–40). Berlin, Heidelberg: Springer.Google Scholar
  35. Ruffo, M., Di Castro, M., Molinari, L., Losito, R., Masi, A., Kovermann, J., & Rodrigues, L. (2014). New infrared time-of-flight measurement sensor for robotic platforms. In Proceedings of the international symposium work on ADC modelling and testing (pp. 13–18).Google Scholar
  36. Santamaria-Navarro A., Grosch P., Lippiello V., Solà J., & Andrade-Cetto J. (2017). Uncalibrated visual servo for unmanned aerial manipulation. IEEE/ASME Transactions on Mechatronics, 22(4),1610–1621.Google Scholar
  37. Santamaria-Navarro, A., Lipiello, V., & Andrade-Cetto, J. (2014). Task priority control for aerial manipulation. In Proceedings of the IEEE international symposium on safety security and rescue robotics (pp. 1–6). Toyako-cho.Google Scholar
  38. Santamaria-Navarro, A., Solà, J., & Andrade-Cetto, J. (2015). High-frequency MAV state estimation using low-cost inertial and optical flow measurement units. In Proceedings of the IEEE/RSJ international conference on intelligent robots and system (pp. 1864–1871). Hamburg.Google Scholar
  39. Shen, S., Michael, N., & Kumar, V. (2012). Autonomous indoor 3D exploration with a micro-aerial vehicle. In Proceedings of the IEEE international conference on robotics and automation (pp. 9–15). Saint Paul.Google Scholar
  40. Shen, S., Mulgaonkar Y., Michael N., Kumar V. (2013). Vision-based state estimation and trajectory control towards high-speed flight with a quadrotor. In Proceedings of the robotics science and system. Berlin.Google Scholar
  41. Solà, J. (2015) Quaternion kinematics for the error-state KF. https://hal.archives-ouvertes.fr/hal-01122406v5, hal-01122406, v5 (in preparation).
  42. Solà, J., Vidal-Calleja, T., Civera, J., & Montiel, J. M. M. (2011). Impact of landmark parametrization on monocular EKF-SLAM with points and lines. International Journal of Computer Vision, 97(3), 339–368.MathSciNetCrossRefzbMATHGoogle Scholar
  43. Thomas, J., Loianno, G., Sreenath, K., & Kumar, V. (2014). Toward image based visual servoing for aerial grasping and perching. In Proceedings of the IEEE international conference on robotics and automation (pp. 2113–2118). Hong Kong.Google Scholar
  44. Tomic, T., Schmid, K., Lutz, P., Domel, A., Kassecker, M., Mair, E., et al. (2012). Toward a fully autonomous UAV: Research platform for indoor and outdoor urban search and rescue. IEEE Robotics & Automation Magazine, 19(3), 46–56.CrossRefGoogle Scholar
  45. Trawny, N., & Roumeliotis, S. I. (2005) Indirect Kalman filter for 3D attitude estimation. University of Minnesota, Department of Computer Science & Engineering, Tech. Rep 2, rev. 57.Google Scholar
  46. Weiss, S., Achtelik, M., Lynen, S., Chli, M., & Siegwart, R. (2012). Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments. Proceedings of the IEEE international conference on robotics and automation (pp. 957–964). Saint Paul.Google Scholar
  47. Weiss, S., Scaramuzza, D., & Siegwart, R. (2011). Monocular-SLAM-based navigation for autonomous micro helicopters in gps denied environments. Journal of Field Robotics, 28(6), 854–874.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2017

Authors and Affiliations

  1. 1.Institut de Robòtica i Informàtica Industrial, CSIC-UPCBarcelonaSpain
  2. 2.GRASP LabUniversity of PennsylvaniaPhiladelphiaUSA

Personalised recommendations