Journal of Intelligent & Robotic Systems

, Volume 82, Issue 2, pp 277–299 | Cite as

Planar-Based Visual Inertial Navigation: Observability Analysis and Motion Estimation

  • Ghazaleh Panahandeh
  • Seth Hutchinson
  • Peter Händel
  • Magnus Jansson
Article

Abstract

In this paper, we address the problem of ego-motion estimation by fusing visual and inertial information. The hardware consists of an inertial measurement unit (IMU) and a monocular camera. The camera provides visual observations in the form of features on a horizontal plane. Exploiting the geometric constraint of features on the plane into visual and inertial data, we propose a novel closed form measurement model for this system. Our first contribution in this paper is an observability analysis of the proposed planar-based visual inertial navigation system (VINS). In particular, we prove that the system has only three unobservable states corresponding to global translations parallel to the plane, and rotation around the gravity vector. Hence, compared to general VINS, an advantage of using features on the horizontal plane is that the vertical translation along the normal of the plane becomes observable. As the second contribution, we present a state-space formulation for the pose estimation in the analyzed system and solve it via a modified unscented Kalman filter (UKF). Finally, the findings of the theoretical analysis and 6-DoF motion estimation are validated by simulations as well as using experimental data.

Keywords

Visual-inertial navigation Motion estimation Observability analysis 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bouguet, J.Y.: Camera calibration toolbox. Computaion vision at Caltech, http://www.vision.caltech.edu/bouguetj/ (last accessed Mar. 2012)
  2. 2.
    Conrad, D., DeSouza, G.N.: Homography-based ground plane detection for mobile robot navigation using a modified em algorithm. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Anchorage, Alaska, pp. 910–915 (2010)Google Scholar
  3. 3.
    De Angelis, A., Händel, P., Rantakokko, J.: Measurement report. laser total station campaign in KTH R1 for ubisense system accuracy evaluation : Laser total station campaign in KTH R1 for ubisense system accuracy evaluation. In: Technical Report (2012)Google Scholar
  4. 4.
    Farrell, J.A., Barth, M.: Global Positioning System, Inertial Navigation and Integration. McGraw-Hill Companies (1999)Google Scholar
  5. 5.
    Guo, C.X., Roumeliotis, S.I.: IMU-RGBD camera extrinsic calibration: Observability analysis and consistency improvement. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, Germany (2013)Google Scholar
  6. 6.
    Hartley, R.I., Zisserman, A.: Multiple View Geometry in Computer Vision. Cambridge University Press (2000). ISBN: 0521623049Google Scholar
  7. 7.
    Hermann, R., Krener, A.: Nonlinear controllability and observability. IEEE Trans. Autom. Control 22(4), 728–740 (1977)Google Scholar
  8. 8.
    Hesch, J.A., Kottas, D.G., Bowman, S.L., Roumeliotis, S.I.: Towards consistent vision-aided inertial navigation. In: 10th International Workshop on the Algorithmic Foundations of Robotics, Cambridge, Massachusetts (2012)Google Scholar
  9. 9.
    Hesch, J.A., Kottas, D.G., Bowman, S.L., Roumeliotis, S.I.: Camera-IMU-based localization: Observability analysis and consistency improvement. Int. J. Robot. Res. 33(1), 182–201 (2014)CrossRefGoogle Scholar
  10. 10.
    Hesch, J.A., Kottas, D.G., Bowman, S.L., Roumeliotis, S.I.: Consistency analysis and improvement of vision-aided inertial navigation. IEEE Trans. Robot. 30(1), 158–176 (2014)CrossRefGoogle Scholar
  11. 11.
    Hide, C., Botterill, T., Andreotti, M.: Low cost vision-aided IMU for pedestrian navigation. In: IEEE Conference on Ubiquitous Positioning Indoor Navigation and Location Based Service (UPINLBS), pp. 1–7 (2010)Google Scholar
  12. 12.
    Jones, E., Vedaldi, A., Soatto, S.: Inertial structure from motion and autocalibration. In: Workshop on Dynamical Vision (2007)Google Scholar
  13. 13.
    Jones, E.S., Soatto, S.: Visual-inertial navigation, mapping and localization: A scalable real-time causal approach. Int. J. Robot. Res. 30(4), 407–430 (2011)Google Scholar
  14. 14.
    Julier, S.J., Uhlmann, J.K.: A new extension of the Kalman filter to nonlinear systems. In: Proceedings of Signal Processing, Sensor fusion, and Target Recognition, vol. 4, pp. 182–193 (1997)Google Scholar
  15. 15.
    Kelly, J., Sukhatme, G.S.: Visual-inertial sensor fusion: Localization, mapping and sensor-to-sensor self-calibration. Int. J. Robot. Res. 30(1), 56–79 (2011)CrossRefGoogle Scholar
  16. 16.
    Li, M., Mourikis, A.I.: Improving the accuracy of EKF-based visual-inertial odometry. In: Proceedings of IEEE International Conference on Robotics and Automation (ICRA), Saint Paul, Minnesota, pp. 828–835 (2012)Google Scholar
  17. 17.
    Ltd., U.: The ubisense precise real-time location system - series 7000 sensor ( http://www.ubisense.net/,last accessed Mar.2012)
  18. 18.
    Martinelli, A.: State estimation based on the concept of continuous symmetry and observability analysis: The case of calibration. IEEE Trans. Robot. 27(2), 239–255 (2011)CrossRefGoogle Scholar
  19. 19.
    Martinelli, A.: Vision and IMU data fusion: Closed-form solutions for attitude, speed, absolute scale, and bias determination. IEEE Trans. Robot. 28(1), 44–60 (2012)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Martinelli, A.: Observability properties and deterministic algorithms in visual-inertial structure from motion. Foundations and Trends in Robotics 3(3), 139–209 (2014)CrossRefGoogle Scholar
  21. 21.
    Maybeck, P.S.: Stochastic models, estimation, and control, vol. I. Academic press, New York (1979)Google Scholar
  22. 22.
    Mirisola, L.G.B., Dias, J., De Almeida, A.T.: Trajectory recovery and 3d mapping from rotation-compensated imagery for an airship. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 1908–1913 (2008)Google Scholar
  23. 23.
    Mirzaei, F.M., Roumeliotis, S.I.: A Kalman filter-based algorithm for IMU-camera calibration: Observability analysis and performance evaluation. IEEE Trans. Robot. 24(5), 1143–1156 (2008). doi: 10.1109/TRO.2008.2004486 CrossRefGoogle Scholar
  24. 24.
    Mourikis, A.I., Roumeliotis, S.: A multi-state constraint Kalman filter for vision-aided inertial navigation. In: Proceedings of IEEE International Conference on Robotics and Automation (ICRA), Roma, Italy, pp. 3565–3572 (2007)Google Scholar
  25. 25.
    Mourikis, A.I., Trawny, N., Roumeliotis, S.I., Johnson, A.E., Ansar, A., Matthies, L.: Vision-aided inertial navigation for spacecraft entry, descent, and landing. IEEE Trans. Robot. 25(2), 264–280 (2009)CrossRefGoogle Scholar
  26. 26.
    Nilsson, J., Händel, P.: Time synchronization and temporal ordering of asynchronous sensor measurements of a multi-sensor navigation system. In: IEEE/ION Position Location and Navigation Symposium (PLANS), pp. 897–902 (2010)Google Scholar
  27. 27.
    Panahandeh, G., Guo, C.X., Jansson, M., Roumeliotis, S.I.: Observability analysis of a vision-aided inertial navigation system using planar features on the ground. In: Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan, pp. 4187–4194 (2013)Google Scholar
  28. 28.
    Panahandeh, G., Guo, C.X., Jansson, M., Roumeliotis, S.I.: Observability analysis of a vision-aided inertial navigation system using planar features on the ground: Supplemental material. In: Technical Report (2013)Google Scholar
  29. 29.
    Panahandeh, G., Jansson, M.: IMU-camera self-calibration using planar mirror reflection. In: Proceedings of IEEE International Conference on Indoor Positioning and Indoor Navigation (IPIN), Guimares, Portugal, pp. 1–7 (2011)Google Scholar
  30. 30.
    Panahandeh, G., Jansson, M.: Vision-aided inertial navigation based on ground plane feature detection. IEEE/ASME Tran. Mechatron. 19(4), 1206–1215 (2014)CrossRefGoogle Scholar
  31. 31.
    Panahandeh, G., Jansson, M., Hutchinson, S.: IMU-camera data fusion: Horizontal plane observation with explicit outlier rejection. In: IEEE International Conference on Indoor Positioning and Indoor Navigation (IPIN), Montbeliard, France (2013)Google Scholar
  32. 32.
    Panahandeh, G., Mohammadiha, N., Jansson, M.: Ground plane feature detection in mobile vision-aided inertial navigation. In: Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vilamoura, Algarve, Portugal, pp. 3607–3611 (2012)Google Scholar
  33. 33.
    Panahandeh, G., Zachariah, D., Jansson, M.: Exploiting ground plane constraints for visual-inertial navigation. In: Proceedings of IEEE-ION Position Location and Navigation Symposium, Myrtle Beach, South Carolina, pp. 527–534 (2012)Google Scholar
  34. 34.
    Pears, N., Liang, B.: Ground plane segmentation for mobile robot visual navigation. In: Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Outrigger Wailea Resort, Maui, Hawaii, pp. 1513–1518 (2001)Google Scholar
  35. 35.
    Roumeliotis, S.I., Johnson, A.E., Montgomery, J.F.: Augmenting inertial navigation with image-based motion estimation. In: Proceedings of IEEE International Conference on Robotics and Automation (ICRA), Washington D.C., pp. 4326–4333 (2002)Google Scholar
  36. 36.
    Shuster, M.D.: A survey of attitude representations. Astronaut. Sci. 41(4), 439–517 (1993)MathSciNetGoogle Scholar
  37. 37.
    Song, X., Seneviratne, L.D., Althoefer, K.: A Kalman filter-integrated optical flow method for velocity sensing of mobile robots. IEEE/ASME Trans. Mechatron. 16(3), 551–563 (2011)CrossRefGoogle Scholar
  38. 38.
    Troiani, C., Martinelli, A.: Vision-aided inertial navigation using virtual features. In: Proceedings IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vilamoura, Algarve, Portugal, pp. 4828–4834 (2012)Google Scholar
  39. 39.
    Wagner, D., Schmalstieg, D.: ARToolKitPlus for pose tracking on mobile devices. In: Proceedings of 12th Computer Vision Winter Workshop (CVWW), pp. 139–146 (2007)Google Scholar
  40. 40.
    Weiss, S., Brockers, R., Matthies, L.: 4 DoF drift free navigation using inertial cues and optical flow. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan, pp. 4180–4186 (2013)Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2015

Authors and Affiliations

  • Ghazaleh Panahandeh
    • 1
  • Seth Hutchinson
    • 2
  • Peter Händel
    • 1
  • Magnus Jansson
    • 1
  1. 1.ACCESS Linnaeus Centre, School of Electrical EngineeringKTH Royal Institute of TechnologyStockholmSweden
  2. 2.University of Illinois at Urbana-ChampaignUrbanaUSA

Personalised recommendations