Skip to main content

A Spline-Based Trajectory Representation for Sensor Fusion and Rolling Shutter Cameras

Abstract

The use of multiple sensors for ego-motion estimation is an approach often used to provide more accurate and robust results. However, when representing ego-motion as a discrete series of poses, fusing information of unsynchronized sensors is not straightforward. The framework described in this paper aims to provide a unified solution for solving ego-motion estimation problems involving high-rate unsynchronized devices. Instead of a discrete-time pose representation, we present a continuous-time formulation that makes use of cumulative cubic B-Splines parameterized in the Lie Algebra of the group \(\mathbb {SE}3\). This trajectory representation has several advantages for sensor fusion: (1) it has local control, which enables sliding window implementations; (2) it is \(C^2\) continuous, allowing predictions of inertial measurements; (3) it closely matches torque-minimal motions; (4) it has no singularities when representing rotations; (5) it easily handles measurements from multiple sensors arriving a different times when timestamps are available; and (6) it deals with rolling shutter cameras naturally. We apply this continuous-time framework to visual–inertial simultaneous localization and mapping and show that it can also be used to calibrate the entire system.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

References

  • Agarwal, S., & Mierle, K. (2012). Ceres solver: Tutorial & reference. California: Google Inc.

    Google Scholar 

  • Alahi, A., Ortiz, R. & Vandergheynst, P. (2012). Freak: Fast retina keypoint. In Conference on Computer Vision and Patter Recognition.

  • Anderson, S. & Barfoot, T. D. (2013). Towards releative continuous-time slam. In IEEE Conference on Robotics and Automation.

  • Baker, S., Bennett, E. P., Kang, S. B. & Szeliski, R. (2010). Removing rolling shutter wobble. In Conference on Computer Vision and Pattern Recognition.

  • Bibby, C. & Reid, I. (2010). A hybrid slam representation for dynamic marine environments. In International Conference on Robotics and Automation.

  • Boor, C. D. (1972). On calculating with b-splines. Journal of Approximation Theory, 6, 50–62.

    MATH  MathSciNet  Article  Google Scholar 

  • Comport, A. I., Malis, E. & Rives, P. (2007). Accurate quadri-focal tracking for robust 3d visual odometry. In International Conference on Robotics and Automation.

  • Cox, M. G. (1972). The numerical evaluation of b-splines. Journal of Applied Mathematics, 10(2), 134–149.

    MATH  Google Scholar 

  • Crouch, P., Kun, G., & Leite, F. S. (1999). The de casteljau algorithm on lie groups and spheres. Journal of Dynamical and Control Systems, 5(3), 397–429.

    MATH  MathSciNet  Article  Google Scholar 

  • Dam, E. B., Koch, M. & Lillholm, M. (1998). Quaternions, interpolation and animation. Technical Report DIKU-TR-98/5, University of Copenhagen, Department of Computer Science.

  • Davison, A. J. (2003). Real-time simultaneous localisation and mapping with a single camera. In International Conference on Computer Vision.

  • Furgale, P., Barfoot, T.D. & Sibley, G. (2012). Continuous-time batch estimation using temporal basis functions. In International Conference on Robotics and Automation.

  • Hedborg, J., Forssen, P., Felsberg, M. & Ringaby, E. (2012). Rolling shutter bundle adjustment. In Conference on Computer Vision and Pattern Recognition.

  • Jia, C. & Evans, B. L. (2012). Probabilistic 3-d motion estimation for rolling shutter video rectification from visual and inertial measurements. In International Workshop on Multimedia Signal Processing.

  • Jones, E., Vedaldi, A. & Soatto, S. (2007). Inertial structure from motion with autocalibration. In ICCV Workshop on Dynamical Vision.

  • Kelly, J., & Sukhatme, G. S. (2010). Visual-inertial sensor fusion: Localization, mapping and sensor-to-sensor self-calibration. International Journal of Robotics Research, 30(1), 56–79.

    Article  Google Scholar 

  • Kim, M. J., Kim, M. S. & Shin, S. (1995a). A \(c^2\)-continuous b-spline quaternion curve interpolating a given sequence of solid orientations. In Computer Animation, pp. 72–81.

  • Kim, M. J., Kim, M. S. & Shin, S. (1995b). A general construction scheme for unit quaternion curves with simple high order derivatives. In SIGGRAPH, pp. 369–376.

  • Klein, G. & Murray, D. (2007). Parallel tracking and mapping for small ar workspaces. In International Symposium on Mixed and Augmented Reality.

  • Klein, G. & Murray, D. (2008). Improving the agility of keyframe-based SLAM. In European Conference on Computer Vision.

  • Klein, G. & Murray, D. (2009). Parallel tracking and mapping on a camera phone. In International Symposium on Mixed and Augmented Reality.

  • Lovegrove, S., Patron-Perez, A. & Sibley, G. (2013). Spline fusion: A continuous-time representation for visual-inertial fusion with application to rolling shutter cameras. In British Machine Vision Conference.

  • Martull, S., Martorell, M. P. & Fukui, K. (2012). Realistic cg stereo image dataset with ground truth disparity maps. In ICPR workshop.

  • C, Mei, Sibley, G., Cummins, M., Newman, P., & Reid, I. (2010). RSLAM: A system for large-scale mapping in constant-time using stereo. International Journal of Computer Vision, 94, 1–17.

    MATH  Google Scholar 

  • Meingast, M., Geyer, C. & Sastry, S. (2005). Geometric models for rolling shutter cameras. In OmniVis Workshop.

  • Mirzaei, F. M., & Roumeliotis, S. I. (2008). A kalman filter-based algorithm for imu-camera calibration: Observability analysis and performance evaluation. IEEE Transactions on Robotics and Automation, 5, 1143–1156.

    Article  Google Scholar 

  • Montiel, J., Civera, J. & Davison, A. J. (2006). Unified inverse depth parametrization for monocular SLAM. In Robotics: Science and Systems.

  • Newcombe, R. A., Lovegrove, S. J. & Davison, A. J. (2011). DTAM: Dense tracking and mapping in real-time. In International Conference on Computer Vision.

  • Nuetzi, G., Weiss, S., Scaramuzza, D. & Siegwart, R. (2010). Fusion of imu and vision for absolute scale estimation in monocular slam. In International Conference on Unmanned Aerial Vehicles.

  • Pietzsch, T. (2008). Efcient feature parameterisation for visual slam using inverse depth bundles. In British Machine Vision Conference.

  • Qin, K. (2000). General matrix representations for b-splines. The Visual Computer, 16(3–4), 177–186.

    Article  Google Scholar 

  • Shoemake, K. (1985). Animating rotation with quaternion curves. In SIGGRAPH, pp. 245–254.

  • Shoemake, K. (1987). Quaternion calculus and fast animation. In SIGGRAPH Course Notes.

  • Strasdat, H. (2012). Local accuracy and global consistency for efficient visual slam. Ph.D. thesis, Imperial College London.

  • Strasdat, H., Montiel, J. M. M. & Davison, A. (2010). Scale drift-aware large scale monocular SLAM. In Robotics: Science and Systems.

  • Strasdat, H., Davison, A. J., Montiel, J. M. M. & Konolige, K. (2011). Double window optimisation for constant time visual slam. In International Conference on Computer Vision.

Download references

Acknowledgments

This work was made possible by generous support from NSF MRI grant 1337722, Toyota Motor Engineering & Manufacturing North America, Inc, and Google, Inc.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alonso Patron-Perez.

Additional information

Communicated by Tilo Burghardt , Majid Mirmehdi, Walterio Mayol, Dima Damen.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Patron-Perez, A., Lovegrove, S. & Sibley, G. A Spline-Based Trajectory Representation for Sensor Fusion and Rolling Shutter Cameras. Int J Comput Vis 113, 208–219 (2015). https://doi.org/10.1007/s11263-015-0811-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11263-015-0811-3

Keywords

  • Sensor fusion
  • Visual–inertial
  • SLAM
  • Rolling shutter
  • Calibration