Skip to main content

Visual-Inertial 2D Feature Tracking based on an Affine Photometric Model

  • Chapter
  • First Online:
Developments in Medical Image Processing and Computational Vision

Abstract

The robust tracking of point features throughout an image sequence is one fundamental stage in many different computer vision algorithms (e.g. visual modelling, object tracking, etc.). In most cases, this tracking is realised by means of a feature detection step and then a subsequent re-identification of the same feature point, based on some variant of a template matching algorithm. Without any auxiliary knowledge about the movement of the camera, actual tracking techniques are only robust for relatively moderate frame-to-frame feature displacements. This paper presents a framework for a visual-inertial feature tracking scheme, where images and measurements of an inertial measurement unit (IMU) are fused in order to allow a wider range of camera movements. The inertial measurements are used to estimate the visual appearance of a feature’s local neighbourhood based on a affine photometric warping model.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    MEMS—micro-electromechanical systems.

  2. 2.

    The different sensor types are indicated by the subscript indices at the entities in the different equations.

  3. 3.

    m des describes the magnitude of the earth’s magnetic field (e.g. 48 \(\mu T\) in Western Europe).

  4. 4.

    For simplification: \(s\alpha=sin(\alpha)\) and \(c\beta=cos(\beta)\).

  5. 5.

    For this a simple first-order Taylor expansion of the minimisation term is used.

  6. 6.

    Here a successfully tracked feature is a feature which is not neglected based on the error threshold e limit .

References

  1. Aufderheide D, Krybus W (2010) Towards real-time camera egomotion estimation and three-dimensional scene acquisition from monocular image streams. In: Proceedings of the 2010 international conference on Indoor Positioning and Indoor Navigation (IPIN 2010). Zurich, Switzerland, September, 15–17 2010, pp 1–10. IEEE – ISBN 978-1-4244-5862-2

    Google Scholar 

  2. Aufderheide D, Steffens M, Kieneke S, Krybus W, Kohring C, Morton D (2009) Detection of salient regions for stereo matching by a probabilistic scene analysis. In: Proceedings of the 9th conference on optical 3-D measurement techniques. Vienna, Austria, July, 1–3 2009, pp 328–331. ISBN 978-3-9501492-5-8

    Google Scholar 

  3. Aufderheide D, Krybus W, Dodds D (2011) A MEMS-based smart sensor system for estimation of camera pose for computer vision applications. In: Proceedings of the University of Bolton Research and Innovation Conference 2011, Bolton, U.K., June, 28–29 2011, The University of Bolton Institutional Repository

    Google Scholar 

  4. Aufderheide D, Krybus W, Witkowski U, Edwards G (2012) Solving the PnP problem for visual odometry—an evaluation of methodologies for mobile robots. In: Advances in autonomous robotics—joint proceedings of the 13th annual TAROS conference and the 15th annual FIRA RoboWorld Congress Bristol, UK, August 20–23, pp 461–462

    Google Scholar 

  5. Harris C, Stephens M (1988) A combined corner and edge detector. In: Proceedings of the 4th Alvey vision conference, pp 147–151

    Google Scholar 

  6. Hwangbo M, Kim JS, Kanade T (2009) Inertial-aided KLT feature tracking for a moving camera. In: 2009 IEEE/RJS international conference on intelligent robots and systems. St. Louis, USA, pp 1909–1916

    Google Scholar 

  7. Hwangbo M, Kim JS, Kanade T (2011) Gyro-aided feature tracking for a moving camera: fusion, auto-calibration and GPU implementation. Int J Robot Res 30(14):1755–1774

    Article  Google Scholar 

  8. Jin H, Favaro P, Soatto S (2001) Real-time feature tracking and outlier rejection with changes in illumination. In: Proceedings of the International Conference on Computer Vision (ICCV), July 2001

    Google Scholar 

  9. Juan L, Gwun O (2009) A comparison of SIFT, PCA-SIFT and SURF. Int J Image Process (IJIP) 3(4):143–152. CSC Journals

    Google Scholar 

  10. Kim J, Hwangbo M, Kanade T (2009) Realtime affine-photometric KLT feature tracker on GPU in CUDA framework. The fifth IEEE workshop on embedded computer vision in ICCV 2009, Sept 2009, pp 1306–1311

    Google Scholar 

  11. Lucas B, Kanade T (1981) An iterative image registration technique with an application to Stereo vision. In: International joint conference on artificial intelligence, pp 674–679

    Google Scholar 

  12. Rehbinder H, Hu X (2004) Drift-free attitude estimation for accelerated rigid bodies. Automatica 40(4):653–659

    Article  MATH  MathSciNet  Google Scholar 

  13. Sabatini A (2006) Quaternion-based extended Kalman filter for determining orientations by inertial and magnetic sensing. IEEE Trans Biomed Eng 53(7):1346–1356

    Article  Google Scholar 

  14. Skog I, Haendel P (2006) Calibration of MEMS inertial unit. In: Proceedings of the IXVII IMEKO world congress on metrology for a sustainable development

    Google Scholar 

  15. Tomasi C, Shi J (1994) Good features to track. In: IEEE computer vision and pattern recognition 1994

    Google Scholar 

  16. Welch G, Bishop G (2006) An introduction to the Kalman filter, Technical Report TR 95-041. Department of Computer Science, University of North Carolina at Chapel Hill

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dominik Aufderheide .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Aufderheide, D., Edwards, G., Krybus, W. (2015). Visual-Inertial 2D Feature Tracking based on an Affine Photometric Model. In: Tavares, J., Natal Jorge, R. (eds) Developments in Medical Image Processing and Computational Vision. Lecture Notes in Computational Vision and Biomechanics, vol 19. Springer, Cham. https://doi.org/10.1007/978-3-319-13407-9_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-13407-9_18

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-13406-2

  • Online ISBN: 978-3-319-13407-9

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics