Skip to main content

Feature Detection and Tracking

  • Chapter
  • 9368 Accesses

Part of the Undergraduate Topics in Computer Science book series (UTICS)

Abstract

This chapter describes the detection of keypoints and the definition of descriptors for those; a keypoint and a descriptor define a feature. The given examples are SIFT, SURF, and ORB, where we introduce BRIEF and FAST for providing ORB. We discuss the invariance of features in general, and of the provided examples in particular. The chapter also discusses three ways for tracking features: KLT, particle filter, and Kalman filter.

Keywords

  • Kalman Filter
  • Particle Filter
  • Correspondence Problem
  • Keypoint Detector
  • Optic Flow Vector

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-1-4471-6320-6_9
  • Chapter length: 44 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   44.99
Price excludes VAT (USA)
  • ISBN: 978-1-4471-6320-6
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   59.99
Price excludes VAT (USA)
Fig. 9.1
Fig. 9.2
Fig. 9.3
Fig. 9.4
Fig. 9.5
Fig. 9.6
Fig. 9.7
Fig. 9.8
Fig. 9.9
Fig. 9.10
Fig. 9.11
Fig. 9.12
Fig. 9.13
Fig. 9.14
Fig. 9.15
Fig. 9.16
Fig. 9.17
Fig. 9.18
Fig. 9.19
Fig. 9.20
Fig. 9.21
Fig. 9.22
Fig. 9.23
Fig. 9.24
Fig. 9.25

Notes

  1. 1.

    The described generation of 3D flow vectors has been published in [J.A. Sanchez, R. Klette, and E. Destefanis. Estimating 3D flow for driver assistance applications. Pacific-Rim Symposium Image Video Technology, LNCS 5414, pp. 237–248, 2009].

  2. 2.

    See [Z. Song and R. Klette. Robustness of point feature detection. In Proc. Computer Analysis Images Patterns, LNCS 8048, pp. 91–99, 2013].

  3. 3.

    See [Y. Zeng and R. Klette. Multi-run 3D streetside reconstruction from a vehicle. In Proc. Computer Analysis Images Patterns, LNCS 8047, pp. 580–588, 2013].

  4. 4.

    The presentation follows the Lucas–Kanade tracker introduction by T. Svoboda on cmp.felk.cvut.cz/cmp/courses/Y33ROV/Y33ROV_ZS20082009/Lectures/Motion/klt.pdf.

  5. 5.

    We use a (practically acceptable) approximation of the Hessian. Instead of mixed derivatives, we apply the product of the first-order derivatives.

  6. 6.

    A particle filter for lane detection was suggested in [S. Sehestedt, S. Kodagoda, A. Alempijevic, and G. Dissanayake. Efficient lane detection and tracking in urban environments. In Proc. European Conf. Mobile Robots, pp. 126–131, 2007].

  7. 7.

    This is the retinal point where lines parallel to translatory motion meet, also assuming a corresponding direction of gaze.

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and Permissions

Copyright information

© 2014 Springer-Verlag London

About this chapter

Cite this chapter

Klette, R. (2014). Feature Detection and Tracking. In: Concise Computer Vision. Undergraduate Topics in Computer Science. Springer, London. https://doi.org/10.1007/978-1-4471-6320-6_9

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-6320-6_9

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-4471-6319-0

  • Online ISBN: 978-1-4471-6320-6

  • eBook Packages: Computer ScienceComputer Science (R0)