Advertisement

Real Time Semi-dense Point Tracking

  • Matthieu Garrigues
  • Antoine Manzanera
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7324)

Abstract

This paper presents a new algorithm to track a high number of points in a video sequence in real-time. We propose a fast keypoint detector, used to create new particles, and an associated multiscale descriptor (feature) used to match the particles from one frame to the next. The tracking algorithm updates for each particle a series of appearance and kinematic states, that are temporally filtered. It is robust to hand held camera accelerations thanks to a coarse-to-fine dominant movement estimation. Each step is designed to reach the maximal level of data parallelism, to target the most common parallel platforms. Using graphics processing unit, our current implementation handles 10 000 points per frame at 55 frames-per-second on 640 ×480 videos.

Keywords

Video Sequence Graphic Process Unit Motion Estimation Tracking Algorithm Saliency Function 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    d’Angelo, E., Paratte, J., Puy, G., Vandergheynst, P.: Fast TV-L1 optical flow for interactivity. In: IEEE International Conference on Image Processing (ICIP 2011), Brussels, Belgium, pp. 1925–1928 (September 2011)Google Scholar
  2. 2.
    Fassold, H., Rosner, J., Schallaeur, P., Bailer, W.: Realtime KLT feature point tracking for high definition video. In: Computer Graphics, Computer Vision and Mathematics (GraVisMa 2009), Plzen, Czech Republic (September 2009)Google Scholar
  3. 3.
    NVIDIA: Cuda toolkit, http://www.nvidia.com/cuda
  4. 4.
    Rosten, E., Drummond, T.: Machine Learning for High-Speed Corner Detection. In: Leonardis, A., Bischof, H., Pinz, A. (eds.) ECCV 2006, Part I. LNCS, vol. 3951, pp. 430–443. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  5. 5.
    Sand, P., Teller, S.: Particle video: Long-range motion estimation using point trajectories. In: Computer Vision and Pattern Recognition (CVPR 2006), pp. 2195–2202, New York (June 2006)Google Scholar
  6. 6.
    Sinha, S.N., Frahm, J.M., Pollefeys, M., Genc, Y.: Feature tracking and matching in video using programmable graphics hardware. Machine Vision and Applications 22(1), 207–217 (2007)CrossRefGoogle Scholar
  7. 7.
    Sundaram, N., Brox, T., Keutzer, K.: Dense Point Trajectories by GPU-Accelerated Large Displacement Optical Flow. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010, Part I. LNCS, vol. 6311, pp. 438–451. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  8. 8.
    Tomasi, C., Kanade, T.: Detection and tracking of point features. Carnegie Mellon University Technical Report CMU-CS-91-132 (April 1991)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Matthieu Garrigues
    • 1
  • Antoine Manzanera
    • 1
  1. 1.ENSTA-ParisTechParis Cedex 15France

Personalised recommendations