Advertisement

Temporal Prediction and Spatial Regularization in Differential Optical Flow

  • Matthias Hoeffken
  • Daniel Oberhoff
  • Marina Kolesnik
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6915)

Abstract

In this paper we present an extension to the Bayesian formulation of multi-scale differential optical flow estimation by Simoncelli et. al.[1]. We exploit the observation that optical flow is consistent in consecutive time frames and thus propagating information over time should improve the quality of the flow estimation. This propagation is formulated via insertion of additional Kalman filters that filter the flow over time by tracking the movement of each pixel. To stabilize these filters and the overall estimation, we insert a spatial regularization into the prediction lane. Through the recursive nature of the filter the regularization has the ability to perform filling-in of missing information over extended spatial extents. We benchmark our algorithm, which is implemented in the nVidia Cuda framework to exploit the processing power of modern graphical processing units (GPUs), against a state-of-the-art variational flow estimation algorithm that is also implemented in Cuda. The comparison shows that, while the variational method yields somewhat higher precision, our method is more than an order of magnitude faster and can thus operate in real-time on live video streams.

Keywords

Optical Flow Motion Vector Motion Estimation Coarse Scale Extended Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Simoncelli, E.P., Jahne, B., Haussecker, H., Geissler, P.: Bayesian Multi-Scale Differential Optical Flow, vol. 2, pp. 397–422. Academic Press, San Diego (1999)Google Scholar
  2. 2.
    Lucas, B.D., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: Proceedings of Imaging Understanding Workshop, pp. 121–130 (1981)Google Scholar
  3. 3.
    Horn, B.K.P., Schunck, B.G.: Determining optical flow. Artificial Intelligence 17, 185–203 (1981)CrossRefGoogle Scholar
  4. 4.
    Baker, S., Scharstein, D., Lewis, J., Roth, S., Black, M., Szeliski, R.: A database and evaluation methodology for optical flow. In: IEEE 11th International Conference on Computer Vision, ICCV 2007, pp. 1–8 (October 2007)Google Scholar
  5. 5.
  6. 6.
    Rannacher, J.: Realtime 3d motion estimation on graphics hardware, Master’s thesis at Heidelberg University (2009)Google Scholar
  7. 7.
    Werlberger, M., Trobin, W., Pock, T., Wedel, A., Cremers, D., Bischof, H.: Anisotropic Huber-L1 Optical Flow. In: Proceedings of British Machine Vision Conference (BMVC) (September 2009)Google Scholar
  8. 8.
    Ferrera, V.P., Wilson, H.R.: Perceived direction of moving two-dimensional patterns. Vision Research 30(2), 273–287 (1990)CrossRefGoogle Scholar
  9. 9.
    Bayerl, P., Neumann, H.: A fast biologically inspired algorithm for recurrent motion estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence 29(2), 246–260 (2007)CrossRefGoogle Scholar
  10. 10.
    Chessa, M., Sabatini, S.P., Solari, F., Bisio, G.M.: A recursive approach to the design of adjustable linear models for complex motion analysis. In: Proceedings of the Fourth Conference on IASTED International Conference: Signal Processing, Pattern Recognition, and Applications, pp. 33–38. ACTA Press, Anaheim (2007)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Matthias Hoeffken
    • 1
  • Daniel Oberhoff
    • 1
  • Marina Kolesnik
    • 1
  1. 1.Fraunhofer Institute FITSchloss BirlinghovenSankt AugustinGermany

Personalised recommendations