Advertisement

Machine Vision and Applications

, Volume 23, Issue 6, pp 1229–1242 | Cite as

A self-organization based optical flow estimator with GPU implementation

  • Manish P. Shiralkar
  • Robert J. Schalkoff
Original Paper

Abstract

This work describes a parallelizable optical flow field estimator based upon a modified batch version of the Self-Organizing Map (SOM). This estimator handles the ill-posedness in gradient-based motion estimation via a novel combination of regression and self-organization. The aperture problem is treated using an algebraic framework that partitions motion estimates obtained from regression into two sets: one (set H c ) containing motion estimates with high confidence and another (set H p ) with low confidence. The self-organization step uses a uniquely designed pair of sets: training set (Q = H c ) and the initial weights set (\({W=H_c \cup H_p}\)). It is shown that with this specific choice of training and initial weights sets, the interpolation of flow vectors is achieved primarily due to the regularization property of the SOM. Moreover, the computationally involved step of finding the winner unit in SOM simplifies to indexing into a 2D array making the algorithm highly scalable. To preserve flow discontinuities at occlusion boundaries, we have designed an anisotropic neighborhood function for SOM that uses a novel optical flow constraint equation residual-based distance measure. A multi-resolution or pyramidal approach is used to estimate large motion. As self-organization based motion estimation is computationally intense, parallel processing on graphics processing units is used for speedup. As the algorithm is data (rather datum) parallel, with sufficient number of computing cores, the implementation of the estimator can be made real time. With the available ground truth from Middlebury database, error metrics like average angular error and average end point error are computed and are shown to be comparable with other leading techniques.

Keywords

Optical flow Parallel implementation Batch SOM Anisotropic neighborhood Graphics processing unit (GPU) 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Baker, S., Scharstein, D., Lewis, J.P., Roth, S., Black, M.J., Szeliski, R.: A database and evaluation methodology for optical flow. In: Proceedings of the IEEE International Conference on Computer Vision (2007)Google Scholar
  2. 2.
    Beauchemin S.S., Barron J.L.: The computation of optical flow. CS 27, 433–467 (1995)Google Scholar
  3. 3.
    Black M.J., Anandan P.: The robust estimation of multiple motions: parametric and piecewise-smooth flow-fields. CVIU 63(1), 75–104 (1996)Google Scholar
  4. 4.
    Cheng Y.: Convergence and ordering of Kohonen’s batch map. Neural Comput. 9(8), 1667–1676 (1997)CrossRefGoogle Scholar
  5. 5.
    Cottrell M., Fort J.C., Pags G.: Theoretical aspects of the SOM algorithm. Neurocomputing 21(1–3), 119–138 (1998)MATHCrossRefGoogle Scholar
  6. 6.
    Cottrell M., Hammer B., Hasenfuss A., Vilmann T.: Batch and median neural gas. Neural Netw. 19, 762–771 (2006)MATHCrossRefGoogle Scholar
  7. 7.
    Heskes T.: Self-organizing maps, vector quantization, and mixture modeling. IEEE Trans. Neural Netw. 12(6), 1299–1305 (2001)CrossRefGoogle Scholar
  8. 8.
    Horn B., Schunk B.G.: Determining optical flow. AI 17, 185–203 (1981)Google Scholar
  9. 9.
    Kamitani, Y.: Image generation with MATLAB. http://www.cns.atr.jp/~kmtn/imagematlab/index.html
  10. 10.
    Keys R.: Cubic convolution interpolation for digital image processing. IEEE Trans. Acoust. Speech Signal Process. 29, 1153–1160 (1981)MathSciNetMATHCrossRefGoogle Scholar
  11. 11.
    Kohonen T.: The self-organizing map. Proc. IEEE 78(9), 1464–1480 (1990)CrossRefGoogle Scholar
  12. 12.
    Lucas, B.D., Kanade, T.: An iterative image registration technique with an application to stereo vision. In DARPA81, pp. 121–130 (1981)Google Scholar
  13. 13.
    Luttrell, S.P.: Self-organisation: a derivation from first principles of a class of learning algorithms. In: International Joint Conference on Neural Networks (1989)Google Scholar
  14. 14.
    Martinetz T.M., Berkovich S.G., Schulten K.J.: Neural-gas network for vector quantization and its application to time-series prediction. IEEE Trans. Neural Netw. 4(1), 558–569 (1993)CrossRefGoogle Scholar
  15. 15.
    Owens J.D., Houston M., Luebke D., Green S., Stone J.E., Phillips J.C.: GPU computing. Proc. IEEE 96(5), 879–899 (2008)CrossRefGoogle Scholar
  16. 16.
    Schalkoff, R.J., Warnekar, C.S.: A predictor–corrector approach to tracking 3-D objects using perspective-projected images. In: Proceedings of the IEEE Southeast conference, Ft. Walton Beach, FL, pp. 371–374 (1982)Google Scholar
  17. 17.
    Shiralkar, M.P.: Ph.D. dissertation: a self organization-based optical flow estimator with GPU implementation (2010)Google Scholar
  18. 18.
    Shiralkar, M.P., Schalkoff, R.J.: Multiple-class spatiotemporal flow estimation using a modified neural gas algorithm. Opt. Eng. 48(1) (2009)Google Scholar
  19. 19.
    Sun, D., Roth, S., Black, M.: Secrets of optical flow estimation and their principles. In: CVPR (2010)Google Scholar
  20. 20.
    Wedel, A., Pock, T., Zach, C., Bischof, H., Cremers, D.: An Improved Algorithm for TV-L1 Optical Flow, pp. 23–45. Springer, Berlin (2009)Google Scholar

Copyright information

© Springer-Verlag 2011

Authors and Affiliations

  1. 1.Clemson UniversityClemsonUSA

Personalised recommendations