A Combined Local-Global Match for Optical Flow

  • Yueran Zu
  • Wenzhong Tang
  • Xiuguo BaoEmail author
  • Ke Gao
  • Mingdong Zhang
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 875)


Optical flow estimation is still an open question in computer vision. Matching is the initialization of the final optical flow results. A good matching is important for the flow. In this paper, a combined local-global matching method is proposed. The local matching method and the global method are integrated together to make a trade-off between the large displacement and local consistency of optical flow. Extensive experiments on state-of-art challenging datasets MPI-Sintel show that the proposed method is efficient and effective.


Optical flow Matching method Non-local Global 



This work was supported by National Natural Science Foundation of China(No. 51475025), Beijing Municipal Science and Technology Commission Project Z171100000117010, the National Key Research and Development Plan (Nos. 2016YFB0801203, 2016YFB0801200).


  1. 1.
    Ochs, P., Malik, J., et al.: Segmentation of moving objects by long term video analysis. TPAMI 36(6), 1187–1200 (2014)CrossRefGoogle Scholar
  2. 2.
    Xizhou, Z., Wang, Y. et al.: Flow-guided feature aggregation for video object detection. In: ICCV (2017)Google Scholar
  3. 3.
    Simonyan, K., Zisserman, A.: Two-stream convolutional networks for action recognition in videos. In: NIPS, vol. 1, no. 4, pp. 568–576 (2014)Google Scholar
  4. 4.
    Urtasun, R., Lenz, P., Geiger, A.: Are we ready for autonomous driving? The Kitti vision benchmark suite. In: CVPR, pp. 3354–3361 (2012)Google Scholar
  5. 5.
    Menze, M., Geiger, A.: Object scene flow for autonomous vehicles. In: CVPR 2015, pp. 3061–3070 (2015)Google Scholar
  6. 6.
    Horn, B.K.P., Schunck, B.G.: Determining optical flow. Artif. Intell. 17(1), 185–203 (1981)CrossRefGoogle Scholar
  7. 7.
    Lucas, B.D., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: IJCAI, pp. 674–679 (1981)Google Scholar
  8. 8.
    Xu, J., Ranftl, R., et. al.: Accurate optical flow via direct cost volume processing. In: CVPR (2017)Google Scholar
  9. 9.
    Hu, Y., Song, R., et. al.: Efficient coarse-to-fine patch match for large displacement optical flow. In: CVPR, pp. 5704–5712 (2016)Google Scholar
  10. 10.
    Baker, S., Scharstein, D.: Lewis, J.P., et. al.: A database and evaluation methodology for optical flow. Int. J. Comput. Vis. 92(1), 1–31 (2011)CrossRefGoogle Scholar
  11. 11.
    Sun, D., Roth, S., Black, M.J.: A quantitative analysis of current practices in optical flow estimation and the principles behind them. Int. J. Comput. Vis. 106(2), 115–137 (2014)CrossRefGoogle Scholar
  12. 12.
    Brox, T., Malik, J.: Large displacement optical flow: descriptor matching in variational motion estimation. TPAMI. 33(3), 500–513 (2011)CrossRefGoogle Scholar
  13. 13.
    Xu, L., Jia, J., Matsushita, Y.: Motion detail preserving optical flow estimation. In: Computer Vision and Pattern Recognition, pp. 1293–1300 (2010)Google Scholar
  14. 14.
    Weinzaepfel, P., Revaud, J., Harchaoui, Z., Schmid, C.: Deepflow: large displacement optical flow with deep matching. In: ICCV, pp. 1385–1392 (2013)Google Scholar
  15. 15.
    Timofte, R., Gool, L.V.: Sparse flow: sparse matching for small to large displacement optical flow. In: WACV, pp. 1100–1106 (2015)Google Scholar
  16. 16.
    Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 60(2), 91–110 (2004)CrossRefGoogle Scholar
  17. 17.
    Barnes, C., Shechtman, E., Goldman, D.B., Finkelstein, A.: The generalized patchmatch correspondence algorithm. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010. LNCS, vol. 6313, pp. 29–43. Springer, Heidelberg (2010). Scholar
  18. 18.
    Revaud. J, Weinzaepfel, P., et al.: Epicflow: edge-preservinginterpolation of correspondences for optical flow. In: CVPR, pp. 1164–1172 (2015)Google Scholar
  19. 19.
    Hu, Y. Li, Y.: Robust interpolation of correspondences for large displacement optical flow. In: CVPR (2017)Google Scholar
  20. 20.
    Niu, Y., Dick, A., Brooks, M.J.: Linking local and global optical flow computation by subspace regularization. Opt. Eng. 52(3) (2013)Google Scholar
  21. 21.
    Jara-Wilde, J., Cerda, M.: An implementation of combined local-global optical flow. Image Processing On Line. pp. 139–158 (2015)Google Scholar
  22. 22.
    Liu, C., Yuen, J., Torralba, A.: Sift flow: dense correspondence across scenes and its applications. TPAMI 33(5), 978–994 (2011)CrossRefGoogle Scholar
  23. 23.
    Zbontar, J., LeCun, Y.: Stereo matching by training a convolutional neural network to compare image patches. JMLR (2016)Google Scholar
  24. 24.
    Ilg, E., Mayer, N., et al.: Flownet 2.0 evolution of optical flow estimation with deep networks. In: CVPR (2017)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2018

Authors and Affiliations

  • Yueran Zu
    • 1
  • Wenzhong Tang
    • 1
  • Xiuguo Bao
    • 2
    Email author
  • Ke Gao
    • 3
  • Mingdong Zhang
    • 2
  1. 1.School of Computer Science and EngineeringBeihang UniverstiyBeijingChina
  2. 2.The National Computer Network Emergency Response Technical Team, Coordination Center of ChinaBeijingChina
  3. 3.Institute of Computing Technology Chinese Academy of SciencesBeijingChina

Personalised recommendations