Fast Optical Flow Using Dense Inverse Search

  • Till KroegerEmail author
  • Radu Timofte
  • Dengxin Dai
  • Luc Van Gool
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9908)


Most recent works in optical flow extraction focus on the accuracy and neglect the time complexity. However, in real-life visual applications, such as tracking, activity detection and recognition, the time complexity is critical. We propose a solution with very low time complexity and competitive accuracy for the computation of dense optical flow. It consists of three parts: (1) inverse search for patch correspondences; (2) dense displacement field creation through patch aggregation along multiple scales; (3) variational refinement. At the core of our Dense Inverse Search-based method (DIS) is the efficient search of correspondences inspired by the inverse compositional image alignment proposed by Baker and Matthews (2001, 2004). DIS is competitive on standard optical flow benchmarks. DIS runs at 300 Hz up to 600 Hz on a single CPU core (1024 \(\times \) 436 resolution. 42 Hz/46 Hz when including preprocessing: disk access, image re-scaling, gradient computation. More details in Sect. 3.1.), reaching the temporal resolution of human’s biological vision system. It is order(s) of magnitude faster than state-of-the-art methods in the same range of accuracy, making DIS ideal for real-time applications.


Optical Flow Large Displacement Coarse Scale Disk Access Image Pyramid 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



This work was supported by ERC VarCity (#273940) and SNF Tracking in the Wild (CRSII2_147693/1), and a NVIDIA GPU grant. We thank Richard Hartley for his pertinent input on this work.

Supplementary material

419976_1_En_29_MOESM1_ESM.pdf (5.6 mb)
Supplementary material 1 (pdf 5719 KB)


  1. 1.
    Baker, S., Matthews, I.: Equivalence and efficiency of image alignment algorithms. In: CVPR (2001)Google Scholar
  2. 2.
    Baker, S., Matthews, I.: Lucas-Kanade 20 years on: a unifying framework. In: IJCV (2004)Google Scholar
  3. 3.
    Benosman, R., Clercq, C., Lagorce, X., Ieng, S.H., Bartolozzi, C.: Event-based visual flow. IEEE Trans. Neural Netw. Learn. Syst. 25(2), 407–417 (2014)CrossRefGoogle Scholar
  4. 4.
    Horn, B.K., Schunck, B.G.: Determining optical flow. In: Proceedings of SPIE 0281, Techniques and Applications of Image Understanding (1981)Google Scholar
  5. 5.
    Lucas, B.D., Kanade, T.: An iterative image registration technique with an application to stereo vision. IJCAI 81, 674–679 (1981)Google Scholar
  6. 6.
    Black, M.J., Anandan, P.: The robust estimation of multiple motions: parametric and piecewise-smooth flow fields. In: CVIU (1996)Google Scholar
  7. 7.
    Papenberg, N., Bruhn, A., Brox, T., Didas, S., Weickert, J.: Highly accurate optic flow computation with theoretically justified warping. IJCV 67(2), 141–158 (2006)CrossRefGoogle Scholar
  8. 8.
    Steinbrucker, F., Pock, T., Cremers, D.: Large displacement optical flow computation without warping. In: ICCV (2009)Google Scholar
  9. 9.
    Brox, T., Malik, J.: Large displacement optical flow: descriptor matching in variational motion estimation. IEEE Trans. PAMI 33(3), 500–513 (2011)CrossRefGoogle Scholar
  10. 10.
    Braux-Zin, J., Dupont, R., Bartoli, A.: A general dense image matching framework combining direct and feature-based costs. In: ICCV (2013)Google Scholar
  11. 11.
    Leordeanu, M., Zanfir, A., Sminchisescu, C.: Locally affine sparse-to-dense matching for motion and occlusion estimation. In: ICCV (2013)Google Scholar
  12. 12.
    Timofte, R., Van Gool, L.: Sparseflow: Sparse matching for small to large displacement optical flow. In: WACV, pp. 1100–1106, January 2015Google Scholar
  13. 13.
    Kennedy, R., Taylor, C.J.: Optical flow with geometric occlusion estimation and fusion of multiple frames. In: Tai, X.-C., Bae, E., Chan, T.F., Lysaker, M. (eds.) EMMCVPR 2015. LNCS, vol. 8932, pp. 364–377. Springer, Heidelberg (2015). doi: 10.1007/978-3-319-14612-6_27 Google Scholar
  14. 14.
    Menze, M., Heipke, C., Geiger, A.: Discrete optimization for optical flow. In: Gall, J., Gehler, P., Leibe, B. (eds.) GCPR 2015. LNCS, vol. 9358, pp. 16–28. Springer, Heidelberg (2015). doi: 10.1007/978-3-319-24947-6_2 CrossRefGoogle Scholar
  15. 15.
    Revaud, J., Weinzaepfel, P., Harchaoui, Z., Schmid, C.: EpicFlow: edge-preserving interpolation of correspondences for optical flow. In: CVPR (2015)Google Scholar
  16. 16.
    Bailer, C., Taetz, B., Stricker, D.: Flow fields: Dense correspondence fields for highly accurate large displacement optical flow estimation. In: ICCV (2015)Google Scholar
  17. 17.
    Weinzaepfel, P., Revaud, J., Harchaoui, Z., Schmid, C.: Deepflow: large displacement optical flow with deep matching. In: ICCV (2013)Google Scholar
  18. 18.
    Wills, J., Agarwal, S., Belongie, S.: A feature-based approach for dense segmentation and estimation of large disparity motion. IJCV 68, 125–143 (2006)CrossRefGoogle Scholar
  19. 19.
    Wulff, J., Black, M.J.: Efficient sparse-to-dense optical flow estimation using a learned basis and layers. In: CVPR, pp. 120–130 (2015)Google Scholar
  20. 20.
    Xu, L., Jia, J., Matsushita, Y.: Motion detail preserving optical flow estimation. IEEE Trans. PAMI 34(9), 1744–1757 (2012)CrossRefGoogle Scholar
  21. 21.
    Fischer, P., Dosovitskiy, A., Ilg, E., Häusser, P., Hazırbaş, C., Golkov, V., van der Smagt, P., Cremers, D., Brox, T.: Flownet: learning optical flow with convolutional networks. In: ICCV (2015)Google Scholar
  22. 22.
    Farnebäck, G.: Two-frame motion estimation based on polynomial expansion. In: Bigun, J., Gustavsson, T. (eds.) SCIA 2003. LNCS, vol. 2749, pp. 363–370. Springer, Heidelberg (2003). doi: 10.1007/3-540-45103-X_50 CrossRefGoogle Scholar
  23. 23.
    Tao, M., Bai, J., Kohli, P., Paris, S.: Simpleflow: A non-iterative, sublinear optical flow algorithm. In: Computer Graphics Forum, vol. 31, pp. 345–353. Wiley Online Library (2012)Google Scholar
  24. 24.
    Bao, L., Yang, Q., Jin, H.: Fast edge-preserving patchmatch for large displacement optical flow. IEEE Trans. Image Process. 23(12), 4996–5006 (2014)MathSciNetCrossRefGoogle Scholar
  25. 25.
    Plyer, A., Le Besnerais, G., Champagnat, F.: Massively parallel lucas kanade optical flow for real-time video processing applications. J. Real-Time Image Proc. 11(4), 1–18 (2014)Google Scholar
  26. 26.
    Handa, A., Newcombe, R.A., Angeli, A., Davison, A.J.: Real-Time camera tracking: when is high frame-rate best? In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012. LNCS, vol. 7578, pp. 222–235. Springer, Heidelberg (2012). doi: 10.1007/978-3-642-33786-4_17 CrossRefGoogle Scholar
  27. 27.
    Dai, D., Kroeger, T., Timofte, R., Van Gool, L.: Metric imitation by manifold transfer for efficient vision applications. In: IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (2015)Google Scholar
  28. 28.
    Srinivasan, N., Roberts, R., Dellaert, F.: High frame rate egomotion estimation. In: Chen, M., Leibe, B., Neumann, B. (eds.) ICVS 2013. LNCS, vol. 7963, pp. 183–192. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-39402-7_19 CrossRefGoogle Scholar
  29. 29.
    Barranco, F., Fermuller, C., Aloimonos, Y.: Contour motion estimation for asynchronous event-driven cameras. Proc. IEEE 102(10), 1537–1556 (2014)CrossRefGoogle Scholar
  30. 30.
    Fortun, D., Bouthemy, P., Kervrann, C.: Optical flow modeling and computation: a survey. Comput. Vis. Image Underst. 134, 1–21 (2015). Image Understanding for Real-world Distributed Video NetworksCrossRefGoogle Scholar
  31. 31.
    Mikolajczyk, K., Tuytelaars, T., Schmid, C., Zisserman, A., Matas, J., Schaffalitzky, F., Kadir, T., Van Gool, L.: A comparison of affine region detectors. IJCV 65, 43–72 (2005)CrossRefGoogle Scholar
  32. 32.
    Tola, E., Lepetit, V., Fua, P.: A fast local descriptor for dense matching. In: CVPR (2008)Google Scholar
  33. 33.
    Liu, C., Yuen, J., Torralba, A.: SIFT flow: Dense correspondence across scenes and its applications. TPAMI 33(5), 978–994 (2011)CrossRefGoogle Scholar
  34. 34.
    Dalal, N., Triggs, B.: Histograms of oriented gradients for human detection. In: CVPR (2005)Google Scholar
  35. 35.
    Lowe, D.G.: Distinctive image features from scale-invariant keypoints. IJCV 60(2), 91–110 (2004)CrossRefGoogle Scholar
  36. 36.
    Baya, H., Essa, A., Tuytelaarsb, T., Van Gool, L.: Speeded-up robust features (surf). CVIU 110(3), 346–359 (2008)Google Scholar
  37. 37.
    Barnes, C., Shechtman, E., Goldman, D.B., Finkelstein, A.: The generalized patchmatch correspondence algorithm. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010. LNCS, vol. 6313, pp. 29–43. Springer, Heidelberg (2010). doi: 10.1007/978-3-642-15558-1_3 CrossRefGoogle Scholar
  38. 38.
    Gadot, D., Wolf, L.: Patchbatch: a batch augmented loss for optical flow. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2016Google Scholar
  39. 39.
    Senst, T., Eiselein, V., Sikora, T.: Robust local optical flow for feature tracking. IEEE Trans. Circ. Syst. Video Technol. 22(9), 1377–1387 (2012)CrossRefGoogle Scholar
  40. 40.
    Heitz, F., Bouthemy, P.: Multimodal estimation of discontinuous optical flow using markov random fields. TPAMI 15(12), 1217–1232 (1993)CrossRefGoogle Scholar
  41. 41.
    Szeliski, R., Zabih, R., Scharstein, D., Veksler, O., Kolmogorov, V., Agarwala, A., Tappen, M., Rother, C.: A comparative study of energy minimization methods for markov random fields with smoothness-based priors. TPAMI 30(6), 1068–1080 (2008)CrossRefGoogle Scholar
  42. 42.
    Chen, Q., Koltun, V.: Full flow: optical flow estimation by global optimization over regular grids. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2016Google Scholar
  43. 43.
    Pauwels, K., Tomasi, M., Alonso, J.D., Ros, E., Van Hulle, M.: A comparison of FPGA and GPU for real-time phase-based optical flow, stereo, and local image features. IEEE Trans. Comput. 61(7), 999–1012 (2012)MathSciNetCrossRefGoogle Scholar
  44. 44.
    Zach, C., Pock, T., Bischof, H.: A duality based approach for realtime TV-L1 optical flow. In: Annual Symposium on German Association Pattern Recognition (2007)Google Scholar
  45. 45.
    Enkelmann, W.: Investigations of multigrid algorithms for the estimation of optical flow fields in image sequences. Comput. Vis. Graph. Image Process. 43, 150–177 (1988)CrossRefGoogle Scholar
  46. 46.
    Hu, Y., Song, R., Li, Y.: Efficient coarse-to-fine patchmatch for large displacement optical flow. In: The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2016Google Scholar
  47. 47.
    Lichtsteiner, P., Posch, C., Delbruck, T.: A 128\(\times \)128 120 db 15 \(\mu \)s latency asynchronous temporal contrast vision sensor. IEEE J. Solid-State Circ. 43(2), 566–576 (2008)CrossRefGoogle Scholar
  48. 48.
    Butler, D.J., Wulff, J., Stanley, G.B., Black, M.J.: A naturalistic open source movie for optical flow evaluation. In: Fitzgibbon, A., Lazebnik, S., Perona, P., Sato, Y., Schmid, C. (eds.) ECCV 2012. LNCS, vol. 7577, pp. 611–625. Springer, Heidelberg (2012). doi: 10.1007/978-3-642-33783-3_44 Google Scholar
  49. 49.
    Geiger, A., Lenz, P., Stiller, C., Urtasun, R.: Vision meets robotics: the KITTI dataset. In: IJRR (2013)Google Scholar
  50. 50.
    Klein, G., Murray, D.: Parallel tracking and mapping for small AR workspaces. In: ISMAR (2007)Google Scholar
  51. 51.
    Forster, C., Pizzoli, M., Scaramuzza, D.: SVO: Fast semi-direct monocular visual odometry. In: ICRA, pp. 15–22, May 2014Google Scholar
  52. 52.
    Sun, D., Roth, S., Black, M.J.: Secrets of optical flow estimation and their principles. In: CVPR (2010)Google Scholar
  53. 53.
    Zimmer, H., Bruhn, A., Weickert, J.: Optic flow in harmony. IJCV 93, 368–388 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  54. 54.
    Brox, T., Bruhn, A., Papenberg, N., Weickert, J.: High accuracy optical flow estimation based on a theory for warping. In: Pajdla, T., Matas, J. (eds.) ECCV 2004. LNCS, vol. 3024, pp. 25–36. Springer, Heidelberg (2004). doi: 10.1007/978-3-540-24673-2_3 CrossRefGoogle Scholar
  55. 55.
    Werlberger, M., Trobin, W., Pock, T., Wedel, A., Cremers, D., Bischof, H.: Anisotropic huber-L1 optical flow. In: BMVC (2009)Google Scholar
  56. 56.
    Bouguet, J.Y.: Pyramidal implementation of the affine lucas kanade feature tracker description of the algorithm. Intel Corporation 5, 1–10 (2001)Google Scholar
  57. 57.
    Yang, J., Li, H.: Dense, accurate optical flow estimation with piecewise parametric model. In: CVPR, pp. 1019–1027 (2015)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Till Kroeger
    • 1
    Email author
  • Radu Timofte
    • 1
  • Dengxin Dai
    • 1
  • Luc Van Gool
    • 1
    • 2
  1. 1.Computer Vision Laboratory, D-ITETETH ZurichZurichSwitzerland
  2. 2.VISICS/iMinds, ESATKU LeuvenLeuvenBelgium

Personalised recommendations