BriefMatch: Dense Binary Feature Matching for Real-Time Optical Flow Estimation

  • Gabriel EilertsenEmail author
  • Per-Erik Forssén
  • Jonas Unger
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10269)


Research in optical flow estimation has to a large extent focused on achieving the best possible quality with no regards to running time. Nevertheless, in a number of important applications the speed is crucial. To address this problem we present BriefMatch, a real-time optical flow method that is suitable for live applications. The method combines binary features with the search strategy from PatchMatch in order to efficiently find a dense correspondence field between images. We show that the BRIEF descriptor provides better candidates (less outlier-prone) in shorter time, when compared to direct pixel comparisons and the Census transform. This allows us to achieve high quality results from a simple filtering of the initially matched candidates. Currently, BriefMatch has the fastest running time on the Middlebury benchmark, while placing highest of all the methods that run in shorter than 0.5 s.


Optical flow Feature matching Real-time computation 



This project was funded by the Swedish Foundation for Strategic Research (SSF) through grants IIS11-0081 Virtual Photo Sets and RIT 15-0097 SymbiCloud, and by the Swedish Research Council through grants 2015-05180, 2014-5928 (LCMM) and 2014-6227 (EMC2).


  1. 1.
    Alba, A., Arce-Santana, E., Rivera, M.: Optical flow estimation with prior models obtained from phase correlation. In: Bebis, G., et al. (eds.) ISVC 2010. LNCS, vol. 6453, pp. 417–426. Springer, Heidelberg (2010). doi: 10.1007/978-3-642-17289-2_40 CrossRefGoogle Scholar
  2. 2.
    Bailer, C., Taetz, B., Stricker, D.: Flow fields: dense correspondence fields for highly accurate large displacement optical flow estimation. In: Proceedings of the ICCV 2015, December 2015Google Scholar
  3. 3.
    Baker, S., Scharstein, D., Lewis, J.P., Roth, S., Black, M.J., Szeliski, R.: A database and evaluation methodology for optical flow. IJCV 92(1), 1–31 (2011)CrossRefGoogle Scholar
  4. 4.
    Bao, L., Yang, Q., Jin, H.: Fast edge-preserving patchmatch for large displacement optical flow. In: Proceedings of the CVPR 2014, June 2014Google Scholar
  5. 5.
    Barnes, C., Shechtman, E., Finkelstein, A., Goldman, D.B.: PatchMatch: a randomized correspondence algorithm for structural image editing. ACM Trans. Graph. 28(3), 24:1–24:11 (2009)CrossRefGoogle Scholar
  6. 6.
    Bartels, C., de Haan, G.: Smoothness constraints in recursive search motion estimation for picture rate conversion. IEEE TCSVT 20(10), 1310–1319 (2010)Google Scholar
  7. 7.
    Bleyer, M., Rhemann, C., Rother, C.: PatchMatch stereo - stereo matching with slanted support windows. In: Proceedings of the BMVC, pp. 14.1–14.11 (2011)Google Scholar
  8. 8.
    Bouguet, J.-Y.: Pyramidal implementation of the affine Lucas Kanade feature tracker description of the algorithm. Intel Corp. 5(1–10), 4 (2001)Google Scholar
  9. 9.
    Brox, T., Bruhn, A., Papenberg, N., Weickert, J.: High accuracy optical flow estimation based on a theory for warping. In: Pajdla, T., Matas, J. (eds.) ECCV 2004. LNCS, vol. 3024, pp. 25–36. Springer, Heidelberg (2004). doi: 10.1007/978-3-540-24673-2_3 CrossRefGoogle Scholar
  10. 10.
    Brox, T., Malik, J.: Large displacement optical flow: descriptor matching in variational motion estimation. IEEE Trans. PAMI 33(3), 500–513 (2011)CrossRefGoogle Scholar
  11. 11.
    Calonder, M., Lepetit, V., Strecha, C., Fua, P.: BRIEF: binary robust independent elementary features. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010. LNCS, vol. 6314, pp. 778–792. Springer, Heidelberg (2010). doi: 10.1007/978-3-642-15561-1_56 CrossRefGoogle Scholar
  12. 12.
    Chen, Z., Jin, H., Lin, Z., Cohen, S., Wu, Y.: Large displacement optical flow from nearest neighbor fields. In: Proceedings of the CVPR 2013, June 2013Google Scholar
  13. 13.
    Cho, J.H., Humenberger, M.: Fast PatchMatch stereo matching using cross-scale cost fusion for automotive applications. In: Proceedings of the IEEE IV 2015, pp. 802–807, June 2015Google Scholar
  14. 14.
    Dosovitskiy, A., Fischer, P., Ilg, E., Hausser, P., Hazirbas, C., Golkov, V., van der Smagt, P., Cremers, D., Brox, T.: FlowNet: learning optical flow with convolutional networks. In: Proceedings of the ICCV 2015 (2015)Google Scholar
  15. 15.
    Farnebäck, G.: Two-frame motion estimation based on polynomial expansion. In: Bigun, J., Gustavsson, T. (eds.) SCIA 2003. LNCS, vol. 2749, pp. 363–370. Springer, Heidelberg (2003). doi: 10.1007/3-540-45103-X_50 CrossRefGoogle Scholar
  16. 16.
    Hafner, D., Demetz, O., Weickert, J.: Why is the census transform good for robust optic flow computation? In: Kuijper, A., Bredies, K., Pock, T., Bischof, H. (eds.) SSVM 2013. LNCS, vol. 7893, pp. 210–221. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-38267-3_18 CrossRefGoogle Scholar
  17. 17.
    Hirschmuller, H., Scharstein, D.: Evaluation of stereo matching costs on images with radiometric differences. IEEE Trans. PAMI 31(9), 1582–1599 (2009)CrossRefGoogle Scholar
  18. 18.
    Humenberger, M., Engelke, T., Kubinger, W.: A census-based stereo vision algorithm using modified semi-global matching and plane fitting to improve matching quality. In: Proceedings of the CVPR 2010 Workshops, pp. 77–84, June 2010Google Scholar
  19. 19.
    Hyun Kim, T., Seok Lee, H., Mu Lee, K.: Optical flow via locally adaptive fusion of complementary data costs. In: Proceedings of the ICCV 2013, December 2013Google Scholar
  20. 20.
    Ilg, E., Mayer, N., Saikia, T., Keuper, M., Dosovitskiy, A., Brox, T.: FlowNet 2.0: evolution of optical flow estimation with deep networks. Technical report, arXiv:1612.01925, December 2016
  21. 21.
    Jith, O.U.N., Ramakanth, S.A., Babu, R.V.: Optical flow estimation using approximate nearest neighbor field fusion. In: Proceedings of the ICASSP 2014, pp. 673–677, May 2014Google Scholar
  22. 22.
    Kroeger, T., Timofte, R., Dai, D., Gool, L.: Fast optical flow using dense inverse search. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) ECCV 2016. LNCS, vol. 9908, pp. 471–488. Springer, Cham (2016). doi: 10.1007/978-3-319-46493-0_29 CrossRefGoogle Scholar
  23. 23.
    Lu, J., Yang, H., Min, D., Do, M.N.: Patch match filter: efficient edge-aware filtering meets randomized search for fast correspondence field estimation. In: Proceedings of the CVPR 2013, June 2013Google Scholar
  24. 24.
    Lucas, B.D., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: Proceedings of the IJCAI 1981, vol. 81, pp. 674–679 (1981)Google Scholar
  25. 25.
    Müller, T., Rabe, C., Rannacher, J., Franke, U., Mester, R.: Illumination-robust dense optical flow using census signatures. In: Mester, R., Felsberg, M. (eds.) DAGM 2011. LNCS, vol. 6835, pp. 236–245. Springer, Heidelberg (2011). doi: 10.1007/978-3-642-23123-0_24 CrossRefGoogle Scholar
  26. 26.
    Ojala, T., Pietikainen, M., Harwood, D.: Performance evaluation of texture measures with classification based on Kullback discrimination of distributions. In: Proceedings of the ICPR 1994, vol. 1, pp. 582–585, October 1994Google Scholar
  27. 27.
    Rannacher, J.: Realtime 3D motion estimation on graphics hardware. Undergraduate thesis, Heidelberg University (2009)Google Scholar
  28. 28.
    Revaud, J., Weinzaepfel, P., Harchaoui, Z., Schmid, C.: EpicFlow: edge-preserving interpolation of correspondences for optical flow. In: Proceedings of the CVPR 2015, June 2015Google Scholar
  29. 29.
    Rong, G., Tan, T.-S.: Jump flooding in GPU with applications to Voronoi diagram and distance transform. In: Proceedings of the I3D 2006, pp. 109–116 (2006)Google Scholar
  30. 30.
    Rublee, E., Rabaud, V., Konolige, K., Bradski, G.: ORB: an efficient alternative to SIFT or SURF. In: Proceedings of the ICCV 2011, pp. 2564–2571, November 2011Google Scholar
  31. 31.
    Stein, F.: Efficient computation of optical flow using the census transform. In: Rasmussen, C.E., Bülthoff, H.H., Schölkopf, B., Giese, M.A. (eds.) DAGM 2004. LNCS, vol. 3175, pp. 79–86. Springer, Heidelberg (2004). doi: 10.1007/978-3-540-28649-3_10 CrossRefGoogle Scholar
  32. 32.
    Vogel, C., Roth, S., Schindler, K.: An evaluation of data costs for optical flow. In: Weickert, J., Hein, M., Schiele, B. (eds.) GCPR 2013. LNCS, vol. 8142, pp. 343–353. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-40602-7_37 CrossRefGoogle Scholar
  33. 33.
    Weinzaepfel, P., Revaud, J., Harchaoui, Z., Schmid, C.: DeepFlow: large displacement optical flow with deep matching. In: Proceedings of the ICCV 2013, December 2013Google Scholar
  34. 34.
    Xu, L., Jia, J., Matsushita, Y.: Motion detail preserving optical flow estimation. IEEE Trans. PAMI 34(9), 1744–1757 (2012)CrossRefGoogle Scholar
  35. 35.
    Zabih, R., Woodfill, J.: Non-parametric local transforms for computing visual correspondence. In: Eklundh, J.-O. (ed.) ECCV 1994. LNCS, vol. 801, pp. 151–158. Springer, Heidelberg (1994). doi: 10.1007/BFb0028345 CrossRefGoogle Scholar
  36. 36.
    Zhang, K., Li, J., Li, Y., Hu, W., Sun, L., Yang, S.: Binary stereo matching. In: Proceedings of the ICPR 2012, pp. 356–359, November 2012Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Gabriel Eilertsen
    • 1
    Email author
  • Per-Erik Forssén
    • 2
  • Jonas Unger
    • 1
  1. 1.Department of Science and TechnologyLinköping UniversityLinköpingSweden
  2. 2.Department of Electrical EngineeringLinköping UniversityLinköpingSweden

Personalised recommendations