Optical Flow in Onboard Applications: A Study on the Relationship Between Accuracy and Scene Texture

  • Naveen Onkarappa
  • Sujay M. Veerabhadrappa
  • Angel D. Sappa
Conference paper
Part of the Lecture Notes in Electrical Engineering book series (LNEE, volume 221)

Abstract

Optical flow has got a major role in making advanced driver assistance systems (ADAS) a reality. ADAS applications are expected to perform efficiently in all kinds of environments, those are highly probable, that one can drive the vehicle in different kinds of roads, times and seasons. In this work, we study the relationship of optical flow with different roads, that is by analyzing optical flow accuracy on different road textures. Texture measures such as \(contrast\), \(correlation\) and \(homogeneity\) are evaluated for this purpose. Further, the relation of regularization weight to the flow accuracy in the presence of different textures is also analyzed. Additionally, we present a framework to generate synthetic sequences of different textures in ADAS scenarios with ground-truth optical flow.

Keywords

Optical flow accuracy Texture metrics Ground-truth optical flow 

Notes

Acknowledgments

This work has been partially supported by the Spanish Government under Research Program Consolider Ingenio 2010: MIPRCV (CSD2007-00018) and Project TIN2011-25606. Naveen Onkarappa is supported by FI grant of AGAUR, Catalan Government. The authors would like to thank Oisin Mac Aodha for providing the Python code for raytracing with Maya.

References

  1. 1.
    Giachetti A, Campani M, Torre V (February 1998) The use of optical flow for road navigation. Robot Autom IEEE Trans 14(1):34–48CrossRefGoogle Scholar
  2. 2.
    Onkarappa N, Sappa AD (2010) On-board monocular vision system pose estimation through a dense optical flow. In: 7th international conference on image analysis and recognition, vol 1, pp 230–239Google Scholar
  3. 3.
    Horn BKP, Schunk BG (1981) Determining optical flow. Artif Intell 17:185–203CrossRefGoogle Scholar
  4. 4.
    Lucas BD, Kanade T (1981) An iterative image registration technique with an application to stereo vision (DARPA). In: DARPA image understanding workshop, pp 121–130Google Scholar
  5. 5.
    Barron JL, Fleet DJ, Beauchemin SS (1994) Performance of optical flow techniques. Intern J Comput Vis 12(1):43–77CrossRefGoogle Scholar
  6. 6.
    Bruhn A (2006) Variational optic flow computation: accurate modelling and efficient numerics. PhD thesis, Department of Mathematics and Computer Science, Saarland University, SaarbrückenGoogle Scholar
  7. 7.
    Galvin B, Mccane B, Novins K, Mason D, Mills S (1998) Recovering motion fields: an evaluation of eight optical flow algorithms. In: British machine vision conference, pp 195–204Google Scholar
  8. 8.
    McCane B, Novins K, Crannitch D, Galvin B (October 2001) On benchmarking optical flow. Comput Vis Image Underst 84(1):126–143MATHCrossRefGoogle Scholar
  9. 9.
    Brox T, Bruhn A, Papenberg N, Weickert J (2004) High accuracy optical flow estimation based on a theory for warping. In: European conference on computer vision, vol 3024 of LNCS. Springer, pp 25–36Google Scholar
  10. 10.
    Wedel A, Pock T, Zach C, Cremers D, Bischof H (2008) An improved algorithm for TV-L1 optical flow. In: Dagstuhl motion workshop, Dagstuhl Castle, pp 23–45Google Scholar
  11. 11.
    Weickert J, Schnörr C (December 2001) A theoretical framework for convex regularizers in pde-based computation of image motion. Intern J Comput Vis 45(3):245–264MATHCrossRefGoogle Scholar
  12. 12.
    Wedel A, Cremers D, Pock T, Bischof H (2009) Structure- and motion-adaptive regularization for high accuracy optic flow. In: International IEEE conference of computer vision, Kyoto, pp 1663–1668Google Scholar
  13. 13.
    Zimmer H, Bruhn A, Weickert J (2011) Optic flow in harmony. Intern J Comput Vis 93(3):368–388MathSciNetMATHCrossRefGoogle Scholar
  14. 14.
    Steinbruecker F, Pock T, Cremers D (2009) Advanced data terms for variational optic flow estimation. In: Modeling vision and visualization workshop, Braunschweig, pp 155–164Google Scholar
  15. 15.
    Sun D, Roth S, Black MJ (June 2010) Secrets of optical flow estimation and their principles. In: Conference IEEE on computer vision and pattern recognition, San Francisco, pp 2432–2439Google Scholar
  16. 16.
    Baker S, Scharstein D, Lewis JP, Roth S, Black MJ, Szeliski R (2007) A database and evaluation methodology for optical flow. In: IEEE international conference of computer vision, pp 1–8Google Scholar
  17. 17.
    Vaudrey T, Rabe C, Klette R, Milburn J (2008) Differences between stereo and motion behaviour on synthetic and real-world stereo sequences. In: Image and vision computing New Zealand, Christchurch, pp 1–6Google Scholar
  18. 18.
    Zach C, Pock T, Bischof H (September 2007) A duality based approach for realtime TV-\(L^{1}\) optical flow. In: Symposium DAGM on pattern recognition, Heidelberg, pp 214–223Google Scholar
  19. 19.
    Haralick RM, Shanmugam K, Dinstein I (1973) Textural features for image classification. IEEE Trans Syst Man Cybern 3(6):610–621CrossRefGoogle Scholar
  20. 20.
    Mac Aodha O, Brostow GJ, Pollefeys M (2010) Segmenting video into classes of algorithm-suitability. In: IEEE conference on computer vision and pattern recognition, San Francisco, pp 1054–1061Google Scholar
  21. 21.
    Onkarappa N, Sappa A (2012) An empirical study on optical flow accuracy depending on vehicle speed. In: IEEE intelligent vehicles symposium, pp 1138–1143Google Scholar

Copyright information

© Springer India 2013

Authors and Affiliations

  • Naveen Onkarappa
    • 1
  • Sujay M. Veerabhadrappa
    • 2
  • Angel D. Sappa
    • 1
  1. 1.Computer Vision CenterBellaterraSpain
  2. 2.Department of Electrical and ElectronicsPES Institute of Technology and ManagementShivamoggaIndia

Personalised recommendations