Advertisement

Real-Time Embedded System for Rear-View Mirror Overtaking Car Monitoring

  • Javier Díaz
  • Eduardo Ros
  • Sonia Mota
  • Rodrigo Agis
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4017)

Abstract

The main goal of an overtaking monitor system is the segmentation and tracking of the overtaking vehicle. This application can be addressed through an optic flow driven scheme. We can focus on the rear mirror visual field by placing a camera on the top of it. If we drive a car, the ego-motion optic flow pattern is more or less unidirectional, i.e. all the static objects and landmarks move backwards while the overtaking cars move forward towards our vehicle. This well structured motion scenario facilitates the segmentation of regular motion patterns that correspond to the overtaking vehicle. Our approach is based on two main processing stages: first, the computation of optical flow using a novel superpipelined and fully parallelized architecture capable to extract the motion information with a frame-rate up to 148 frames per second at VGA resolution (640x480 pixels). Second, a tracking stage based on motion pattern analysis provides an estimated position of the overtaking car. We analyze the system performance, resources and show some promising results using a bank of overtaking car sequences.

Keywords

Optical Flow Motion Estimation Optical Flow Field Optical Flow Algorithm Alert Signal 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Apostoloff, N., Zelinsky, A.: Vision In and Out of Vehicles: Integrated Driver and Road Scene Monitoring. Int. J. of Robotics Research 23(4-5), 513–538 (2004)CrossRefGoogle Scholar
  2. 2.
    Dagan, E., Mano, O., Stein, G.P., Shashua, A.: Forward collision warning with a single camera. In: IEEE Intelligent Vehicles Symposium, 14-17, pp. 37–42 (2004)Google Scholar
  3. 3.
    Mota, S., Ros, E., Díaz, J., Tan, S., Dale, J., Johnston, A.: Detection and tracking of overtaking cars for driving assistance. Early Cog. Vision Workshop, Isle of Skye, Scotland, UK, 28 May- 1 June (2004), (http://www.cn.stir.ac.uk/ecovision-ws/schedule.php). Google Scholar
  4. 4.
    Mobileye N.V. Blind Spot Detection and Lane Change Assist (BSD/LCA), http://www.mobileye.com/general.shtml
  5. 5.
    Volvo BLIS system.Weblink: http://www.mynrma.com.au/blis.asp
  6. 6.
    Ficosa Digital blind spot detector, http://www.ficosa.com/eng/home_noticiaseventos.htm
  7. 7.
    Díaz, J., Ros, E., Mota, S., Rodriguez-Gomez, R.: Highly paralellized architecture for image motion estimation. In: Bertels, K., Cardoso, J.M.P., Vassiliadis, S. (eds.) ARC 2006. LNCS, vol. 3985, pp. 75–86. Springer, Heidelberg (2006) (accepted for publication)CrossRefGoogle Scholar
  8. 8.
    Lucas, B., Kanade, T.: An Iterative Image Registration Technique with Applications to Stereo Vision. In: Proc. DARPA Image Understanding Workshop, pp. 121–130 (1981)Google Scholar
  9. 9.
    Barron, J., Fleet, D.J., Beauchemin, S.S.: Performance of Optical Flow Techniques. IJCV 12(1), 43–77 (1994)CrossRefGoogle Scholar
  10. 10.
    Díaz, J., Ros, E., Ortigosa, E.M., Mota, S.: FPGA based real-time optical-flow system. IEEE Trans. on Circuits and Systems for Video Technology 16(2), 274–279 (2006)CrossRefGoogle Scholar
  11. 11.
    McCane, B., Novins, K., Crannitch, D., Galvin, B.: On Benchmarking Optical Flow. Computer Vision and Image Understanding 84, 126–143 (2001)MATHCrossRefGoogle Scholar
  12. 12.
    Liu, H.C., Hong, T.S., Herman, M., Camus, T., Chellappa, R.: Accuracy vs. Efficiency Trade-offs in Optical Flow Algorithms. CVIU 72(3), 271–286 (1998)Google Scholar
  13. 13.
    Lim, S., Apostolopoulos, J.G., Gamal, A.E.: Optical flow estimation using temporally oversampled video. IEEE Trans. on Image Processing 14(8), 1074–1087 (2005)CrossRefGoogle Scholar
  14. 14.
    Dellaert, F., Thorpe, C.: Robust car tracking using Kalman filtering and Bayesian templates. In: Proceedings of SPIE: Intelligent Transportation Systems, vol. 3207 (1997)Google Scholar
  15. 15.
    Gao, J., Kosaka, A., Kak, A.C.: A multi-Kalman filtering approach for video tracking of human-delineated objects in cluttered environments. Computer Vision and Image Understanding 99(1), 1–57 (2005)CrossRefGoogle Scholar
  16. 16.
    Jung, S.-K., Wohn, K.-Y.: 3-D tracking and motion estimation using hierarchical Kalman filter. IEE Proc.-Vis. Image Signal Process 144(5), 293–298 (1997)CrossRefGoogle Scholar
  17. 17.
    Celoxica company. Web site and products information available at, www.celoxica.com
  18. 18.
    Dept. of predevelopment EE-11, Hella KG Hueck & Co., Germany, www.hella.de

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Javier Díaz
    • 1
  • Eduardo Ros
    • 1
  • Sonia Mota
    • 2
  • Rodrigo Agis
    • 1
  1. 1.Dep. Arquitectura y Tecnología de ComputadoresUniversidad de GranadaSpain
  2. 2.Dep.Informática y Análisis NuméricoUniversidad de CórdobaSpain

Personalised recommendations