Advertisement

Estimation and Prediction of the Vehicle’s Motion Based on Visual Odometry and Kalman Filter

  • Basam Musleh
  • David Martin
  • Arturo de la Escalera
  • Domingo Miguel Guinea
  • Maria Carmen Garcia-Alegre
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7517)

Abstract

The movement of the vehicle is an useful information for different applications, such as driver assistant systems or autonomous vehicles. This information can be known by different methods, for instance, by using a GPS or by means of the visual odometry. However, there are some situations where both methods do not work correctly. For example, there are areas in urban environments where the signal of the GPS is not available, as tunnels or streets with high buildings. On the other hand, the algorithms of computer vision are affected by outdoor environments, and the main source of difficulties is the variation in the ligthing conditions. A method to estimate and predict the movement of the vehicle based on visual odometry and Kalman filter is explained in this paper. The Kalman filter allows both filtering and prediction of vehicle motion, using the results from the visual odometry estimation.

Keywords

Stereo vision Visual odometry Kalman filter 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Borenstein, J., Everett, H., Feng, L.: Where am i? sensors and methods for mobile robot positioning. University of Michigan 119, 120 (1996)Google Scholar
  2. 2.
    Demirdjian, D., Darrell, T.: Motion estimation from disparity images. In: Proceedings of Eighth IEEE International Conference on Computer Vision, ICCV 2001, vol. 1, pp. 213–218. IEEE (2001)Google Scholar
  3. 3.
    Hernández, A., Nieto, J., Vidal Calleja, T., Nebot, E., et al.: Large scale visual odometry using stereo vision (2011)Google Scholar
  4. 4.
    Howard, A.: Real-time stereo visual odometry for autonomous ground vehicles. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2008, pp. 3946–3952. IEEE (2008)Google Scholar
  5. 5.
    NovAtel Inc., Calgary (2012), http://www.novatel.com
  6. 6.
    Hu, Z., Lamosa, F., Uchimura, K.: A complete uv-disparity study for stereovision based 3d driving environment analysis. In: Fifth International Conference on 3-D Digital Imaging and Modeling, 3DIM 2005, pp. 204–211. IEEE (2005)Google Scholar
  7. 7.
    Kalman, R.: A new approach to linear filtering and prediction problems. Journal of basic Engineering 82(Series D), 35–45 (1960)CrossRefGoogle Scholar
  8. 8.
    Labayrade, R., Aubert, D., Tarel, J.: Real time obstacle detection in stereovision on non flat road geometry through v-disparity representation. In: Intelligent Vehicle Symposium, vol. 2, pp. 646–651. IEEE (2002)Google Scholar
  9. 9.
    Lowe, D.: Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision 60(2), 91–110 (2004)CrossRefGoogle Scholar
  10. 10.
    Musleh, B., Escalera, A., Armingol, J.: Real-time pedestrian recognition in urban environments. In: Advanced Microsystems for Automotive Applications, pp. 139–147 (2011)Google Scholar
  11. 11.
    Musleh, B., de la Escalera, A., Armingol, J.: U-v disparity analysis in urban environments. In: Moreno-Díaz, R., Pichler, F., Quesada-Arencibia, A. (eds.) EUROCAST 2011, Part II. LNCS, vol. 6928, pp. 426–432. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  12. 12.
    Nistér, D., Naroditsky, O., Bergen, J.: Computer Vision and Pattern Recognition. In: Proceedings of the 2004 IEEE Computer Society Conference on CVPR 2004 , vol. 1, p–652. IEEE (2004)Google Scholar
  13. 13.
    Nourani-Vatani, N., Roberts, J., Srinivasan, M.: Practical visual odometry for car-like vehicles. In: IEEE International Conference on Robotics and Automation, ICRA 2009, pp. 3551–3557. IEEE (2009)Google Scholar
  14. 14.
    Parra, I., Sotelo, M., Llorca, D., Ocana, M.: Robust visual odometry for vehicle localization in urban environments. Robotica 28(3), 441–452 (2010)CrossRefGoogle Scholar
  15. 15.
    Scaramuzza, D., Fraundorfer, F., Siegwart, R.: Real-time monocular visual odometry for on-road vehicles with 1-point ransac. In: IEEE International Conference on Robotics and Automation, ICRA 2009, pp. 4293–4299. IEEE (2009)Google Scholar
  16. 16.
    Scaramuzza, D., Siegwart, R.: Appearance-guided monocular omnidirectional visual odometry for outdoor ground vehicles. IEEE Transactions on Robotics 24(5), 1015–1026 (2008)CrossRefGoogle Scholar
  17. 17.
    Scharstein, D., Szeliski, R.: A taxonomy and evaluation of dense two-frame stereo correspondence algorithms. International Journal of Computer Vision 47(1), 7–42 (2002)zbMATHCrossRefGoogle Scholar
  18. 18.
    Stein, G., Mano, O., Shashua, A.: A robust method for computing vehicle ego-motion. In: Proceedings of the IEEE Intelligent Vehicles Symposium, IV 2000, pp. 362–368. IEEE (2000)Google Scholar
  19. 19.
    Vedaldi, A.: An open implementation of the SIFT detector and descriptor. Tech. Rep. 070012, UCLA CSD (2007)Google Scholar
  20. 20.
    Wangsiripitak, S., Murray, D.: Avoiding moving outliers in visual slam by tracking moving objects. In: IEEE International Conference on Robotics and Automation, ICRA 2009, pp. 375–380. IEEE (2009)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Basam Musleh
    • 1
  • David Martin
    • 1
  • Arturo de la Escalera
    • 1
  • Domingo Miguel Guinea
    • 1
  • Maria Carmen Garcia-Alegre
    • 2
  1. 1.Intelligent Systems LabUniversity Carlos IIILeganesSpain
  2. 2.Center for Automation and Robotics (CAR)Spanish Council for Scientific Research (CSIC)Arganda del Rey.Spain

Personalised recommendations