Advertisement

Pattern Recognition and Image Analysis

, Volume 26, Issue 1, pp 109–113 | Cite as

3D pose estimation for articulated vehicles using Kalman-filter based tracking

  • C. FuchsEmail author
  • F. Neuhaus
  • D. Paulus
Applied Problems

Abstract

Knowledge about relative poses within a tractor/trailer combination is a vital prerequisite for kinematic modelling and trajectory estimation. In case of autonomous vehicles or driver assistance systems, for example, the monitoring of an attached passive trailer is crucial for operational safety. We propose a camerabased 3D pose estimation system based on a Kalman-filter. It is evaluated against previously published methods for the same problem.

Keywords

the Kalman-filter pose estimetion tracking 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    E. Balcerak, D. Zöbel, and T. Weidenfeller, Patent DE 102006056408 A1 (Deutsches Patentund Markenamt, 112006).Google Scholar
  2. 2.
    C. Fuchs, S. Eggert, B. Knopp, and D. Zöbel, “Pose detection in truck and trailer combinations for advanced driver assistance systems,” in Proc. 13th Int. Conf. on Intelligent Autonomous Systems (Padua, 2014).Google Scholar
  3. 3.
    M. Fiala, “Artag, a fiducial marker system using digital techniques,” in Proc. IEEE Computer Society Conf. “Computer Vision and Pattern Recognition, 2005” CVPR 2005 (San Diego, 2005), Vol. 2, pp. 590–596.Google Scholar
  4. 4.
    C. Fuchs, D. Zöbel, and D. Paulus, “3D pose detection for articulated vehicles,” in Proc. IEEE Intelligent Vehicles Symp. (Ypsilanti, MI, 2014).Google Scholar
  5. 5.
    R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision (Cambridge Univ. Press, 2003).zbMATHGoogle Scholar
  6. 6.
    R. E. Kalman, “A new approach to linear filtering and prediction problems,” J. Fluids Eng. 82 (1), 35–45 (1960).Google Scholar
  7. 7.
    H. Kato and M. Billinghurst, “Marker tracking and hmd calibration for a videobased augmented reality conferencing system,” in Proc. 2nd IEEE and ACM Int. Workshop on Augmented Reality (IWAR’99) (San Francisco, 1999), pp. 85–94.CrossRefGoogle Scholar
  8. 8.
    K. Levenberg, “A method for the solution of certain problems in least squares,” Quarterly Appl. Math. 2, 164–168 (1944).MathSciNetzbMATHGoogle Scholar
  9. 9.
    R. K. Lenz and R. Y. Tsai, “Techniques for calibration of the scale factor and image center for high accuracy 3D machine vision metrology,” IEEE Trans. Pattern Anal. Mach. Intellig. 10 (5) (1988).Google Scholar
  10. 10.
    D. W. Marquardt, “An algorithm for least-squares estimation of nonlinear parameters,” J. Soc. Industr. Appl. Math. 11 (2), 431–441 (1963).MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    E. Olson, “AprilTag: a robust and flexible visual fiducial system,” in Proc. IEEE Int. Conf. on Robotics and Automation (ICRA) (Shanghai, May 2011), pp. 3400–3407.Google Scholar
  12. 12.
    D. Scaramuzza, “Omnidirectional vision: From calibration to robot motion estimation,” PhD Thesis (Citeseer, 2007).Google Scholar
  13. 13.
    D. Schmalstieg, A. Fuhrmann, G. Hesina, Z. Szalavari, L. M. Encarnacao, M. Gervautz, and W. Purgathofer, “The studierstube augmented reality project,” Presence: Teleoperators Virtual Environ. 11 (1), 33–54 (2002).CrossRefGoogle Scholar
  14. 14.
    R. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using offthe-shelf tv cameras and lenses,” IEEE J. Robotics Automation 3 (4), 323–344 (1987).CrossRefGoogle Scholar
  15. 15.
    Zhengyou Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Mach. Intell. 22 (11), 1330–1334 (2000).CrossRefGoogle Scholar

Copyright information

© Pleiades Publishing, Ltd. 2016

Authors and Affiliations

  1. 1.Active Vision Group, Institute for Computational VisualisticsUniversity of Koblenz-LandauKoblenzGermany

Personalised recommendations