Using Spatially Distributed Patterns for Multiple View Camera Calibration

  • Martin Grochulla
  • Thorsten Thormählen
  • Hans-Peter Seidel
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6930)

Abstract

This paper presents an approach to intrinsic and extrinsic camera parameter calibration from a series of photographs or from video. For the reliable and accurate estimation of camera parameters it is common to use specially designed calibration patterns. However, using a single pattern, a globally consistent calibration is only possible from positions and viewing directions from where this single pattern is visible. To overcome this problem, the presented approach uses multiple coded patterns that can be distributed over a large area. A connection graph representing visible patterns in multiple views is generated, which is used to estimate globally consistent camera parameters for the complete scene. The approach is evaluated on synthetic and real-world ground truth examples. Furthermore, the approach is applied to calibrate the stereo-cameras of a robotic head on a moving platform.

Keywords

Augmented Reality Camera Calibration View Camera Camera Parameter Bundle Adjustment 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Scharstein, D., Szeliski, R., Zabih, R.: A taxonomy and evaluation of dense two-frame stereo correspondence algorithms. In: Stereo and Multi-Baseline Vision, pp. 131–140 (2001)Google Scholar
  2. 2.
    Tsai, R.: An Efficient and Accurate Camera Calibration Technique for 3D Machine Vision. In: Computer Vision and Pattern Recognition, pp. 364–374 (1986)Google Scholar
  3. 3.
    Tsai, R.: A Versatile Camera Calibration Technique for High-accuracy 3D Machine Vision Metrology Using Off-the-shelf TV Cameras and Lenses. IEEE Journal of Robotics and Automation 3, 323–344 (1987)CrossRefGoogle Scholar
  4. 4.
    Zhang, Z.: Flexible Camera Calibration by Viewing a Plane from Unknown Orientations. In: International Conference on Computer Vision, vol. 1, pp. 666–673 (1999)Google Scholar
  5. 5.
    Zhang, Z.: A Flexible New Technique for Camera Calibration. IEEE Transactions on Pattern Analysis and Machine Intelligence 22, 1330–1334 (2000)CrossRefGoogle Scholar
  6. 6.
    Ueshiba, T., Tomita, F.: Plane-based Calibration Algorithm for Multi-camera Systems via Factorization of Homography Matrices. In: International Conference on Computer Vision, vol. 2, pp. 966–973 (2003)Google Scholar
  7. 7.
    Svoboda, T., Martinec, D., Pajdla, T.: A Convenient Multicamera Self-Calibration for Virtual Environments. Presence: Teleoperators and Virtual Environments 14, 407–422 (2005)CrossRefGoogle Scholar
  8. 8.
    Maybank, S.J., Faugeras, O.D.: A Theory of Self-Calibration of a Moving Camera. International Journal of Computer Vision 8, 123–152 (1992)CrossRefGoogle Scholar
  9. 9.
    Hartley, R.I.: An Algorithm for Self Calibration from Several Views. In: Computer Vision and Pattern Recognition, pp. 908–912 (1994)Google Scholar
  10. 10.
    Fiala, M.: ARTAG Rev2 Fiducial Marker System: Vision based Tracking for AR. Presented at the Workshop of Industrial Augmented Reality (2005)Google Scholar
  11. 11.
    Fiala, M.: ARTag – augmented reality (2009), http://www.artag.net/ (retrieved on March 10, 2011)

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Martin Grochulla
    • 1
  • Thorsten Thormählen
    • 1
  • Hans-Peter Seidel
    • 1
  1. 1.Max-Planck-Institut InformatikSaarbrückenGermany

Personalised recommendations