The Visual Computer

, Volume 19, Issue 6, pp 405–416

Omnidirectional video

Special issue on computational video

Abstract

Omnidirectional video enables direct surround immersive viewing of a scene by warping the original image into the correct perspective given a viewing direction. However, novel views from viewpoints off the camera path can only be obtained if we solve the three-dimensional motion and calibration problem. In this paper we address the case of a parabolic catadioptric camera – a paraboloidal mirror in front of an orthographic lens – and we introduce a new representation, called the circle space, for points and lines in such images. In this circle space, we formulate an epipolar constraint involving a 4×4 fundamental matrix. We prove that the intrinsic parameters can be inferred in closed form from the two-dimensional subspace of the new fundamental matrix from two views if they are constant or from three views if they vary. Three-dimensional motion and structure can then be estimated from the decomposition of the fundamental matrix.

Keywords

Catadioptric cameras Structure from motion Pose estimation Immersive walkthroughs 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Aliaga D, Carlbom I (2001) Plenoptic stitching: a scalable method for reconstructing interactive walkthroughs. In: Proceedings of ACM SIGGRAPH, pp 443–450 Google Scholar
  2. 2.
    Antone M, Teller S (2000) Automatic recovery of relative camera rotations in urban scenes. In: IEEE Conference on Computer Vision and Pattern Recognition, Hilton Head Island, SC, pp Google Scholar
  3. 3.
    Barreto J, Araujo H (2002) Issues on the geometry of central catadioptric imaging. In: Hawaii, Vol. IV, pp 422–427 Google Scholar
  4. 4.
    Benosman R, Kang S (2000) Panoramic vision. Springer, Berlin Google Scholar
  5. 5.
    Benosman R, Mouaddib E (eds) (2002) Workshop on Omnidirectional Vision, Copenhagen. IEEE Computer Society Press Google Scholar
  6. 6.
    Boult T (1998) Remote reality demonstration. In: IEEE Conference on Computer Vision and Pattern Recognition, Santa Barbara, CA, pp 966–967 Google Scholar
  7. 7.
    Bruckstein A, Richardson T (2000) Omniview cameras with curved surface mirrors. In: IEEE Workshop on Omnidirectional Vision, Hilton Head Island, SC, pp 79–86. [Originally published as Bell Labs Technical Memo, 1996] Google Scholar
  8. 8.
    Chen E, Williams L (1993) View interpolation for image synthesis. In: Proceedings of ACM SIGGRAPH, pp Please give page numbers. Google Scholar
  9. 9.
    Daniilidis K (ed) (2000) In: IEEE Workshop on Omnidirectional Vision, Hilton Head Island, SC, pp Please give page numbers. Google Scholar
  10. 10.
    Daniilidis K, Makadia A, Bülow T (2002) Image processing in catadioptric planes: spatiotemporal derivatives and optical flow computation. In: Workshop on Omnidirectional Vision, Copenhagen, pp 3–12 Google Scholar
  11. 11.
    Daniilidis K, Spetsakis M (1996) Understanding noise sensitivity in structure from motion. In: Aloimonos Y (ed) Visual navigation. Lawrence Erlbaum Associates, Hillsdale, NJ, pp 61–88 Google Scholar
  12. 12.
    Faugeras O, Luong Q-T, Papadopoulo T (2001) The geometry of multiple images: the laws that govern the formation of multiple images of a scene and some of their applications. MIT Press Google Scholar
  13. 13.
    Fermüller C, Aloimonos Y (1998) Ambiguity in structure from motion: sphere vs. plane. Int J Comput Vision 28:137–154 CrossRefGoogle Scholar
  14. 14.
    Geyer C, Daniilidis K (2001a) Catadioptric projective geometry. Int J Comput Vision 43:223–243 CrossRefGoogle Scholar
  15. 15.
    Geyer C, Daniilidis K (2001b) Structure and motion from uncalibrated catadioptric views. In: IEEE Conference on Computer Vision and Pattern Recognition, Hawaii, pp Google Scholar
  16. 16.
    Geyer C, Daniilidis K (2002a) Para-cata-dioptric calibration. IEEE Trans Pattern Anal Mach Intell 24:687–695 CrossRefGoogle Scholar
  17. 17.
    Geyer C, Daniilidis K (2002b) Properties of the catadioptric fundamental matrix. In: Proceedings of Seventh European Conference on Computer Vision, Copenhagen, pp 140–154 Google Scholar
  18. 18.
    Gluckman J, Nayar S (1998) Ego-motion and omnidirectional cameras. In: Proceedings of International Conference on Computer Vision, Bombay, pp 999–1005 Google Scholar
  19. 19.
    Gortler S, Grzeszczuk R, Szeliski R, Cohen M (1996) The lumigraph. In: Proceedings of ACM SIGGRAPH, pp 43–54 Google Scholar
  20. 20.
    Hartley R, Zisserman A (2000) Multiple view geometry. Cambridge University Press Google Scholar
  21. 21.
    Heyden A, Aström K (1997) Euclidean reconstruction from image sequences with varying and unknown focal length and principal point. In: IEEE Conference on Computer Vision and Pattern Recognition, pp 438–443 Google Scholar
  22. 22.
    Jepson A, Heeger D (1990) Subspace methods for recovering rigid motion II: theory. RBCV-TR-90-36, University of Toronto Google Scholar
  23. 23.
    Kang S (2000) Catadioptric self-calibration. In: IEEE Conference on Computer Vision and Pattern Recognition, Hilton Head Island, SC, pp I201–I207 Google Scholar
  24. 24.
    Levoy M, Hanrahan P (1996) Lightfield rendering. In: Proceedings of ACM SIGGRAPH, pp 31–42 Google Scholar
  25. 25.
    Ma Y, Soatto S, Kosecka J, Sastry S (2000) Euclidean reconstruction and reprojection up to subgroups. Int J Comput Vision 38:217–227 CrossRefGoogle Scholar
  26. 26.
    Maybank S (1993) Theory of reconstruction from image motion. Springer, Berlin Google Scholar
  27. 27.
    Maybank S, Faugeras O (1992) A theory of self-calibration of a moving camera. Int J Comput Vision 8:123–151 CrossRefGoogle Scholar
  28. 28.
    Mulligan J, Isler V, Daniilidis K (2002) Trinocular stereo: a new algorithm and its evaluation. Int J Comput Vision 47:51–61 CrossRefGoogle Scholar
  29. 29.
    Nayar S (1997) Catadioptric omnidirectional camera. In: IEEE Conference on Computer Vision and Pattern Recognition, Puerto Rico, pp 482–488 Google Scholar
  30. 30.
    Pedoe D (1970) Geometry: a comprehensive course. Dover, New York Google Scholar
  31. 31.
    Pollyfeys M, Koch R, van Gool L (1998) Self-calibration and metric reconstruction in spite of varying and unknown internal camera parameters. In: Proceedings of International Conference on Computer Vision, Bombay, pp 90–95 Google Scholar
  32. 32.
    Rees DW (1971) Panoramic television viewing system. US Patent No. 3 505 465 (1970) Google Scholar
  33. 33.
    Sturm P (1999) Critical motion sequences for the self-calibration of cameras and stereo systems with variable focal length. In: BMVC, pp Google Scholar
  34. 34.
    Sturm P (2002) Mixing catadioptric and perspective cameras. In: Workshop on Omnidirectional Vision, Copenhagen, pp 37–44 Google Scholar
  35. 35.
    Svoboda T, Pajdla T, Hlavac V (1998) Epipolar geometry for panoramic cameras. In: Proceedings of Fifth European Conference on Computer Vision, pp 218–231 Google Scholar
  36. 36.
    Taylor C (2000) Video Plus. In: IEEE Workshop on Omnidirectional Vision, Hilton Head Island, SC, pp 3–10 Google Scholar
  37. 37.
    Yagi Y, Kawato S, Tsuji S (1994) Real-time omnidirectional image sensor (COPIS) for vision-guided navigation. IEEE Trans Robot Autom 10:11–22 CrossRefGoogle Scholar

Copyright information

© Springer-Verlag 2003

Authors and Affiliations

  1. 1.GRASP LaboratoryUniversity of PennsylvaniaPhiladelphiaUSA

Personalised recommendations