Coaxial Omnidirectional Stereopsis

  • Libor Spacek
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3024)


Catadioptric omnidirectional sensors, consisting of a camera and a mirror, can track objects even when their bearings change suddenly, usually due to the observer making a significant turn. There has been much debate concerning the relative merits of several possible shapes of mirrors to be used by such sensors.

This paper suggests that the conical mirror has some advantages over other shapes of mirrors. In particular, the projection beam from the central region of the image is reflected and distributed towards the horizon rather than back at the camera. Therefore a significant portion of the image resolution is not wasted.

A perspective projection unwarping of the conical mirror images is developed and demonstrated. This has hitherto been considered possible only with mirrors that possess single viewpoint geometry. The cone is viewed by a camera placed some distance away from the tip. Such arrangement does not have single viewpoint geometry. However, its multiple viewpoints are shown to be dimensionally separable.

Once stereopsis has been solved, it is possible to project the points of interest to a new image through a (virtual) single viewpoint. Successful reconstruction of a single viewpoint image from a pair of images obtained via multiple viewpoints appears to validate the use of multiple viewpoint projections.

The omnidirectional stereo uses two catadioptric sensors. Each sensor consists of one conical mirror and one perspective camera. The sensors are in a coaxial arrangement along the vertical axis, facing up or down. This stereoscopic arrangement leads to very simple matching since the epipolar lines are the radial lines of identical orientations in both omnidirectional images.

The stereopsis results on artificially generated scenes with known ground truth show that the error in computed distance is proportional to the distance of the object (as usual), plus the distance of the camera from the mirror. The error is also inversely proportional to the image radius coordinate, ie. the results are more accurate for points imaged nearer the rim of the circular mirror.


Stereo Match Panoramic Image Perspective Projection Epipolar Line Radial Distortion 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Swaminathan, R., Nayar, S.K.: Nonmetric calibration of wide-angle lenses and polycameras. IEEE Transactions on Pattern Analysis and Machine Intelligence 22, 1172–1178 (2000)CrossRefGoogle Scholar
  2. 2.
    Rees, D.: Panoramic television viewing system. US Patent No. 3,505,465 (1970)Google Scholar
  3. 3.
    Kang, S., Szeliski, R.: 3-d scene data recovery using omni-directional multibaseline stereo. IJCV 25, 167–183 (1997)CrossRefGoogle Scholar
  4. 4.
    Ishiguro, H., Yamamoto, M., Tsuji, S.: Omni-directional stereo. PAMI 14, 257–262 (1992)Google Scholar
  5. 5.
    Shah, S., Aggarwal, J.: Mobile robot navigation and scene modeling using stereo fish-eye lens system. MVA 10, 159–173 (1997)Google Scholar
  6. 6.
    Nayar, S.: Catadioptric omnidirectional cameras. In: CVPR 1997, 482–488 (1997)Google Scholar
  7. 7.
    Rushant, K., Spacek, L.: An autonomous vehicle navigation system using panoramic vision techniques. In: International Symposium on Intelligent Robotic Systems, ISIRS 1998, pp. 275–282 (1998)Google Scholar
  8. 8.
    Pajdla, T., Hlaváč, V.: Zero phase representation of panoramic images for image vased localization. In: Solina, F., Leonardis, A. (eds.) CAIP 1999. LNCS, vol. 1689, pp. 550–557. Springer, Heidelberg (1999)CrossRefGoogle Scholar
  9. 9.
    Yagi, Y., Nishii, W., Yamazawa, K., Yachida, M.: Rolling motion estimation for mobile robot by using omnidirectional image sensor hyperomnivision. In: ICPR 1996 (1996)Google Scholar
  10. 10.
    Yagi, Y., Nishizawa, Y., Yachida, M.: Map-based navigation for a mobile robot with omnidirectional image sensor copis. Trans. Robotics and Automation 11, 634–648 (1995)CrossRefGoogle Scholar
  11. 11.
    Baker, S., Nayar, S.: A theory of single-viewpoint catadioptric image formation. IJCV 32, 175–196 (1999)CrossRefGoogle Scholar
  12. 12.
    Baker, S., Nayar, S.: A theory of catadioptric image formation. In: ICCV 1998, pp. 35–42 (1998)Google Scholar
  13. 13.
    Geyer, C., Daniilidis, K.: A unifying theory for central panoramic systems and practical applications. In: Vernon, D. (ed.) ECCV 2000. LNCS, vol. 1843, pp. 445–461. Springer, Heidelberg (2000)CrossRefGoogle Scholar
  14. 14.
    Geyer, C., Daniilidis, K.: Properties of the catadioptric fundamental matrix. In: Heyden, A., Sparr, G., Nielsen, M., Johansen, P. (eds.) ECCV 2002. LNCS, vol. 2351, p. 140. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  15. 15.
    Baker, S., Nayar, S.: Single viewpoint catadioptric cameras. In: PV 2001, pp. 39–71 (2001)Google Scholar
  16. 16.
    Svoboda, T., Pajdla, T.: Epipolar geometry for central catadioptric cameras. IJCV 49, 23–37 (2002)zbMATHCrossRefGoogle Scholar
  17. 17.
    Swaminathan, R., Grossberg, M., Nayar, S.: Caustics of catadioptric cameras. In: ICCV 2002 (2001)Google Scholar
  18. 18.
    Fiala, M., Basu, A.: Panoramic stereo reconstruction using non-svp optics. In: ICPR 2002, vol. 4, pp. 27–30 (2002)Google Scholar
  19. 19.
    Yagi, Y., Kawato, S.: Panoramic scene analysis with conic projection. In: IROS 1990 (1990)Google Scholar
  20. 20.
    Yokoya, N., Iwasa, H., Yamazawa, K., Kawanishi, T., Takemura, H.: Generation of high-resolution stereo panoramic images by omnidirectional imaging sensor using hexagonal pyramidal mirrors. In: ICPR 1998 (1998)Google Scholar
  21. 21.
    Spacek, L.: Omnidirectional catadioptric vision sensor with conical mirrors. In: Towards Intelligent Mobile Robotics, TIMR 2003 (2003) Google Scholar
  22. 22.
    Lin, S., Bajcsy, R.: True single view point cone mirror omni-directional catadioptric system. In: ICCV 2001, vol. 2, pp. 102–107 (2001)Google Scholar
  23. 23.
    Hicks, A., Bajscy, R.: Reactive surfaces as computational sensors. In: The second IEEE Workshop on Perception for Mobile Agents. Held in Conjunction with CVPR1999, pp. 82–86 (1999)Google Scholar
  24. 24.
    Brassart, E., et al.: Experimental results got with the omnidirectional vision sensor: Syclop. In: EEE Workshop on Omnidirectional Vision (OMNIVIS 2000), pp. 145–152 (2000)Google Scholar
  25. 25.
    Geyer, C., Daniilidis, K.: Structure and motion from uncalibrated catadioptric views. In: CVPR 2001, vol. 1, pp. 279–286 (2001)Google Scholar
  26. 26.
    Geyer, C., Daniilidis, K.: Paracatadioptric camera calibration. IEEE PAMI 24, 1–10 (2002)Google Scholar
  27. 27.
    Lin, S., Bajcsy, R.: High resolution catadioptric omni-directional stereo sensor for robot vision. In: IEEE International Conference on Robotics and Automation, Taipei, Taiwan, pp. 12–17 (2003)Google Scholar
  28. 28.
    Spacek, L.: Edge detection and motion detection. Image and Vision Computing 4, 43–56 (1986)CrossRefGoogle Scholar
  29. 29.
    Šára, R.: Finding the largest unambiguous component of stereo matching. In: Heyden, A., Sparr, G., Nielsen, M., Johansen, P. (eds.) ECCV 2002. LNCS, vol. 2352, pp. 900–914. Springer, Heidelberg (2002)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Libor Spacek
    • 1
  1. 1.Department of Computer ScienceUniversity of EssexColchesterUK

Personalised recommendations