Reconstructing Illumination Environment by Omnidirectional Camera Calibration

  • Yong-Ho Hwang
  • Hyun-Ki Hong
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4304)


This paper presents a novel approach to reconstruct illumination environment by omnidirectional camera calibration. The camera positions are estimated by our method considering the inlier distribution. The light positions are computed with the intersection points of the rays starting from the camera positions toward the corresponding points between two images. In addition, our method can integrate various synthetic objects in the real photographs by using the distributed ray tracing and HDR (High Dynamic Range) radiance map. Simulation results showed that we can generate photo-realistic image synthesis in the reconstructed illumination environment.


Camera Model Camera Position Real Photograph Global Illumination Scene Structure 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Fournier, A., Gunawan, A., Romanzin, C.: Common illumination between real and computer generated scenes. In: Proc. of Graphics Interface, pp. 254–262 (1993)Google Scholar
  2. 2.
    Debevec, P.: Rendering synthetic objects into real scenes: bridging traditional and image-based graphics with global illumination and high dynamic range photography. In: Proc. of Siggraph, pp. 189–198 (1998)Google Scholar
  3. 3.
    Gibson, S., Murta, A.: Interactive rendering with real-world illumination. In: Proc. of the 11th Eurographics Workshop on Rendering, pp. 365–376 (2000)Google Scholar
  4. 4.
    Xiong, Y., Turkowski, K.: Creating image based VR using a self-calibrating fisheye lens. In: Proc. of Computer Vision and Pattern Recognition, pp. 237–243 (1997)Google Scholar
  5. 5.
    Nene, S.A., Nayar, S.K.: Stereo with mirrors. In: Proc. of Int. Conf. on Computer Vision, pp. 1087–1094 (1998)Google Scholar
  6. 6.
    Sato, I., Sato, Y., Ikeuchi, K.: Acquiring a radiance distribution to superimpose virtual objects onto a real scene. IEEE Trans. on Visualization and Computer Graphics 5(1), 1–12 (1999)CrossRefGoogle Scholar
  7. 7.
    Micusik, B., Pajdla, T.: Estimation of omnidiretional camera model from epipolar geometry. In: Proc. of Computer Vision and Pattern Recognition, pp. 485–490 (2003)Google Scholar
  8. 8.
    Micusik, B., Martinec, D., Pajdla, T.: 3D Metric reconstruction from uncalibrated omnidirectional Images. In: Proc. of Asian Conf. on Computer Vision, pp. 545–550 (2004)Google Scholar
  9. 9.
    Micusik, B., Pajdla, T.: Omnidirectional camera model and epipolar estimateion by RANSAC with bucketing. In: IEEE Scandinavian Conf. Image Analysis, pp. 83–90 (2003)Google Scholar
  10. 10.
    Hwang, Y., Hong, H.: Estimation of Omnidirectional Camera Model with One Parametric Projection. Lecture Notes in Control and Information Sciences, vol. 345, pp. 827–833 (2006)Google Scholar
  11. 11.
    Hwang, Y., Hong, H.: 3D Analysis of Omnidirectional Images for Illumination Environment Reconstruction. In: Greco, S., Hata, Y., Hirano, S., Inuiguchi, M., Miyamoto, S., Nguyen, H.S., Słowiński, R. (eds.) RSCTC 2006. LNCS (LNAI), vol. 4259, Springer, Heidelberg (2006)CrossRefGoogle Scholar
  12. 12.
    Glassner, A.S.: Graphics Gems. AP Professional, Cambridge (1990)Google Scholar
  13. 13.
    Agarwal, S., Ramamoorthi, R., Belongie, S., Jensen, H.: Structured Importance Sampling of Environment Maps. In: Proc. of Siggraph, pp. 605–612 (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Yong-Ho Hwang
    • 1
  • Hyun-Ki Hong
    • 1
  1. 1.Dept. of Image Eng., Graduate School of Advanced Imaging Science, Multimedia and FilmChung-Ang Univ.SeoulKorea

Personalised recommendations