Advertisement

Inverse Polarization Raytracing: Estimating Surface Shapes of Transparent Objects

  • Katsushi Ikeuchi
  • Daisuke Miyazaki

We propose a novel method for estimating the surface shapes of transparent objects by analyzing the polarization state of the light. Existing methods do not fully consider the reflection, refraction, and transmission of the light occurring inside a transparent object. We employ a polarization raytracing method to compute both the path of the light and its polarization state. Our proposed iterative computation method estimates the surface shape of the transparent object by minimizing the difference between the polarization data rendered by the polarization raytracing method and the polarization data obtained from a real object.

Keywords

Root Mean Square Error Target Object Front Surface Polarization Data Surface Shape 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    K. Koshikawa and Y. Shirai, “A model-based recognition of glossy objects using their polarimetrical properties,” Advanced Robotics, Vol. 2, No. 2, pp. 137-147, 1987.CrossRefGoogle Scholar
  2. [2]
    L. B. Wolff and T. E. Boult, “Constraining object features using a polarization reflectance model,” IEEE Trans. Patt. Anal. Mach. Intell., Vol. 13, No. 7, pp. 635-657, 1991.CrossRefGoogle Scholar
  3. [3]
    S. Rahmann and N. Canterakis, “Reconstruction of specular surfaces using polarization imaging,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, pp. 149-155, 2001.Google Scholar
  4. [4]
    O. Drbohlav and R. Šára, “Unambiguous determination of shape from photometric stereo with unknown light sources,” Proc. IEEE Int’l Conf. Computer Vision, pp. I:581-586, 2001.Google Scholar
  5. [5]
    D. Miyazaki, R. T. Tan, K. Hara, and K. Ikeuchi, “Polarization-based inverse rendering from a single view,” Proc. IEEE Int’l Conf. Computer Vision, pp. 982-987, 2003.Google Scholar
  6. [6]
    M. Saito, Y. Sato, K. Ikeuchi, and H. Kashiwagi, “Measurement of surface orientations of transparent objects by use of polarization in highlight,” J. Opt. Soc. Am. A, Vol. 16, No. 9, pp. 2286-2293, 1999.CrossRefGoogle Scholar
  7. [7]
    D. Miyazaki, M. Saito, Y. Sato, and K. Ikeuchi, “Determining surface orientations of transparent objects based on polarization degrees in visible and infrared wavelengths,” J. Opt. Soc. Am. A, Vol. 19, No. 4, pp. 687-694, 2002.CrossRefGoogle Scholar
  8. [8]
    D. Miyazaki, M. Kagesawa, and K. Ikeuchi, “Transparent surface modeling from a pair of polarization images,” IEEE Trans. Patt. Anal. Mach. Intell., Vol. 26, No. 1, pp. 73-82, 2004.CrossRefGoogle Scholar
  9. [9]
    H. Murase, “Surface shape reconstruction of a nonrigid transparent object using refraction and motion,” IEEE Trans. Patt. Anal. Mach. Intell., Vol. 14, No. 10, pp. 1045-1052, 1992.CrossRefGoogle Scholar
  10. [10]
    S. Hata, Y. Saitoh, S. Kumamura, and K. Kaida, “Shape extraction of transparent object using genetic algorithm,” Proc. Int’l Conf. Pattern Recognition, pp. 684-688, 1996.Google Scholar
  11. [11]
    K. Ohara, M. Mizukawa, K. Ohba, and K. Taki, “3D modeling of micro transparent object with integrated vision,” Proc. IEEE Conf. Multisensor Fusion and Integration for Intelligent Systems, pp. 107-112, 2003.Google Scholar
  12. [12]
    M. Ben-Ezra and S. K. Nayar, “What does motion reveal about transparency?,” Proc. IEEE Int’l Conf. Computer Vision, pp. 1025-1032, 2003.Google Scholar
  13. [13]
    K. N. Kutulakos, “Refractive and specular 3D shape by light-path triangulation,” Proc. Int’l Symposium on the CREST Digital Archiving Project, pp. 86-93, 2005.Google Scholar
  14. [14]
    D. E. Zongker, D. M. Warner, B. Curless, and D. H. Salesin, “Environmental matting and compositing,” Proc. SIGGRAPH, pp. 205-214, 1999.Google Scholar
  15. [15]
    Y. Chuang, D. E. Zongker, J. Hindorff, B. Curless, D. H. Salesin, and R. Szeliski, “Environment matting extensions: towards higher accuracy and real-time capture,” Proc. SIGGRAPH, pp. 121-130, 2000.Google Scholar
  16. [16]
    Z. S. Hakura and J. M. Snyder, “Realistic reflections and refractions on graphics hardware with hybrid rendering and layered environment maps,” Proc. Eurographics Workshop on Rendering, pp. 289-300, 2001.Google Scholar
  17. [17]
    W. Matusik, H. Pfister, R. Ziegler, A. Ngan, and L. McMillan, “Acquisition and rendering of transparent and refractive objects,” Proc. Eurographics Workshop on Rendering, pp. 267-278, 2002.Google Scholar
  18. [18]
    Y. Wexler, A. W. Fitzgibbon, and A. Zisserman, “Image-based environment matting,” Proc. Eurographics Workshop on Rendering, pp. 279-290, 2002.Google Scholar
  19. [19]
    P. Peers and P. Dutré, “Wavelet environment matting,” Proc. Eurographics Workshop on Rendering, pp. 157-166, 2003.Google Scholar
  20. [20]
    Y. Y. Schechner, J. Shamir, and N. Kiryati, “Polarization and statistical analysis of scenes containing a semireflector,” J. Opt. Soc. Am. A, Vol. 17, No. 2, pp. 276-284, 2000.CrossRefGoogle Scholar
  21. [21]
    Y. Y. Schechner, N. Kiryati, and R. Basri, “Separation of transparent layers using focus,” Int’l J. Computer Vision, Vol. 39, No. 1, pp. 25-39, 2000.zbMATHCrossRefGoogle Scholar
  22. [22]
    R. Szeliski, S. Avidan, and P. Anandan, “Layer extraction from multiple images containing reflections and transparency,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, pp. 246-253, 2000.Google Scholar
  23. [23]
    H. Farid and E. H. Adelson, “Separating reflections from images by use of independent component analysis,” J. Opt. Soc. Am. A, Vol. 16, No. 9, pp. 2136-2145, 1999.CrossRefGoogle Scholar
  24. [24]
    M. Born and E. Wolf, Principles of optics, Pergamon Press, 1959.Google Scholar
  25. [25]
    W. A. Shurcliff, Polarized light: production and use, Harvard University Press, 1962.Google Scholar
  26. [26]
    R. A. Chipman, “Mechanics of polarizaiton ray tracing,” Optical Engineering, Vol. 34, No. 6, pp. 1636-1645, 1995.CrossRefGoogle Scholar
  27. [27]
    L. B. Wolff and D. J. Kurlander, “Ray tracing with polarization parameters,” IEEE Computer Graphics and Applications, Vol. 10, No. 6, pp. 44-55, 1990.CrossRefGoogle Scholar
  28. [28]
    C. Gu and P. Yeh, “Extended Jones matrix method. II,” J. Opt. Soc. Am. A, Vol. 10, No. 5, pp. 966-973, 1993.CrossRefGoogle Scholar
  29. [29]
    J. S. Gondek, G. W. Meyer, and J. G. Newman, “Wavelength dependent reflectance functions,” Proc. SIGGRAPH, pp. 213-220, 1994.Google Scholar
  30. [30]
    D. C. Tannenbaum, P. Tannenbaum, and M. J. Wozny, “Polarization and birefringency considerations in rendering,” Proc. SIGGRAPH, pp. 221-222,1994.Google Scholar
  31. [31]
    A. Wilkie, R. F. Tobler, and W. Purgathofer, “Combined rendering of polarization and fluorescence effects,” Proc. Eurographics Workshop on Rendering, pp. 197-204, 2001.Google Scholar
  32. [32]
    S. Guy and C. Soler, “Graphics gems revisited: fast and physically-based rendering of gemstones,” Proc. SIGGRAPH, pp. 231-238, 2004.Google Scholar
  33. [33]
  34. [34]
  35. [35]
  36. [36]
    W. H. Press, S. A. Teukolsky, W. T. Vetterling, and B. P. Flannery, Numerical recipes in C: the art of scientific computing, Cambridge University Press, 1992.Google Scholar
  37. [37]
    K. Ikeuchi, “Reconstructing a depth map from intensity maps,” Proc. Int’l Conf. Pattern Recognition, pp. 736-738, 1984.Google Scholar
  38. [38]
    B. K. P. Horn, “Height and Gradient from Shading,” Int’l J. Computer Vision, Vol. 5, No. 1, pp. 37-75, 1990.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2008

Authors and Affiliations

  • Katsushi Ikeuchi
    • 1
  • Daisuke Miyazaki
    • 1
  1. 1.Institute of Industrial ScienceThe University of TokyoMeguro-kuJapan

Personalised recommendations