CCCV 2017: Computer Vision pp 506-517 | Cite as

How Depth Estimation in Light Fields Can Benefit from Angular Super-Resolution?

  • Mandan Zhao
  • Gaochang Wu
  • Yebin Liu
  • Xiangyang Hao
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 771)

Abstract

With the development of consumer light field cameras, the light field imaging has become an extensively used method for capturing the 3D appearance of a scene. The depth estimation often require a dense sampled light field in the angular domain. However, there is an inherent trade-off between the angular and spatial resolution of the light field. Recently, some studies for novel view synthesis or angular super-resolution from a sparse set of have been introduced. Rather than the conventional approaches that optimize the depth maps, these approaches focus on maximizing the quality of synthetic views. In this paper, we investigate how the depth estimation can benefit from these angular super-resolution methods. Specifically, we compare the qualities of the estimated depth using the original sparse sampled light fields and the reconstructed dense sampled light fields. Experiment results evaluate the enhanced depth maps using different view synthesis approaches.

Keywords

Light field Angular super-resolution View synthesis Depth estimation 

References

  1. 1.
  2. 2.
    RayTrix: 3D light field camera technology. http://www.raytrix.de/
  3. 3.
    Stanford Lytro Light Field Archive. http://lightfields.stanford.edu/
  4. 4.
    Ihrke, I., Restrepo, J.F., Mignard-Debise, L.: Principles of light field imaging: briefly revisiting 25 years of research. IEEE Signal Process. Mag. 33(5), 59–69 (2016).  https://doi.org/10.1109/MSP.2016.2582220 CrossRefGoogle Scholar
  5. 5.
    Kalantari, N.K., Wang, T.C., Ramamoorthi, R.: Learning-based view synthesis for light field cameras. ACM Trans. Graph. (TOG) 35(6), 193–202 (2016)CrossRefGoogle Scholar
  6. 6.
    Levoy, M., Hanrahan, P.: Light field rendering. In: Siggraph, pp. 31–42 (1996). https://doi.org/10.1145/237170.237199
  7. 7.
    Pujades, S., Devernay, F., Goldluecke, B.: Bayesian view synthesis and image-based rendering principles. In: CVPR, pp. 3906–3913 (2014)Google Scholar
  8. 8.
    Shi, L., Hassanieh, H., Davis, A., Katabi, D., Durand, F.: Light field reconstruction using sparsity in the continuous fourier domain. ACM TOG 34(1), 12 (2014)CrossRefMATHGoogle Scholar
  9. 9.
    Tao, M.W., Hadap, S., Malik, J., Ramamoorthi, R.: Depth from combining defocus and correspondence using light-field cameras. In: ICCV, pp. 673–680 (2013)Google Scholar
  10. 10.
    Vagharshakyan, S., Bregovic, R., Gotchev, A.: Image based rendering technique via sparse representation in shearlet domain. In: ICIP, pp. 1379–1383. IEEE (2015)Google Scholar
  11. 11.
    Wang, T.C., Efros, A.A., Ramamoorthi, R.: Occlusion-aware depth estimation using light-field cameras. In: ICCV, pp. 3487–3495 (2015)Google Scholar
  12. 12.
    Wanner, S., Goldluecke, B.: Variational light field analysis for disparity estimation and super-resolution. IEEE TPAMI 36(3), 606–619 (2014)CrossRefGoogle Scholar
  13. 13.
    Wanner, S., Meister, S., Goldlücke, B.: Datasets and benchmarks for densely sampled 4D light fields. In: Vision, Modeling & Visualization, pp. 225–226 (2013)Google Scholar
  14. 14.
    Wu, G., Zhao, M., Wang, L., Chai, T., Dai, Q., Liu, Y.: Light field reconstruction using deep convolutional network on EPI. In: CVPR (2017)Google Scholar
  15. 15.
    Yoon, Y., Jeon, H.G., Yoo, D., Lee, J.Y., So Kweon, I.: Learning a deep convolutional network for light-field image super-resolution. In: CVPRW, pp. 24–32 (2015)Google Scholar
  16. 16.
    Zhang, Z., Liu, Y., Dai, Q.: Light field from micro-baseline image pair. In: CVPR, pp. 3800–3809 (2015)Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2017

Authors and Affiliations

  • Mandan Zhao
    • 1
  • Gaochang Wu
    • 2
  • Yebin Liu
    • 3
  • Xiangyang Hao
    • 1
  1. 1.Information Engineering UniversityZhengzhouChina
  2. 2.Northeastern UniversityShenyangChina
  3. 3.Tsinghua UniversityBeijingChina

Personalised recommendations