Depth estimation improvement in 3D integral imaging using an edge removal approach

  • José M. Sotoca
  • Pedro Latorre-Carmona
  • Hector Espinos-Morato
  • Filiberto Pla
  • Bahram Javidi
Original Article
  • 24 Downloads

Abstract

A new depth estimation method for 3D reconstruction in a synthetic aperture integral imaging framework is presented. This method removes the edges of the objects in the elemental images when the objects are in focus. This strategy aims to compensate for the noise that objects focused close to the cameras can introduce into the photo-consistency measure of objects at higher depths. Furthermore, a photo-consistency criterion is applied combining a defocus and a correspondence measure, and a depth regularization method which smooths noisy depth results for the case of object surfaces. The proposed method obtains consistent results for any type of object surfaces as well as very sharp boundaries. Experimental results show that our method reduces the noise in the object edges and gives rise to an improvement in the depth map results in relation to the other methods shown in the comparative analysis.

Keywords

Integral imaging Depth map Removal edges Defocus Regularisation 

Notes

Acknowledgements

This work was supported by the Spanish Ministry of Economy and Competitiveness (MINECO) under the Projects SEOSAT (ESP2013-48458-C4-3- P) and MTM2013-48371-C2-2-PDGI, by the Generalitat Valenciana through the Project PROMETEO-II-2014-062, and by the University Jaume I through the Project UJIP11B2014-09. B. Javidi would like to acknowledge support under NSF/IIS-1422179 and ONR under N00014-17-1-2561.

References

  1. 1.
    Lytro redefines photography with light field cameras. Press release. (2011) (Outline). https://www.lytro.com/
  2. 2.
    Arai J, Okano F, Kawakita M, Okui M, Haino Y, Yoshimura M, Furuya M, Sato M (2010) Integral three-dimensional television using a 33-megapixel imaging system. IEEE J Disp Technol 6(10):422–430CrossRefGoogle Scholar
  3. 3.
    Arimoto H, Javidi B (2001) Integral three-dimensional imaging with digital reconstruction. Opt Lett 26:157–159CrossRefGoogle Scholar
  4. 4.
    Azevedo TCS, Tavares JMRS, Vaz MAP (2009) 3d object reconstruction from uncalibrated images using and off-the-shelf camera. In: Tavares J, Jorge RN (eds) Advances in computational vision and medical image processing: methods and applications. Springer, BerlinGoogle Scholar
  5. 5.
    Azevedo TCS, Tavares JMRS, Vaz MAP (2010) Three-dimensional reconstruction and characterization of human external shapes from two-dimensional images using volumetric methods. Comput Methods Biomech Biomed Eng 13(3):359–369CrossRefGoogle Scholar
  6. 6.
    Bae S, Duran F (2007) Defocus magnification. Eurographics 26(3):571–579Google Scholar
  7. 7.
    Benton SA, Bove VM (2008) Holographic imaging. Wiley, HobokenCrossRefGoogle Scholar
  8. 8.
    Boykov Y, Kolmogorov V (2004) An experimental comparison of min-cut/max-flow algorithms for energy minimization in vision. IEEE Trans Pattern Anal Mach Intell 26(9):1124–1137CrossRefMATHGoogle Scholar
  9. 9.
    Boykov Y, Veksler O, Zabih R (2001) Fast approximate energy minimization via graph cuts. IEEE Trans Pattern Anal Mach Intell 23(11):1222–1239CrossRefGoogle Scholar
  10. 10.
    Burckhardt CB (1968) Optimum parameters and resolution limitation of integral photography. J Opt Soc Am 58:71–76CrossRefGoogle Scholar
  11. 11.
    Cho M, Javidi B (2010) Three-dimensional visualization of objects in turbid water using integral imaging. J Display Technol 6(10):544–547CrossRefGoogle Scholar
  12. 12.
    Cho M, Mahalanobis A, Javidi B (2011) 3D passive photon counting automatic target recognition using advanced correlation filters. Opt Lett 36(6):861–863CrossRefGoogle Scholar
  13. 13.
    Daneshpanah M, Javidi B (2009) Profilometry and optical slicing by passive three-dimensional imaging. Opt Lett 34(7):1105–1107CrossRefGoogle Scholar
  14. 14.
    Espinós-Morató H, Latorre-Carmona P, Sotoca JM, Pla F, Javidi B (2017) Combining defocus and photoconsistency for depth map estimation in 3D integral imaging. In: 8th Iberian conference on pattern recognition and image analysis (IbPRIA), pp 114–121Google Scholar
  15. 15.
    Furukawa Y, Hernandez C (2015) Multi-view stereo: a tutorial. Found Trends Comput Graph Vision 9:1–148CrossRefGoogle Scholar
  16. 16.
    Hong SH, Jang JS, Javidi B (2004) Three-dimensional volumetric object reconstruction using computational integral imaging. Opt Express 3(3):483–491CrossRefGoogle Scholar
  17. 17.
    Hongen L, Hata N, Nakajima S, Iwahara M, Sakuma I, Dohi T (2004) Surgical navigation by autostereoscopic image overlay of integral videography. IEEE Trans Inf Technol Biomed 8:114–121CrossRefGoogle Scholar
  18. 18.
    Igarashi Y, Murata H, Ueda M (1978) 3-D display system using a computer generated integral photograph. Jpn J Appl Phys 17(9):1683–1684CrossRefGoogle Scholar
  19. 19.
    Ives HE (1931) Optical properties of a Lippmann lenticuled sheet. J Opt Soc Am 21:171–176CrossRefGoogle Scholar
  20. 20.
    Jang JS, Javidi B (2002) Three-dimensional synthetic aperture integral imaging. Opt Lett 27:1144–1146CrossRefGoogle Scholar
  21. 21.
    Javidi B, Hong SH, Matoba O (2006) Multidimensional optical sensor and imaging system. Appl Opt 45:2986–2994CrossRefGoogle Scholar
  22. 22.
    Javidi B, Okano F, Son JY (2008) Three-dimensional imaging, visualization, and display technology. Springer, BerlinGoogle Scholar
  23. 23.
    Javidi B, Shen X, Markman A, Latorre-Carmona P, Martínez-Uso A, Sotoca JM, Pla F, Martinez-Corral M, Saavedra G, Huang YP, Stern A (2017) Multidimensional optical sensing and imaging systems (MOSIS): from macro to micro scales. Proc IEEE 105(5):850–875CrossRefGoogle Scholar
  24. 24.
    Kurulakos K, Seitz S (2000) A theory of shape by space carving. Int J Comput Vision 38(3):199–218CrossRefMATHGoogle Scholar
  25. 25.
    Martínez-Corral M, Javidi B, Martínez-Cuenca R, Saavedra G (2004) Integral imaging with improved depth of field by use of amplitude modulated microlens array. Appl Opt 43:5806–5813CrossRefGoogle Scholar
  26. 26.
    Martínez-Cuenca R, Saavedra G, Martínez-Corral M, Javidi B (2009) Progress in 3-D multiperspective display by integral imaging. Proc IEEE 97:1067–1077CrossRefGoogle Scholar
  27. 27.
    Martínez-Uso A, Latorre-Carmona P, Sotoca JM, Pla F, Javidi B (2016) Depth estimation in integral imaging based on a maximum voting strategy. IEEE J Display Technol 12(12):1715–1723Google Scholar
  28. 28.
    Okano F, Akai J, Mitani K, Okui M (2006) Real-time integral imaging based on extremely high resolution video system. Proc IEEE 94:490–501CrossRefGoogle Scholar
  29. 29.
    Okano F, Hoshino H, Arai J, Yuyama I (1997) Real-time pickup method for a three-dimensional image based on integral photography. Appl Opt 36(7):1598–1603CrossRefGoogle Scholar
  30. 30.
    Okoshi T (1980) Three-dimensional displays. Proc IEEE 68(5):548–564CrossRefGoogle Scholar
  31. 31.
    Perduz S, García MA, Puig D (2015) Focus-aided scene segmentation. Comput Vis Image Underst 133:66–75CrossRefGoogle Scholar
  32. 32.
    Shin DH, Lee BG, Lee JJ (2008) Occlusion removal method of partially occluded 3d objects using sub-image block matching in computational integral imaging. Opt Express 16(21):16294–16304CrossRefGoogle Scholar
  33. 33.
    Sinha S, Steedly D, Szeliski R, Agrawala M, Pollefeys M (2008) Interactive 3D architectural modeling from unordered photo collections. ACM Trans Graph 27:1–10CrossRefGoogle Scholar
  34. 34.
    Slabaugh G, Culbertson W, Malzbender T, Stevens M, Schafer R (2004) Methods for volumetric reconstruction of visual scenes. Int J Comput Vision 57(3):179–199CrossRefGoogle Scholar
  35. 35.
    Sokolov AP (1911) Autostereoscopy and integral photography by professor Lippmann’s method. Moscow State University Press, MoscowGoogle Scholar
  36. 36.
    Son JY, Son WH, Kim SK, Lee KH, Javidi B (2013) Three-dimensional imaging for creating real-world-like environments. Proc IEEE 101(1):190–205CrossRefGoogle Scholar
  37. 37.
    Stern A, Javidi B (2006) Three-dimensional image sensing, visualization and processing using integral imaging. Proc IEEE 94:591–607CrossRefGoogle Scholar
  38. 38.
    Stern A, Javidi B (2006) Three-dimensional imaging aperture integral imaging. Proc IEEE 94:591–607CrossRefGoogle Scholar
  39. 39.
    Svoboda T, Martinec D, Pajdla T (2005) A convenient multi-camera self-calibration for virtual environments. PRESENCE: Teleoperator Virtual Environ 14(4):407–422CrossRefGoogle Scholar
  40. 40.
    Wang TC, Efros AE, Ramamoorthi R (2016) Depth estimation with occlusion modeling using light-field cameras. IEEE Trans Pattern Anal Mach Intell 38(11):2170–2181CrossRefGoogle Scholar
  41. 41.
    Wang Z, Bovik AC, Sheikh HR, Simoncelli EP (2004) Image quality assessment: from error visibility to structural similarity. IEEE Trans Image Process 13(4):600–612CrossRefGoogle Scholar
  42. 42.
    Wheatstone C (1838) On some remarkable, and hitherto unobserved, phenomena of binocular vision. Philos Trans R Soc Lond 128:371–394CrossRefGoogle Scholar
  43. 43.
    Xiao X, Javidi B, Martinez-Corral M, Stern A (2013) Advances in three-dimensional integral imaging: sensing, display, and applications. Appl Opt 52(4):546–560CrossRefGoogle Scholar
  44. 44.
    Zhao Y, Xiao X, Cho M, Javidi B (2011) Tracking of multiples objects in unknown background using Bayesian estimation in 3d space. J Opt Soc Am A 28(9):1935–1940CrossRefGoogle Scholar
  45. 45.
    Zhu X, Cohen S, Schiller S, Milanfar P (2013) Estimating spatially varying defocus blur from a single image. IEEE Trans Image Process 22(12):4879–4891MathSciNetCrossRefMATHGoogle Scholar

Copyright information

© Springer-Verlag London Ltd., part of Springer Nature 2018

Authors and Affiliations

  1. 1.Institute of New Imaging TechnologiesUniversitat Jaume ICastellón de la PlanaSpain
  2. 2.Electrical and Computer Engineering DepartmentUniversity of ConnecticutCTUSA

Personalised recommendations