Advertisement

Experimental Mechanics

, Volume 58, Issue 5, pp 831–845 | Cite as

Full-Field Surface 3D Shape and Displacement Measurements Using an Unfocused Plenoptic Camera

  • B. Chen
  • B. Pan
Article

Abstract

Full-field surface 3D shape and displacement measurements using a single commercial unfocused plenoptic camera (Lytro Illum) are reported in this work. Before measurements, the unfocused plenoptic camera is calibrated with two consecutive steps, including lateral calibration and depth calibration. Each raw image of a checkerboard pattern recorded by Lytro Illum is first extracted to an array of sub-aperture images (SAIs), and the center sub-aperture images (CSAIs) at diverse poses are used for lateral calibration to determine intrinsic and extrinsic parameters. The parallax maps between the CSAI and the remaining SAIs at each pose are then determined for depth parameters estimation using depth calibration. Furthermore, a newly developed physical-based depth distortion model is established to correct the serious distortion of the depth field. To realize shape and deformation measurements, the raw images of a test sample with speckle patterns premade on its surface are captured by Lytro Illum and extracted to arrays of SAIs. The parallax maps between the CSAI and the target SAIs are obtained using subset-based digital image correlation. Based on the pre-computed intrinsic and depth parameters and the disparity map, the full-field surface 3D shape and displacement of a test object are finally determined. The effectiveness and accuracy of the proposed approach are evaluated by a set of experiments involving the shape reconstruction of a cylinder, in-plane and out-of-plane displacement measurements of a flat plate and 3D full-field displacement measurements of a cantilever beam. The preliminary results indicate that the proposed method is expected to become a novel approach for full-field surface 3D shape and displacement measurements.

Keywords

Unfocused plenoptic camera Digital image correlation Surface 3D shape reconstruction Full-field displacement measurement Depth distortion model 

Notes

Acknowledgements

This work is supported by the National Natural Science Foundation of China (Grants No. 11427802, 11632010), the Aeronautical Science Foundation of China (2016ZD51034), and State Key Laboratory of Traction Power of Southwest Jiaotong University (Grant No. TPL1607).

Glossary

CSAI

Center sub-aperture image. A raw image captured by a plenoptic camera can be extracted to an array of sub-aperture images, which are equivalent to an array of images captured with slight parallaxes. The one at the center is named as Center sub-aperture image.

DIC

Digital image correlation, a widely used optical technique for surface profile and deformation measurements.

EPI

Epipolar image, a 2D slice of the 4D light field by fixing the horizontal angular and spatial coordinates, or the vertical angular and spatial coordinates simultaneously.

IC-GN algorithm

Inverse-compositional Gauss-Newton algorithm, an efficient algorithm for subset matching in DIC.

MLA

Microlens array, inserted in front of the sensor and behind the main lens of the plenoptic camera.

PIV

Particle image velocimetry, an optical method to measure the velocity field of the flow.

RMS

Root mean square, a statistical value defined as the square root of the mean square.

ROI

Region of interest, a region on the image selected for further measurements. We only measure the profile or deformation in this region.

SAI

Sub-aperture image, images extracted from the raw image of the plenoptic camera by fixing angular coordinates.

SD

Standard deviation.

ZNSSD

Zero-mean normalized sum of squared difference.

Supplementary material

11340_2018_383_MOESM1_ESM.pdf (244 kb)
ESM 1 (PDF 243 kb)

References

  1. 1.
    Ives FE (1903) Parallax stereogram and process of making same. US Pat 725, 567 1–3Google Scholar
  2. 2.
    Lippmann G (1908) La photographie intégrale. CR Acad Sci 146:446–451Google Scholar
  3. 3.
    Ives HE, Jones L (1928) A camera for making parallax panoramagrams. J Opt Soc Am 17:435–439CrossRefGoogle Scholar
  4. 4.
    Ives HE (1931) The projection of parallax panoramagrams. J Opt Soc Am 21:397–409CrossRefGoogle Scholar
  5. 5.
    Adelson EH, Wang JYA (1992) Single lens stereo with a plenoptic camera. IEEE Trans Pattern Anal Mach Intell 14:99–106CrossRefGoogle Scholar
  6. 6.
    Bok Y, Jeon HG, Kweon IS (2017) Geometric calibration of micro-lens-based light field cameras using line features. IEEE Trans Pattern Anal Mach Intell 39:287–300CrossRefGoogle Scholar
  7. 7.
    Ng R, Levoy M, Duval G et al (2005) Light field photography with a hand-held plenoptic camera. Informational 1–11Google Scholar
  8. 8.
    Ng R (2006) Digital light field photography. Stanford Univ 1–203Google Scholar
  9. 9.
    Lumsdaine A, Georgiev T (2008) Full resolution lightfield rendering. Indiana Univ Adobe Syst Tech Rep 1–12Google Scholar
  10. 10.
    Lumsdaine A, Georgiev T (2009) The focused plenoptic camera. 2009 I.E. Int Conf Comput Photogr ICCP 09Google Scholar
  11. 11.
    Lumsdaine A (2010) Focused plenoptic camera and rendering. J Electron Imaging 19:21106CrossRefGoogle Scholar
  12. 12.
    Levoy M, Hanrahan P (1996) Light field rendering. Proc 23rd annu conf comput graph interact tech - SIGGRAPH ‘96 31–42Google Scholar
  13. 13.
    Wanner S, Fehr J, Jähne B (2011) Generating EPI representations of 4D light fields with a single lens focused plenoptic camera. Lect Notes Comput Sci (including Subser Lect Notes Artif Intell Lect Notes Bioinformatics) 6938 LNCS:90–101Google Scholar
  14. 14.
    Johannsen O, Sulc A, Goldluecke B (2016) Occlusion-aware depth estimation using sparse light field coding. In: Rosenhahn B, Andres B (eds) Pattern Recognition, Gcpr 2016. Springer International Publishing, Cham, pp 207–218Google Scholar
  15. 15.
    Lin H, Chen C, Kang SB, Yu J (2015) Depth recovery from light field using focal stack symmetry. Proc IEEE int conf comput vis 2015 inter, pp 3451–3459Google Scholar
  16. 16.
    Tao MW, Srinivasan PP (2015) Depth from shading, defocus, and correspondence using light-field angular coherence. CVPR, pp 1940–1948Google Scholar
  17. 17.
    Wang TC, Efros AA, Ramamoorthi R (2015) Occlusion-aware depth estimation using light-field cameras. In: Proc. ieee int. conf. comput. vis, pp 3487–3495Google Scholar
  18. 18.
    Jeon HG, Park JJ, Choe G et al (2015) Accurate depth map estimation from a lenslet light field camera. Proc IEEE comput soc conf comput vis pattern recognit 07–12–June, pp 1547–1555Google Scholar
  19. 19.
    Dansereau DG, Pizarro O, Williams SB (2013) Decoding, calibration and rectification for lenselet-based plenoptic cameras. Proc IEEE comput soc conf comput vis pattern recognit, pp 1027–1034Google Scholar
  20. 20.
    Yang P, Wang Z, Yan Y et al (2016) Close-range photogrammetry with light field camera: from disparity map to absolute distance. Appl Opt 55:7477–7486CrossRefGoogle Scholar
  21. 21.
    Johannsen O, Heinze C, Goldluecke B, Perwaß C (2013) On the calibration of focused plenoptic cameras. Lect Notes Comput Sci (including Subser Lect Notes Artif Intell Lect Notes Bioinformatics) 8200 LNCS, pp 302–317Google Scholar
  22. 22.
    Zeller N, Quint F, Stilla U (2016) Depth estimation and camera calibration of a focused plenoptic camera for visual odometry. ISPRS J Photogramm Remote Sens 118:83–100CrossRefGoogle Scholar
  23. 23.
    Zeller N, Quint F, Stilla U (2014) Calibration and accuracy analysis of a focused plenoptic camera. ISPRS Ann Photogramm Remote Sens Spat Inf Sci II-3, pp 205–212Google Scholar
  24. 24.
    Heinze C, Spyropoulos S, Hussmann S, Perwass C (2016) Automated robust metric calibration algorithm for multifocus plenoptic cameras. IEEE Trans Instrum Meas 65:1197–1205CrossRefGoogle Scholar
  25. 25.
    Strobl KH, Lingenauber M (2016) Stepwise calibration of focused plenoptic cameras. Comput Vis Image Underst 145:140–147CrossRefGoogle Scholar
  26. 26.
    Lynch K (2011) Development of a 3D fluid velocimetry technique based on light field imaging. MS Thesis, Auburn University, AlabamaGoogle Scholar
  27. 27.
    Lynch K, Fahringer T, Thurow B (2012) Three-dimensional particle image velocimetry using a plenoptic camera. 50th AIAA Aerosp Sci Meet Incl New Horizons Forum Aerosp Expo, pp 1–14Google Scholar
  28. 28.
    Shi S, Wang J, Ding J et al (2016) Parametric study on light field volumetric particle image velocimetry. Flow Meas Instrum 49:70–88CrossRefGoogle Scholar
  29. 29.
    Bolan J, Johnson KC, Thurow BS (2014) Preliminary investigation of three-dimensional flame measurements with a plenoptic camera. 30th AIAA Aerodyn Meas Technol Gr Test Conf, pp 1–12Google Scholar
  30. 30.
    Carlsohn MF, Kemmling A, Petersen A, Wietzke L (2016) 3D real-time visualization of blood flow in cerebral aneurysms by light field particle image velocimetry. Proc SPIE 9897:989703CrossRefGoogle Scholar
  31. 31.
    Bing P, Li K, Tong W (2013) Fast, robust and accurate digital image correlation calculation without redundant computations. Exp Mech 53:1277–1289CrossRefGoogle Scholar
  32. 32.
    Yu L, Bing P (2017) Color stereo-digital image correlation method using a single 3CCD color camera. Exp Mech 57:649–657CrossRefGoogle Scholar
  33. 33.
    Adelson EH, Bergen JR (1991) The plenoptic function and the elements of early vision. Comput Model Vis Process, pp 3–20Google Scholar
  34. 34.
    Birklbauer C, Opelt S, Bimber O (2013) Rendering gigaray light fields. Comput Graphics Forum 32:469–478CrossRefGoogle Scholar
  35. 35.
    Cho D, Lee M, Kim S, Tai YW (2013) Modeling the calibration pipeline of the lytro camera for high quality light-field image reconstruction. Proc IEEE int conf comput vis, pp 3280–3287Google Scholar
  36. 36.
    Geiger A, Moosmann F, Car O, Schuster B (2012) Automatic camera and range sensor calibration using a single shot. 2012 I.E. int conf robot autom, pp 3936–3943.  https://doi.org/10.1109/ICRA.2012.6224570
  37. 37.
    Zhang Z (2000) A flexible new technique for camera calibration. IEEE Trans Pattern Anal Mach Intell 22:1330–1334CrossRefGoogle Scholar
  38. 38.
    Brown D (1966) Decentering distortion of lenses. Photogramm Eng 32:444–462Google Scholar
  39. 39.
    Triggs B, McLauchlan PF, Hartley RI, Fitzgibbon AW (1999) Bundle adjustment—a modern synthesis. In: International workshop on vision algorithms. Springer, Berlin, Heidelberg, pp 298–372Google Scholar
  40. 40.
    Zhang Z (1999) Flexible camera calibration by viewing a plane from unknown orientations. Proc seventh IEEE int conf comput vis, vol 1, pp 666–673Google Scholar
  41. 41.
    Moré JJ (1978) The Levenberg-Marquardt algorithm: implementation and theory[M]//Numerical analysis. Springer, Berlin, Heidelberg, pp 105–116Google Scholar
  42. 42.
    Luo PF, Chen JN (2000) Measurement of curved-surface deformation in cylindrical coordinates. Exp Mech 40:345–350CrossRefGoogle Scholar
  43. 43.
    Szeliski R (2010) Computer vision: algorithms and applications. Computer (Long Beach Calif) 5:832Google Scholar

Copyright information

© Society for Experimental Mechanics 2018

Authors and Affiliations

  1. 1.Institute of Solid MechanicsBeihang UniversityBeijingChina

Personalised recommendations