Multispectral incoherent holography based on measurement of differential wavefront curvature

We propose a technique of multispectral incoherent holography. The differential wavefront curvature is measured, and the principle of Fourier transform spectrometry is applied to provide a set of spectral components of three-dimensional images and continuous spectra for spatially incoherent, polychromatic objects. This paper presents the mathematical formulation of the principle and the experimental results. Three-dimensional imaging properties are investigated based on an analytical impulse response function. The experimental and theoretical results agree well.

We previously developed a method of interferometric spectral imaging for three-dimensional (3D) objects illuminated by a natural light source [11]. Subsequently, we introduced the synthetic aperture technique to advance digital holographic 3D imaging spectrometry [12,13]. This method is also called the spherical-type method, because the fringe patterns recorded in the measured volume interferogram are arranged in the same way as spherical wavefronts propagating from the object. Variations of the method have been developed, including the hyperbolic-type method (H-type) [14] and rotated hyperbolic-type method [15]. Variations have also been extended to single-pixel imaging, in which an H-type volume interferogram can be measured directly without using a synthetic aperture technique [16]. Each variation has its own advantages; however, these methods generally have a long measurement time.
In this paper, we propose a method, called multispectral incoherent holography, that requires a simple system and a short measurement time compared with previous methods, and thus is better suited for unstable objects, such as biological samples. A set of spectral components of the 3D images and continuous spectra for spatially incoherent polychromatic objects are obtained from the measured volume interferogram. Because the method is based on measuring differential wavefront curvature, it is a generalization of Fresnel incoherent correlation holography [4], combined with the Fourier transform spectrometry.
We present our experimental results and mathematical analysis of the method. To characterize the spatial imaging properties in the lateral and depth directions and the spectral resolution, a new analytical solution of the impulse response function (IRF) under the paraxial approximation is derived. This IRF is defined over four-dimensional (4D) (x, y, z, ) space. Based on this function, it is possible to estimate both 3D spatial and spectral resolutions.
In Sect. 2, we summarize the procedure for reconstructing the multispectral components of 3D images. This reconstruction process is based on the Wiener-Khinchin theorem and the propagation law of optical coherence. Section 3 is divided into two parts. The first part shows the experimental setup and objects that we measured. The second part describes the experimental results that demonstrate the performance of the method for reconstructing multispectral images. Section 4 consists of two subsections. First, we derive the analytical IRF solution defined over 4D space. Next, we examine the validity of the derived IRF solution by comparing the experimental results using a monochromatic point source. The summary is given in Sect. 5. Part of this work has been presented elsewhere in the literature [17].

Description of the method
In this section, we present the concept of multispectral incoherent holography. We begin by introducing the measurement system, which measures the differential wavefront curvature between two split wavefronts. The system obtains a volume (3D) interferogram that records a 3D spatial correlation function. We show the signal processing procedure for spectral decomposition to obtain a set of complex incoherent holograms for different spectral components. The 3D image at each spectrum can be reconstructed from the complex incoherent hologram by applying the usual inverse propagation techniques. optical fields V 1 ( ⊥ , z 0 , t) reflected by the concave mirror and V 2 ( ⊥ , z 0 + Z, t), reflected by the plane mirror. The 3D s p a t i a l c o r r e l a t i o n f u n c t i o n , , of the optical field contains both 3D spatial information and spectral information of the polychromatic object.

Recovery of 3D images for many spectral components
For the stationary optical field, the spatial correlation function 12 ( ) recorded in the volume interferogram is expressed as a superposition of the cross-spectral density function, W 12 , in the form: where = ck is the angular frequency, c is the speed of light in free space, and k = 2 ∕ is the wavenumber of wavelength . Equation (2) is a special case of the Wiener-Khinchin theorem, in which temporal difference is set to be zero. This equation means that for a stationary optical filed, spectral components of the optical fields for different frequencies are mutually uncorrelated. The cross-spectral density function on the right-hand side of Eq. (2) is defined as the cross correlation of the monochromatic components U 1 and U 2 of V 1 and V 2 as: Under the paraxial approximation and the assumptions z 0 ≫ Z , this cross-spectral density function can be written as: where is the cross-spectral density function defined over the observation plane z = z 0 . Substituting Eq. (4) into Eq. (2) gives the relationship between the spatial correlation function 12 , and the cross-spectral density function across the observation plane as: It is then clear that Eq. (6) may be inverted to express the cross-spectral density function across the observation plane as the Fourier transform of the spatial correlation function, In Eq. (7), the integrand has taken over the actual extension of the interferogram with respect to Z.
From Eq. (1), the 3D interferogram being recorded includes two intensity distribution terms and two interference terms. These interference terms can be separated from the intensity distribution terms during the retrieval of the cross-spectral density functions, because the interference term 12 ( ) contains only positive-frequency spectral components, as shown in the integral region in Eq. (6), and * 12 ( ) contains only negative-frequency components. On the other hand, the intensity distribution terms 11 and 22 do not change rapidly within the volume interferogram. This means that the spectra of 11 and 22 appear close to zero spatial-frequency region, separated from those of 12 and * 12 . By choosing the positive-frequency components, we obtain the information of 12 , separated from other terms.
The cross-spectral density function, W (z 0 ) 12 ( ⊥ ; ) , in Eq. (5) can be expressed in terms of the spectral density function, S(r s ; ) , of the measured object as (see Appendix 1): Here, is a coefficient, r s = (r s⊥ , z s ) = (x s , y s , z s ) is a point on the polychromatic object, (z s ) is the radius of the differential wavefront curvature as a function of z s , and m is the lateral magnification. Equation (7) allows us to retrieve the cross-spectral density function across the observation plane. This crossspectral density function, which is expressed as Eq. (8), is equivalent to the complex incoherent hologram of a spectral component. Thus, the 3D image for each spectrum can be reconstructed from the complex incoherent hologram by applying the usual inverse propagation formula, Here, ⊗ stands for the convolution integral. The objects that are located across plane z = z s are in focused and other objects are defocused.

Experimental conditions
In this section, we describe the experiment in which we obtained a set of spectral components of 3D images. We used two mask screens of the letter K and number 2 as the measured objects that were illuminated by incoherent light sources, which were a metal halide lamp (MHL) and a blue LED, so that the measured objects were planar polychromatic objects located at different depths (Fig. 2). Figure 3 shows the spectral profiles of the MHL and blue LED, which were measured separately by Fourier transform spectrometry. The spectral resolution was 61.09 cm −1 and the spectral range was 3.13 × 10 4 cm −1 . Figure 4 shows the photograph of the mask screens for K ( 0.8 mm × 0.8 mm ) and 2 ( 0.7 mm × 0.8 mm ). Other parameters are listed in Table 1. Figure 5 shows the intensity distribution of the volume interferogram along the optical path difference, Z, at particular point on (X, Y) space. Figure 6 shows the continuous spectral profile over the observation plane that was obtained by taking the Fourier transform of the intensity distribution in Fig. 5. The number of data points is 52, which covers the spectral range from 400 to 800 nm. The spectral resolution is limited by the step interval and number of steps of the PZT.   Figure 7a shows the phase distribution and Fig. 7b shows the absolute value of the complex incoherent hologram W

Experimental results
. This phase distribution only recorded the wavefront shape of the optical field propagated from the K mask screen, because the contribution from the 2 mask screen is small at this spectral component. Figure 7c shows the phase distribution of the reconstructed image and Fig. 7d shows the in-focus image over the x-y plane, where the reconstruction distance is z = 4 mm . The images of K and 2 are separated clearly. Because the wavefront shape of the 2 mask screen, recorded in the phase distribution of the complex incoherent hologram, has been eliminated, this reconstruction distance specifies the in-focus plane of K. Figure 7e, f shows the intensity profiles along the x-and y-directions in the object position of Fig. 7d. From these intensity profiles, the size of the reconstructed object is 0.7 mm × 0.8 mm. The shape of the measured object was reconstructed and the size of the letter K was close to the original size of the K in Fig. 4a. Figure 8a, b shows the intensity distributions over the x-z and y-z planes, and Fig. 8c shows the intensity profile along the z-direction at = 553.5 nm . The intensity peak is close to the z = 4 mm , which is in agreement with object position, z 1 = 5 mm.
Similarly, Fig. 9a-f shows the complex incoherent hologram at = 470.8 nm and the reconstructed results for the 2 mask screen obtained from the complex incoherent hologram. Figure 9a shows the phase distribution and Fig. 9b shows the absolute value of the complex incoherent hologram. Because the two objects have this spectral component, the wavefront shapes of the optical field propagated from both objects are recorded. Figure 9c shows the phase distribution of the reconstructed image and Fig. 9d shows the in-focus image, where the reconstruction distance is   z = −6 mm . From these reconstructed results, the reconstructed images of the 2 and K mask screens were obtained. However, K was obviously blurred and 2 was focused, which means that the depth distance of the two objects can be distinguished. Figure 9e, f shows the intensity profiles along the x-and y-directions in the object position of Fig. 9d. From these intensity profiles, the sizes of the reconstructed object are 0.7 mm × 0.9 mm . The shape of the measured object was reconstructed. Figure 10a, b shows the intensity distributions over the x-z and y-z planes, and Fig. 10c shows the intensity profile along z-direction at = 470.8 nm . The intensity peak is close to z = −6 mm , which is in agreement with object position, z 2 = −5 mm.
The measured polychromatic objects in this experiment were the K and 2 mask screens that have different continuous spectra (Fig. 3). Figure 11a, b shows the reconstructed in-focus spectral images and the phase distributions of K and 2 at different spectral components. The object shapes of the K and 2 mask screens are clearly seen at spectral peaks = 470.8 nm and = 553.5 nm. We also see the change in the object intensity and shapes of K and 2 at = 470.8-499.5 nm, because this wavelength range covers the contribution of spectral components of both the MHL and blue LED, and the positions of the K and 2 mask screens are different. In contrast, we see only the object shapes of K at = 525.1-602.3 nm, because the wavelength range only covers the spectral components of the MHL. These results agree with the combined spectral profile of the MHL and blue LED (Fig. 6). Figure 12 shows the separated spectral profiles of in-focus images of K and 2 at fixed points on the characters. The profiles are obtained by the tracking variation of intensities across the wavelength region, thereby obtaining the continuous spectrum at a specific point on the 3D images. Our experimental results agree with the spectral profiles in Fig. 3 obtained separately.
Finally, we note that the measurement time of this experiment is 256 s for 256 frames. This is about 295 times shorter than our previous work [16] that takes 21 h, because the present method uses single-axis PZT scan instead of the 3D scan by a single-axis PZT and independent two-axes stages. Measurement time of the previous method is quite long, because each stage stops at every sampling point for measuring. In principle, measurement time of the present

Comparison of imaging properties predicted by the IRF and experimental results
In this section, we derive an analytical solution of the 4D IRF of the present method. The derivation is performed under the paraxial approximation. To validate the IRF solution, the imaging properties predicted by the IRF were compared with the experimental results.

Mathematical analysis of 4D IRF
First, let us assume that the object to be measured is a monochromatic point source with angular frequency s = ck s = 2 c s , located at position r s = (x s , y s , z s ) , and the spatial correlation functions along x-, y-, and z-axes are measured within baseline lengths of l x , l y , and l z . We may write: where S p (r; ) is the spectral density function of the monochromatic point source located at r s , with unit intensity and the 3D window function, A( ) , assigns the size of the volume interferogram. This window function takes unit value in the measurement area and zero outside. From Eq. (7), the measured cross-spectral density function, denoted W M , across the observation plane may be expressed as the Fourier transform of the product of the spatial correlation function in Eq. (6) and the window function. Then, the measured crossspectral density function is expressed as: where sinc x = (sin x)∕ x . In this equation, subscript i indicates that the parameters are used for the reconstruction, so that the angular frequency for reconstruction is i = ck i = 2 c i . In Eq. (15), the product of coefficient l z 2 c and the sinc function represents the spectral IRF that is characterized by the limited baseline length, l z . Using Eqs. (8) and (13), we may rewrite Eq. (15) as:  In this equation, the measured cross-spectral density function is expressed as a product of the spectral IRF, the 2D aperture function A( ⊥ ) = rect(X l x )rect(Y l y ) , that specifies the size of the complex incoherent hologram, and the quadratic phase factor for the monochromatic point source. From this cross-spectral density function, we reconstruct the 3D image of the monochromatic point source. This image corresponds to the IRF. By applying the inverse propagation formula in Eq. (12), the IRF, denoted h, is expressed as: where d 2 ⊥ = dXdY . We define the 3D space vector, r i = (x i , y i , z i ) = (r i⊥ , z i ) , which specifies the location of the reconstructed image at a reconstruction frequency i . On substituting Eq. (16) into Eq. (17), we may rewrite IRF after a straightforward calculation as: Here, we introduce the degrees of focusing, M, as: In Eq. (19), M is defined as the ratio of the product of wavelength and differential curvature radius of the object and the reconstructed image. The in-focus condition is realized if M = 1, because the quadratic phase factors in the integrations by X and Y on the right-hand side of Eq. (18) vanish. For M ≠ 1 , these integrations are carried out and lead to the following expression of IRF [18]: .
where ( * ) appear only if 0<M<1 . The function F(a) is defined by: where are the Fresnel integrals [19]. Function F(a) in Eq. (21) represents the complex amplitude of Fresnel diffraction with an infinite linear edge [20]. Arguments and of function F in Eq. (20) are expressed as: In Eq. (20), if we take the limit of M → 1 , this equation reduces to: �� , This expression of 4D IRF corresponds to the diffractionlimited in-focus image of the monochromatic point source. Because the optics and signal processing in our system are linear, the output image, O, is generally expressed as the superposition integral of the input spectral density function and IRF,

Comparison of imaging properties predicted by IRF and experimental results
We compare the 4D IRF in Eq. (20) and our experimental results. The measured object is a monochromatic point source with a wavelength of 632.8 nm composed of He-Ne laser light guided by a single-mode optical fiber. This monochromatic point source is set close to the origin of the Cartesian coordinate system. Thus, the 3D image obtained experimentally can be compared directly with the 3D point spread function. All the experimental parameters are shown in Table 2. The conditions assumed in the numerical calculation by the 4D IRF are the same as the experimental conditions. The experimental spectral profile is shown in Fig. 13. The spectral peak appears near 640 nm, and the spectral resolution is limited by the first zero point of sinc function on the right-hand side of Eq. (20). The spectral resolution is Δf = 1 l z = 244.14 cm −1 , where f = k∕ 2 = 1∕ is the wavenumber. In the wavelength region, the spectral resolution is written as: For = 640 nm , we find Δ = 9.77 nm . This value agrees with the intervals of spectral channels around the peak, as shown circles in Fig. 13. These expressions of spectral resolution are common in the field of Fourier transform spectrometry.
The phase distributions of the complex incoherent holograms at the spectral peak calculated using Eq. (16) and obtained experimentally agree well (Fig. 14a, b). Figures 15,16,17,18 compare the reconstructed results from the 4D IRF and the experimental results. The in-focus image over the x i -y i plane at z i = 0 mm calculated from an analytical solution of 4D IRF in Eq. (26) (Fig. 15a) and the corresponding image reconstructed from the complex incoherent hologram whose phase distribution is shown in Fig. 14b  (Fig. 15b) agree well. These images are enlarged for detail comparison. Figure 16 shows the intensity profiles along the x-axis in Fig. 15; the solid curve shows the experimental results and the dotted curve shows the theoretical results based on the 4D IRF. Figures 15, 16 correspond to the intensity profile of a diffraction-limited image of a point source. The experimental and theoretical results both show that for a hologram with a rectangular aperture, the 2D point spread function is represented by the second and third sinc functions of the IRF in Eq. (26). Figure 17a shows the intensity distribution over the x i -z i plane calculated from an analytical solution of the 4D IRF and Fig. 17b shows the corresponding image obtained from the experimental complex incoherent hologram (Fig. 14b). Figure 18 compares the experimental intensity profile (solid curve) with the analytical solution of 4D IRF (dotted curve) along the z-axis across the object position of Fig. 17. The peak positions and the distribution shapes agree well. We conclude from these results that the 4D IRF in Eq. (20) specifies the spectral resolution and 3D imaging properties in multispectral incoherent holography.

Conclusion
We presented experimental and theoretical studies of multispectral incoherent holography, which is based on measuring differential wavefront curvature. The experimental results showed that 3D spatial information at every spectral component of the measured object was acquired properly by this method. A paraxial IRF defined over the space-frequency domain was derived. Based on this IRF solution, we investigated the imaging properties of multispectral incoherent holography. By comparing the theoretical prediction of the 4D IRF and the experimental results, we have shown that the spectral resolution and 3D imaging properties observed experimentally agree well with the theoretical prediction.
The measurement time of the present method is considerably smaller than those of our previous methods. This simplified lensless optical system is expected to be useful in wide range of applications, such as biological observations via spectrally resolved 3D images.