Annals of Biomedical Engineering

, Volume 40, Issue 2, pp 251–262

Lensfree Optofluidic Microscopy and Tomography

Authors

  • Waheb Bishara
    • Electrical Engineering DepartmentUniversity of California
  • Serhan O. Isikman
    • Electrical Engineering DepartmentUniversity of California
    • Electrical Engineering DepartmentUniversity of California
    • Bioengineering DepartmentUniversity of California
    • California NanoSystems InstituteUniversity of California
Article

DOI: 10.1007/s10439-011-0385-3

Cite this article as:
Bishara, W., Isikman, S.O. & Ozcan, A. Ann Biomed Eng (2012) 40: 251. doi:10.1007/s10439-011-0385-3

Abstract

Microfluidic devices aim at miniaturizing, automating, and lowering the cost of chemical and biological sample manipulation and detection, hence creating new opportunities for lab-on-a-chip platforms. Recently, optofluidic devices have also emerged where optics is used to enhance the functionality and the performance of microfluidic components in general. Lensfree imaging within microfluidic channels is one such optofluidic platform, and in this article, we focus on the holographic implementation of lensfree optofluidic microscopy and tomography, which might provide a simpler and more powerful solution for three-dimensional (3D) on-chip imaging. This lensfree optofluidic imaging platform utilizes partially coherent digital in-line holography to allow phase and amplitude imaging of specimens flowing through micro-channels, and takes advantage of the fluidic flow to achieve higher spatial resolution imaging compared to a stationary specimen on the same chip. In addition to this, 3D tomographic images of the same samples can also be reconstructed by capturing lensfree projection images of the samples at various illumination angles as a function of the fluidic flow. Based on lensfree digital holographic imaging, this optofluidic microscopy and tomography concept could be valuable especially for providing a compact, yet powerful toolset for lab-on-a-chip devices.

Keywords

HolographyPixel super-resolutionOn-chip imagingFiltered back-projectionDiffraction

Introduction

The field of optofluidics has been growing steadily in recent years due to the great potential lying at the intersection between microfluidics and optics. The integration of microfluidic and optical techniques allows the manipulation of fluids and light in highly integrated, yet compact lab-on-a-chip platforms. Such optofluidic devices could enable sensitive optical interrogation of fluids for biological or chemical detection and sensing, as well as lead the way to tunable and reconfigurable micro-scale optical devices. Some of these recently introduced optofluidic technologies include tunable dye laser, reconfigurable lenses, and flow cytometers,1,12,23,25,30 among others. One other major optofluidic platform that has been added to this emerging set of tools is optical microscopy.10,12,19,22,28,30 Its integration with microfluidic devices would allow the miniaturization and simplification of optical imaging instruments for applications in, e.g., telemedicine and global health.

Several attempts have been made to do away with lenses and other bulky optical components to achieve optical imaging within a compact and cost-effective platform. One such approach, termed optofluidic microscope (OFM), is based on a slanted array of nano-apertures (e.g., <1 μm in diameter) placed at the bottom of a microfluidic channel.10,19,22,28 As an object flows over and in close proximity to these nano-apertures, light is collected from each aperture, and by digitally processing each aperture’s intensity signal (through time-to-space conversion), a microscopic image of the specimen within the micro-channel can be synthesized. The major contribution of this study is to get around the pixel size limitation of digital sensors (e.g., CCD or CMOS chips) through fabrication of sub-micron apertures, so that a decent lensfree spatial resolution can be achieved within the optofluidic chip.

In this article, however, we will focus on an alternative optofluidic microscopy approach, i.e., lensfree holographic implementation of optofluidic on-chip microscopy, which might provide a significantly simpler and more powerful solution for three-dimensional (3D) on-chip imaging. By using simple light sources such as Light Emitting Diodes (LEDs) and no other optical components in this holographic optofluidic microscope (HOM), the lensfree hologram of an object flowing within a microfluidic channel can be recorded. In fact, the flow of the object within the micro-channel is utilized for capturing multiple slightly different lensfree holograms of the same object, which allows lateral resolution enhancement using pixel super-resolution (PSR) algorithms. This is an important example of how combining microfluidics with optics can indeed enhance the performance and versatility of a lab-on-a-chip device. Centered on holographic optofluidic imaging, this article will be structured as follows: First, the principles behind lensfree in-line holographic imaging are introduced. Then, the integration of in-line holography and microfluidics is explained, along with PSR algorithms that enable enhancement in lateral resolution of the lensfree microscope by means of the flow inherent to micro-fluidic devices. Finally, we describe the multi-angle holographic implementation of optical tomographic imaging and its incorporation into optofluidics.

Partially Coherent Lensfree Digital In-Line Holography

The basic imaging technique underlying the devices covered in this article is partially coherent digital in-line holography, albeit in a configuration different from the traditional coherent in-line holography systems.15 In digital in-line holographic imaging, in general, a portion of the illumination wavefront passing through an object to be imaged is scattered, while the remaining portion propagates unperturbed. The scattered portion, s(x,y,z), and the unperturbed portion, R(x,y,z), interfere at the surface of the digital image sensor, z = z0 (if they are sufficiently coherent with respect to each other) as a result of which a lensfree digital hologram is recorded, namely:
$$ \begin{aligned} I\left( {x,y} \right) = &\,\left| {R\left( {x,y,z_{0} } \right) + s\left( {x,y,z_{0} } \right)} \right|^{2} \\ = &\,R\left| {\left( {x,y,z_{0} } \right)} \right|^{2} + \left| {s\left( {x,y,z_{0} } \right)} \right|^{2} + R^{*} \left( {x,y,z_{0} } \right)s\left( {x,y,z_{0} } \right) + R\left( {x,y,z_{0} } \right)s^{*} \left( {x,y,z_{0} } \right). \\ \end{aligned} $$
(1)
This hologram can be digitally propagated back to the position of the object by numerical free-space propagation, recovering a complex-valued image of the object. An illustration of the configuration used for in-line holography-based optofluidic devices described in this article is shown in Fig. 1.
https://static-content.springer.com/image/art%3A10.1007%2Fs10439-011-0385-3/MediaObjects/10439_2011_385_Fig1_HTML.gif
Figure 1

The on-chip in-line holography geometry. The object to be imaged is placed close to the image sensor, and the illumination aperture is placed several centimeters away. If the object is placed in a microfluidic channel, then multiple shifted lensfree holograms can be captured and processed using PSR algorithms. If the source is rotated, then multiple illumination angles can be adopted to recover a tomographic 3D image of the object

The digital sensor-array (e.g., a CMOS chip) only records the electromagnetic field intensity, which means that the phase of the optical wavefront reaching the sensor is lost, unless it is encoded into intensity oscillations as practiced in holography. In off-axis holography, this is achieved by interfering the scattered object wavefronts with a tilted reference wavefront,9 which allows the recovery of the optical phase information of the object by filtering and shifting in the spatial frequency domain. In in-line holography, however, the loss of optical phase is more severe, and leads to artifacts referred to as the “twin image,” where in addition to the image of the object, there is an additional defocused image spatially overlapping with the object’s real image, causing aliasing. The real and twin images correspond to the last two interference terms of Eq. (1). To eliminate the twin image in digital in-line holography, phase retrieval can be performed. In the optofluidic microscopy and tomography techniques discussed in this article, phase retrieval is performed using an iterative algorithm,14,26,35 which propagates the optical fields back and forth between the object and the sensor planes, enforcing the known quantities/information at each iteration. More specifically, at the sensor plane, the known quantity is the measured amplitude of the optical field. In the object plane, however, what is imposed is the spatial support of the object, which can be digitally determined by, e.g., intensity thresholding the back-projected image at the object plane (despite the overlapping twin image). After a few iterations (typically ~15–20), the phase of the field at the sensor plane can be retrieved, and the object image is then cleaned of the twin image artifact.

As alluded to earlier, the formation of a useful hologram requires sufficient coherence of the wavefront. Spatial coherence properties of the light can be tuned by filtering the illumination using, e.g., a sub-micron sized pinhole. Such a small pinhole, however, requires sensitive alignment of the light source with respect to the pinhole, and reduces the illumination efficiency. In the on-chip HOM approach of this article, the object to be imaged is placed centimeters away from the light source, allowing the use of a much larger pinhole size of, e.g., ~100 μm while still achieving a sufficient spatial coherence diameter at the object plane.26 This large pinhole drastically simplifies the optical alignment and allows the use of simple LEDs as light sources because of significantly higher photon transmission of the larger pinhole.

In this partially coherent on-chip holographic imaging scheme, the object to be imaged is placed much closer to the sensor-array than to the illumination pinhole (see Fig. 1), in contrast to traditional in-line holography schemes where the object is typically much closer to the illumination pinhole. This range of parameters allows holographic recording of the object waves, i.e., s(x,y,z) of Eq. (1), with unit magnification, and utilizes the entire active area of the sensor array that is employed in the experiments as the imaging field-of-view (FOV). This provides a major advantage since a typical CMOS sensor chip has ~20–30 mm2 of active area, and when compared to the FOV of, e.g., a 10× objective lens, this constitutes >20-fold increase in imaging FOV. One disadvantage of this configuration, however, is that the spatial sampling of the lensfree holograms may not be adequate due to the pixel size of the sensor. In commercially available sensor-arrays, the pixel size is typically on the order of ~1.4–10 μm, leading to undersampling of the hologram in this unit magnification geometry shown in Fig. 1. We can assume that the pixel size of future sensors will continue to shrink down due to the large demand of these sensors in consumer electronics, especially cell phones. Meanwhile, optofluidics provides an elegant solution to this pixelation issue by computationally creating a pixel size that is much smaller than the physical pixel size of the sensor. The next section describes one such technique in which multiple slightly shifted lower resolution holograms (captured during the fluidic flow of the specimen within the micro-channel) are used to synthesize a single high-resolution holographic image through PSR based digital processing.

Pixel Super-Resolution in Lensfree Holography

The lensfree holographic on-chip imaging technique described in the previous section is based on generating, recording, and processing of a digital hologram under unit magnification. Therefore, the spatial sampling frequency at the digital sensor plane plays a central role in determining the imaging resolution of this scheme. The current state of the art in digital sensor arrays (with typical pixel sizes of ~1.4–10 μm) may lead to undersampling of lensfree digital holograms, which in turn limits the spatial resolution of the resulting microscopic images. It has been previously shown that a digital sensor with, e.g., a 2.2-μm pixel pitch can provide a resolution of ~1.5–2 μm.27

To mitigate this limitation, there are computational methods that can achieve an effectively smaller pixel size than the physical pixel size at the sensor chip. One such technique, PSR, comes from the digital image processing community, where multiple slightly different lower resolution views of the same scene are computationally merged to generate a single high-resolution image of that scene.18,29 In this case, integer pixel shifts are redundant, and do not offer any additional information about the image.

In the on-chip in-line holography setup described in Fig. 1, the goal is to capture multiple shifted lower resolution holograms in order to generate a single high-resolution hologram, which in turn leads to a high resolution microscopic image through digital holographic reconstruction. In this geometry, there are multiple ways in which sub-pixel shifts of the holograms can be generated without dramatically increasing the experimental complexity of the device. It has been previously demonstrated that mechanically shifting the illumination source4 can easily generate the required sub-pixel shifts because of the demagnification in the hologram recording geometry between a shift of the source and a shift of the hologram (which is a manifestation of the large ratio between the z1 and z2 distances—refer to Fig. 1). Alternatively, utilizing multiple adjacent illumination sources sequentially gives similar results and allows further miniaturization of the lensfree microscope to a handheld and cost-effective unit.3 In both of these cases depicted in Bishara et al.,3,4 high-resolution holograms and microscopic images are synthesized, with a resolution (e.g., <1 μm) that is beyond reach using a single raw hologram without PSR. In this article, however, we will focus on another mechanism for hologram shifting, which involves the physical shifting of the object itself. Since flow is inherent to the operation of microfluidic devices, and owing to the simplicity of the lensfree holographic set-up, microfluidic devices can be readily inserted into the imaging path, and the fluidic flow can provide the required sub-pixel shifts of object holograms on the sensor chip.

Before invoking the PSR algorithm, the spatial shifts between different captured holograms must be calculated. This can be done directly from the raw digital holograms, without the need to know any of the parameters/dimensions of the experimental set-up. A simple way to estimate the shifts between different holograms is the iterative gradient method. In this method, the two holograms are first assumed to have a small enough shift such that one can be obtained from the other with a linear approximation. Given the two holograms I1 and I2 that are slightly shifted with respect to each other, i.e., I2(x,y) = I1(x + a, y + b), then I2 can be approximated as
$$ \tilde{I}_{2} \left( {x,y} \right) \approx I_{1} \left( {x,y} \right) + \left( {\frac{{\partial I_{1} }}{\partial x}} \right) \cdot a + \left( {\frac{{\partial I_{1} }}{\partial y}} \right) \cdot b. $$

The shifts a and b (along x and y, respectively) are found by minimizing the squared error between the linearly approximated \( \tilde{I}_{2} \) and the measured I2. As this error is a quadratic function of the shifts, the minimization problem reduces to solving an algebraic equation. After a first estimation is obtained, the image I1 can be interpolated to a grid shifted by the estimated a and b to bring it closer to I2, and the linear approximation can be reiterated in an attempt to reach a closer estimate of the real shifts. For this gradient shift estimation technique to work properly, it must be used in a region of the image where spatial aliasing, or under-sampling, is not severe, which is typically the central region of a lensfree hologram that exhibits low-frequency interference fringes. In addition, a small rotation between the two images can also be computed using a straightforward extension of this gradient estimation formulation.18

Once the shifts of all the raw holograms with respect to a reference hologram are computed, the PSR algorithm can then be invoked to find the high-resolution hologram. The idea is to find a high-resolution hologram which, when downsampled appropriately, recovers all the measured raw shifted holograms. If the distance between the downsampled computed images and raw measured images is defined as a quadratic function, then it can be straightforwardly minimized by means of the conjugate gradient descent or similar optimization algorithms. The total cost function to be minimized can then be written as
$$ C\left( {\mathbf{Y}} \right) = \frac{1}{2}\sum\limits_{\begin{subarray}{l} k = 1, \ldots ,p \\ i = 1, \ldots ,M \end{subarray} } {\left( {x_{k,i} - \tilde{x}_{k,i} } \right)^{2} + \frac{\alpha }{2}\left( {{\mathbf{Y}}_{\text{fil}}^{\text{T}} \cdot {\mathbf{Y}}_{\text{fil}} } \right)} , $$
where \( x_{k} \) and \( \tilde{x}_{k} \) are the measured holograms and ones downsampled from the high-resolution hologram \( {\mathbf{Y}} \), respectively. The index i runs over all pixels of a given hologram. The last term in the cost function gives additional cost to very high-frequency components in \( {\mathbf{Y}} \) to suppress processing artifacts. The minimization of the cost function can be performed using standard methods as it is a simple quadratic function of all the high-resolution pixel values. In the results to be discussed below, the conjugate gradient descent method was used.

Holographic Optofluidic Microscopy (HOM)

In recent years, microfluidic devices have been the focus of much research due to their wide variety of potential uses in medicine and science, such as biological and chemical detection and manipulation of samples.33,37 Microfluidic devices have also shown potential in using the optical properties of fluids for creating miniaturized and tunable optical devices. Examples of such optofluidic devices include tunable lenses, lasers, waveguides, and switches.12,30

Optofluidic imaging has also emerged in several recent studies. One approach for achieving this has been the OFM.10,19,22,28 In this mentioned approach, a digital sensor is covered with an opaque layer with sub-micrometer apertures punched through it. If the holes are positioned appropriately and an object is made to controllably flow in close proximity to the aperture plane, then a microscopic image of the object can be obtained with a resolution smaller than the physical pixel size of sensor chip.

In this article, however, we focus on a set of optofluidic on-chip imaging devices which utilize the lensfree holographic scheme described in “Partially Coherent Lensfree Digital In-Line Holography” section, along with multi-frame image-processing algorithms that take advantage of the flow of the object, to achieve high-resolution microscopy and tomographic imaging.

A schematic of the HOM is shown in Fig. 1. A microfluidic channel is placed directly atop a CMOS sensor with a pixel size of 2.2 μm and an FOV of 24 mm2. The channel can be 1–2 mm wide and, e.g., 80 μm high. Typically, microfluidic channels have smaller cross sections for better control of the flow, which restricts the size of the objects that can be imaged, and makes the channel susceptible to clogging. In HOM platform, however, the flow need not be uniform in speed or direction. A light source with a wavelength of, e.g., 500–600 nm and a bandwidth of, e.g., ~10–15 nm (e.g., an LED) is placed approximately 5 cm away from the micro-channel and is filtered by a large aperture with a diameter of, e.g., 100 μm. The large pinhole allows efficient and robust coupling of the light source through the pinhole, while the 5 cm distance to the sample allows the wavefront to develop spatial coherence by the time it reaches the sample, despite being weakly coherent at the pinhole due to the limited spatial coherence of the source.

The flow within the channel can be controlled either by applying an electric field across the channel or by creating a pressure difference between the ends of the channel. Since the object can be tracked directly from its hologram without knowing the details of the flow, the flow need not be uniform. The requirements on the movement of the object to be imaged within the channel are that the hologram is not blurred in a single captured frame (due to motion) and that the object is rigid and does not rotate out of plane during its flow. The first requirement translates to the requirement that the object moves a distance much smaller than a sensor pixel length within the exposure time of a single frame of the sensor. The exposure time can always be made shorter by using a brighter light source, and does not pose a fundamental limitation. In the figures below, the exposure time of a single frame was ~50 ms, and the objects were moving at a speed of 1–2 μm/s, satisfying the above requirements regarding flow speed. The sensor used in our imaging experiments reported below is a 5 megapixel sensor with a pixel size of 2.2 μm and an active area of 24 mm2, and was operated at 5 full-frames/s. Note, however, that over a smaller region of interest the same sensor chip can provide more than an order of magnitude faster frame rate, which could help us to significantly increase the flow speed of the objects within the micro-channel, without sacrificing spatial resolution.

As the object is flowing through the channel, its hologram shifts along the sensor plane. The sensor continuously captures frames, each containing a low-resolution lensfree hologram of the object. If a single hologram is processed following the algorithm described in “Partially Coherent Lensfree Digital In-Line Holography” section, it will recover a microscopic image of the object, but with a lower resolution of, e.g., 1.5–2.0 μm. When a higher imaging resolution is desired, the PSR algorithm described in “Pixel Super-Resolution in Lensfree Holography” section can be employed. The shift estimation algorithm discussed above is used to track the movement of the object as it flows through the channel. In Fig. 2, the trajectory of the object is tracked for ~50 s, corresponding to ~250 consecutive frames. The microfluidic channel was placed at a small angle with respect to the sensor edges, so that the trajectory has components along both sensor axes (x and y). The flow in this particular case was due to an applied voltage of ~2 V over the length of the channel, approximately 5 cm. The object imaged here is a Caenorhabditis elegans worm, a widely studied model organism in life sciences, which has been temporarily paralyzed using a solution containing Levamisole for achieving rigidity during the imaging period. The velocity of the worm during the imaging period was not constant, as can be seen from the 3 highlighted sets of frames in Figs. 2a and 2b, each representing 3 s of imaging, though the worm covers different distances in each case. The angle estimation (Fig. 2b) also shows a slow rotation of the worm coinciding with the change of velocity. This irregular flow does not pose challenge for HOM, and it is due to the large cross section of our micro-channel, the surface defects and impurities in the channel, since it was made with simple glass without additional treatment or passivization steps.
https://static-content.springer.com/image/art%3A10.1007%2Fs10439-011-0385-3/MediaObjects/10439_2011_385_Fig2_HTML.gif
Figure 2

As the object flows through a microfluidic channel (due to, e.g., electro-kinetic motion or a pressure gradient) lensfree in-line holograms are captured by a digital sensor-array. Acquisition of ~15 consecutive digital frames is sufficient to generate a high-resolution image, but for comparison, ~230 frames were captured during ~45 s. (a) 2D lateral shifts of consecutive frames with respect to the first one. (b) In-plane rotation angles of consecutive frames with respect to the first one. (c) Sub-pixel shifts of 3 sequences (A, B, and C; each composed of 15 consecutive frames) are shown. These three sequences of lensfree raw holograms are used independently to generate three different microscopic images of the same flowing specimen

It is also important to emphasize that it is not necessary to track the object for a long period of time. In fact, for the flow conditions described above, it was possible to achieve a decent imaging performance with only 15 consecutive frames, chosen at any random point in the long sequence of frames (see Fig. 2). This corresponds to ~3 s of imaging time at five frames/s, but a faster sensor (or selection of a smaller FOV on the same chip) can scale down the imaging time significantly. Given any 15 consecutive frames, shifts between the frames are estimated, and the integer pixel part of the shifts is discarded. The remaining sub-pixel shifts are shown in Fig. 2c, for three different sets of 15 frames each.

The PSR algorithm, as discussed earlier, combines a set of 15 shifted lower resolution holograms into a single high-resolution one. PSR processing of a set of low-resolution holograms of the worm shown in Figs. 3 and 4 took 2.5 s to converge running in Matlab on a 3.2-GHz PC. The code was also converted to run on a Graphics Processing Unit (GPU), which reduced the processing time by an order of magnitude to 0.25 s. In Fig. 3, the results of PSR are demonstrated and compared to simple interpolation of a single low-resolution hologram. The super-resolved (SR) hologram contains interference fringes that cannot be captured with the physical pixel size of the CMOS sensor.
https://static-content.springer.com/image/art%3A10.1007%2Fs10439-011-0385-3/MediaObjects/10439_2011_385_Fig3_HTML.jpg
Figure 3

(a) A super-resolved hologram of a C. elegans sample is shown, synthesized from 15 low-resolution frames, i.e., sequence A of Fig. 2. (b) An enlarged section of (a), where sub-pixel holographic oscillations are clearly visible. (c) An enlarged section of one of the raw holograms is shown, after interpolation

https://static-content.springer.com/image/art%3A10.1007%2Fs10439-011-0385-3/MediaObjects/10439_2011_385_Fig4_HTML.gif
Figure 4

HOM images of a C. elegans sample are shown. Without PSR, a single low-resolution hologram gives a lower resolution image of the object, as shown in the far left image. When using 15 consecutive holographic frames of sequence A (Fig. 2) a super-resolved hologram is synthesized to create higher-resolution amplitude and phase images of the same object. To demonstrate the robustness of this method, two additional sequences (B and C) were independently used to generate similar reconstructed images, all of which match the conventional microscope image of the same worm acquired under a 40× objective lens (NA: 0.65)

Figure 4 demonstrates the imaging capability of HOM. The microscopic image obtained from a single lensfree raw hologram is compared to the image obtained from PSR processing of 15 consecutive frames, and to an image obtained from a traditional bright-field benchtop microscope with a 40× objective lens (NA = 0.65). The PSR images show a clear enhancement of spatial resolution over the single frame image, and compare well to the bright-field image obtained with a benchtop microscope. The different sequences (A, B, and C in Fig. 4) correspond to different sets of 15 consecutive frames taken from different parts of the larger set of frames (see Fig. 3), and they all compare well to each other. The holographic nature of this microscope also allows phase imaging in addition to amplitude imaging. Phase imaging is particularly useful for transparent biological samples, which do not absorb or scatter light strongly. Figure 5 shows HOM images of smaller objects such as a Mulberry pollen and Giardia lamblia cysts, to demonstrate the robustness of the flow estimation and the PSR algorithms. Using more than 15 frames as input for PSR did not show dramatic improvement, as shown in Bishara et al.4
https://static-content.springer.com/image/art%3A10.1007%2Fs10439-011-0385-3/MediaObjects/10439_2011_385_Fig5_HTML.gif
Figure 5

HOM imaging results for G. lamblia cysts and Mulberry pollen, where single frame images and PSR images are compared to bright-field microscope images. The bright-field images are representative images and not images of the specimens shown in the HOM images

Holographic Optofluidic Tomography (HOT)

The optofluidic microscopy modalities discussed thus far lack the ability to perform sectional imaging of the specimen. While numerous systems have been demonstrated for optical sectioning of objects,5,6,11,13,16,17,34,38 these platforms have lens-based complex architectures, which hamper their integration with microfluidic systems and lab-on-a-chip platforms. To address this need in optofluidics, an optofluidic tomographic microscope has recently been developed.21 In this section, we will review this technology, which can perform 3D imaging of the specimen flowing within a microfluidic channel placed on a sensor chip.

The imaging principle of holographic optofluidic tomography (HOT) is similar to HOM in the sense that pixel SR images of flowing objects are obtained using lensfree on-chip holography. The key difference is that HOT relies on multi-angle illumination to record multiple lensfree views of objects that can be used to obtain volumetric structural information. Accordingly, HOT utilizes partially coherent illumination (e.g., ~600 nm center wavelength with ~10 nm spectral bandwidth, filtered by an aperture of diameter ~0.1 mm) placed ~50 mm away from the sensor array to record digital in-line holograms of the sample. As described in the previous sections, PSR techniques are utilized to digitally generate high-resolution holograms of the objects. Since illumination is provided without any sensitive alignment either with respect to the pinhole or the sensor-array, the architecture of lensfree on-chip holography conveniently permits imaging of the flowing objects using multiple illumination angles as shown in Fig. 1, which is the key to achieve tomographic optofluidic microscopy on a chip.

In HOT, multiple shifted projection holograms (e.g., 10–15 frames) are recorded at each illumination angle (spanning a range of, e.g., θ = −50°:50°) as the sample is driven through a micro-channel placed on the sensor array. These lower-resolution (LR) lensfree holograms are then digitally synthesized into a single SR hologram by using the above-mentioned PSR techniques to obtain a higher lateral resolution in the plane parallel to the sensor. These SR projection holograms are digitally reconstructed to obtain projection images of the same object for different viewing directions with a lateral imaging resolution of <1 μm. For weakly scattering objects that are not thicker than the depth-of-focus of the projection images, these lensfree projection images represent rectilinear summation of the object’s transmission function (e.g., scattering strength) along the direction of illumination. In other words, if scattering is not very strong inside an object whose thickness is <40–50 μm (i.e., the typical depth-of-focus of our lensfree projection images), the diffraction within the object can be ignored, and the projection assumption can be satisfied.32 Therefore, the 3D transmission function of the object can be computed using a filtered back-projection algorithm,31 where all the complex projection images (i.e., 51 SR images for θ = −50°:2°:50°) are used as input.

As a proof of concept for optofluidic tomography, sectional imaging of a wild-type C. elegans worm was demonstrated using HOT. To achieve that, holograms of the worm, flowing through a microfluidic channel, were acquired at various illumination angles spanning θ = −50°:50° in discrete increments of 2°. The main reason for not using larger illumination angles, e.g., 70–80° is the degradation in pixel-response for large incidence angles. That is, digital sensors are designed for lens-based imaging systems, where incidence angles do not typically exceed 20–30°. Therefore holograms recorded at illumination angles larger than ±50° exhibit significant artifacts and including them in the back-projection process leads to spatial aberrations. For each illumination angle, ~15 holographic frames of the flowing object (in <3 s) were recorded, leading to a total imaging time of ~2.5 min under the electro-kinetic flow condition. The data collected within 2.5 min enables tomographic imaging of all the objects that were above the sensor during the data acquisition time. Multi-angle illumination was automatically provided by a computer-controlled rotation stage holding the light source. The center of rotation for the light source was roughly adjusted to coincide with the center of the sensor that is located at the xy plane. Exemplary LR holograms recorded with this set-up at different illumination angles are illustrated in Fig. 6. As expected, the extent of the holograms along x gets wider at larger angles. By means of the sub-pixel shifts of the worm during its flow within the micro-channel together with PSR techniques, SR holograms of the sample at each illumination angle were obtained as presented in Fig. 6. These SR holograms exhibit high spatial frequency fringes that were normally under-sampled in their corresponding raw lensfree holograms.
https://static-content.springer.com/image/art%3A10.1007%2Fs10439-011-0385-3/MediaObjects/10439_2011_385_Fig6_HTML.gif
Figure 6

(Top) Recorded holograms of a C. elegans sample at three different illumination angles (θ = 0°, 34°, and −34°). (Bottom) Pixel super-resolved holograms of the worm for the same illumination angles, which exhibits higher-frequency holographic fringes that are otherwise undersampled in the recorded holograms. Lensfree holograms are naturally wider for the tilted illumination angles when compared to the vertical illumination hologram

The reconstruction of the SR holograms for oblique illumination angles is performed similar to the vertical illumination case. Accordingly, the SR holograms are digitally multiplied with a tilted reference wave before being propagated back to the object plane. Once the hologram is multiplied by this tilted plane wave, the back-propagation algorithm essentially propagates the hologram along the direction of this tilted wave, i.e., toward the actual position of the object. The tilt angle of this reconstruction wave is not equal to the tilt of the light source in air, because of the refraction of light in the microfluidic chamber and its walls. Instead, the digital reconstruction angle (θ) for projection holograms are determined by calculating the inverse tangent of the ratio d/z2, where ∆d denotes the lateral shifts of the holograms of objects with respect to their positions in the vertical projection image. The digital reconstruction distance of the vertical projection hologram enables determining z2. Note that even though tilted illumination angles are used, the recorded holograms are still in-line holograms, not off-axis holograms9 since the reference wave and the object wave propagate co-axially. As a result, the same iterative phase recovery algorithm26 described in “Partially Coherent Lensfree Digital In-Line Holography” section can be utilized to reconstruct projection images without the twin-image noise. Throughout these iterations where the object support and the measured hologram amplitude at the sensor plane are used as constraints, the optical field is propagated back and forth between the parallel hologram and object planes until convergence is achieved. Then, the projection of the complex field in the plane normal to the illumination angle is obtained by interpolating the recovered field on a grid whose dimension along the tilt direction is rescaled by cos(θ). Finally, projection images, some of which are shown in Fig. 7 for various θ values, are obtained which can be now back-projected to compute tomograms of the flowing objects.
https://static-content.springer.com/image/art%3A10.1007%2Fs10439-011-0385-3/MediaObjects/10439_2011_385_Fig7_HTML.jpg
Figure 7

Digitally reconstructed amplitude images of the worm in Fig. 6 at illumination angles of θ = −34°, 0°, +34°. A brightfield microscope image, obtained by a 40× objective lens, is also provided for visual comparison

Figure 8 demonstrates the tomographic imaging performance of HOT using a C. elegans sample by showing several slice images through the worm body. Different depth sections of the worm reveal distinctly details regarding the structure of the worm, which is otherwise unattainable by using a single SR image. In other words, the tomographic imaging approach significantly mitigates the well-known depth-of-focus problem (which indeed becomes beneficial for tomographic imaging by satisfying the projection approximation) inherent in holographic reconstruction modalities,5,24 and allows optofluidic sectional imaging with significantly improved axial resolution. The entire tomographic reconstruction process (including the synthesis of the SR holograms and the filtered back-projection) takes less than 3.5 min using a single GPU, which can be significantly improved by using several GPUs in parallel.
https://static-content.springer.com/image/art%3A10.1007%2Fs10439-011-0385-3/MediaObjects/10439_2011_385_Fig8_HTML.jpg
Figure 8

Tomographic optofluidic imaging of a C. elegans worm. Different slice images are provided within a depth range of z = −6 to +6 μm. The color bar applies to all lensfree slice images. Scale bars: 50 μm

The lensfree holographic on-chip imaging approach taken in HOT is one of the key enablers for obtaining multiple views of the flowing objects within a microfluidic channel. That is, the distance traversed by the object-wave until it reaches the sensor rapidly increases for large illumination angles, e.g., 40–50°, which further necessitates a “capability” to propagate the recorded image back to the object plane to prevent artifacts caused by increased blurring (or defocusing) at large angles. Fortunately, HOT can digitally correct for this varying distance between the sensor and the object as a function of the illumination angle. Similarly, potential fluctuations of the object along the z direction (normal to the sensor plane) can also be digitally corrected for. This brings robustness to this optofluidic tomography platform as the vertical position of objects may vary during the flow. Therefore, the channel height can be increased to avoid clogging issues, which is an important practical advantage of this platform.

An important factor limiting the achievable axial resolution in HOT is the limited angular range of illumination. As a result of performing tomographic reconstruction with limited views, a missing region in the Fourier space of the object remains empty, commonly referred as the “missing wedge problem.”2 The most significant implication of this missing information is the elongation of the PSF in the axial direction. Therefore, the axial resolution is limited to ~4 μm, although lateral resolution can be less than 1 μm. Nevertheless, using next-generation sensors that offer pixels with better angular response can help us to significantly increase our current resolution. In the meantime, for applications demanding higher axial resolution and better sectioning ability, a dual-axis illumination scheme can be employed,20 where the light source is rotated along two orthogonal directions as opposed to only one. Moreover, iterative approaches can be helpful to increase the axial resolution if certain properties of the object such as its 3D support are known a priori or can be estimated.36

Summary and Outlook

Incorporating high-resolution imaging into microfluidic devices would allow new functionalities broadening possible uses of such devices. In particular, on-chip microscopy and tomography techniques within microfluidic channels could lead to high-throughput screening and manipulation of samples in compact and cost-effective ways.

The holographic on-chip approach described in this article has the advantages of robustness, simplicity, and the ability to perform phase imaging as well as amplitude imaging. The disadvantages of limited lateral and axial resolutions are mitigated by use of digital algorithms for processing multiple shifted frames and multiple illumination angles as a function of the fluidic flow.

There remains significant room for improvement in the performance of these lensfree on-chip imaging devices. Better resolution, color sensitivity for detecting stains, and faster and more efficient digital algorithms could further improve the performance of these imaging devices, leading to better integration of imaging into microfluidic devices for more versatile optofluidic devices. Finally, the extension of the presented optofluidic microscopy approaches into lensfree fluorescent imaging is also an important research direction in which we expect significant advances from its current status, as described in Treurniet-Donker et al.7 and Arnott and Stenning.8

Copyright information

© Biomedical Engineering Society 2011