# Widefield fluorescence microscopy with extended resolution

- First Online:

- Accepted:

DOI: 10.1007/s00418-008-0506-8

- Cite this article as:
- Stemmer, A., Beck, M. & Fiolka, R. Histochem Cell Biol (2008) 130: 807. doi:10.1007/s00418-008-0506-8

## Abstract

Widefield fluorescence microscopy is seeing dramatic improvements in resolution, reaching today 100 nm in all three dimensions. This gain in resolution is achieved by dispensing with uniform Köhler illumination. Instead, non-uniform excitation light patterns with sinusoidal intensity variations in one, two, or three dimensions are applied combined with powerful image reconstruction techniques. Taking advantage of non-linear fluorophore response to the excitation field, the resolution can be further improved down to several 10 nm. In this review article, we describe the image formation in the microscope and computational reconstruction of the high-resolution dataset when exciting the specimen with a harmonic light pattern conveniently generated by interfering laser beams forming standing waves. We will also discuss extensions to total internal reflection microscopy, non-linear microscopy, and three-dimensional imaging.

### Keywords

Structured illuminationExtended resolutionWidefield fluorescence microscopy## Introduction

An ever-growing selection of highly specific fluorescent markers binding to or being expressed as part of single molecules and molecular assemblies make light microscopy the method of choice to study molecular architecture and dynamics in cells (Taylor and Wang 1980; Stephens and Allan 2003). Particularly attractive is the possibility to simultaneously observe a multitude of cellular constituents that are labeled with markers of distinct emission wavelengths to quantify their spatial and temporal distribution and possible colocalization. Although the numerical aperture (NA) of the objective and the wavelength of light physically limit the attainable resolution, a fact described by Ernst Abbe already in 1870 (Abbe 1873), separations much smaller than this limit can be determined between two isolated markers emitting on two different wavelengths because the position of each signal source can be measured independently with very high precision. For each wavelength, however, the width and axial extent of the point spread function (PSF), i.e., the image of an arbitrarily small light emitter, determines the observable size of an object and how close two objects can still be resolved. In practice, the point resolution of a standard widefield light microscope with well corrected high-NA objectives amounts to approx 230 nm laterally and 800 nm in axial direction, which is not sufficient to resolve a large class of biological structures. Hence, much effort has been devoted to extend the optical resolution beyond the classical diffraction limit.

In the following sections, we will discuss the concepts and developments that led to the dramatic resolution improvements feasible in widefield fluorescence microscopy today. We will outline image formation in the microscope and computational reconstruction of the high-resolution dataset when exciting a fluorescent specimen with a harmonic light pattern. The corresponding illumination set-ups are also explained. Finally, we provide an overview of further developments exploiting non-linearities in the response of fluorophores to reach a resolution of a few 10 nm with visible light and standard microscope objectives.

## Developments toward extended resolution

The confocal microscope invented by Marvin Minsky in 1955 marks the first practical concept to extend the classical resolution limit (Minsky 1988). Using a pinhole to reject out-of-focus light, the optical sectioning capability and hence axial resolution was dramatically improved over standard widefield microscopes (White et al. 1987). Theoretically, a small pinhole also increases lateral resolution up to a factor of 1.4. In practice, however, this resolution enhancement is seldom realized since resolution is traded off against signal strength when the pinhole is closed down.

In 1963, Lukosz and Marchand introduced a general concept to increase the optical resolution using structured illumination instead of uniform light as effected by Köhler illumination (Lukosz 1966). Structured illumination, e.g., by a grid pattern, improves spatial resolution at the cost of temporal resolution since several images with shifted illumination pattern have to be acquired in order to extract more information. In this respect, the confocal microscope represents a limiting case of structured illumination, since a spot of light is scanned across the field of view and the signal is acquired for each position.

Only in 1993, Lanni and Bailey introduced a first application of structured illumination, namely standing wave fluorescence microscopy (Bailey et al. 1993). In their set-up, two interfering laser beams generated a standing wave in axial direction. The alternating nodal (dark) and antinodal (bright) planes parallel to the object plane enabled selective excitation of individual sections in the sample. However, this technique was only suitable for very thin samples in the range of the period of the standing wave. In thicker samples, nodal planes above and/or below the focal plane create out-of-focus blur due to the poor axial resolution of widefield microscopes.

Gustafsson et al. (1995) improved the basic concept of standing wave microscopy by using two objectives for illumination *and* detection. In his set-up, the focal plane is selectively excited by the interference of counter-propagating light beams originating from an incoherent source. Additionally the images collected by the two objectives are coherently superimposed on a CCD detector. I5M microscopy achieved a sevenfold higher axial resolution than conventional microscopes. In parallel, Hell and co-workers introduced 4Pi microscopy, the point scanning analog to I5M that also employs two objectives (Hell and Stelzer 1992). Both the techniques demand a very careful alignment of the objectives and the optical train used for image formation, and the image is reconstructed by computational post processing. For ease of use, the 4Pi microscope has to be operated in the two-photon excitation regime (Hell and Stelzer 1992).

Surprisingly, these concepts only aimed at increasing the axial resolution, which of course is an important issue when studying three-dimensional objects, but the lateral resolution remained within the classical limit.

Lateral resolution, as suggested by Lukosz, can be improved in a similar way by illuminating the object with standing waves extending in lateral direction. A first practical implementation of this concept was published by Heintzmann and Cremer (1999), and later Gustafsson (2000) and Frohn et al. (2000) independently demonstrated microscope set-ups reaching twice the optical resolution. So and co-workers recognized the possibility to further extend the lateral resolution by applying standing wave illumination to total internal reflection fluorescence (TIRF) microscopy (Cragg and So 2000; Chung et al. 2006, 2007).

In 1994, Hell and co-workers took advantage of optical non-linearities to fundamentally improve resolution in a point scanning microscope (Hell and Wichmann 1994; Klar et al. 2000). To this end, they depleted the fluorescence at the rim of a focused spot by stimulated emission. Later, Heintzmann et al. (2002) extended the concept of resolution enhancement by non-linear phenomena to widefield microscopy and structured illumination, and Gustafsson (2005) succeeded in experimentally demonstrating a lateral resolution of 50 nm. Techniques relying on non-linear fluorophore response typically require long acquisition times and may suffer from increased photobleaching and phototoxicity, limiting their application to biological specimens unless particularly stable fluorophores are employed.

Despite earlier suggestions (Heintzmann and Cremer 1999; Gustafsson et al. 2000; Frohn et al. 2001), due to technical challenges simultaneous enhancement of lateral and axial resolution yielding a near-isotropic resolution of about 100 nm using structured light was only recently achieved by Gustafsson and co-workers (Gustafsson et al. 2008; Schermelleh et al. 2008; Shao et al. 2008). The remarkable gain in resolution becomes very apparent in complex biological specimens.

## Image formation

*δx*=

*λ*/(2NA), where

*λ*denotes the wavelength of light in vacuum.

Alternatively, one may describe the imaging process in Fourier (reciprocal/frequency) space. Any object can be decomposed into a sum of sinusoids of different spatial frequencies and amplitudes. The objective, however, can only collect a limited set of low spatial frequencies. The so-called optical transfer function (OTF) is defined as the region in Fourier space that has non-zero values (Fig. 2, see also Fig. 10b). The bigger the extent of the OTF, the higher is the resolution. Figure 2 illustrates how the image of an object is blurred by the PSF and, correspondingly, how the object spectrum is filtered by the OTF.

## HELM theory

### Real space

How does a harmonic excitation pattern, e.g., the standing wave created by interfering laser beams, lead to higher resolution? The underlying physics is frequency mixing, i.e., the transformation of the object’s spatial frequencies into a set of lower and a set of higher frequencies by subtracting or adding the spatial frequency of the harmonic excitation pattern, respectively. In fluorescence microscopy, we deal with a multiplication of the illumination pattern with the labeled specimen. Hence we may apply Euler’s formula to describe the multiplication of the object frequencies *k*_{1}, with the harmonic illumination pattern of spatial frequency *u*: cos(*k*_{1}*x*)cos(*ux*) = 1/2[cos(*k*_{1} + *u*) + cos(*k*_{1} − *u*)].

The resulting signal is separated into a component with higher frequencies (*k*_{1} + *u*) and, more interesting for light microscopy, into a component with lower frequencies of *k* = (*k*_{1} − *u*). Provided *k* remains within the microscope’s passband, i.e., the region with non-zero OTF, the low frequency component can carry higher spatial-frequency information into the passband than predicted by the objective’s resolution limit. One should note, however, that one could not properly retrieve this higher-frequency information from a single image. Obviously, with a harmonic excitation pattern, we do not uniformly illuminate the specimen and we must acquire additional images with shifted pattern that excites previously dark portions. As we will see below, the additional images are also required to retrieve the high-frequency information. Now we can calculate the highest spatial frequency transmitted through an optical passband with cut-off frequency *k*_{c} = 2NA/*λ*: *k*_{max} = *k*_{c} + *u*. For *u* = *k*_{c} this leads to a doubling in resolution. The high-resolution information may be retrieved in real space by comparing images acquired with different shifts of the illumination pattern (So et al. 2001).

### Fourier space

*u*= 2

*π*/Λ, where Λ is the period of the illumination pattern. This shift brings higher spatial frequencies of the image spectrum (open squares) into the passband so they can be detected, albeit at the wrong position in frequency space. Illumination by a two-dimensional (2D) harmonic light field (Fig. 3b) gives rise to a fluorescence spectrum \( \tilde{\Upphi } \) consisting of a linear combination of five spectral components:

To reconstruct the extended object spectrum (Fig. 3b), a sequence of *i* = 1,…,5 raw images with different phases of the illumination pattern (Δ*φ*_{x}, Δ*φ*_{y})_{i} is acquired. The phase steps are chosen such that the sequence of pattern shifts will illuminate the entire field of view, e.g., 0° and ±90° or ±120° to reduce bleaching effects in long run experiments. Applying image arithmetic to the image set allows one to separate the spectral components \( \tilde{\Uppsi }_{1, \ldots ,5} \) and correctly rearrange them (Heintzmann and Cremer 1999).

### Image reconstruction

To separate and rearrange the spectral components \( \tilde{\Uppsi }_{1, \ldots ,5} \) one must determine the exact period and orientation of the excitation pattern. To this end, one may record a calibration image without fluorescence filters and analyze the pattern by Fourier transformation. The peak positions are obtained by interpolation. A widely used three-point Gaussian fit typically determines the pattern period within ~2 nm, which is not insufficient for reconstruction. Better results are achieved by using the fit value as initial guess to iteratively maximize the mean square error between the recorded image and an analytical cosine pattern. The iteration is stopped when the pattern period reaches a convergence band of <0.1 nm. Alternatively, one may determine the wave vector * u* of the illumination pattern from the overlap region of the separated spectral components, i.e., without calibration image (Heintzmann and Cremer 1999; Gustafsson et al. 2000; Gustafsson 2000).

The image spectra of the raw images are compensated by the measured OTF of the microscope. To avoid noise amplification we employ a Wiener filter for deconvolution. The sequence of *i* = 1,…,5 compensated image spectra with known phase shift of the illumination pattern (Δ*φ*_{x}, Δ*φ*_{y})_{i} form an equation system according to Eq. 1. To determine the spectral components \( \tilde{\Uppsi }_{1, \ldots ,5} \) the equation system is solved for each pixel by inverse matrix multiplication. The spectra are then shifted back to the origin of frequency space by applying the Fourier shift theorem in real space, i.e., by multiplying the Fourier back-transform by the corresponding non-stationary phases \( (e^{{ \pm i\vec{u}_{x} \vec{x}}} ,e^{{ \pm i\vec{u}_{y} \vec{x}}} ). \)

The detection of the initial phase of the illumination pattern is a critical step in the reconstruction. In the presence of phase offsets, assuming the initial phase of the pattern to be zero results in wrong weighting of the spectral components. False intensities and a lateral shift of the reconstructed image would result (Schaefer et al. 2004).

The initial phase can be determined from the overlap regions of the different spectral components. To this end, one applies a cross correlation analysis of the phase angles \( \theta_{i} = \arctan \left( {{{\text{Im} \left( {\tilde{\Uppsi }_{i} } \right)} \mathord{\left/ {\vphantom {{\text{Im} \left( {\tilde{\Uppsi }_{i} } \right)} {\text{Re} \left( {\tilde{\Uppsi }_{i} } \right)}}} \right. \kern-\nulldelimiterspace} {\text{Re} \left( {\tilde{\Uppsi }_{i} } \right)}}} \right) \) between the unshifted spectral component and the shifted components in a finite pixel field of the overlap region, and iteratively maximizes the correlation coefficient (Beck et al. 2008).

Once all spectral components are superimposed into a single data set to form the extended HELM passband, the sharp transition to zero at the rim of the HELM passband needs to be smoothed by an apodization function to avoid ringing artifacts in the extended resolution image calculated by Fourier back-transform. Since apodization not only reduces ringing artifacts but also broadens spectral features, we apply a function with constant weighting over most of the HELM passband and a Gaussian decay at the edge to damp the spectrum down to 0.5% at the limit of the extended OTF (Beck et al. 2008).

The final extended resolution HELM image is obtained after Fourier transforming the extended spectrum into real space. To accommodate the higher resolution, spectra are re-sampled by zero padding. The pixel-size reduction is usually chosen to meet the Shannon-Nyquist criterion. Larger reductions may be used to produce smoother reconstructions.

## Instrumentation

To extend lateral resolution, the standing wave pattern of Fig. 4a needs to be turned by 90°. Figure 4b displays a cross-section of a prism launch set-up coupling two pairs (only one pair shown) of interfering laser beams into the specimen chamber to create a two-dimensional grid-like harmonic excitation pattern (Frohn et al. 2000). The laser beams enter the specimen chamber at an oblique angle relative to the cover slide and propagate into the objective to permit observation of the standing wave pattern. This feature facilitates calibration of the illumination pattern (see above) but is not an absolute must.

When illuminating the specimen with two standing wave patterns simultaneously, orientated along the *x*- and *y*-axis, image analysis is facilitated when the two patterns do not cross-interfere. To this end, one may select s-polarization for the laser beams, which additionally increases the pattern contrast (modulation depth). As a result of the limited number of polarization directions, however, fluorophores with fixed dipol axis may not get excited if orientated in an unfavorable direction.

A harmonic excitation pattern may also be created by projecting a diffractive phase grating into the specimen plane. Figure 4c illustrates the principle on the example of a two-dimensional phase grating. Selecting only the first diffraction orders and blocking the zero order results in a two-dimensional sinusoidal intensity variation. Blocking the zero order also doubles the spatial frequency of the pattern. The grating approach benefits from a very stable illumination pattern since spatial drifts of the grating are strongly demagnified. A practical set-up is shown in Fig. 4d. The diffracted beams are focused into the back focal plane of the objective to obtain plane waves in the specimen plane. The fluorescence signal collected by the objective is separated from the excitation light by a dichromatic mirror and a bandpass filter and recorded with a CCD camera.

To shift the harmonic excitation pattern one usually delays one laser beam of the interfering pair. To this end, piezo-actuated mirrors are inserted into the beam to adjust the optical path length (Frohn et al. 2000). Alternatively, one may insert an electrically tunable phase plate (Beck et al. 2008) that offers the additional advantage of maintaining a straight beam trajectory. When projecting a grating (Fig. 4d), translating the grating laterally shifts the illumination pattern, while rotating the grating (e.g., 45° for a 2D-grating or 120° and 240° for a 1D-grating) adds spectral copies toward a more isotropic extended HELM passband compared to Fig. 3b (Gustafsson 2000).

## Applications and further developments of HELM

### TIRF–HELM

*α*= 63.2° in Fig. 5b.

### Multi-color HELM

*is determined by the period of the standing wave pattern, i.e., by the excitation wavelength, the angle between the interfering laser beams, and the refractive index. Hence, even for single wavelength excitation, e.g., when using quantum dot labels, the resulting resolution will be higher for shorter emission wavelength than for longer emission wavelength. Using the grating approach for TIRF–HELM (Fig. 4d), excitation with different wavelengths requires adjustment of the diffraction angle, e.g., by a tunable grating (Beck et al. 2008) or spatial light modulator (Fiolka et al. 2008), to maintain the condition of total internal reflection. The period of the standing wave and its penetration depth into the sample will vary with wavelength. For example, the TIRF–HELM images shown in Fig. 6c, d were acquired with excitation wavelengths of 488 and 532 nm. The periods Λ of the standing wave pattern were Λ*

**u**_{488}= 177 nm and Λ

_{532}= 194 nm, corresponding to penetration depths of d

_{488}= 104 nm and d

_{532}= 143 nm, respectively.

### Saturated structured illumination microscopy (SSIM)

*n-*th order polynomial result in the generation of

*n*harmonics. If the relation between excitation and emission is described by an exponential function, an infinite number of harmonics will occur. Such a relation would theoretically allow infinite resolution. Figure 8 illustrates the occurrence of higher harmonics in the fluorophore response when bleaching an initially homogenous fluorescent layer by a two-dimensional harmonic excitation field. Due to a non-linear bleaching rate, after a short time sharp intensity peaks of unbleached fluorophores form at the location of intensity zeros of the excitation field. Figure 8a–d shows the temporal evolution of the remaining active fluorophore population along with the corresponding image spectra (Fig. 8e).

*on*-state and only small volumes, located at the intensity zeros, remain in the

*off*-state. Similar to the HELM technique described above, a sequence of raw images with shifted excitation pattern is acquired to reconstruct the final extended resolution image. The required number of raw images increases with the order of harmonics to be resolved and the number of angular pattern orientations applied to achieve isotropic resolution. With enough laser intensity, this technique can theoretically achieve infinite resolution. In biological imaging, photostability and phototoxicity as well as long acquisition times will set a practical limit. With a picosecond laser an image resolution of 50 nm has been demonstrated (Fig. 9) (Gustafsson 2005).

Instead of excitation into saturation, photo-switchable fluorophores may be employed to create non-linearities in the fluorophore response (Enderlein 2005; Hofmann et al. 2005; Keller et al. 2007). Referred to as RESOLFT (reversible saturable/switchable optical fluorescence transition) a stunning resolution claim of λ/12 has been reported in a field-scanning implementation (Schwentker et al. 2007). This real space field-scanning approach requires less post-processing compared to the reconstruction in Fourier space. Field-scanning, however, typically requires more images to correctly sample the specimen. We estimate at least twice the number of images to achieve the same resolution per direction.

Although the number of higher harmonics is infinite, at least in principle, only a finite number of harmonics will rise above the noise level. The difficulty in creating a real intensity zero also puts a practical limit on the achievable resolution with such non-linear techniques.

### 3D-HELM

*x–z*section of a 3D widefield PSF for fluorescent light emission. The elliptical shape along the

*z*coordinate causes the poor

*z*-resolution. In addition to the central ellipsoid, the widefield PSF features an hourglass shaped structure. Owing to this structure, objects that are not in focus still contribute to the image, although extremely blurred. Thereby out-of-focus objects increase the background level and overshadow the in-focus image. The corresponding cross-section of the 3D-OTF, which is the Fourier transform of the 3D-PSF, is shown in Fig. 10b. Owing to the toroidal shape of the OTF, information in a conical region around the

*k*

_{z}axis, known as the missing cone, is not transferred. The missing cone is responsible for the poor axial resolution and the occurrence of out-of-focus blur in widefield microscopy. In confocal microscopy the missing cone is filled, improving axial resolution to about 700–800 nm, which is still not sufficient to resolve typical biological structures along the

*z*-direction.

Structured illumination with harmonic patterns that are tilted above the horizontal plane encodes three-dimensional information that is normally not transferred by the microscope. The tilted patterns create copies of the OTF that reach into the missing cone (Fig. 10c). Filling the missing cone improves axial resolution and deterministically removes out-of-focus blur. Frohn et al. (2001) suggested a set-up for 3D-HELM using two facing objectives and beam steering units for sequential illumination of the specimen with one-dimensional interference patterns.

## Conclusions

Confocal microscopes becoming commercially available and a rapidly growing selection of fluorescent tags marked the beginning of a lasting renaissance of light microscopy in life science. The ability to visualize the three-dimensional dynamic architecture of cells with, at that time, unprecedented clarity paved the way to new insights into complex molecular mechanisms. Imaging three-dimensional specimens in widefield fluorescence microscopy somewhat remained a technique for experts, quite likely because in the past the computational methods necessary to reconstruct clear images appeared involved. Widefield microscopes, however, collect light emitted from the entire field of view in parallel and very efficiently since there are no pinholes cutting off light. We expect that the recent demonstrations of superb resolution and clarity achieved with structured illumination will now spur growing interest to apply widefield fluorescence microscopy in the study of ever finer structures and processes.

## Acknowlegments

We are grateful to Arnold Hayer and Dr. Helge Ewers from the Institute of Biochemistry at ETH Zurich for preparing labeled CV-1 and HeLa cells.