Optical Review

, Volume 23, Issue 5, pp 859–864 | Cite as

Multi-aperture optics as a universal platform for computational imaging

Special Section: Regular Paper The 10th International Conference on Optics-Photonics Design & Fabrication (ODF’16), Weingarten, Germany
Part of the following topical collections:
  1. The 10th International Conference on Optics-Photonics Design & Fabrication (ODF’16), Weingarten, Germany


Computational imaging is a novel imaging framework based on optical encoding and computational decoding. To avoid a heuristic design that depends on the particular problem to be solved, multi-aperture optics is useful as a universal platform for optical encoding. In this paper, the fundamental properties of multi-aperture optics are summarized. Then some examples of interesting functions implemented by multi-aperture optics are explained, together with some effective applications.


Multi-aperture optics Compound-eye imaging Computational imaging Image processing Compressive sensing 

1 Introduction

Progress made in information technology has enabled the realization of a novel imaging framework called computational photography or computational imaging [1]. Dynamic range extension [2] and depth-of-field extension [3] are practical examples of the benefits achieved using this framework. A plenoptic camera [4] is also an attractive example demonstrating unique functionalities, such as refocusing and the ability to translate the viewing point after image capturing.

The fundamental procedure of computational imaging consists of optical encoding of the object signals and computational decoding for image reconstruction. Combinatorial variations of these processes provide a variety of imaging modalities. Optical encoding is achieved by various methods, such as specific optical devices [3], mechanical scanning [5], pattern illumination [6] and scattering media [7]. Once the optical signals are encoded, computational decoding is performed by digital processing on a computer. Although the resources and the processing throughput of the digital processing should be taken into account, optical encoding is a key process in computational imaging for practical applications.

Most optical encoding techniques are designed by a heuristic approach suitable for a given problem. However, such a design process makes it difficult to achieve a systematic design that is applicable to a wide range of problems. To alleviate this situation, a versatile optical platform capable of flexible optical encoding is desired. Multi-aperture optics is a promising and effective solution to this problem.

In this paper, the fundamental properties of multi-aperture optics are summarized to illustrate this technique’s suitability as a universal platform for computational imaging. Some examples of interesting functions implemented by multi-aperture optics are explained. Then, applications of the multi-aperture optical systems are demonstrated to show their effectiveness and extendibility.

2 Computational imaging

Figure 1 illustrates the difference between conventional and computational imaging systems centering on opto-electronic conversion. In the conventional imaging system, the optics are designed to form a complete replica of the object, and processing is mainly employed to improve the image quality. On the other hand, in the computational imaging system, optical encoding changes the form of the object signal, and computational decoding reconstructs the signal to achieve high-functionality and high-performance imaging.
Fig. 1

Conventional and computational imaging systems

Dowski and Cathy proposed an interesting scheme to extend the depth of field of imaging optics [3]. In their method, the point spread function (PSF) of the imaging system is intentionally modified so as to be insensitive to defocus. A phase plate that induces a cubic phase retardation is introduced for this purpose. Because the decoder knows the modified PSF, the blurred image is restored by deconvolution with the PSF. This method achieves not only image deblurring, but also extended depth of field.

The author’s group presented a demonstration to extend the field of view with image superposition [8]. The imaging system is formulated as a linear system expressed by
$$\begin{aligned} \mathbf {g} = \mathrm {H}\mathbf {f} \end{aligned}$$
where \(\mathbf {f}\) and \(\mathbf {g}\) are the object and the observation vectors, respectively, and \(\mathrm {H}\) is a system matrix describing the property of the imaging optics. Imaging is regarded as an inverse problem in which the object \(\mathbf {f}\) is retrieved from a given set of observations \(\mathbf {g}\). Compressive sensing provides an effective method of estimating a sparse vector \(\mathbf {f}\) from fewer observations \(\mathbf {g}\) [9]. The estimation \(\hat{\mathbf {f}}\) is obtained by
$$\begin{aligned} \hat{\mathbf {f}} = \mathop {\mathrm{argmin}}\limits _{\mathbf {f}} ||\mathbf {f}||_1 \quad \text {subject\, to}\quad \mathbf {g}=\mathrm {H}\mathbf {f} \end{aligned}$$
where \(||\cdot ||_1\) indicates the L1 norm. In general, an appropriate transformation, such as a discrete cosine transform, can convert the object signal into a sparse vector. Using this property, a signal decoder based on compressive sensing [10] is applied to retrieve the field of view before superposition, which allows the field of view to be extended.
Table 1 summarizes some examples of demonstrations achieved by computational imaging. A combination of optical encoding and computational decoding ensures the extendibility of the imaging framework. Note that computational decoding on a computer is much more flexible than an optical encoder implemented by an optical system. Therefore, there is a strong demand for a universal platform for optical encoding that can be used in a broad range of applications of computational imaging. Multi-aperture optics is a promising and effective solution.
Table 1

Demonstration examples of computational imaging

Achieved feature




Performance extension


Depth-of-field extension

PSF modulation



Field-of-view extension


CS decoder


Dynamic range extension

Assorted pixels

Signal mixture


Functional extension



Light field capturing

Ray processing


Viewpoint change

Light field capturing

Ray processing


Novel imaging




CS decoder


Phase imaging

Coded aperture

CDI, CS decoder


CS compressive sensing, CDI coherent diffractive imaging

3 Multi-aperture optics

3.1 Composition

Multi-aperture optics is an optical system composed of multiple sets of elementary optics. As shown in Fig. 2, two basic strategies can be used to construct a multi-aperture optical system from a baseline optical system: multiplication and division of the baseline optical system. From an engineering viewpoint, the latter is more interesting owing to the capabilities of hardware miniaturization and functional integration. Although the number of pixels and the arrangement of the elementary optics are restricted, geometrical stability becomes an advantage in decoding processing.
Fig. 2

Two basic strategies for multi-aperture imaging

With respect to the fields of view of the individual elementary optics, two types of designs are possible: field division and field overlapping. The former divides the field of view using the individual elementary optics, whereas in the latter, all elementary optics observe almost the same field of view. Field division is suitable for imaging applications because simple stitch processing is enough to reconstruct the observed field. It is also effective for wide field-of-view observation. On the other hand, field overlapping requires elaborate processing to retrieve the object information, but different properties of signals can be captured at a time. It is useful for multi-dimensional signal observation.

As for the uniformity of the elementary optics, homogeneous and heterogeneous configurations can be selected. In the homogeneous configuration, elementary optics of the same type are arranged. This has advantages in terms of ease of fabrication and the capability to capture parallax and light field signals. The heterogeneous configuration consists of elementary optics with different types of optical properties. It is suitable for flexible design and integration of different functions. Figure 3 shows some examples of these types of multi-aperture optics [4, 11, 12, 13, 14].
Fig. 3

Different types of multi-aperture optical systems

3.2 Optical properties

A compound-eye imaging system known as TOMBO [13], shown in Fig. 4, is a typical example of the divided type of multi-aperture optics. Division of the imaging optics provides interesting optical properties. For \(N^2\)-divided optics under the condition of equivalent F-number of the imaging lens and equivalent image sensor, the focal length is reduced by 1/N. The image sensor is divided into \(N^2\) sections, so that the pixel number of a unit image reduced by \(1/N^2\). This enlarges the pixel size relative to the object, and the permissible circle of confusion increases. As a result, the depth of field is increased N-times.

Owing to overlapping the field of views of the units, their captured pixel signals are not necessary unique ones. Because such overlapping situation depends on the distance from the microlens array, the total number of the unique pixel signals changes along the optical axis. In the case of a planar arrangement of multiple imaging lenses with the same focal length, the number of effective pixels in the image sensor becomes \(1/N^2\) at infinity. In general, increasing the number of system parameters contributes to greater flexibility in optical system design.
Fig. 4

TOMBO: a compound-eye imaging system

3.3 Implementation

Introducing an array of pinholes or microlenses into the imaging optics is an easy way to implement multi-aperture optics. A plenoptic camera [4] is a typical instance. A simpler implementation is just to set an array of microlenses in front of the image sensor to focus on the sensor surface. A thin wafer-level camera [12] and TOMBO [13] are typical examples. As a variant of multi-aperture optics, a sophisticated phase imaging scheme in which an aperture array is inserted into the light propagation field has been proposed [15].

4 Functions achieved by multi-aperture optics

Multi-aperture optics provides versatile functionalities achieved by specific techniques. Table 2 summarizes some examples of functions achieved by multi-aperture optics. Among them, some interesting examples are explained in the following.
Table 2

Examples of functions achieved by multi-aperture optics





Geometrical measurement

Distant observation

Irregular-arranged aperture


Combinatorial measurement

Spectral imaging

Multi-channel filters




Geometrical displacement


Light field capturing

Compact hardware

Elimination of main lens


Hardware compaction

Functional integration

Heterogeneous config


Focus sweeping

Extended depth of field

Superposition-eye optics


4.1 Geometrical measurement

Because multiple elementary optics observe the object from different positions, parallax signals can be obtained for geometrical measurements. Multiple baselines with different lengths and orientations for stereo matching can be assigned to the observations, which improves the measurement performance [16]. Note that, in the case of multi-aperture optics in a regular arrangement and having the same focal length, the images observed by all elementary optics become identical at infinity. This considerably reduces the total amount of observation signals. To alleviate this situation, an irregular arrangement of the elementary optics is effective [17]. It was demonstrated that the imaging properties for observation of a distant object were improved with a slightly disordered arrangement of elementary imaging optics.

4.2 Combinatorial measurement

Using a heterogeneous configuration, various optical properties can be assigned to the individual elementary optics. For example, optical filters with different intensity attenuations, wavelength transmittances, and polarization orientations are utilized to capture the corresponding optical properties. By combining the different observations, various kinds of object information can be retrieved. Spectral imaging is achieved by placing wavelength filters on the individual elementary optics [18]. Gonio-imaging just utilizes the geometrical displacement of the apertures [19]. With an appropriate decoding processing, such as using a compressive sensing decoder [10], it is possible to retrieve not only exclusive separable signals but also inclusive mixed ones.

4.3 Focus sweeping

Mechanical scanning of the focusing position is a useful optical encoding technique for equalizing the imaging properties of the system, for example, the PSF. Because such an equalized PSF is regarded as shift invariant, simple deconvolution is adopted to restore the blur as well as to extend the depth of field. A specific sort of compound eye, called a superposition eye, has the interesting property that it superposes the images of objects located at different distances along the optical axis, as shown in Fig. 5 [20]. Using this specific optics, we can achieve focus sweeping without mechanical scanning. Deblur processing of the captured image restores the object signals and also extends the depth of field.
Fig. 5

Focus sweeping with a superposition eye

5 Applications

Compact integration of multiple functions is an attractive feature of multi-aperture optics for product development. In this section, some examples of sophisticated applications based on multi-aperture optical systems are presented.

5.1 Intra-oral diagnosis system

Multi-aperture optical systems are promising for medical applications owing to the suitability for optical sensing. The author’s group is developing an integrated multi-functional measurement system based on the TOMBO system for intra-oral diagnosis [21]. Figure 6 shows an example of the image observed with different optical properties. The 3-D shape of teeth and the gingiva is measured using parallax information. In addition, the component distribution of gingival tissue is estimated by multiple regression analysis [22]. By exploiting the compactness and multi-functionality of this system, a stick-shaped TOMBO is being developed for clinical applications.
Fig. 6

Functions of elementary optical systems for intra-oral diagnosis

5.2 Single-shot phase imaging

An aperture array inside a light propagation field works as a sieve that samples the observation field to reduce the complexity of the decoding process and to extend the field of view. This technique has been proposed for single-shot phase imaging with a coded aperture, called SPICA. SPICA employs compressive Fresnel holography [23] and coherent diffraction imaging [24] iteratively to retrieve the amplitude and phase signals of the object. This technique is based on a variant of multi-aperture optics and shows the potential of this optical system.

5.3 Wide-field and deep-focused imaging

A superposition eye was applied to wide-field and deep-focused imaging [20]. In addition to the focus sweep property, point symmetry of the optics enables omnidirectional observation, as shown in Fig. 5. Image blur caused by a shift-invariant PSF can be restored with deconvolution, which extends the depth of field and widens the field of view.

5.4 Optical system virtualization

The ultimate goal of computational imaging is to completely replace conventional optical systems by digital processing. This is nothing but virtualization of optics. At this stage, the available computing power and resources are not sufficient to realize the idea, but an interesting scheme was presented using multi-aperture optics [25]. Figure 7 illustrates depth-of-field extension based on a virtual optical system. Multi-aperture optics is used to observe the light field in the object space. Once the light field is captured, the ray behavior can be controlled by ray manipulation. Any optical elements, such as lenses or phase plates, can be emulated without having to physically fabricate them. In the demonstration, tilted microlenses were emulated by the virtual optical system, and image restoration was applied to the resultant image for depth-of-field extension. Although the signal resolution is not sufficient compared with real optics, the flexibility and extendibility of the scheme indicate the future direction of computational imaging.
Fig. 7

Virtual optical system for depth-of-field extension

6 Conclusions

Multi-aperture optics is promising as a universal platform for computational imaging because of its versatility in optical signal encoding. Division of baseline optics is an effective strategy for realizing multi-aperture optics owing to the present capabilities of hardware miniaturization and functional integration. Suitable design and fabrication methodologies for multi-aperture optics will be required to promote widespread use of this promising technology.


  1. 1.
    Levoy, M.: Light field and computational imaging. Computer 39(8), 46–55 (2006)CrossRefGoogle Scholar
  2. 2.
    Rasker, R.: Computational cameras: redefining the image. Computer 39(8), 30–38 (2006)CrossRefGoogle Scholar
  3. 3.
    Dowski, E.R., Cathey, W.T.: Extended depth of field through wave-front coding. Appl. Opt. 34, 1859–1866 (1995)ADSCrossRefGoogle Scholar
  4. 4.
    Ng, R., Levoy, M., Brédif, M., Duval, G., Horowitz, M., Hanrahan, P.: Light field photography with a hand-held plenoptic camera. Comput. Sci. Tech. Rep. CSTR 2(11), 1–11 (2005)Google Scholar
  5. 5.
    Kuthirummal, S., Nagahara, H., Zhou, C., Nayar, S.K.: Flexible depth of field photography. IEEE Trans. Pattern Anal. Mach. Intell. 33, 58–71 (2011)CrossRefGoogle Scholar
  6. 6.
    Horisaki, R., Fukata, N., Tanida, J.: A compressive active stereo imaging system with random pattern projection. Appl. Phys. Express 5(7), 072501 (2012)ADSCrossRefGoogle Scholar
  7. 7.
    Nakamura, T., Horisaki, R., Tanida, J.: Compact wide-field-of-view imager with a designed disordered medium. Opt. Rev. 22, 19–24 (2015)CrossRefGoogle Scholar
  8. 8.
    Horisaki, R., Tanida, J.: Multi-channel data acquisition using multiplexed imaging with spatial encoding. Opt. Express 18, 23041–23053 (2010)ADSCrossRefGoogle Scholar
  9. 9.
    Donoho, D.L.: Compressed sensing. IEEE Trans. Inf. Theory 52, 1289–1306 (2006)MathSciNetCrossRefMATHGoogle Scholar
  10. 10.
    Bioucas-Dias, J.M., Figueiredo, M.A.T.: A new TwIST: two-step iterative shrinkage/thresholding algorithms for image restoration. IEEE Trans. Image Processing 16, 2992–3004 (2007)ADSMathSciNetCrossRefGoogle Scholar
  11. 11.
    Jeong, K.-H., Kim, J., Lee, L.P.: Biologically inspired artificial compound eyes. Science 312(5773), 557–561 (2006)ADSCrossRefGoogle Scholar
  12. 12.
    Brückner, A., Duparré, J., Leitel, R., Dannberg, P., Bräuer, A., Tünnermann, A.: Thin wafer-level camera lenses inspired by insect compound eyes. Opt. Express 18, 24379–24394 (2010)ADSCrossRefGoogle Scholar
  13. 13.
    Tanida, J., Kumagai, T., Yamada, K., Miyatake, S., Ishida, K., Morimoto, T., Kondou, N., Miyazaki, D., Ichioka, Y.: Thin observation module by bound optics (TOMBO): concept and experimental verification. Appl. Opt. 40, 1806–1813 (2001)ADSCrossRefGoogle Scholar
  14. 14.
    Plemmons, R.J., Prasad, S., Matthews, S., Mirotznik, M., Barnard, R., Gray, B., Pauca, V.P., Torgersen, T.C., van der Gracht, J., Behrmann, G.: PERIODIC: integrated computational array imaging technology. In: Adaptive optics: analysis and methods/computational optical sensing and imaging/information photonics/signal recovery and synthesis topical meetings on CD-ROM, OSA Technical Digest (CD) (Optical Society of America), paper CMA1 (2007)Google Scholar
  15. 15.
    Horisaki, R., Ogura, Y., Aino, M., Tanida, J.: Single-shot phase imaging with a coded aperture. Opt. Lett. 39, 6466–6469 (2014)ADSCrossRefGoogle Scholar
  16. 16.
    Okutomi, M., Kanade, T.: A multiple-baseline stereo. IEEE Trans. Pattern Anal. Mach. Intell. 15, 353–363 (1993)CrossRefGoogle Scholar
  17. 17.
    Horisaki, R., Kagawa, K., Nakao, Y., Toyoda, T., Masaki, Y., Tanida, J.: Irregular lens arrangement design to improve imaging performance of compound-eye imaging systems. Appl. Phys. Express 3, 022501 (2010)ADSCrossRefGoogle Scholar
  18. 18.
    Shogenji, R., Kitamura, Y., Yamada, K., Miyatake, S., Tanida, J.: Multispectral imaging using compact compound optics. Opt. Express 12, 1643–1655 (2004)ADSCrossRefGoogle Scholar
  19. 19.
    Akao, Y., Shogenji, R., Tsumura, N., Yamaguchi, M., Tanida, J.: Efficient gonio-imaging of optically variable devices by compound-eye image-capturing system. Opt. Express 19, 3353–3362 (2011)ADSCrossRefGoogle Scholar
  20. 20.
    Nakamura, T., Horisaki, R., Tanida, J.: Computational superposition compound eye imaging for extended depth-of-field and field-of-view. Opt. Express 20, 27482–27495 (2012)ADSCrossRefGoogle Scholar
  21. 21.
    Tanida, J., Mima, H., Kagawa, K., Ogata, C., Umeda, M.: Application of a compound imaging system to odontotherapy. Opt. Rev. 22, 322–328 (2015)CrossRefGoogle Scholar
  22. 22.
    Tsumura, N., Haneishi, H., Miyake, Y.: Estimation of spectral reflectances from multi-band images by multiple regression analysis. Jpn. J. Opt. 27, 384 (1998). (in Japanese) Google Scholar
  23. 23.
    Rivenson, Y., Stern, A., Javidi, B.: Compressive fresnel holography. J. Display Technol. 6, 506–509 (2010)ADSCrossRefGoogle Scholar
  24. 24.
    Marchesini, S., Chapman, H., Hau-Riege, S., London, R., Szoke, A., He, H., Howells, M., Padmore, H., Rosen, R., Spence, J., Weierstall, U.: Coherent X-ray diffractive imaging: applications and limitations. Opt. Express 11, 2344–2353 (2003)ADSCrossRefGoogle Scholar
  25. 25.
    Nakamura, T., Horisaki, R., Tanida, J.: Computational phase modulation in light field imaging. Opt. Express 21, 29523–29543 (2013)ADSCrossRefGoogle Scholar

Copyright information

© The Optical Society of Japan 2016

Authors and Affiliations

  1. 1.Department of Information and Physical Sciences, Graduate School of Information Science and TechnologyOsaka UniversitySuitaJapan

Personalised recommendations