Skip to main content

Disparity Estimation for Image Fusion in a Multi-aperture Camera

  • Conference paper
  • First Online:
Computer Analysis of Images and Patterns (CAIP 2015)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 9257))

Included in the following conference series:

Abstract

In this paper, an image fusion algorithm is proposed for a multi-aperture camera. Such camera is a worthy alternative to traditional Bayer filter camera in terms of image quality, camera size and camera features. The camera consists of several camera units, each having dedicated optics and color filter. The main challenge of a multi-aperture camera arises from the fact that each camera unit has a slightly different viewpoint. Our image fusion algorithm corrects the parallax error between the sub-images using a disparity map, which is estimated from the multi-spectral images. We improve the disparity estimation by combining matching costs over multiple views with help of trifocal tensors. Images are matched using two alternative matching costs, mutual information and Census transform. We also compare two different disparity estimation methods, graph cuts and semi-global matching. The results show that the overall quality of the fused images is near the reference images.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Suda, Y.: Image sensing apparatus and its control method, control program, and storage medium for correcting position deviation of images. US Patent No. 7847843 (2010)

    Google Scholar 

  2. Yu, Y., Zhang, Z.: Digital cameras using multiple sensors with multiple lenses. US Patent No. 6611289 (2003)

    Google Scholar 

  3. Kolehmainen, T., Rytivaara, M., Tokkonen, T., Mäkelä, J., Ojala, K.: Imaging device. US Patent No. 7453510 (2008)

    Google Scholar 

  4. Gere, D., S.: Image capture using luminance and chrominance sensors. US Patent No. 8497897 (2013)

    Google Scholar 

  5. Sung, G.-Y., Park, D.-S., Lee, H.-Y., Kim, S.-S., Kim, C.-Y.:Camera module. European Patent No. 1871091 (2007)

    Google Scholar 

  6. LinX Imaging: Technology presentation (2014). http://linximaging.com/imaging/

  7. Venkataraman, K., Lelescu, D., Duparre, J., McMahon, A., Molina, G., Chatterjee, P., Mullis, R., Nayar, S.: PiCam: An ultra-thin high performance monolithic camera array. ACM Transactions on Graphics 32(6), 13 (2013)

    Article  Google Scholar 

  8. Hernández, C.: Lens blur in the new google camera app (2014). googleresearch.blogspot.com/2014/04/lens-blur-in-new-google-camera-app.html

    Google Scholar 

  9. Kim, J., Kolmogorov, V., Zabih, R.: Visual correspondence using energy minimization and mutual information. In: The Proceedings of the 9th IEEE International Conference on Computer Vision, vol. 2, pp. 1033–1040 (2003)

    Google Scholar 

  10. Hirschmüller, H.: Stereo processing by semiglobal matching and mutual information. IEEE Transactions on Pattern Analysis and Machine Intelligence 30(2), 328–341 (2008)

    Article  Google Scholar 

  11. Hartley, R., Zisserman, A.: Multiple view geometry in computer vision, 2nd edn. Cambridge University Press, United States of America, 655 p. (2003)

    Google Scholar 

  12. Hirschmüller, H., Scharstein, D.: Evaluation of stereo matching costs on images with radiometric differences. IEEE Transactions on Pattern Analysis and Machine Intelligence 31(9), 1582–1599 (2008)

    Article  Google Scholar 

  13. Egnal, G.: Mutual information as a stereo correspondence measure. University of Pennsylvania, Department of Computer and Information Science, Technical Report No. MS-CIS-00-20 (2000)

    Google Scholar 

  14. Zabih, R., Woodfill, J.: Non-parametric local transforms for computing visual correspondence. In: Eklundh, J.-O. (ed.) ECCV 1994. LNCS, vol. 801, pp. 151–158. Springer, Heidelberg (1994)

    Chapter  Google Scholar 

  15. Boykov, Y., Veksler, O., Zabih, R.: Fast approximate energy minimization via graph cuts. IEEE Transactions on Pattern Analysis and Machine Intelligence 23(11), 1222–1239 (2001)

    Article  Google Scholar 

  16. Scharstein, D., Szeliski, R.: A taxonomy and evaluation of dense two-frame stereo correspondence algorithms. International Journal of Computer Vision 47(1–3), 7–42 (2002)

    Article  MATH  Google Scholar 

  17. Scharstein, D., Szeliski, R.: High-accuracy stereo depth maps using structured light. IEEE Computer Society Conference on Computer Vision and Pattern Recognition 1, 195–202 (2003)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Janne Mustaniemi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Mustaniemi, J., Kannala, J., Heikkilä, J. (2015). Disparity Estimation for Image Fusion in a Multi-aperture Camera. In: Azzopardi, G., Petkov, N. (eds) Computer Analysis of Images and Patterns. CAIP 2015. Lecture Notes in Computer Science(), vol 9257. Springer, Cham. https://doi.org/10.1007/978-3-319-23117-4_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-23117-4_14

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-23116-7

  • Online ISBN: 978-3-319-23117-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics