Multimedia Tools and Applications

, Volume 76, Issue 6, pp 8175–8193 | Cite as

Image fusion based on simultaneous empirical wavelet transform

Article

Abstract

In this paper, a new multi-scale image fusion algorithm for multi-sensor images is proposed based on Empirical Wavelet Transform (EWT). Different from traditional wavelet transform, the wavelets of EWT are not fixed, but the ones generated according to the processed signals themselves, which ensures that these wavelets are optimal for processed signals. In order to make EWT can be used in image fusion, Simultaneous Empirical Wavelet Transform (SEWT) for 1D and 2D signals are proposed, by which different signals can be projected into the same wavelet set generated according to all the signals. The fusion algorithm constructed on the 2D SEWT contains three steps: source images are decomposed into a coarse layer and a detail layer first; then, the algorithm fuses detail layers using maximum absolute values, and fuses coarse layers using the maximum global contrast selection; finally, coefficients in all the fused layers are combined to obtain the final fused image using 2D inverse SEWT. Experiments on various images are conducted to examine the performance of the proposed algorithm. The experimental results have shown that the fused images obtained by the proposed algorithm achieve satisfying visual perception; meanwhile, the algorithm is superior to other traditional algorithms in terms of objective measures.

Keywords

Image fusion Empirical wavelet transforms Data driven Simultaneous image decomposition 

References

  1. 1.
    Chen T, Zhang J, Zhang Y (2005) Remote sensing image fusion based on ridgelet transform. In: IEEE International Geoscience and Remote Sensing Symposium (IGARSS’05). IEEE 1150–1153Google Scholar
  2. 2.
    Geng P, Huang M, Liu S, Feng J, Bao P. Multifocus image fusion method of Ripplet transform based on cycle spinning. Multimed Tools Appl 1–11Google Scholar
  3. 3.
    Gilles J (2013) Empirical wavelet transform. IEEE Trans Signal Process 61(16):3999–4010MathSciNetCrossRefGoogle Scholar
  4. 4.
    Gilles J, Tran G, Osher S (2014) 2D empirical transforms. wavelets, ridgelets, and curvelets revisited. SIAM J Imaging Sci 7(1):157–186MathSciNetCrossRefMATHGoogle Scholar
  5. 5.
    Han Y, Cai Y, Cao Y, Xu X (2013) A new image fusion performance metric based on visual information fidelity. Inf Fusion 14(2):127–135CrossRefGoogle Scholar
  6. 6.
    Hariharan H, Gribok A, Abidi MA, Koschan A (2006) Image fusion and enhancement via empirical mode decomposition. J Pattern Recogn Res 1(1):16–32CrossRefGoogle Scholar
  7. 7.
    Huang NE, Shen Z, Long SR, Wu MC, Shih HH, Zheng Q, Yen N-C, Tung CC, Liu HH (1998) The empirical mode decomposition and the Hilbert spectrum for nonlinear and non-stationary time series analysis. Proc Ro Soc Lond A Math Phys Eng Sci R Soc 903–995Google Scholar
  8. 8.
    Li S, Kang X, Hu J (2013) Image fusion with guided filtering. IEEE Trans Image Process 22(7):2864–2875CrossRefGoogle Scholar
  9. 9.
    Li H, Manjunath B, Mitra SK (1995) Multisensor image fusion using the wavelet transform. Graph Model Image Process 57(3):235–245CrossRefGoogle Scholar
  10. 10.
    Li TJ, Wang YY (2011) Biological image fusion using a NSCT based variable-weight method. Inf Fusion 12(2):85–92CrossRefGoogle Scholar
  11. 11.
    Liu Y, Liu S, Wang Z (2015) A general framework for image fusion based on multi-scale transform and sparse representation. Inf Fusion 24:147–164CrossRefGoogle Scholar
  12. 12.
    Loeckx D, Slagmolen P, Maes F, Vandermeulen D, Suetens P (2010) Nonrigid image registration using conditional mutual information. IEEE Trans Med Imaging 29(1):19–29CrossRefGoogle Scholar
  13. 13.
    Looney D, Mandic DP (2009) Multiscale image fusion using complex extensions of EMD. IEEE Trans Signal Process 57(4):1626–1630MathSciNetCrossRefGoogle Scholar
  14. 14.
    Miao QG, Shi C, Xu PF, Yang M, Shi YB (2011) A novel algorithm of image fusion using shearlets. Opt Commun 284(6):1540–1547CrossRefGoogle Scholar
  15. 15.
    Pajares G, De La Cruz JM (2004) A wavelet-based image fusion tutorial. Pattern Recogn 37(9):1855–1872CrossRefGoogle Scholar
  16. 16.
    Petrovic V, Xydeas C (2005) Objective evaluation of signal-level image fusion performance. Opt Eng 44(8)Google Scholar
  17. 17.
    Piella G, Heijmans H (2003) A new quality metric for image fusion. Proc Int Conf Image Process 3:173–176Google Scholar
  18. 18.
    Siu AMK, Lau RWH (2005) Image registration for image-based rendering. IEEE Trans Image Process 14(2):241–252CrossRefGoogle Scholar
  19. 19.
    Toet A, Franken EM (2003) Perceptual evaluation of different image fusion schemes. Displays 24(1):25–37CrossRefGoogle Scholar
  20. 20.
    Wang Z, Bovik AC (2002) A universal image quality index. IEEE Signal Process Lett 9(3):81–84CrossRefGoogle Scholar
  21. 21.
    Wang Z, Bovik AC (2006) Modern image quality assessment. Synth Lect Image Video Multimed Process 2(1):1–156CrossRefGoogle Scholar
  22. 22.
    Yang SY, Wang M, Jiao LC, Wu RX, Wang ZX (2010) Image fusion based on a new contourlet packet. Inf Fusion 11(2):78–84CrossRefGoogle Scholar
  23. 23.
    Zhenfeng S, Jun L, Qimin C (2012) Fusion of infrared and visible images based on focus measure operators in the curvelet domain. Appl Optics 51(12):1910–1921CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.Key Laboratory of Symbol Computation and Knowledge Engineer of Ministry of EducationJilin UniversityChangchunChina
  2. 2.College of Computer Science and TechnologyJilin UniversityChangchunChina

Personalised recommendations