Skip to main content
Log in

Novel and fast EMD-based image fusion via morphological filter

  • Original article
  • Published:
The Visual Computer Aims and scope Submit manuscript

Abstract

This paper presents a novel and fast EMD-based (empirical mode decomposition-based) image fusion approach via morphological filter. Firstly, we develop a multi-channel bidimensional EMD method based on morphological filter to conduct image fusion. It uses the morphological expansion and erosion filters to compute the upper and lower envelopes of a multi-channel image in the sifting processing, and can decompose the input source images into several intrinsic mode functions (IMFs) with different scales and a residue. It significantly improves the computation efficiency of EMD for multi-channel images while maintaining the decomposition quality. Secondly, we adopt a patch-based fusion strategy with overlapping partition to fuse the IMFs and residue instead of the pixel-based fusion way usually used in EMD-based image fusion, where an energy-based maximum selection rule is designed to fuse the IMFs, and the feature information extracted by IMFs is used as a guide to merge the residue. Such strategy can extract the salient information of the source images well and can also reduce the spatial artifacts introduced by the noisy characteristics of the pixel-wise maps. A large number of comparative experiments on the fusion of several commonly used image data sets with multi-focus and multi-modal images, show that our newly proposed fusion method can obtain much better results than the existing EMD-based image fusion approaches. Furthermore, it is very competitive with the state-of-the-art image fusion methods in visualization, objective metrics, and time performance. The code of the proposed method can be downloaded from: https://github.com/neepuhjp/MFMBEMD-ImageFusion.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Notes

  1. https://github.com/yuliu316316/MFIF.

  2. https://github.com/uzeful/IFCNN.

  3. https://github.com/arbabsufyan/Proposed-Code.git.

References

  1. Goshtasby, A.A., Nikolov, S.: Image fusion: advances in the state of the art. Inf. Fus. 8(2), 114–118 (2007)

    Article  Google Scholar 

  2. Ma, J., Ma, Y., Li, C.: Infrared and visible image fusion methods and applications: a survey, Inf. Fus. pp 153–178 (2019)

  3. Huang, N.E., Shen, Z., Long, S.R., Wu, M.C., Shih, H.H., Zheng, Q., Yen, N.C., Tung, C.C., Liu, H.H.: The empirical mode decomposition and the hilbert spectrum for nonlinear and non-stationary time series analysis. Proc. Math. Phys. Eng. Sci. 454(1971), 903–995 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  4. Ahmed, M.U., Mandic, D.P.: Image fusion based on fast and adaptive bidimensional empirical mode decomposition, In: 2010 13th International Conference on Information Fusion, 2010, pp. 1–6

  5. Yeh, M.H.: The complex bidimensional empirical mode decomposition. Signal Process. 92(2), 523–541 (2012)

    Article  Google Scholar 

  6. Qin, X., Zheng, J., Hu, G., Wang, J.: Multi-focus image fusion based on window empirical mode decomposition. Infrared Phys. Technol. 85, 251–260 (2017)

    Article  Google Scholar 

  7. Rehman, N., Ehsan, S., Abdullah, S., Akhtar, M., Mandic, D., Mcdonald-Maier, K.: Multi-scale pixel-based image fusion using multivariate empirical mode decomposition. Sensors 15(5), 10923–10947 (2015)

    Article  Google Scholar 

  8. Pan, J., Tang, Y.Y.: A mean approximation based bidimensional empirical mode decomposition with application to image fusion. Digital Signal Process. 50, 61–71 (2016)

    Article  Google Scholar 

  9. Wang, P., Fu, H., Zhang, K.: A pixel-level entropy-weighted image fusion algorithm based on bidimensional ensemble empirical mode decomposition. Int. J. Distrib. Sens. Netw. 14(12), 1–16 (2018)

    Article  Google Scholar 

  10. Xia, Y., Zhang, B., Pei, W., Mandic, D.P.: Bidimensional multivariate empirical mode decomposition with applications in multi-scale image fusion. IEEE Access 7, 114261–114270 (2019)

    Article  Google Scholar 

  11. Zhu, P., Liu, L., Zhou, X.: Infrared polarization and intensity image fusion based on bivariate bemd and sparse representation. Multimed. Tools Appl. 80, 4455–4471 (2021)

    Article  Google Scholar 

  12. Nunes, J.C., Bouaoune, Y., Delechelle, E., Niang, O., Bunel, P.: Image analysis by bidimensional empirical mode decomposition. Image Vis. Comput. 21(12), 1019–1026 (2003)

    Article  MATH  Google Scholar 

  13. Al-Baddai, S., Al-Subari, K., Tom, A.M., Sol-Casals, J., Lang, E.W.: A green function-based bi-dimensional empirical mode decomposition. Inf. Sci. 348, 305–321 (2016)

    Article  MATH  Google Scholar 

  14. Hu, J., Wang, X., Qin, H.: Improved, feature-centric emd for 3d surface modeling and processing. Graph. Models 76(5), 340–354 (2014)

    Article  Google Scholar 

  15. Hu, J., Wang, X., Qin, H.: Novel and efficient computation of hilbert-huang transform on surfaces. Comput. Aided Geom. Design 43, 95–108 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  16. Wang, X., Hu, J., Guo, L., Zhang, D., Hong, Q., Hao, A.: Feature-preserving, mesh-free empirical mode decomposition for point clouds and its applications. Comput. Aided Geomet. Design 59, 1–16 (2018)

    Article  MathSciNet  MATH  Google Scholar 

  17. Wang, X., Hu, K., Hu, J., Du, L., Ho, A.T.S., Qin, H.: Robust and blind image watermarking via circular embedding and bidimensional empirical mode decomposition. Vis. Comput. 36(19), 2201–2214 (2020)

    Article  Google Scholar 

  18. Wang, X., Hu, K., Hu, J., Du, L., Ho, A.T.S., Qin, H.: A novel robust zero-watermarking algorithm for medical images. Vis. Comput. 37, 2841–2853 (2021)

    Article  Google Scholar 

  19. Bhuiyan, S., Adhami, R.R., Khan, J.F.: Fast and adaptive bidimensional empirical mode decomposition using order-statistics filter based envelope estimation. Eurasip J. Adv. Signal Process. 2008(164), 1–18 (2008)

    Google Scholar 

  20. Trusiak, M., Wielgus, M., Patorski, K.: Advanced processing of optical fringe patterns by automated selective reconstruction and enhanced fast empirical mode decomposition. Opt. Lasers Eng. 52, 230–240 (2014)

    Article  Google Scholar 

  21. Mandic, D.P., ur Rehman, N., Wu, Z., Huang, N.E.: Empirical mode decomposition-based time-frequency analysis of multivariate signals: the power of adaptive data analysis. IEEE Signal Process. Mag. 30(6), 74–86 (2013)

    Article  Google Scholar 

  22. Rehman, N., Mandic, D.P.: Multivariate empirical mode decomposition. Proc. R. Soc. A Math. Phys. Eng. Sci. 466(2117), 1291–1302 (2010)

    MathSciNet  MATH  Google Scholar 

  23. Bhuiyan, S., Khan, J.F., Alam, M.S., Adhami, R.R.: Color image trend adjustment using a color bidimensional empirical mode decomposition method. J. Electron. Imaging 21(3), 234–242 (2012)

    Article  Google Scholar 

  24. Liu, Y., Chen, X., Wang, Z., Wang, Z.J., Ward, R.K., Wang, X.: Deep learning for pixel-level image fusion: recent advances and future prospects. Inf. Fus. 42, 158–173 (2018)

    Article  Google Scholar 

  25. Yu, L., Lei, W., Juan, C., Chang, L., Xun, C.: Multi-focus image fusion: a survey of the state of the art. Inf. Fus. 64, 71–91 (2020)

    Article  Google Scholar 

  26. Sufyan, A., Imran, M., Shah, S.A., Shahwani, H., Wadood, A.A.: A novel multimodality anatomical image fusion method based on contrast and structure extraction. Int. J. Imag. Syst. Technol. 32(1), 324–342 (2022)

    Article  Google Scholar 

  27. Li, X., Li, H., Yu, Z., Kong, Y.: Multifocus image fusion scheme based on the multiscale curvature in nonsubsampled contourlet transform domain. Opt. Eng. 54(7), 1–15 (2015)

    Article  Google Scholar 

  28. Li, H., Chai, Y., Li, Z.: Multi-focus image fusion based on nonsubsampled contourlet transform and focused regions detection. Optik Int. J. Light Electron Opt. 124(1), 40–51 (2013)

    Article  Google Scholar 

  29. Nencini, F., Garzelli, A., Baronti, S., Alparone, L.: Remote sensing image fusion using the curvelet transform. Inf. Fus. 8(2), 143–156 (2007)

    Article  Google Scholar 

  30. Liu, Y., Liu, S., Wang, Z.: A general framework for image fusion based on multi-scale transform and sparse representation. Inf. Fus. 24, 147–164 (2015)

    Article  Google Scholar 

  31. Naidu, V.: Multi-resolution image fusion by fft, In: International Conference on Image Information Processing 2011, 1–6 (2011)

  32. Li, H., Manjunath, B.S., Mitra, S.: Multisensor image fusion using the wavelet transform. Gr. Models Image Process. 57(3), 235–245 (1995)

    Article  Google Scholar 

  33. Lewis, J.J., OCallaghan, R., Nikolov, S.G., Bull, D.R., Canagarajah, N.: Pixel- and region-based image fusion with complex wavelets. Inf. Fus. 8(2), 119–130 (2007)

    Article  Google Scholar 

  34. Yu, Z.A., Yu, L.B., Peng, S.C., Han, Y.A., Xz, D., Li, Z.A.: Ifcnn: a general image fusion framework based on convolutional neural network. Inf. Fus. 54, 99–118 (2020)

    Article  Google Scholar 

  35. Haghighat, M.B.A., Aghagolzadeh, A., Seyedarabi, H.: A non-reference image fusion metric based on mutual information of image features. Comput. Electr. Eng. 37(5), 744–756 (2011)

    Article  MATH  Google Scholar 

  36. Xydeas, C.S., Petrovic, V.: Objective image fusion performance measure. Electron. Lett. 36(4), 308–309 (2000)

    Article  Google Scholar 

  37. Yang, C., Zhang, J.-Q., Wang, X.-R., Liu, X.: A novel similarity based quality metric for image fusion. Inf. Fus. 9(2), 156–160 (2008)

    Article  Google Scholar 

  38. Han, Y., Cai, Y., Cao, Y., Xu, X.: A new image fusion performance metric based on visual information fidelity. Inf. Fus. 14(2), 127–135 (2013)

    Article  Google Scholar 

  39. Ma, J., Zhou, Z., Wang, B., Miao, L., Zong, H.: Multi-focus image fusion using boosted random walks-based algorithm with two-scale focus maps. Neurocomputing 335, 9–20 (2019)

    Article  Google Scholar 

  40. Lai, R., Li, Y., Guan, J., Xiong, A.: Multi-scale visual attention deep convolutional neural network for multi-focus image fusion. IEEE Access 7, 114385–114399 (2019)

    Article  Google Scholar 

  41. Zhu, Z., Zheng, M., Qi, G., Wang, D., Xiang, Y.: A phase congruency and local laplacian energy based multi-modality medical image fusion method in nsct domain. IEEE Access 7, 20811–20824 (2019)

    Article  Google Scholar 

  42. Zhan, K., Kong, L., Liu, B., He, Y.: Multimodal image seamless fusion. J. Electron. Imaging 28(2), 1–9 (2019)

    Article  Google Scholar 

  43. Li, H., Qi, X., Xie, W.: Fast infrared and visible image fusion with structural decomposition. Knowl.-Based Syst. 204, 106182 (2020)

    Article  Google Scholar 

Download references

Acknowledgements

We would like to thank the anonymous reviewers for their helpful comments. This work is supported in part by National Science Foundation of USA (IIS-1812606, IIS-1715985); National Natural Science Foundation of China (No. 61672149, 61532002, 61602344, 61802279); the Open Project Program of the State Key Lab of CAD &CG(No. A2105), Zhejiang University.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Jianping Hu or Xiaochao Wang.

Ethics declarations

Conflict of interest

The authors declare no conflict of interest in this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xie, Q., Hu, J., Wang, X. et al. Novel and fast EMD-based image fusion via morphological filter. Vis Comput 39, 4249–4265 (2023). https://doi.org/10.1007/s00371-022-02588-x

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00371-022-02588-x

Keywords

Navigation