Skip to main content
Log in

Research on fundus image registration and fusion method based on nonsubsampled contourlet and adaptive pulse coupled neural network

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

We present a registration and fusion method of fluorescein fundus angiography image and color fundus image which combines Nonsubsampled Contourlet (NSCT) and adaptive Pulse Coupled Neural Network (PCNN). Firstly, we register two images by Speeded Up Robust Features (SURF) feature points, the nearest neighbor and the next nearest neighbor distance ratio method to eliminate the spatial difference between the source images. Secondly, we use Random Sample Consensus (RANSAC) algorithm to achieve precise matching of feature points. Then, according to the transformation parameters obtained by RANSAC algorithm, we perform spatial transformation on the floating image to complete the registration. Finally, we obtain the low-frequency sub-band and high-frequency sub-band of the image to be fused by NSCT decomposition. The low-frequency sub-band is fused by the regional energy. The high-frequency sub-bands are studied using a simplified-PCNN model and the Particle Swarm Optimization algorithm. The link strength of the simplified-PCNN is an improved Laplacian energy and the images are fused based on the number of times the pixels are ignited. The proposed method has higher average gradient (AG) value and information entropy (IE) value and lower relative global dimensional synthesis error (ERGAS) than the existing fusion methods of the fundus image. The fusion image can accurately synthesize the image information, clarify the performance of the details, and has better spectral quality in the spectral range. The image of fused provides an effective reference for the clinical diagnosis of fundus diseases.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. E. J. Candes (1999) Ridgelets: Theory and application. USA: Department of statistics, Stanford University.

  2. Chen J, Tian J (2010) A Partial Intensity Invariant Feature Descriptor for Multimodal Retinal Image Registration. IEEE Transactions on Biomedi -Cal Engineering 57(7):1707–1718

    Article  Google Scholar 

  3. Cunha AL, Zhou J, Do MN (2006) The nonsubsampled contourlet transform: theory, design, and application. IEEE Trans. Image Proc 15(10):3089–3101

    Article  Google Scholar 

  4. Dai W, Jiang X (2016) NSCT Medical Image Adaptive Fusion Based on Human Visual Properties. J Electron 08(8):1932–1939

    Google Scholar 

  5. Do MN, Vetterli M (2005) The Contourlet Transform: An Efficient Directional Multiresolution Image Representation. IEEE Trans Image Process 14(12):2091–2095

    Article  Google Scholar 

  6. Fischler M, Bolles R (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Communication of the ACM 24(2):381–395

    Article  MathSciNet  Google Scholar 

  7. Goshtasby AA, Nikolov S (2007) Image fusion: advances in the state of the art. Information Fusion (S1566-2535) 8(2):114–118

    Article  Google Scholar 

  8. Guo F, Zhao X, Zou B, Liang Y (2017) Automatic retinal image registration using blood vessel segmentation and SIFT feature. Int J Pattern Recognit Artif Intell

  9. He F, Guo Y, Gao C (2017) Improved PCNN Method for Human Target Infrared Image Segmentation Under Complex Environments. Acta Opt Sin 2017(2):175–184

    Google Scholar 

  10. Izhikevich EM (1999) Class 1 neural excitability, conventional synapses, weakly connected networks, and mathematical foundations of pulse-coupled. IEEE Trans Neural Netw 10(3):499–507

    Article  Google Scholar 

  11. Ju J, Loew M, Ku B, Ko H (2016) Erratum: Hybrid Retinal Image Registration Using Mutual Information and Salient Features. IEICE Trans Inf Syst E99.D(6):1729–1732

    Article  Google Scholar 

  12. Kennedy J (2001) Swarm Intelligence. Swarm Intelligence 2(1), pp 475–495

    Google Scholar 

  13. Kennedy J (2011) Particle swarm optimization. Springer US: Encyclopedia of Machine Learning 2010:760–766

    Google Scholar 

  14. Li X, Ren J, L Z (2013) Multispectral and Panchromatic Image Fusion Methods Based on Improved PCNN and Regional Energy in NSCT Domain. Infrared and Laser Engineering 42(11):3096-3102

  15. Liao Y, Huang W, Shang L, Li P (2014) Image fusion based on Shearlet and improved PCNN. Computer Engineering and Applications 50(2):142–146

    Google Scholar 

  16. Luo T, Liu B (2015) Fast SURF registration algorithm for fusion features. Journal of Image and Graphics 20(01):95–103

    Google Scholar 

  17. Maia GA, Jose LS, Raquel GC, Rafael G (2004) Fusion of Multispectral and Panchromatic Image Using Improve HIS and PCA Mergers Based on Wavelet Decomposition. IEEE Trans Geosci Remote Sens 2004:1291–1299

    Google Scholar 

  18. Miri MS, Abràmoff MD, Kwon YH, Garvin MK (2016) Multimodal registration of SD-OCT volumes and fundus photographs using histograms of oriented gradients. Biomedical Optics Express 7(12):5252–5267

    Article  Google Scholar 

  19. Nencini F, Garzelli A, Baronti S, Alparone L (2007) Remote sensing image fusion using the curvelet transform. Information Fusion (S1566-2535) 8(2):143–156

    Article  Google Scholar 

  20. Santosh KC, Alam N, Roy PP, Wendling L, Antani S, Thoma GR (2016) A Simple and Efficient Arrowhead Detection Technique in Biomedical Images. IJPRAI 30(5):1–16

    Google Scholar 

  21. Santosh KC, Roy PP (2018) Arrow detection in biomedical images using sequential classifier. Int J Mach Learn Cybern 9(6):993–1006

    Article  Google Scholar 

  22. Song J, Gan J, Liu S (2016) Infrared and visible image fusion based on PCNN and region characters. Computer Engineering and Applications 52(8):186–190

    Google Scholar 

  23. Song R, Wang M, Wang X (2016) Multifocus Image Fusion Algorithm Based on NSCT and Edge Detection. Journal of Computer - Aided Design and Graphics 28(12):2134–2141

    Google Scholar 

  24. Sun Y, Jiang L (2017) Color multi-focus image fusion algorithm based on fuzzy theory and dual-tree complex wavelet transform. Journal of Algorithms & Computational Technology 11(2):164–169

    Article  Google Scholar 

  25. Wu J, Song S, Zhang D, Yang S (2017) Research on fundus image registration and fusion method based on NSCT and adaptive PCNN. International Journal of Engineering Inventions

  26. Yang L, Liu Y, Liu X (2009) Medical Image Fusion Based on Wavelet Packet Transform. Chin J Biomed Eng 28(1):12–16

    Google Scholar 

  27. Zhou D, Gao C, Guo Y (2014) A coarse-to fine strategy for iterative segmentation using simplified pulse-coupled neural network. Soft Comput 18(3):557–570

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China, under grant No. 61771340; Tianjin Science and Technology Major Projects and Engineering, under grant No. 17ZXHLSY00040, No. 17ZXSCSY00060 and No. 17ZXSCSY00090.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhitao Xiao.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Preliminary results of this work were presented at the 3rd International Symposium on Artificial Intelligence and Robotics (ISAIR) 2018, Nanjing, China.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wu, J., Ren, X., Xiao, Z. et al. Research on fundus image registration and fusion method based on nonsubsampled contourlet and adaptive pulse coupled neural network. Multimed Tools Appl 79, 34795–34812 (2020). https://doi.org/10.1007/s11042-019-08194-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-019-08194-9

Keywords

Navigation