Advertisement

MRI–PET Medical Image Fusion Technique by Combining Contourlet and Wavelet Transform

  • Ch. Hima Bindu
  • K. Satya Prasad
Conference paper
Part of the Lecture Notes in Electrical Engineering book series (LNEE, volume 150)

Abstract

This paper proposes the application of the hybrid Multiscale transform in medical image fusion. The multimodality medical image fusion plays an important role in clinical applications which can support more accurate information for physicians to diagnosis diseases. In this paper, a new fusion scheme for Magnetic Resonance Images (MRI) and Positron Emission Tomography (PET) images based on hybrid transforms is proposed. PET/MRI medical image fusion has important clinical significance. Medical image fusion is the important step after registration, which is an integrative display method of two images. The PET image indicates the brain function and a low spatial resolution; MRI image shows the brain tissue anatomy and contains no functional information. Hence, a perfect fused image should contains both more functional information and more spatial characteristics with no spatial and color distortion. Firstly, the image is decomposed into high and low frequency subband coefficients with discrete wavelet transform (DWT). On these coefficients apply contourlet transform individually before going for fusion process. Later the fusion process is performed on contourlet components for each subband, for fusion the spatial frequency method is used. Finally, the proposed algorithm results are compared with different Multiscale transform techniques. According to simulation results, the algorithm holds useful information from source images.

Keywords

Image fusion Discrete wavelet transform Contourlet transform 

References

  1. 1.
    Daneshvar S, Ghassemian H (2010) MRI and PET image fusion by combining his and retina-inspired models, Inf Fusion 11:114–123Google Scholar
  2. 2.
    Jia YH (1998) Fusion of landsat TM and SAR images based on principal component analysis. Remote Sens Technol Appl 13(1):46–49Google Scholar
  3. 3.
    Mukhopadhyay S, Chanda B (2001) Fusion of 2D gray scale images using multi-scale morphology. Pattern Recognit 34(12):1939–1949Google Scholar
  4. 4.
    Petrovic VS, Xydeas CS (2004) Gradient-based multi-resolution image fusion. IEEE Trans Image Process 13(2): 228–237Google Scholar
  5. 5.
    Liu Y, Yang J, Sun J (2010) PET/CT medical image fusion algorithm based on multiwavelet transform. In: 2nd international conference on advanced computer and control, pp 264–268Google Scholar
  6. 6.
    Barron DR, Thomas ODJ (2001) Image fusion through consideration of texture components. IEEE Trans Electron Lett 37(12):746–748Google Scholar
  7. 7.
    Yang L, Guo BL, Ni W (2008) Multimodality medical image fusion based on multiscale geometric analysis of contourlet transform. Neurocomputing 72:203–211CrossRefGoogle Scholar
  8. 8.
    Yang L, Guo BL, Li W (2008) Multimodality medical image fusion based on multiscale geometric analysis of contourlet transform. Neurocomputing 72(1–3):203–211CrossRefGoogle Scholar
  9. 9.
    Chien JT, Wu CC (2002) Discriminant wavelet faces and nearest feature classifiers for face recognition. IEEE Trans PAMI 24:1644–1649Google Scholar
  10. 10.
    Bloch I (1996) Information combination operators for data fusion: a review with classification. IEEE Trans SMC (Part A) 26(1):52–67Google Scholar
  11. 11.
    Daneshvar S, Ghassemian H (2010) MRI and PET image fusion by combining his and retina-inspired models. Inf Fusion 11:114–123Google Scholar
  12. 12.
    Zhang J, Ma S, Han X (2006) Multiscale feature extraction of finger-vein patterns based on curvelets and local interconnection structure neural network. In: Proceedings of international conference on pattern recognition, vol 4, pp 145–148Google Scholar
  13. 13.
    Zhao M, Li P, Liu Z (2008). Face recognition based on wavelet transform weighted modular PCA. In Proceedings of the congress in image and signal processingGoogle Scholar
  14. 14.
    LI S, Wang JT (2001) Combination of images with diverse focuses using the spatial frequency. Inf Fusion 2:169–176Google Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  1. 1.Department of Electronics and CommunicationsQIS College of Engineering and TechnologyOngoleIndia
  2. 2.Department of Electronics and CommunicationsJNTU College of EngineeringKakinadaIndia

Personalised recommendations