Remote Sensing Image Fusion Based on Adaptive RBF Neural Network
With the availability of multi-sensor and multi-frequency image data from operational observation satellites, the fusion of image data has become an important tool in remote sensing image evaluation and segmentation. This paper presents a novel Radius Basis Function (RBF) neural network with some distinctive training strategies, which can integrate multiple information sources efficiently and exploit the potential advantages of each feature. Multi-scale features extracted from remote sensing images are evaluated adaptively and used for segmentation. Experimental results obtained on artificial and real data are both presented which demonstrate the effectiveness of our proposal.
KeywordsImage Fusion Synthetic Aperture Radar Synthetic Aperture Radar Image Hide Unit Feature Weight
Unable to display preview. Download preview PDF.
- 3.Carper, J.W., Lillesand, T.M., Kjefer, R.W.: The use of intensity-hue-saturation transformation for merging SPOT panchromatic and multispectral image data. Photogrammetric Engineering and Remote Sensing 56(4), 459–467 (1990)Google Scholar
- 4.Yonghong, J.: Fusion of Landsat TM and SAR image based on principal component analysis. Remote Sensing Technology and Application 13(3), 46–49 (1998)Google Scholar
- 7.Oliver, C.J., Quegan, S.: Understanding SAR Images. Artech House, Boston (1998)Google Scholar
- 12.Singh, M., Singh, S.: Spatial texture analysis: a comparative study. In: Proc. 15th International Conf. on Pattern Recognition (ICPR 2002), vol. 1, pp. 676–679 (2002)Google Scholar
- 16.Wettschereck, D., Dietterich, T.: Improving the performance of Radial Basis Function networks by learning center locations. Advance in Neural Information Processing System, vol. 4. Morgan Kaufmann Publisher, San Francisco (1992)Google Scholar