A Supervised Band Selection Method for Hyperspectral Images Based on Information Gain Ratio and Clustering

  • Sonia SarmahEmail author
  • Sanjib K. Kalita
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11942)


The high spectral dimension of hyperspectral images increases the computational complexity and processing time requirement for analysing such images. As a consequence, dimension reduction is an essential step to be performed prior to the processing of hyperspectral images. In this work we have presented a dimension reduction technique for hyperspectral images using a supervised band selection method. Initially significance of each band were evaluated by calculating information gain ratio. Then, by applying clustering technique, similar bands were grouped into k clusters, from each of which the band with the maximum information gain ratio value was selected as the representative band. In a subsequent band pruning step, this set of representative bands were further reduced by eliminating the bands with low information gain ratio. The results of the experiment showed that adequate accuracy could be achieved with relatively low number of bands selected by the proposed method.


Hyperspectral Supervised band selection Clustering Information gain ratio Band pruning 


  1. 1.
    Bailey, T.: A note on distance-weighted k-nearest neighbor rules. Trans. Syst. Man Cybern. 8, 311–313 (1978)Google Scholar
  2. 2.
    Bajcsy, P., Groves, P.: Methodology for hyperspectral band selection. Photogram. Eng. Remote Sens. 70(7), 793–802 (2004)CrossRefGoogle Scholar
  3. 3.
    Chang, C.I., Du, Q., Sun, T.L., Althouse, M.L.: A joint band prioritization and band-decorrelation approach to band selection for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 37(6), 2631–2641 (1999)CrossRefGoogle Scholar
  4. 4.
    Du, H., Qi, H., Wang, X., Ramanath, R., Snyder, W.E.: Band selection using independent component analysis for hyperspectral image processing. In: 2003 Proceedings of 32nd Applied Imagery Pattern Recognition Workshop, pp. 93–98. IEEE (2003)Google Scholar
  5. 5.
    He, X., Wang, Z., Jin, C., Zheng, Y., Xue, X.: A simplified multi-class support vector machine with reduced dual optimization. Pattern Recogn. Lett. 33(1), 71–82 (2012)CrossRefGoogle Scholar
  6. 6.
    Hughes, G.: On the mean accuracy of statistical pattern recognizers. IEEE Trans. Inf. Theory 14(1), 55–63 (1968)CrossRefGoogle Scholar
  7. 7.
    Milligan, G.W., Cooper, M.C.: A study of the comparability of external criteria for hierarchical cluster analysis. Multivariate Behav. Res. 21(4), 441–458 (1986)CrossRefGoogle Scholar
  8. 8.
    Ng, A.Y., Jordan, M.I., Weiss, Y.: On spectral clustering: analysis and an algorithm. In: Advances in Neural Information Processing Systems, pp. 849–856 (2002)Google Scholar
  9. 9.
    Quinlan, J.R.: Induction of decision trees. Mach. Learn. 1(1), 81–106 (1986)Google Scholar
  10. 10.
    Shen, D., Lu, Z., et al.: Computation of correlation coefficient and its confidence interval in SAS. SUGI: Paper, pp. 170–31 (2006)Google Scholar
  11. 11.
    Xing, C., Ma, L., Yang, X.: Stacked denoise autoencoder based feature extraction and classification for hyperspectral images. J. Sens. 2016, 10 (2016)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Gauhati UniversityGuwahatiIndia
  2. 2.Assam Don Bosco UniversityGuwahatiIndia

Personalised recommendations