Advertisement

Band Selection with Bhattacharyya Distance Based on the Gaussian Mixture Model for Hyperspectral Image Classification

  • Mohammed LahlimiEmail author
  • Mounir Ait Kerroum
  • Youssef Fakhri
Conference paper
Part of the Advances in Science, Technology & Innovation book series (ASTI)

Abstract

This paper investigates a new band selection approach with the Bhattacharyya distance based on the Gaussian Mixture Model (GMM) for Hyperspectral image classification. Our main motivation to model the Bhattacharyya distance using GMM is due to the fact that this tool is well known for capturing non-Gaussian statistic of multivariate data and that is less sensitive to estimation error problem than purely non-parametric models. To estimate the parameters of GMM, a Robust Expectation-Maximization (REM) algorithm is used. REM solves the shortcoming of the classical Expectation-Maximization (EM) algorithm by dynamically adapting the number of clusters to the data structure. The selected bands with the proposed approach are compared, in terms of classification accuracy, to the Bhattacharyya expressed in its parametric form and the Bhattacharyya modelled with GMM using the classical EM algorithm. The experiment was carried out on two real hyperspectral images, the Indiana Pines (92AV3C) sub-scene and the Kennedy Space Center (KSC) dataset, and the experimental results have demonstrated the effectiveness of our proposed method in terms of classification accuracy with fewer bands.

Keywords

Hyperspectral image Remote sensing Band selection Bhattacharyya distance Probability estimation Gaussian mixture model 92AV3C KSC 

References

  1. AitKerroum, M., Hammouch, A., & Aboutajdine, D. (2010). Textural feature selection by joint mutual information based on gaussian mixture model for multispectral image classification. Pattern Recognition Letters, 31(10), 1168–1174.  https://doi.org/10.1016/j.patrec.2009.11.010.CrossRefGoogle Scholar
  2. Baumgardner, M. F., Biehl, L. L., & Landgrebe, D. A. (2015). 220 band aviris hyperspectral image data set: June 12, 1992 indian pine test site 3.  https://doi.org/10.4231/r7rx991c, https://purr.purdue.edu/publications/1947/1.
  3. Burrell, L., Smart, O., Georgoulas, G. K., Marsh, E., & Vachtsevanos, G. J. (2007). Evaluation of feature selection techniques for analysis of functional MRI and EEG. In DMIN (pp. 256–262).Google Scholar
  4. Camps-Valls, G., & Bruzzone, L. (2009). Kernel methods for remote sensing data analysis. Wiley.Google Scholar
  5. Datta, A., Ghosh, S., & Ghosh, A. (2014). Band elimination of hyperspectral imagery using partitioned band image correlation and capacitory discrimination. International Journal of Remote Sensing, 35(2), 554–577.  https://doi.org/10.1080/01431161.2013.871392.CrossRefGoogle Scholar
  6. Duda, R. O., Hart, P. E., & Stork, D. G. (2000). Pattern classification (2nd ed.). New York, NY, USA: Wiley-Interscience.zbMATHGoogle Scholar
  7. Dundar, M. M., & Landgrebe, D. (2002). A model-based mixture-supervised classification approach in hyperspectral data analysis. IEEE Transactions on Geoscience and Remote Sensing, 40(12), 2692–2699.  https://doi.org/10.1109/TGRS.2002.807010.CrossRefGoogle Scholar
  8. Dundar, M. M., & Landgrebe, D. A. (2004). Toward an optimal supervised classifier for the analysis of hyperspectral data. IEEE Transactions on Geoscience and Remote Sensing, 42(1), 271–277.  https://doi.org/10.1109/TGRS.2003.817813.CrossRefGoogle Scholar
  9. GIC UdPV. (2015). Hyperspectral remote sensing scenes—kennedy space center (KSC). http://www.ehu.eus/ccwintco/index.php?title=Hyperspectral_Remote_Sensing_Scenes#Kennedy_Space_Center.28KSC.29.
  10. Huang, G. B., Zhu, Q. Y., & Siew, C. K. (2004). Extreme learning machine: A new learning scheme of feedforward neural networks. In 2004 IEEE International Joint Conference on Neural Networks (IEEE Cat. No. 04CH37541) (Vol. 2, pp. 985–990).  https://doi.org/10.1109/ijcnn.2004.1380068.
  11. Huang, G. B., Zhu, Q. Y., & Siew, C. K. (2006). Extreme learning machine: Theory and applications. Neurocomputing, 70(1), 489–501.  https://doi.org/10.1016/j.neucom.2005.12.126.CrossRefGoogle Scholar
  12. Jimenez, L. O., & Landgrebe, D. A. (1998). Supervised classification in high-dimensional space: Geometrical, statistical, and asymptotical properties of multivariate data. IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews), 28(1), 39–54.  https://doi.org/10.1109/5326.661089.
  13. Kuo, B. C., & Landgrebe, D. A. (2002). A robust classification procedure based on mixture classifiers and nonparametric weighted feature extraction. IEEE Transactions on Geoscience and Remote Sensing, 40(11), 2486–2494.  https://doi.org/10.1109/TGRS.2002.805088.CrossRefGoogle Scholar
  14. Le Bris, A., Chehata, N., Briottet, X., & Paparoditis, N. (2015). Extraction of optimal spectral bands using hierarchical band merging out of hyperspectral data. The International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, 40(3), 459.CrossRefGoogle Scholar
  15. Lee, C., & Landgrebe, D. A. (1993). Feature extraction based on decision boundaries. IEEE Transactions on Pattern Analysis and Machine Intelligence, 15(4), 388–400.  https://doi.org/10.1109/34.206958.CrossRefGoogle Scholar
  16. Li, W., Prasad, S., & Fowler, J. E. (2014). Hyperspectral image classification using gaussian mixture models and markov random fields. IEEE Geoscience and Remote Sensing Letters, 11(1), 153–157.  https://doi.org/10.1109/LGRS.2013.2250905.CrossRefGoogle Scholar
  17. Martinez, W., & Martinez, A. (2007). Computational Statistics Handbook with MATLAB (2nd ed.). Chapman & Hall/CRC Computer Science & Data Analysis: CRC Press.zbMATHGoogle Scholar
  18. Richards, J. (2012). Remote sensing digital image analysis: An introduction. Berlin: Springer. https://books.google.com/books?id=ETfwQnBMP4UC.
  19. Shahshahani, B. M., & Landgrebe, D. A. (1994). The effect of unlabeled samples in reducing the small sample size problem and mitigating the hughes phenomenon. IEEE Transactions on Geoscience and Remote Sensing, 32(5), 1087–1095.  https://doi.org/10.1109/36.312897.CrossRefGoogle Scholar
  20. Simin, C., Rongqun, Z., Wenling, C., & Hui, Y. (2009). Band selection of hyperspectral images based on bhattacharyya distance. WSEAS Transactions on Information Science and Applications, 6(7), 1165–1175.Google Scholar
  21. Tadjudin, S., & Landgrebe, D. A. (2000). Robust parameter estimation for mixture model. IEEE Transactions on Geoscience and Remote Sensing, 38(1), 439–445.  https://doi.org/10.1109/36.823939.CrossRefGoogle Scholar
  22. Theodoridis, S., & Koutroumbas, K. (2009). Pattern recognition (2nd ed.). Elsevier Science.Google Scholar
  23. Thomaz, C. E., Gillies, D. F., & Feitosa, R. Q. (2004). A new covariance estimate for bayesian classifiers in biometric recognition. IEEE Transactions on Circuits and Systems for Video Technology, 14(2), 214–223.  https://doi.org/10.1109/TCSVT.2003.821984.CrossRefGoogle Scholar
  24. Wang, S., & Wang, C. (2015). Research on dimension reduction method for hyperspectral remote sensing image based on global mixture coordination factor analysis. The International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, 40(7), 159.CrossRefGoogle Scholar
  25. Webb, A. (2003). Statistical pattern recognition (2nd ed.). Wiley InterScience Electronic Collection, Wiley.Google Scholar
  26. Yang, L., & Lin Yang, M. S., Lai, C. Y., & Lin, C. Y. (2012). A robust EM clustering algorithm for Gaussian mixture models. Pattern Recognition, 45(11), 3950–3961,  https://doi.org/10.1016/j.patcog.2012.04.031.

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Mohammed Lahlimi
    • 1
    Email author
  • Mounir Ait Kerroum
    • 1
  • Youssef Fakhri
    • 1
  1. 1.LaRIT, Faculty of SciencesIbn Tofail University KenitraKenitraMorocco

Personalised recommendations