Unsupervised Classification of SAR Images Using Hierarchical Agglomeration and EM

  • Koray Kayabol
  • Vladimir A. Krylov
  • Josiane Zerubia
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7252)

Abstract

We implement an unsupervised classification algorithm for high resolution Synthetic Aperture Radar (SAR) images. The foundation of algorithm is based on Classification Expectation-Maximization (CEM). To get rid of two drawbacks of EM type algorithms, namely the initialization and the model order selection, we combine the CEM algorithm with the hierarchical agglomeration strategy and a model order selection criterion called Integrated Completed Likelihood (ICL). We exploit amplitude statistics in a Finite Mixture Model (FMM), and a Multinomial Logistic (MnL) latent class label model for a mixture density to obtain spatially smooth class segments. We test our algorithm on TerraSAR-X data.

Keywords

High resolution SAR TerraSAR-X classification texture multinomial logistic Classification EM hierarchical agglomeration 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Titterington, D., Smith, A., Makov, A.: Statistical Analysis of Finite Mixture Disributions, 3rd edn. John Wiley & Sons, Chichester (1992)Google Scholar
  2. 2.
    Oliver, C., Quegan, S.: Understanding Synthetic Aperture Radar Images, 3rd edn. Artech House, Norwood (1998)Google Scholar
  3. 3.
    Masson, P., Pieczynski, W.: SEM Algorithm and Unsupervised Statistical Segmentation of Satellite Images. IEEE Trans. Geosci. Remote Sens. 31(3), 618–633 (1993)CrossRefGoogle Scholar
  4. 4.
    Krylov, V.A., Moser, G., Serpico, S.B., Zerubia, J.: Supervised Enhanced Dictionary-Based SAR Amplitude Distribution Estimation and Its Validation With Very High-Resolution Data. IEEE Geosci. Remote Sens. Lett. 8(1), 148–152 (2011)CrossRefGoogle Scholar
  5. 5.
    Dempster, A.P., Laird, N.M., Rubin, D.B.: Maximum Likelihood from Incomplete Data via the EM Algorithm. J. R. Statist. Soc. B. 39, 1–22 (1977)MathSciNetMATHGoogle Scholar
  6. 6.
    Redner, R.A., Walker, H.F.: Mixture Densities, Maximum Likelihood and the EM Algorithm. SIAM Review 26(2), 195–239 (1984)MathSciNetMATHCrossRefGoogle Scholar
  7. 7.
    Palubinskas, G., Descombes, X., Kruggel, F.: An Unsupervised Clustering Method using the Entropy Minimization. In: Int. Conf. Pattern Recognition, ICPR 1998, pp. 1816–1818 (1998)Google Scholar
  8. 8.
    Figueiredo, M.A.T., Jain, A.K.: Unsupervised Learning of Finite Mixture Models. IEEE Trans. on Pattern Anal. Machine Intell. 24(3), 381–396 (2002)CrossRefGoogle Scholar
  9. 9.
    Wilson, S.P., Zerubia, J.: Segmentation of Textured Satellite and Aerial Images by Bayesian Inference and Markov Random Fields. Res. Rep. RR-4336, INRIA, France (2001)Google Scholar
  10. 10.
    Kayabol, K., Voisin, A., Zerubia, J.: SAR Image Classification with Non-stationary Multinomial Logistic Mixture of Amplitude and Texture Densities. In: Int. Conf. Image Process, ICIP 2011, pp. 173–176 (2011)Google Scholar
  11. 11.
    Celeux, G., Govaert, G.: A Classification EM Algorithm for Clustering and Two Stochastic Versions. Comput. Statist. Data Anal. 14, 315–332 (1992)MathSciNetMATHCrossRefGoogle Scholar
  12. 12.
    Fraley, C., Raftery, A.: Model-based Clustering, Discriminant Analysis, and Density Estimation. J. Am. Statistical Assoc. 97(458), 611–631 (2002)MathSciNetMATHCrossRefGoogle Scholar
  13. 13.
    Ward, J.H.: Hierarchical groupings to optimize an objective function. J. Am. Statistical Assoc. 58(301), 236–244 (1963)CrossRefGoogle Scholar
  14. 14.
    Schwarz, G.: Estimating the Dimension of a Model. Annals of Statistics 6, 461–464 (1978)MathSciNetMATHCrossRefGoogle Scholar
  15. 15.
    Celeux, G., Chretien, S., Forbes, F., Mkhadri, A.: A Component-wise EM Algorithm for Mixtures. Res. Rep. RR-3746, INRIA, France (1999)Google Scholar
  16. 16.
    Wallace, C.S., Boulton, D.M.: An Information Measure for Classification. Comp. J. 11, 185–194 (1968)MATHGoogle Scholar
  17. 17.
    Wallace, C.S., Freeman, P.R.: Estimation and Inference by Compact Coding. J. R. Statist. Soc. B 49(3), 240–265 (1987)MathSciNetMATHGoogle Scholar
  18. 18.
    Biernacki, C., Celeux, G., Govaert, G.: Assessing a Mixture Model for Clustering with the Integrated Completed Likelihood. IEEE Trans. on Pattern Anal. Machine Intell. 22(7), 719–725 (2000)CrossRefGoogle Scholar
  19. 19.
    Krishnapuram, B., Carin, L., Figueiredo, M.A.T., Hartemink, A.J.: Sparse Multinomial Logistic Regression: Fast Algorithms and Generalization Bounds. IEEE Trans. on Pattern Anal. Machine Intell. 27(6), 957–968 (2005)CrossRefGoogle Scholar
  20. 20.
    Scarpa, G., Gaetano, R., Haindl, M., Zerubia, J.: Hierarchical Multiple Markov Chain Model for Unsupervised Texture Segmentation. IEEE Trans. Image Process. 18(8), 1830–1843 (2009)MathSciNetCrossRefGoogle Scholar
  21. 21.
    Lin, J.: Divergence Measures Based on the Shannon Entropy. IEEE Trans. Inform. Theory 37(1), 145–151 (1991)MathSciNetMATHCrossRefGoogle Scholar
  22. 22.
    Krylov, V.A., Moser, G., Serpico, S.B., Zerubia, J.: Supervised High-Resolution Dual-Polarization SAR Image Classification by Finite Mixtures and Copulas. IEEE J. Sel. Top. Signal Process. 5(3), 554–566 (2011)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Koray Kayabol
    • 1
  • Vladimir A. Krylov
    • 1
  • Josiane Zerubia
    • 1
  1. 1.ArianaINRIA Sophia Antipolis MediterraneeSophia Antipolis CedexFrance

Personalised recommendations