A Learning Scheme for Recognizing Sub-classes from Model Trained on Aggregate Classes

  • Ranga Raju Vatsavai
  • Shashi Shekhar
  • Budhendra Bhaduri
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5342)

Abstract

In many practical situations it is not feasible to collect labeled samples for all available classes in a domain. Especially in supervised classification of remotely sensed images it is impossible to collect ground truth information over large geographic regions for all thematic classes. As a result often analysts collect labels for aggregate classes. In this paper we present a novel learning scheme that automatically learns sub-classes from the user given aggregate classes. We model each aggregate class as finite Gaussian mixture instead of classical assumption of unimodal Gaussian per class. The number of components in each finite Gaussian mixture are automatically estimated. Experimental results on real remotely sensed image classification showed not only improved accuracy in aggregate class classification but the proposed method also recognized sub-classes.

Keywords

Semi-supervised learning EM GMM Remote Sensing 

References

  1. 1.
    Cozman, F.G., Cohen, I., Cirelo, M.C.: Semi-supervised learning of mixture models. In: Twentieth International Conference on Machine Learning, ICML (2003)Google Scholar
  2. 2.
    Figueiredo, M.A.T., Jain, A.K.: Unsupervised selection and estimation of finite mixture models. In: Pattern Recognition, 2000. Proceedings. 15th International Conference on 2000, vol. 2, pp. 87–90 (2000)Google Scholar
  3. 3.
    Goldman, S., Zhou, Y.: Enhancing supervised learning with unlabeled data. In: Proc. 17th International Conf. on Machine Learning, pp. 327–334. Morgan Kaufmann, San Francisco (2000)Google Scholar
  4. 4.
    Jensen, J.R.: Introductory Digital Image Processing, A Remote Sensing Perspective. Prentice Hall, Upper Saddle River (1996)Google Scholar
  5. 5.
    Mclachlan: Mixture Models: Inference and Applications to Clustering. CRC, New York (1987)Google Scholar
  6. 6.
    Miloslavsky, M., van der Laan, M.J.: Fitting of mixtures with unspecified number of components using cross validation distance estimate. Comput. Stat. Data Anal. 41(3-4), 413–428 (2003)MathSciNetCrossRefMATHGoogle Scholar
  7. 7.
    Mitchell, T.: The role of unlabeled data in supervised learning. In: Proceedings of the Sixth International Colloquium on Cognitive Science, San Sebastian, Spain (1999)Google Scholar
  8. 8.
    Nigam, K., McCallum, A.K., Thrun, S., Mitchell, T.M.: Text classification from labeled and unlabeled documents using EM. Machine Learning 39(2/3), 103–134 (2000)CrossRefMATHGoogle Scholar
  9. 9.
    Richards, J.A., Jia, X.: Remote Sensing Digital Image Analysis. Springer, New York (1999)CrossRefGoogle Scholar
  10. 10.
    Sankar, A.: Experiments with a gaussian merging-splitting algorithm for hmm training for speech recognition. In: Proceedings of the Broadcast News Transcription and Understanding Workshop, pp. 99–104 (1998)Google Scholar
  11. 11.
    Shahshahani, B., Landgrebe, D.: The effect of unlabeled samples in reducing the small sample size problem and mitigating the hughes phenomenon. IEEE Trans. on Geoscience and Remote Sensing 32(5) (1994)Google Scholar
  12. 12.
    Xuelei, Lei, X.U.: Investigation on several model selection criteria for determining the number of clusters. Neural Information Processing - Letters and Reviews 4(1), 139–148 (2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Ranga Raju Vatsavai
    • 1
  • Shashi Shekhar
    • 2
  • Budhendra Bhaduri
    • 1
  1. 1.Computational Sciences and Engineering Division, Oak Ridge National LaboratoryOak RidgeUSA
  2. 2.Dept. of Computer ScienceUniversity of MinnesotaUSA

Personalised recommendations