Abstract

Gaussian mixture models (GMMs) are widely used to model complex distributions. Usually the parameters of the GMMs are determined in a maximum likelihood (ML) framework. A practical deficiency of ML fitting of the GMMs is the poor performance when dealing with high-dimensional data since a large sample size is needed to match the numerical accuracy that is possible in low dimensions. In this paper we propose a method for fitting the GMMs based on the projection pursuit (PP) strategy. By means of simulations we show that the proposed method outperforms ML fitting of the GMMs for small sizes of training sets.

Keywords

Bayesian Information Criterion Gaussian Mixture Model Latent Variable Model Training Sample Size Gaussian Mixture Model Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Aladjem, M. E.: Linear discriminant analysis for two-classes via removal of classification structure. IEEE Trans. Pattern Anal. Mach. Intell. 19 (1997) 187–192CrossRefGoogle Scholar
  2. 2.
    Aladjem, M. E.: Non-parametric discriminant analysis via recursive optimization of Patrick-Fisher distance. IEEE Trans. on Syst., Man, Cybern. 28B (1998) 292–299CrossRefGoogle Scholar
  3. 3.
    Aladjem, M. E.: Recursive training of neural networks for classification. IEEE Trans. on Neural Networks. 11 (2000) 488–503CrossRefGoogle Scholar
  4. 4.
    Bishop, C. M.: Neural Networks for Pattern Recognition. Oxford University Press Inc., New York (1995)Google Scholar
  5. 5.
    Bishop, C. M.: Latent variable models. In: Jordan, M. I. (ed.): Learning in Graphical Models. The MIT Press, London (1999) 371–403Google Scholar
  6. 6.
    Fraley, C, R.aftery, A. E.: How many clusters? Which clustering method? Answers via model-based cluster analysis. The Computer Journal. 41 (1998) 578–588MATHCrossRefGoogle Scholar
  7. 7.
    Friedman, J. H.: Exploratory projection pursuit. Journal of the American Statistical Association. 82 (1987) 249–266MATHCrossRefMathSciNetGoogle Scholar
  8. 8.
    Friedman, J. H., Stuetzle, W., Schroeder, A.: Projection pursuit density estimation. Journal of the American Statistical Association. 79 (1984) 599–608CrossRefMathSciNetGoogle Scholar
  9. 9.
    Hwang, J. N., Lay, S. R., Lippman, A.: Nonparametric multivariate density estimation: A comparative study. IEEE Trans. on Signal Processing. 42 (1994) 2795–2810CrossRefGoogle Scholar
  10. 10.
    Moerland, P.: A comparison of mixture models for density estimation. In: Proceedings of the International Conference on Artificial Neural Networks (1999)Google Scholar
  11. 11.
    Sun, J.: Some practical aspects of exploratory projected pursuit. SIAM J. Sci. Comput. 14 (1993) 68–80MATHCrossRefMathSciNetGoogle Scholar
  12. 12.
    Wand, M. P., Jones, M. C.: Comparison of smoothing parameterizations in bivari-ate kernel density estimation. Journal of the American Statistical Association. 88 (1993) 520–528MATHCrossRefMathSciNetGoogle Scholar
  13. 13.
    Wand, M. P., Jones, M. C.: Kernel Smoothing. Charman & Hall/CRC (1995)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Mayer Aladjem
    • 1
  1. 1.Department of Electrical and Computer EngineeringBen-Gurion University of the NegevBeer-ShevaIsrael

Personalised recommendations