Advertisement

Novel Mixtures Based on the Dirichlet Distribution: Application to Data and Image Classification

  • Nizar Bouguila
  • Djemel Ziou
  • Jean Vaillancourt
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2734)

Abstract

The Dirichlet distribution offers high flexibility for modeling data. This paper describes two new mixtures based on this density: the GDD (Generalized Dirichlet Distribution) and the MDD (Multinomial Dirichlet Distribution) mixtures. These mixtures will be used to model continuous and discrete data, respectively. We propose a method for estimating the parameters of these mixtures. The performance of our method is tested by contextual evaluations. In these evaluations we compare the performance of Gaussian and GDD mixtures in the classification of several pattern-recognition data sets and we apply the MDD mixture to the problem of summarizing image databases.

Keywords

Expectation Maximization Algorithm Dirichlet Distribution Initialization Method Natural Gradient Wisconsin Breast Cancer 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Amari, S. Natural Gradient Works Efficiently in Learning. Neural Computation, 10:251–276, 1998.CrossRefGoogle Scholar
  2. 2.
    Bouguila, N., Ziou, D. and Vaillancourt, J. The Introduction of Dirichlet Mixture into Image Processing Applications. Submitted to. IEEE Transactions on Image Processing.Google Scholar
  3. 3.
    Crawford, S.L. An Application of the Laplace Method to Finite Mixture Distributions. Journal of the American Statistical Association, 89:259–267, 1994.zbMATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Dempster, A.P., Laird, N.M. and Rubin, D.B. Maximum Likelihood from Incomplete Data via the EM Algorithm. Journal of the Royal Statistical Society, B, 39:1–38, 1977.zbMATHMathSciNetGoogle Scholar
  5. 5.
    Duda, R.O. and Hart, P.E. Pattern Classification and Scene Analysis. Wiley, New York, 1973.zbMATHGoogle Scholar
  6. 6.
    Everitt, B.S. and Hand, D.J. Finite mixture Distributions. Chapman and Hall, London, UK, 1981.zbMATHGoogle Scholar
  7. 7.
    Fielitz, B.D and Myers, B.L. Estimation of Parameters in the Beta Distribution. Decision Sciences, 6:1–13, 1975.CrossRefGoogle Scholar
  8. 8.
    Ikeda, S. Acceleration of the EM algorithm. Systems and Computers in Japan, 31(2):10–18, February 2000.CrossRefGoogle Scholar
  9. 9.
    Kaufman, L. and Rousseeuw, P.J. Finding Groups in Data. John Wiley, New York, 1990.Google Scholar
  10. 10.
    Kherfi, M.L., Ziou, D. and Bernardi, A. Content-Based Image Retrieval Using Positive and Negative Examples. To appear, 2002.Google Scholar
  11. 11.
    Kotz, S. and Ng, K.W. and Fang, K. Symmetric Multivariate and Related Distributions. London/New York: Chapman and Hall, 1990.zbMATHGoogle Scholar
  12. 12.
    Raftery, A.E. and Banfield, J.D. Model-Based Gaussian and Non-Gaussian Clustering. Biometrics, 49:803–821, 1993.zbMATHCrossRefMathSciNetGoogle Scholar
  13. 13.
    Rao, C.R. Advanced Statistical Methods in Biomedical Research. New York: John Wiley and Sons, 1952.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2003

Authors and Affiliations

  • Nizar Bouguila
    • 1
  • Djemel Ziou
    • 1
  • Jean Vaillancourt
    • 1
  1. 1.DMI, Faculté des SciencesUniversité de SherbrookeSherbrookeCanada

Personalised recommendations