Variational Bayesian Dirichlet-Multinomial Allocation for Exponential Family Mixtures

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4212)


This paper studies a Bayesian framework for density modeling with mixture of exponential family distributions. Variational Bayesian Dirichlet-Multinomial allocation (VBDMA) is introduced, which performs inference and learning efficiently using variational Bayesian methods and performs automatic model selection. The model is closely related to Dirichlet process mixture models and demonstrates similar automatic model selection in the variational Bayesian context.


Mixture Model Exponential Family Dirichlet Process Dirichlet Distribution Prior Term 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Attias, H.: A variational Bayesian framework for graphical models. In: Advances in Neural Information Processing Systems 12. MIT Press, Cambridge (2000)Google Scholar
  2. 2.
    Blei, D.M., Jordan, M.I.: Variational methods for the Dirichlet process. In: Proceedings of the 21st International Conference on Machine Learning (2004)Google Scholar
  3. 3.
    Connor, R.J., Mosimann, J.E.: Concepts of independence for proportions with a generalization of the Dirichlet distribution. J. Amer. Stat. Ass. 64 (1969)Google Scholar
  4. 4.
    Corduneanu, A., Bishop, C.M.: Variational Bayesian model selection for mixture distributions. In: Workshop AI and Statistics, pp. 27–34 (2001)Google Scholar
  5. 5.
    Escobar, M.D., West, M.: Bayesian density estimation and inference using mixtures. Journal of the American Statistical Association 90(430) (June 1995)Google Scholar
  6. 6.
    Ferguson, T.S.: A Bayesian analysis of some nonparametric problems. Annals of Statistics 1, 209–230 (1973)zbMATHCrossRefMathSciNetGoogle Scholar
  7. 7.
    Ghahramani, Z., Beal, M.J.: Graphical models and variational methods. In: Advanced mean Field Methods — Theory and Practice. MIT Press, Cambridge (2000)Google Scholar
  8. 8.
    Green, P.J., Richardson, S.: Modelling heterogeneity with and without the Dirichlet process (unpublished paper, 2000)Google Scholar
  9. 9.
    Ishwaran, H., James, L.F.: Gibbs sampling methods for stick-breaking priors. Journal of the American Statistical Association 96(453), 161–173 (2001)zbMATHCrossRefMathSciNetGoogle Scholar
  10. 10.
    Jordan, M.I., Ghahramani, Z., Jaakkola, T., Saul, L.K.: An introduction to variational methods for graphical models. Machine Learning 37(2), 183–233 (1999)zbMATHCrossRefGoogle Scholar
  11. 11.
    Neal, R.M.: Markov chain sampling methods for Dirichlet process mixture models. Journal of Computational and Graphical Statistics 9, 249–265 (2000)CrossRefMathSciNetGoogle Scholar
  12. 12.
    Ueda, N., Nakano, R., Ghahramani, Z., Hinton, G.E.: SMEM algorithm for mixture models. In: Neural Computation (1999)Google Scholar
  13. 13.
    Wong, T.-T.: Generalized Dirichlet distribution in Bayesian analysis. Appl. Math. Comput. 97(2-3), 165–181 (1998)zbMATHCrossRefMathSciNetGoogle Scholar
  14. 14.
    Yu, S.: Advanced Probabilistic Models for Clustering and Projection. PhD thesis, University of Munich (2006)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  1. 1.Institute for Computer ScienceUniversity of MunichGermany
  2. 2.Siemens Corporate TechnologyMunichGermany

Personalised recommendations