Advertisement

Split-Merge Augmented Gibbs Sampling for Hierarchical Dirichlet Processes

  • Santu Rana
  • Dinh Phung
  • Svetha Venkatesh
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7819)

Abstract

The Hierarchical Dirichlet Process (HDP) model is an important tool for topic analysis. Inference can be performed through a Gibbs sampler using the auxiliary variable method. We propose a split-merge procedure to augment this method of inference, facilitating faster convergence. Whilst the incremental Gibbs sampler changes topic assignments of each word conditioned on the previous observations and model hyper-parameters, the split-merge sampler changes the topic assignments over a group of words in a single move. This allows efficient exploration of state space. We evaluate the proposed sampler on a synthetic test set and two benchmark document corpus and show that the proposed sampler enables the MCMC chain to converge faster to the desired stationary distribution.

Keywords

Gibbs Sampler Latent Dirichlet Allocation Document Corpus Dirichlet Process Mixture MCMC Chain 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Teh, Y., Jordan, M., Beal, M., Blei, D.: Hierarchical Dirichlet processes. Journal of the American Statistical Association 101(476), 1566–1581 (2006)MathSciNetzbMATHCrossRefGoogle Scholar
  2. 2.
    Blei, D., Ng, A., Jordan, M.: Latent Dirichlet allocation. Journal of MachineResearch 3, 993–1022 (2003)zbMATHGoogle Scholar
  3. 3.
    Jain, S., Neal, R.: A split-merge Markov chain Monte Carlo procedure for the Dirichlet process mixture model. Journal of Computational and Graphical Statistics 13(1), 158–182 (2004)MathSciNetCrossRefGoogle Scholar
  4. 4.
    Wang, C., Blei, D.: A split-merge mcmc algorithm for the hierarchical dirichlet process. Arxiv preprint arXiv:1201.1657 (2012)Google Scholar
  5. 5.
    Antoniak, C.: Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems. The Annals of Statistics 2(6), 1152–1174 (1974)MathSciNetzbMATHCrossRefGoogle Scholar
  6. 6.
    Sethuraman, J.: A constructive definition of Dirichlet priors. Statistica Sinica 4(2), 639–650 (1994)MathSciNetzbMATHGoogle Scholar
  7. 7.
    Teh, Y., Jordan, M.: Hierarchical Bayesian nonparametric models with applications. In: Hjort, N., Holmes, C., Müller, P., Walker, S. (eds.) Bayesian Nonparametrics: Principles and Practice, p. 158. Cambridge University Press (2009)Google Scholar
  8. 8.
    Xing, E., Sohn, K., Jordan, M., Teh, Y.: Bayesian multi-population haplotype inference via a hierarchical dirichlet process mixture. In: Proceedings of the 23rd International Conference on Machine Learning, pp. 1049–1056. ACM (2006)Google Scholar
  9. 9.
    Li, L., Fei-Fei, L.: Optimol: automatic online picture collection via incremental model learning. International Journal of Computer Vision 88(2), 147–168 (2010)CrossRefGoogle Scholar
  10. 10.
    Dahl, D.: Sequentially-allocated merge-split sampler for conjugate and nonconjugate dirichlet process mixture models. Journal of Computational and GraphicalStatistics (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Santu Rana
    • 1
  • Dinh Phung
    • 1
  • Svetha Venkatesh
    • 1
  1. 1.Center for Pattern Recognition and Data AnalyticsDeakin UniversityWaurn PondsAustralia

Personalised recommendations