Advertisement

Statistics and Computing

, Volume 21, Issue 1, pp 93–105 | Cite as

Slice sampling mixture models

  • Maria Kalli
  • Jim E. GriffinEmail author
  • Stephen G. Walker
Article

Abstract

We propose a more efficient version of the slice sampler for Dirichlet process mixture models described by Walker (Commun. Stat., Simul. Comput. 36:45–54, 2007). This new sampler allows for the fitting of infinite mixture models with a wide-range of prior specifications. To illustrate this flexibility we consider priors defined through infinite sequences of independent positive random variables. Two applications are considered: density estimation using mixture models and hazard function estimation. In each case we show how the slice efficient sampler can be applied to make inference in the models. In the mixture case, two submodels are studied in detail. The first one assumes that the positive random variables are Gamma distributed and the second assumes that they are inverse-Gaussian distributed. Both priors have two hyperparameters and we consider their effect on the prior distribution of the number of occupied clusters in a sample. Extensive computational comparisons with alternative “conditional” simulation techniques for mixture models using the standard Dirichlet process prior and our new priors are made. The properties of the new priors are illustrated on a density estimation problem.

Keywords

Dirichlet process Markov chain Monte Carlo Mixture model Normalized weights Slice sampler Hazard function 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Celeux, G., Hurn, M., Robert, C.P.: Computational and inferential difficulties with mixture posterior distributions. J. Am. Stat. Assoc. 95, 957–970 (2000) zbMATHCrossRefMathSciNetGoogle Scholar
  2. Devroye, L.: Non-Uniform Random Variate Generation. Springer, New York (1986) zbMATHGoogle Scholar
  3. Dunson, D.: Kernel local partition processes for functional data. Discussion paper 2008-26, Department of Statistical Science, Duke University (2008) Google Scholar
  4. Escobar, M.D.: Estimating the means of several normal populations by nonparametric estimation of the distribution of the means. Unpublished Ph.D. dissertation, Department of Statistics, Yale University (1988) Google Scholar
  5. Escobar, M.D.: Estimating normal means with a Dirichlet process prior. J. Am. Stat. Assoc. 89, 268–277 (1994) zbMATHCrossRefMathSciNetGoogle Scholar
  6. Escobar, M.D., West, M.: Bayesian density estimation and inference using mixtures. J. Am. Stat. Assoc. 90, 577–588 (1995) zbMATHCrossRefMathSciNetGoogle Scholar
  7. Ferguson, T.S.: A Bayesian analysis of some nonparametric problems. Ann. Stat. 1, 209–230 (1973) zbMATHCrossRefMathSciNetGoogle Scholar
  8. Gilks, W.R., Best, N.G., Tan, K.K.C.: Adaptive rejection Metropolis sampling within Gibbs sampling. Appl. Stat. 44, 455–472 (1995) zbMATHCrossRefGoogle Scholar
  9. Green, P.J., Richardson, S.: Modelling heterogeneity with and without the Dirichlet process. Scand. J. Stat. 28, 355–375 (2001) zbMATHCrossRefMathSciNetGoogle Scholar
  10. Ishwaran, H., James, L.F.: Gibbs sampling methods for stick-breaking priors. J. Am. Stat. Assoc. 96, 161–173 (2001) zbMATHCrossRefMathSciNetGoogle Scholar
  11. Ishwaran, H., Zarepour, M.: Markov chain Monte Carlo in approximate Dirichlet and beta two-parameter process hierarchical models. Biometrika 87, 371–390 (2000) zbMATHCrossRefMathSciNetGoogle Scholar
  12. Lijoi, A., Mena, R.H., Prünster, I.: Hierarchical mixture modeling with normalized inverse-Gaussian priors. J. Am. Stat. Assoc. 100, 1278–1291 (2005) zbMATHCrossRefGoogle Scholar
  13. Lijoi, A., Mena, R.H., Prüenster, I.: Controlling the reinforcement in Bayesian nonparametric mixture models. J. R. Stat. Soc. B 69, 715–740 (2007) CrossRefGoogle Scholar
  14. Lo, A.Y.: On a class of Bayesian nonparametric estimates I. Density estimates. Ann. Stat. 12, 351–357 (1984) zbMATHCrossRefGoogle Scholar
  15. MacEachern, S.N.: Estimating normal means with a conjugate style Dirichlet process prior. Commun. Stat., Simul. Comput. 23, 727–741 (1994) zbMATHCrossRefMathSciNetGoogle Scholar
  16. MacEachern, S.N., Müller, P.: Estimating mixtures of Dirichlet process models. J. Comput. Graph. Stat. 7, 223–238 (1998) CrossRefGoogle Scholar
  17. Neal, R.: Markov chain sampling methods for Dirichlet process mixture models. J. Comput. Graph. Stat. 9, 249–265 (2000) CrossRefMathSciNetGoogle Scholar
  18. Papaspiliopoulos, O.: A note on posterior sampling from Dirichlet mixture models. Preprint (2008) Google Scholar
  19. Papaspiliopoulos, O., Roberts, G.O.: Retrospective Markov chain Monte Carlo methods for Dirichlet process hierarchical models. Biometrika 95, 169–186 (2008) zbMATHCrossRefMathSciNetGoogle Scholar
  20. Sethuraman, J.: A constructive definition of Dirichlet priors. Stat. Sin. 4, 639–650 (1994) zbMATHMathSciNetGoogle Scholar
  21. Sokal, A.: Monte Carlo Methods in Statistical Mechanics: Foundations and New Algorithms Functional Integration, Cargése, 1996. NATO Adv. Sci. Inst. Ser. B Phys., vol. 361, pp. 131–192. Plenum, New York (1997), Google Scholar
  22. Smith, A.F.M., Roberts, G.O.: Bayesian computations via the Gibbs sampler and related Markov chain Monte Carlo methods. J. R. Stat. Soc., Ser. B 55, 3–23 (1993) zbMATHMathSciNetGoogle Scholar
  23. Van Gael, J., Saatchi, Y., Teh, Y.W., Ghahramani, Z.: Beam sampling for the infinite hidden Markov model. Technical Report: Engineering Department, University of Cambridge (2008) Google Scholar
  24. Walker, S.G.: Sampling the Dirichlet mixture model with slices. Commun. Stat., Simul. Comput. 36, 45–54 (2007) zbMATHCrossRefGoogle Scholar
  25. Yau, C., Papaspiliopoulos, O., Roberts, G.O., Holmes, C.: Bayesian nonparametric hidden Markov models with application to the analysis of copy-number-variation in mammalian genomes. Technical Report, Man Institute, Oxford (2008) Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2009

Authors and Affiliations

  • Maria Kalli
    • 1
  • Jim E. Griffin
    • 2
    Email author
  • Stephen G. Walker
    • 2
  1. 1.Centre for Health Services StudiesUniversity of KentCanterburyUK
  2. 2.Institute of Mathematics, Statistics & Actuarial ScienceUniversity of KentCanterburyUK

Personalised recommendations