Advertisement

Statistics and Computing

, Volume 27, Issue 1, pp 131–145 | Cite as

Sequential Monte Carlo methods for mixtures with normalized random measures with independent increments priors

  • J. E. GriffinEmail author
Article

Abstract

Normalized random measures with independent increments are a general, tractable class of nonparametric prior. This paper describes sequential Monte Carlo methods for both conjugate and non-conjugate nonparametric mixture models with these priors. A simulation study is used to compare the efficiency of the different algorithms for density estimation and comparisons made with Markov chain Monte Carlo methods. The SMC methods are further illustrated by applications to dynamically fitting a nonparametric stochastic volatility model and to estimation of the marginal likelihood in a goodness-of-fit testing example.

Keywords

Bayesian nonparametrics Dirichlet process Normalized generalized gamma process Nonparametric stochastic volatility Slice sampling Particle Gibbs sampling 

Supplementary material

11222_2015_9612_MOESM1_ESM.pdf (89 kb)
Supplementary material 1 (pdf 88 KB)

References

  1. Andrieu, C., Doucet, A., Holenstein, R.: Particle Markov chain Monte Carlo methods (with discussion). J. R. Stat. Soc. Ser. B 72(3), 269–342 (2010)MathSciNetCrossRefGoogle Scholar
  2. Bassetti, F., Casarin, R., Leisen, F.: Beta-product dependent Pitman–Yor processes for Bayesian inference. J. Econom. 180, 49–72 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  3. Basu, S., Chib, S.: Marginal likelihood and Bayes factors for Dirichlet process mixture models. J. Am. Stat. Assoc. 98, 224–235 (2003)MathSciNetCrossRefzbMATHGoogle Scholar
  4. Berger, J., Guglielmi, A.: Testing of a parametric model versus nonparametric alternatives. J. Am. Stat. Assoc. 96, 174–184 (2001)CrossRefzbMATHGoogle Scholar
  5. Blackwell, D., MacQueen, J.B.: Ferguson distributions via Polya Urn schemes. Ann. Stat. 1, 353–355 (1973)CrossRefzbMATHGoogle Scholar
  6. Brix, A.: Generalized gamma measures and shot-noise Cox processes. Adv. Appl. Probab. 31, 929–953 (1999). ISSN 0001-8678MathSciNetCrossRefzbMATHGoogle Scholar
  7. Caron, F., Davy, M., Doucet, A., Duflos, E., Vanheeghe, P.: Bayesian inference for linear dynamics models with Dirichlet process mixtures. IEEE Trans. Signal Process. 56, 71–84 (2008)MathSciNetCrossRefGoogle Scholar
  8. Carota, C., Parmigiani, G.: On Bayes factors for nonparametric alternatives. In: Bernardo, J.M., Berger, J.O., Dawid, A.P., Smith, A.F.M. (eds.) Bayesian Statistics 5, pp. 508–511. Oxford University Press, London (1996)Google Scholar
  9. Carpenter, J., Clifford, P., Fearnhead, P.: An improved particle filter for non-linear problems. IEE Proc. Radar Sonar Navig. 146, 2–7 (1999)CrossRefGoogle Scholar
  10. Carvalho, C.M., Lopes, H.F., Polson, N.G., Taddy, M.A.: Particle learning for general mixtures. Bayesian Anal. 5, 709–740 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  11. Chen, C., Rao, V.A., Buntine, W., Teh, Y.W.: Dependent normalized random measures. In: Proceedings of the International Conference on Machine Learning (2013)Google Scholar
  12. Chopin, N.: A sequential particle filter for static models. Biometrika 89, 539–551 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  13. Chopin, N., Singh, S.S.: On the particle Gibbs sampler. arXiv:1304.1887v1 (2013)
  14. Dass, S.C., Lee, J.: A note on the consistency of Bayes factors for testing point null versus non-parametric alternatives. J. Stat. Plan. Inference 119, 143–152 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  15. Del Moral, P.: Feynman–Kac Formulae: Genealogical and Interacting Particle Systems with Applications. Springer, New York (2004)CrossRefzbMATHGoogle Scholar
  16. Del Moral, P., Doucet, A., Jasra, A.: Sequential Monte Carlo samplers. J. R. Stat. Soc. Ser. B 68, 411–436 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  17. Delatola, E.-I., Griffin, J.E.: Bayesian nonparametric modelling of the return distribution with stochastic volatility. Bayesian Anal. 6, 901–926 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  18. Delatola, E.-I., Griffin, J.E.: A Bayesian semiparametric model for volatility modelling with a leverage effect. Comput. Stat. Data Anal. 60, 97–110 (2013)CrossRefzbMATHGoogle Scholar
  19. Escobar, M.D., West, M.: Bayesian density estimation and inference using mixtures. J. Am. Stat. Assoc. 90, 577–588 (1995)MathSciNetCrossRefzbMATHGoogle Scholar
  20. Favaro, S., Teh, Y.W.: MCMC for normalized random measure mixture models. Stat. Sci. 28, 335–359 (2013)MathSciNetCrossRefzbMATHGoogle Scholar
  21. Fearnhead, P.: Particle filters for mixture models with an unknown number of components. Stat. Comput. 14, 11–21 (2004)MathSciNetCrossRefGoogle Scholar
  22. Ferguson, T.S.: A Bayesian analysis of some nonparametric problems. Ann. Stat. 1, 209–230 (1973)MathSciNetCrossRefzbMATHGoogle Scholar
  23. Geisser, S., Eddy, W.F.: A predictive approach to model selection. J. Am. Stat. Assoc. 74, 153–160 (1979)MathSciNetCrossRefzbMATHGoogle Scholar
  24. Geyer, C.: Practical Markov chain Monte Carlo. Stat. Sci. 7, 473–511 (1992)CrossRefGoogle Scholar
  25. Griffin, J.E.: Default priors for density estimation with mixture models. Bayesian Anal. 5(1), 45–64 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  26. Griffin, J.E., Walker, S.G.: Posterior simulation of normalised random measure mixtures. J. Comput. Graph. Stat. 20, 241–259 (2011)CrossRefGoogle Scholar
  27. Griffin, J.E., Kolossiatis, M., Steel, M.F.J.: Comparing distributions by using dependent normalized random-measure mixtures. J. R. Stat. Soc. Ser. B 75, 499–529 (2013)MathSciNetCrossRefGoogle Scholar
  28. Ishwaran, H., James, L.: Gibbs sampling methods for stick-breaking priors. J. Am. Stat. Assoc. 96, 161–173 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  29. James, L., Lijoi, A., Prünster, I.: Posterior analysis for normalized random measures with independent increments. Scand. J. Stat. 36, 76–97 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  30. Jensen, M.J., Maheu, J.M.: Bayesian semiparametric stochastic volatility modeling. J. Econom. 157, 305–316 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  31. Jensen, M.J., Maheu, J.M.: Estimating a semiparametric asymmetric stochastic volatility model with a Dirichlet process mixture. J. Econom. 178, 523–538 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  32. Kalli, M., Griffin, J.E., Walker, S.G.: Slice sampling mixture models. Stat.Comput. 21, 93–105 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  33. Lijoi, A., Mena, R.H., Prünster, I.: Hierarchical mixture modeling with normalized inverse-Gaussian priors. J. Am. Stat. Assoc. 100, 1278–1291 (2005)MathSciNetCrossRefzbMATHGoogle Scholar
  34. Lijoi, A., Mena, R.H., Prünster, I.: Controlling the reinforcement in Bayesian non-parametric mixture models. J. R. Stat. Soc. Ser. B 69, 715–740 (2007). ISSN 1369-7412MathSciNetCrossRefGoogle Scholar
  35. Lijoi, A., Nipoti, B., Prünster, I.: Bayesian inference with dependent normalized completely random measures. Bernoulli 20, 1260–1291 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  36. Lindsten, F., Jordan, M.I., Schön, T.B.: Particle Gibbs with ancestor sampling. J. Mach. Learn. Res. 15, 2145–2184 (2014)MathSciNetzbMATHGoogle Scholar
  37. Liu, J.S.: Nonparametric hierarchical Bayes via sequential imputation. Ann. Stat. 24, 910–930 (1996)MathSciNetGoogle Scholar
  38. MacEachern, S.N., Müller, P.: Estimating mixture of Dirichlet process models. J. Comput. Graph. Stat. 7, 223–238 (1998)Google Scholar
  39. MacEachern, S.N., Clyde, M.A., Liu, J.: Sequential importance sampling for nonparametric Bayes models: the next generation. Can. J. Stat. 27, 251–267 (1999)MathSciNetCrossRefzbMATHGoogle Scholar
  40. McVinish, R., Rousseau, J., Mengersen, K.: Bayesian goodness of fit testing with mixtures of triangular distributions. Scand. J. Stat. 36, 337–354 (2009)Google Scholar
  41. Neal, R.: Markov chain sampling methods for Dirichlet process mixture models. J. Comput. Graph. Stat. 9, 249–265 (2000)MathSciNetGoogle Scholar
  42. Papaspiliopoulos, O., Roberts, G.: Retrospective Markov chain Monte Carlo methods for Dirichlet process hierarchical models. Biometrika 95, 169–186 (2008)MathSciNetCrossRefzbMATHGoogle Scholar
  43. Pitt, M.K., Shephard, N.: Filtering via simulation: auxiliary particle filters. J. Am. Stat. Assoc. 94, 590–599 (1999)MathSciNetCrossRefzbMATHGoogle Scholar
  44. Ulker, Y., Gunsel, B., Taylan Cemgil, A.: Sequential Monte Carlo samplers for Dirichlet process mixtures. J. Mach. Learn. Res. 9, 876–883 (2010)Google Scholar
  45. Walker, S.G.: Sampling the Dirichlet mixture model with slices. Commun. Stat. Simul. Comput. 36(1–3), 45–54 (2007)MathSciNetCrossRefzbMATHGoogle Scholar
  46. Whiteley, N.: Discussion of “Particle Markov chain Monte Carlo methods”. J. R. Stat. Soc. Ser. B 72(3), 306–307 (2010)Google Scholar
  47. Whiteley, N., Andrieu, C., Doucet, A.: Efficient Bayesian inference for switching state-space models using discrete particle Markov chain Monte Carlo methods. Technical Report, University of Bristol (2010)Google Scholar

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  1. 1.School of Mathematics, Statistics and Actuarial ScienceUniversity of KentKentUK

Personalised recommendations