Statistics and Computing

, Volume 25, Issue 1, pp 67–78

On a class of \(\sigma \)-stable Poisson–Kingman models and an effective marginalized sampler



We investigate the use of a large class of discrete random probability measures, which is referred to as the class \(\mathcal {Q}\), in the context of Bayesian nonparametric mixture modeling. The class \(\mathcal {Q}\) encompasses both the the two-parameter Poisson–Dirichlet process and the normalized generalized Gamma process, thus allowing us to comparatively study the inferential advantages of these two well-known nonparametric priors. Apart from a highly flexible parameterization, the distinguishing feature of the class \(\mathcal {Q}\) is the availability of a tractable posterior distribution. This feature, in turn, leads to derive an efficient marginal MCMC algorithm for posterior sampling within the framework of mixture models. We demonstrate the efficacy of our modeling framework on both one-dimensional and multi-dimensional datasets.


Bayesian nonparametrics Normalized generalized Gamma process Marginalized MCMC sampler Mixture model \(\sigma \)-Stable Poisson–Kingman model Two parameter Poisson–Dirichlet process 

Supplementary material

11222_2014_9499_MOESM1_ESM.pdf (195 kb)
ESM (PDF 196 kb)

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.University of Torino and Collegio Carlo AlbertoTurinItaly
  2. 2.Gatsby Computational Neuroscience UnitUniversity College LondonLondonUK
  3. 3.Department of StatisticsUniversity of OxfordOxfordUK

Personalised recommendations