Statistics and Computing

, Volume 21, Issue 4, pp 649–656 | Cite as

Diffusive nested sampling

  • Brendon J. BrewerEmail author
  • Livia B. Pártay
  • Gábor Csányi
Open Access


We introduce a general Monte Carlo method based on Nested Sampling (NS), for sampling complex probability distributions and estimating the normalising constant. The method uses one or more particles, which explore a mixture of nested probability distributions, each successive distribution occupying ∼e −1 times the enclosed prior mass of the previous distribution. While NS technically requires independent generation of particles, Markov Chain Monte Carlo (MCMC) exploration fits naturally into this technique. We illustrate the new method on a test problem and find that it can achieve four times the accuracy of classic MCMC-based Nested Sampling, for the same computational effort; equivalent to a factor of 16 speedup. An additional benefit is that more samples and a more accurate evidence value can be obtained simply by continuing the run for longer, as in standard MCMC.


Nested sampling Bayesian computation Markov chain Monte Carlo 


  1. Chib, S., Ramamurthy, S.: Tailored randomized-block MCMC methods with application to DSGE models. J. Econom. 155, 19–38 (2010) CrossRefMathSciNetGoogle Scholar
  2. Feroz, F., Hobson, M.P., Bridges, M.: MultiNest: an efficient and robust Bayesian inference tool for cosmology and particle physics. arXiv:0809.3437 (2008)
  3. Marinari, E., Parisi, G.: Simulated tempering: a new Monte Carlo scheme. Europhys. Lett. 19, 451 (1992) CrossRefGoogle Scholar
  4. Mukherjee, P., Parkinson, D., Liddle, A.R.: A nested sampling algorithm for cosmological model selection. Astrophys. J. 638, L51–L54 (2006) CrossRefGoogle Scholar
  5. Murray, I.: Advances in Markov chain Monte Carlo methods. PhD thesis, Gatsby computational neuroscience unit, University College London (2007) Google Scholar
  6. Neal, R.M.: Slice sampling (with discussion). Ann. Stat. 31, 705–767 (2003) zbMATHCrossRefMathSciNetGoogle Scholar
  7. Pártay, L.B., Bartók, A.P., Csányi, G.: Efficient sampling of atomic configurational spaces. J. Phys. Chem. B 114(32), 10502–10512 (2010) CrossRefGoogle Scholar
  8. Roberts, G.O., Gelman, A., Gilks, W.R.: Weak convergence and optimal scaling of random walk Metropolis algorithms. Ann. Appl. Probab. 7(1), 110–120 (1997) zbMATHCrossRefMathSciNetGoogle Scholar
  9. Rosenthal, J.S.: Optimal proposal distributions and adaptive MCMC. In: Brooks, S.P., Gelman, A., Jones, G., Meng, X.-L. (eds.) Handbook of Markov Chain Monte Carlo. Chapman and Hall/CRC Press, Boca Raton (2010) Google Scholar
  10. Sivia, D.S., Skilling, J.: Data Analysis: A Bayesian Tutorial, 2nd edn. Oxford University Press, Oxford (2006) zbMATHGoogle Scholar
  11. Skilling, J.: Nested sampling for general Bayesian computation. Bayesian Anal. 4, 833–860 (2006) MathSciNetGoogle Scholar
  12. Trias, M., Vecchio, A., Veitch, J.: Delayed rejection schemes for efficient Markov-chain Monte-Carlo sampling of multimodal distributions. arXiv:0904.2207 (2009)
  13. Wang, F., Landau, D.P.: Efficient, multiple-range random walk algorithm to calculate the density of states. Phys. Rev. Lett. 86, 2050 (2001) CrossRefGoogle Scholar

Copyright information

© The Author(s) 2010

Open AccessThis is an open access article distributed under the terms of the Creative Commons Attribution Noncommercial License (, which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.

Authors and Affiliations

  • Brendon J. Brewer
    • 1
    Email author
  • Livia B. Pártay
    • 2
  • Gábor Csányi
    • 3
  1. 1.Department of PhysicsUniversity of CaliforniaSanta BarbaraUSA
  2. 2.University Chemical LaboratoryUniversity of CambridgeCambridgeUK
  3. 3.Engineering LaboratoryUniversity of CambridgeCambridgeUK

Personalised recommendations