Skip to main content
Log in

Importance tempering

  • Published:
Statistics and Computing Aims and scope Submit manuscript

Abstract

Simulated tempering (ST) is an established Markov chain Monte Carlo (MCMC) method for sampling from a multimodal density π(θ). Typically, ST involves introducing an auxiliary variable k taking values in a finite subset of [0,1] and indexing a set of tempered distributions, say π k (θ) π(θ)k. In this case, small values of k encourage better mixing, but samples from π are only obtained when the joint chain for (θ,k) reaches k=1. However, the entire chain can be used to estimate expectations under π of functions of interest, provided that importance sampling (IS) weights are calculated. Unfortunately this method, which we call importance tempering (IT), can disappoint. This is partly because the most immediately obvious implementation is naïve and can lead to high variance estimators. We derive a new optimal method for combining multiple IS estimators and prove that the resulting estimator has a highly desirable property related to the notion of effective sample size. We briefly report on the success of the optimal combination in two modelling scenarios requiring reversible-jump MCMC, where the naïve approach fails.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Atchadé, Y., Liu, J.: The Wang–Landau algorithm in general state spaces: applications and convergence analysis. Technical Report, University of Harvard (2007)

  • Breiman, L., Friedman, J.H., Olshen, R., Stone, C.: Classification and Regression Trees. Wadsworth, Belmont (1984)

    MATH  Google Scholar 

  • Del Moral, P., Doucet, A., Jasra, A.: Sequential Monte Carlo samplers. J. R. Stat. Soc. Ser. B 68, 411–436 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  • Geyer, C.: Markov chain Monte Carlo maximum likelihood. In: Computing Science and Statistics: Proceedings of the 23rd Symposium on the Interface, pp. 156–163 (1991)

  • Geyer, C., Thompson, E.: Annealing Markov chain Monte Carlo with applications to ancenstral inference. J. Am. Stat. Assoc. 90, 909–920 (1995)

    Article  MATH  Google Scholar 

  • Gramacy, R.B., Lee, H.K.H.: Bayesian treed Gaussian process models with an application to computer modeling. J. Am. Stat. Assoc. 103(143), 1119–1130 (2008)

    Article  Google Scholar 

  • Green, P.: Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika 82, 711–732 (1995)

    Article  MATH  MathSciNet  Google Scholar 

  • Hukushima, K., Nemoto, K.: Exchange Monte Carlo method and application to spin glass simulations. J. Phys. Soc. Jpn. 65(4), 1604–1608 (1996)

    Article  Google Scholar 

  • Iba, Y.: Extended ensemble Monte Carlo. Int. J. Mod. Phys. 12(5), 623–656 (2001)

    Article  Google Scholar 

  • Jasra, A., Stephens, D., Holmes, C.: On population-based simulation for static inference. Stat. Comput. 17(3), 263–279 (2007a)

    Article  MathSciNet  Google Scholar 

  • Jasra, A., Stephens, D., Holmes, C.: Population-based reversible jump Markov chain Monte Carlo. Biometrika 94(4), 787–807 (2007b)

    Article  MATH  MathSciNet  Google Scholar 

  • Jennison, C.: Discussion on the meeting on the Gibbs sampler and other Markov chain Monte Carlo methods. J. R. Stat. Soc. Ser. B 55, 54–56 (1993)

    Google Scholar 

  • Kass, R.E., Carlin, B.P., Gelman, A., Neal, R.M.: Markov chain Monte Carlo in practice: a roundtable discussion. Am. Stat. 52(2), 93–100 (1998)

    Article  MathSciNet  Google Scholar 

  • King, R., Brooks, S.: Model selection for integrated recovery/recapture data. Biometrics 58, 841–851 (2002)

    Article  MathSciNet  Google Scholar 

  • Kirkpatrick, S., Gelatt, C., Vecci, M.: Optimization by simulated annealing. Science 220, 671–680 (1983)

    Article  MathSciNet  Google Scholar 

  • Kushner, H., Lin, G.: Stochastic Approximation Algorithms and Applications. Springer, New York (1997)

    MATH  Google Scholar 

  • Liu, J.S.: Monte Carlo Strategies in Scientific Computing. Springer, New York (2001)

    MATH  Google Scholar 

  • Madras, N., Picconi, M.: Importance sampling for families of distributions. Ann. Appl. Probab. 9, 1202–1225 (1999)

    Article  MATH  MathSciNet  Google Scholar 

  • Marinari, E., Parisi, G.: Simulated tempering: a new Monte Carlo scheme. Europhys. Lett. 19, 451–458 (1992)

    Article  Google Scholar 

  • Neal, R.M.: Sampling from multimodal distributions using tempered transition. Stat. Comput. 6, 353–366 (1996)

    Article  Google Scholar 

  • Neal, R.M.: Annealed importance sampling. Stat. Comput. 11, 125–129 (2001)

    Article  MathSciNet  Google Scholar 

  • Neal, R.M.: Estimating ratios of normalizing constants using Linked Importance Sampling. Technical Report 0511, Department of Statistics, University of Toronto (2005), 37 pages

  • Owen, A., Zhou, Y.: Safe and effective importance sampling. J. Am. Stat. Assoc. 95(449), 135–143 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  • Veach, E., Guibas, L.J.: Optimally combining sampling techniques for Monte Carlo rendering. In: SIGGRAPH ’95 Conference Proceedings, pp. 419–428. Reading, Addison-Wesley (1995)

    Google Scholar 

  • Wong, W., Liang, F.: (1997). Dynamic weighting in Monte Carlo and optimization. In: Proceedings of the National Academy of Sciences of USA, vol. 94(26), pp. 14220–14224

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Robert Gramacy.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Gramacy, R., Samworth, R. & King, R. Importance tempering. Stat Comput 20, 1–7 (2010). https://doi.org/10.1007/s11222-008-9108-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11222-008-9108-5

Keywords

Navigation