Skip to main content

Bayesian Approaches to the Design of Markov Chain Monte Carlo Samplers

  • Conference paper
  • First Online:
Monte Carlo and Quasi-Monte Carlo Methods 2012

Part of the book series: Springer Proceedings in Mathematics & Statistics ((PROMS,volume 65))

  • 2983 Accesses

Abstract

In the decades since Markov chain Monte Carlo methods were first introduced, they have revolutionised Bayesian approaches to statistical inference. Each new advance in MCMC methodology produces near immediate benefits for Bayesian practitioners, expanding the range of problems they can feasibly solve. In this paper, we explore ways in which Bayesian approaches can return something of the debt owed to MCMC, by using explicitly Bayesian concepts to aid in the design of MCMC samplers. The art of efficient MCMC sampling lies in designing a Markov process that (a) has the required limiting distribution, (b) has good convergence and mixing properties and (c) can be implemented in a computationally efficient manner. In this paper, we explore the idea that the selection of an appropriate process, and in particular the tuning of the parameters of the process to achieve the above goals, can be regarded as a problem of estimation. As such, it is amenable to a conventional Bayesian approach, in which a prior distribution for optimal parameters of the sampler is specified, data relevant to sampler performance is obtained and a posterior distribution for optimal parameters is formed. Sampling from this posterior distribution can then be incorporated into the MCMC sampler to produce an adaptive method. We present a new MCMC algorithm for Bayesian adaptive Metropolis-Hasting sampling (BAMS), using an explicitly Bayesian inference to update the proposal distribution. We show that author Keith’s earlier Bayesian adaptive independence sampler (BAIS) and a new Bayesian adaptive random walk sampler (BARS) emerge as instances. More important than either of these instances, BAMS provides a general framework within which to explore adaptive schemes that are guaranteed to converge to the required limiting distribution.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Andrieu, C., Moulines, E.: On the ergodicity properties of some adaptive MCMC algorithms. Ann. Appl. Probab. 16, 1462–1505 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  2. Andrieu, C., Thoms, J.: A tutorial on adaptive MCMC. Stat. Comput. 18, 343–373 (2008)

    Article  MathSciNet  Google Scholar 

  3. Gelman, A., Roberts, G.O., Gilks, W.R.: Efficient Metropolis jumping rules. Bayesian Statist. 5, 599–607 (1996)

    MathSciNet  Google Scholar 

  4. Haario, H., Laine, M., Mira, A., Saksman, E.: DRAM: efficient adaptive MCMC. Stat. Comput. 16, 339–354 (2006)

    Article  MathSciNet  Google Scholar 

  5. Hastings, W.K.: Monte Carlo sampling methods using Markov chains and their applications. Biometrika, 57, 97–109 (1970)

    Article  MATH  Google Scholar 

  6. Higdon, D.M.: Auxiliary variable methods for Markov chain Monte Carlo with applications. J. Amer. Statist. Stat. Assoc. 93, 585–595 (1998)

    Article  MATH  Google Scholar 

  7. Keith, J.M., Kroese, D.P., Bryant, D.: A generalized Markov sampler. Methodol. Comput. Appl. Probab. 6, 29–53 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  8. Keith, J.M., Kroese, D.P., Sofronov, G.Y.: Adaptive independence samplers. Stat. Comput. 18 409–420 (2008)

    Article  MathSciNet  Google Scholar 

  9. Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H.: Equations of state calculations by fast computing machines. J. Chem. Phys. 21, 1087–1092 (1953)

    Article  Google Scholar 

  10. R Development Core Team: R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna. ISBN 3-900051-07-0. http://www.R-project.org (2008)

  11. Roberts, G.O., Rosenthal, J.S.: Examples of adaptive MCMC. J. Comput. Graph. Statist. 18, 349–367 (2009)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jonathan M. Keith .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Keith, J.M., Davey, C.M. (2013). Bayesian Approaches to the Design of Markov Chain Monte Carlo Samplers. In: Dick, J., Kuo, F., Peters, G., Sloan, I. (eds) Monte Carlo and Quasi-Monte Carlo Methods 2012. Springer Proceedings in Mathematics & Statistics, vol 65. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-41095-6_22

Download citation

Publish with us

Policies and ethics