Skip to main content
Log in

On adaptive Metropolis–Hastings methods

  • Published:
Statistics and Computing Aims and scope Submit manuscript

Abstract

This paper presents a method for adaptation in Metropolis–Hastings algorithms. A product of a proposal density and K copies of the target density is used to define a joint density which is sampled by a Gibbs sampler including a Metropolis step. This provides a framework for adaptation since the current value of all K copies of the target distribution can be used in the proposal distribution. The methodology is justified by standard Gibbs sampling theory and generalizes several previously proposed algorithms. It is particularly suited to Metropolis-within-Gibbs updating and we discuss the application of our methods in this context. The method is illustrated with both a Metropolis–Hastings independence sampler and a Metropolis-with-Gibbs independence sampler. Comparisons are made with standard adaptive Metropolis–Hastings methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  • Andrieu, C., Moulines, D.: On the ergodicity properties of some adaptive MCMC algorithms. Ann. Appl. Probab. 16, 1462–1505 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  • Andrieu, C., Thoms, J.: A tutorial on adaptive MCMC. Stat. Comput. 18, 343–373 (2008)

    Article  MathSciNet  Google Scholar 

  • Besag, J., Green, P.J.: Spatial statistics and Bayesian computation. J. R. Stat. Soc., Ser. B, Stat. Methodol. 55, 25–37 (1993)

    MathSciNet  MATH  Google Scholar 

  • Botev, Z.I., Grotowski, J.F., Kroese, D.P.: Kernel density estimation via diffusion. Ann. Stat. 38, 2916–2957 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  • Cai, B., Meyer, R., Perron, F.: Metropolis–Hastings algorithms with adaptive proposals. Stat. Comput. 18, 421–433 (2008)

    Article  MathSciNet  Google Scholar 

  • Cappé, O., Guillin, A., Marin, J.M., Robert, C.P.: Population Monte Carlo. J. Comput. Graph. Stat. 13, 907–929 (2004)

    Article  Google Scholar 

  • Gasemyr, J.: On an adaptive version of the Metropolis–Hastings algorithm with independent proposal distribution. Scand. J. Stat. 30, 159–173 (2003)

    Article  MathSciNet  Google Scholar 

  • Gilks, W.R., Roberts, G.O., George, E.I.: Adapative direction sampling. The Statistician 43, 179–189 (1994)

    Article  Google Scholar 

  • Giordani, P., Kohn, R.: Adaptive Independent Metropolis–Hastings by Fast Estimation of Mixtures of Normals. Technical Report. Available at SSRN: http://ssrn.com/abstract=1082955 (2008)

  • Haario, H., Saksman, E., Tamminen, J.: An adaptive Metropolis algorithm. Bernoulli 7, 223–242 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  • Haario, H., Saksman, E., Tamminen, J.: Componentwise adaptation for high dimensional MCMC. Comput. Stat. 20, 265–273 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  • Jasra, A., Stephens, D.A., Holmes, C.C.: On population-based simulation for static inference. Stat. Comput. 17, 263–279 (2007)

    Article  MathSciNet  Google Scholar 

  • Keith, J.M., Kroese, D.P., Sofronov, G.Y.: Adaptive independence samplers. Stat. Comput. 18, 409–420 (2008)

    Article  MathSciNet  Google Scholar 

  • Laskey, B.M., Myers, J.W.: Population Markov chain Monte Carlo. Mach. Learn. 50, 175–196 (2003)

    Article  MATH  Google Scholar 

  • Latuszynski, K., Roberts, G.O., Rosenthal, J.S.: Adaptive Gibbs samplers and related MCMC methods. Technical Report. University of Toronto (2011)

  • Mengersen, K.L., Robert, C.P.: Iid sampling with self-avoiding particle filters: the pinball sampler. In: Bernardo, J.M., Bayarri, M.J., Berger, J.O., Dawid, A.P., Heckerman, D., Smith, A.F.M., West, M. (eds.) Bayesian Statistics 7, pp. 277–292. Oxford University Press, Oxford (2003)

    Google Scholar 

  • Roberts, G.O., Rosenthal, J.S.: Coupling and ergodicity of adaptive MCMC. J. Appl. Probab. 44, 458–475 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  • Roberts, G.O., Rosenthal, J.S.: Examples of adaptive MCMC. J. Comput. Graph. Stat. 18, 349–367 (2009)

    Article  MathSciNet  Google Scholar 

  • Saksman, E., Vihola, M.: On the ergodicity of the adaptive Metropolis algorithm on unbounded domains. Ann. Appl. Probab. 20, 2178–2203 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  • Silverman, B.W.: Density Estimation. Chapman & Hall, London (1986)

    MATH  Google Scholar 

  • Smith, A.F.M., Roberts, G.O.: Bayesian computation via the Gibbs sampler and related Markov chain Monte Carlo methods (with discussion). J. R. Stat. Soc., Ser. B, Stat. Methodol. 55, 3–24 (1993)

    MathSciNet  MATH  Google Scholar 

  • Tierney, L.: Markov chains for exploring posterior distributions. Ann. Stat. 22, 1701–1786 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  • Warnes, G.R.: The normal kernel Coupler: an adaptive Markov chain Monte Carlo method for efficiently sampling form multi-modal distributions. Technical Report (2001)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jim E. Griffin.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Griffin, J.E., Walker, S.G. On adaptive Metropolis–Hastings methods. Stat Comput 23, 123–134 (2013). https://doi.org/10.1007/s11222-011-9296-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11222-011-9296-2

Keywords

Navigation