Asymptotically Independent Samplers

  • Luca Martino
  • David Luengo
  • Joaquín Míguez
Chapter
Part of the Statistics and Computing book series (SCO)

Abstract

Markov Chain Monte Carlo (MCMC) methods are possibly the most popular tools for random sampling nowadays. They generate “chains” (sequences) of samples from a target distribution that can be selected with few constraints. However, as highlighted by the term “chain,” the draws output by the MCMC method are statistically dependent (and often highly correlated), which makes such algorithms not directly comparable with the methods in the rest of this monograph. In this chapter, we describe two families of non-standard MCMC techniques that enjoy the property of producing samples that become asymptotically independent as a parameter grows to infinity or the number of random draws in the algorithm is increased. The methods of the first family are based on generating a pool of candidate samples at each iteration of the chain, instead of only one as in conventional procedures. The techniques in the second family rely on an adaptive, non-parametric approximation of the target density, which is improved as new samples are generated. We describe the general methodology for the two families, and provide some specific algorithms as examples.

References

  1. 1.
    B. Cai, R. Meyer, F. Perron, Metropolis-Hastings algorithms with adaptive proposals. Stat. Comput. 18, 421–433 (2008)Google Scholar
  2. 2.
    B. Calderhead, A general construction for parallelizing Metropolis-Hastings algorithms. Proc. Natl. Acad. Sci. U. S. A. (PNAS) 111(49), 17408–17413 (2014)Google Scholar
  3. 3.
    R. Casarin, R.V. Craiu, F. Leisen, Interacting multiple try algorithms with different proposal distributions. Stat. Comput. 23, 185–200 (2013)Google Scholar
  4. 4.
    D. Gamerman, H.F. Lopes, Markov Chain Monte Carlo: Stochastic Simulation for Bayesian Inference. Chapman & Hall/CRC Texts in Statistical Science (Chapman & Hall/CRC, Boca Raton, 2006)Google Scholar
  5. 5.
    W.R. Gilks, Derivative-free adaptive rejection sampling for Gibbs sampling. Bayesian Stat. 4, 641–649 (1992)Google Scholar
  6. 6.
    W.R. Gilks, P. Wild, Adaptive rejection sampling for Gibbs sampling. Appl. Stat. 41(2), 337–348 (1992)Google Scholar
  7. 7.
    W.R. Gilks, N.G. Best, K.K.C. Tan, Adaptive rejection metropolis sampling within Gibbs sampling. Appl. Stat. 44(4), 455–472 (1995)Google Scholar
  8. 8.
    D. Görür, Y.W. Teh, Concave convex adaptive rejection sampling. University College London, Technical Report (2009)Google Scholar
  9. 9.
    W.K. Hastings, Monte Carlo sampling methods using Markov chains and their applications. Biometrika 57(1), 97–109 (1970)Google Scholar
  10. 10.
    L. Holden, R. Hauge, M. Holden, Adaptive independent Metropolis-Hastings. Ann. Appl. Probab. 19(1), 395–413 (2009)Google Scholar
  11. 11.
    W. Hörmann, A rejection technique for sampling from T-concave distributions. ACM Trans. Math. Softw. 21(2), 182–193 (1995)Google Scholar
  12. 12.
    F. Liang, C. Liu, R. Caroll, Advanced Markov Chain Monte Carlo Methods: Learning from Past Samples. Wiley Series in Computational Statistics (Wiley, Chichester, 2010)Google Scholar
  13. 13.
    J.S. Liu, Monte Carlo Strategies in Scientific Computing (Springer, New York, 2004)Google Scholar
  14. 14.
    J.S. Liu, F. Liang, W.H. Wong, The multiple-try method and local optimization in metropolis sampling. J. Am. Stat. Assoc. 95(449), 121–134 (2000)Google Scholar
  15. 15.
    L. Martino, A review of multiple try MCMC algorithms for signal processing. Digital Signal Process. 75, 134–152 (2018)Google Scholar
  16. 16.
    L. Martino, V. Elvira, Metropolis sampling. Wiley StatsRef: Statistics Reference Online (2017)Google Scholar
  17. 17.
    L. Martino, F. Louzada, Issues in the Multiple Try Metropolis Mixing. Comput. Stat. 32(1), 239–252 (2017)Google Scholar
  18. 18.
    L. Martino, J. Míguez, Generalized rejection sampling schemes and applications in signal processing. Signal Process. 90(11), 2981–2995 (2010)Google Scholar
  19. 19.
    L. Martino, J. Read, A multi-point Metropolis scheme with generic weight functions. Stat. Probab. Lett. 82(7), 1445–1453 (2012)Google Scholar
  20. 20.
    L. Martino, J. Read, On the flexibility of the design of multiple try Metropolis schemes. Comput. Stat. 28(6), 2797–2823 (2013)Google Scholar
  21. 21.
    L. Martino, J. Read, D. Luengo, Independent doubly adaptive rejection Metropolis sampling, in IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) (2014)Google Scholar
  22. 22.
    L. Martino, V. Elvira, D. Luengo, J. Corander, F. Louzada, Orthogonal parallel MCMC methods for sampling and optimization. arXiv:1507.08577 (2015)Google Scholar
  23. 23.
    L. Martino, J. Read, D. Luengo, Independent doubly adaptive rejection Metropolis sampling within Gibbs sampling. IEEE Trans. Signal Process. 63(12), 3123–3138 (2015)Google Scholar
  24. 24.
    L. Martino, H. Yang, D. Luengo, J. Kanniainen, J. Corander, A fast universal self-tuned sampler within Gibbs sampling. Digit. Signal Process. 47, 68–83 (2015)Google Scholar
  25. 25.
    L. Martino, R. Casarin, D. Luengo, Sticky proposal densities for adaptive MCMC methods, in IEEE Workshop on Statistical Signal Processing (SSP) (2016)Google Scholar
  26. 26.
    L. Martino, V. Elvira, D. Luengo, J. Corander, Layered Adaptive Importance Sampling. Stat. Comput. 27(3) 599–623 (2017)Google Scholar
  27. 27.
    N. Metropolis, S. Ulam, The Monte Carlo method. J. Am. Stat. Assoc. 44, 335–341 (1949)CrossRefMATHGoogle Scholar
  28. 28.
    N. Metropolis, A. Rosenbluth, M. Rosenbluth, A. Teller, E. Teller, Equations of state calculations by fast computing machines. J. Chem. Phys. 21, 1087–1091 (1953)CrossRefGoogle Scholar
  29. 29.
    R. Meyer, B. Cai, F. Perron, Adaptive rejection Metropolis sampling using Lagrange interpolation polynomials of degree 2. Comput. Stat. Data Anal. 52(7), 3408–3423 (2008)MathSciNetCrossRefMATHGoogle Scholar
  30. 30.
    R. Neal, MCMC using ensembles of states for problems with fast and slow variables such as Gaussian process regression. arXiv:1101.0387 (2011)Google Scholar
  31. 31.
    C.P. Robert, G. Casella, Monte Carlo Statistical Methods (Springer, New York, 2004)CrossRefMATHGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Luca Martino
    • 1
  • David Luengo
    • 2
  • Joaquín Míguez
    • 1
  1. 1.Department of Signal Theory and CommunicationsCarlos III University of MadridMadridSpain
  2. 2.Department of Signal Theory and CommunicationsTechnical University of MadridMadridSpain

Personalised recommendations