Advanced Topics in MCMC

Part of the Statistics and Computing book series (SCO)


The pace of research on MCMC methods is so quick that any survey of advanced topics is immediately obsolete. The highly eclectic and decidedly biased coverage in our final chapter begins with a discussion of Markov random fields. Our limited aims here are to prove the Hammersley-Clifford theorem and introduce the Swendsen-Wang algorithm, a clever form of slice sampling. In the Ising model, the Swendsen-Wang algorithm is much more efficient than standard Gibbs sampling.


Markov Chain Gibbs Sampling Transition Probability Matrix Advance Topic Total Variation Distance 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Akhiezer NI, Glazman IM (1993) Theory of Linear Operators in Hilbert Space. Dover, New YorkMATHGoogle Scholar
  2. 2.
    Brémaud P (1999) Markov Chains: Gibbs Fields, Monte Carlo Simulation, and Queues. SpringerGoogle Scholar
  3. 3.
    Brook D (1964) On the distinction between the conditional and the joint probability approaches in the specification of nearest-neighbor systems. Biometrika 51:481-483MATHMathSciNetGoogle Scholar
  4. 4.
    Conway JB (1985) A Course on Functional Analysis. Springer, New YorkGoogle Scholar
  5. 5.
    Diaconis P (1988) Group Representations in Probability and Statistics. Institute of Mathematical Statistics, Hayward, CAMATHGoogle Scholar
  6. 6.
    Diaconis P, Khare K, Saloff-Coste L (2008) Gibbs sampling, exponen- tial families and orthogonal polynomials. Stat Science 23:151-178CrossRefMathSciNetGoogle Scholar
  7. 7.
    Edwards RG, Sokal AD (1988) Generalizations of the Fortuin-Kasteleyn-Swendsen-Wang representation and Monte Carlo algorithm. Physical Review D 38:2009-2012CrossRefMathSciNetGoogle Scholar
  8. 8.
    Green PJ (1995) Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika 82:711-732MATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Hastie D, Green PJ (2009) Reversible jump MCMC. (unpublished lecture notes)Google Scholar
  10. 10.
    Jones GL (2004) On the Markov chain central limit theorem. Prob Surveys 1:299-320CrossRefGoogle Scholar
  11. 11.
    Levin DA, Peres Y, Wilmer EL (2008) Markov Chains and Mixing Times. Amer Math Soc, Providence, RIGoogle Scholar
  12. 12.
    Liu JS (1996) Metropolized independent sampling with comparisons to rejection sampling and importance sampling. Stat and Computing 6:113-119CrossRefGoogle Scholar
  13. 13.
    Liu JS (2001) Monte Carlo Strategies in Scientific Computing. Springer, New YorkMATHGoogle Scholar
  14. 14.
    Richardson S, Green PJ (1997) On Bayesian analysis of mixtures with an unknown number of components. J Royal Stat Soc B 59:731-792MATHCrossRefMathSciNetGoogle Scholar
  15. 15.
    Robert CP, Casella G (2004) Monte Carlo Statistical Methods, 2nd ed. Springer, New YorkMATHGoogle Scholar
  16. 16.
    Rosenthal JS (1995) Convergence rates of Markov chains. SIAM Review 37:387-405MATHCrossRefMathSciNetGoogle Scholar
  17. 17.
    Rynne BP, Youngson MA (2008) Linear Functional Analysis. Springer, New YorkMATHCrossRefGoogle Scholar
  18. 18.
    Stein EM, Shakarchi R (2005) Real Analysis: Measure Theory, Integration, and Hilbert Spaces. Princeton University Press, Princeton, NJMATHGoogle Scholar
  19. 19.
    Swendsen RH, Wang JS (1987) Nonuniversal critical dynamics in Monte Carlo simulations. Physical Review Letters 58:86-88CrossRefGoogle Scholar
  20. 20.
    Tierney L (1994) Markov chains for exploring posterior distributions (with discussion). Ann Stat 22:1701-1762MATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer New York 2010

Authors and Affiliations

  1. 1.Departments of Biomathematics, Human Genetics, and Statistics David Geffen School of MedicineUniversity of California, Los AngelesLos AngelesUSA

Personalised recommendations