Statistics and Computing

, Volume 16, Issue 2, pp 193-202

Markov Chain Monte Carlo in small worlds

  • Yongtao GuanAffiliated withDepartment of Mathematics, University of Idaho
  • , Roland FleißnerAffiliated withDepartment of Mathematics, University of Idaho
  • , Paul JoyceAffiliated withDepartment of Mathematics, University of IdahoDepartment of Statistics, University of Idaho Email author 
  • , Stephen M. KroneAffiliated withDepartment of Mathematics, University of Idaho

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access


As the number of applications for Markov Chain Monte Carlo (MCMC) grows, the power of these methods as well as their shortcomings become more apparent. While MCMC yields an almost automatic way to sample a space according to some distribution, its implementations often fall short of this task as they may lead to chains which converge too slowly or get trapped within one mode of a multi-modal space. Moreover, it may be difficult to determine if a chain is only sampling a certain area of the space or if it has indeed reached stationarity.

In this paper, we show how a simple modification of the proposal mechanism results in faster convergence of the chain and helps to circumvent the problems described above. This mechanism, which is based on an idea from the field of “small-world” networks, amounts to adding occasional “wild” proposals to any local proposal scheme. We demonstrate through both theory and extensive simulations, that these new proposal distributions can greatly outperform the traditional local proposals when it comes to exploring complex heterogenous spaces and multi-modal distributions. Our method can easily be applied to most, if not all, problems involving MCMC and unlike many other remedies which improve the performance of MCMC it preserves the simplicity of the underlying algorithm.


Markov Chain Monte Carlo Metropolis-Hastings algorithm Proposal distributions Small-world networks Importance sampling