Machine Learning

, Volume 50, Issue 1–2, pp 175–196

Population Markov Chain Monte Carlo

  • Kathryn Blackmond Laskey
  • James W. Myers


Stochastic search algorithms inspired by physical and biological systems are applied to the problem of learning directed graphical probability models in the presence of missing observations and hidden variables. For this class of problems, deterministic search algorithms tend to halt at local optima, requiring random restarts to obtain solutions of acceptable quality. We compare three stochastic search algorithms: a Metropolis-Hastings Sampler (MHS), an Evolutionary Algorithm (EA), and a new hybrid algorithm called Population Markov Chain Monte Carlo, or popMCMC. PopMCMC uses statistical information from a population of MHSs to inform the proposal distributions for individual samplers in the population. Experimental results show that popMCMC and EAs learn more efficiently than the MHS with no information exchange. Populations of MCMC samplers exhibit more diversity than populations evolving according to EAs not satisfying physics-inspired local reversibility conditions.

Markov chain Monte Carlo Metropolis-Hastings algorithm graphical probabilistic models Bayesian networks Bayesian learning evolutionary algorithms 

Copyright information

© Kluwer Academic Publishers 2003

Authors and Affiliations

  • Kathryn Blackmond Laskey
    • 1
  • James W. Myers
    • 2
  1. 1.Department of Systems Engineering and Operations ResearchGeorge Mason UniversityFairfaxUSA
  2. 2.TRW, VAR1/9D02Reston

Personalised recommendations