Advertisement

Error Bounds Between Marginal Probabilities and Beliefs of Loopy Belief Propagation Algorithm

  • Nobuyuki Taga
  • Shigeru Mase
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4293)

Abstract

Belief propagation (BP) algorithm has been becoming increasingly a popular method for probabilistic inference on general graphical models. When networks have loops, it may not converge and, even if converges, beliefs, i.e., the result of the algorithm, may not be equal to exact marginal probabilities. When networks have loops, the algorithm is called Loopy BP (LBP). Tatikonda and Jordan applied Gibbs measures theory to LBP algorithm and derived a sufficient convergence condition. In this paper, we utilize Gibbs measure theory to investigate the discrepancy between a marginal probability and the corresponding belief. Consequently, in particular, we obtain an error bound if the algorithm converges under a certain condition. It is a general result for the accuracy of the algorithm. We also perform numerical experiments to see the effectiveness of the result.

Keywords

Ising Model Error Bound Belief Propagation Gibbs Measure Marginal Probability 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Frey, B.J.: Graphical Models for Pattern Classification, Data Compression and Channel Coding. MIT press, Cambridge (1998)Google Scholar
  2. 2.
    Georgii, H.-O.: Gibbs Measures and Phase Transitions. Walter de Gruyter, Berlin, New York (1988)MATHGoogle Scholar
  3. 3.
    Jensen, F.: An Introduction to Bayesian Networks. UCL Press, London (1996)Google Scholar
  4. 4.
    McEliece, R.J., MacKay, D.J.C., Cheng, J.F.: Turbo Decoding as an Instance of Pearl’s “Belief Propagation” Algorithm. IEEE Journal on Selected Areas in Communication 16(2), 140–152 (1998)CrossRefGoogle Scholar
  5. 5.
    Murphy, K.P., Weiss, Y., Jordan, M.I.: Loopy belief propagation for approximate inference: an empirical study. In: Proc. of the 15th Conf. on Unc. in Art. Int., pp. 467–475. Morgan Kaufmann, San Francisco (1999)Google Scholar
  6. 6.
    Pearl, J.: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann, San Francisco (1988)Google Scholar
  7. 7.
    Taga, N., Mase, S.: On the Convergence of Loopy Belief Propagation Algorithm for Different Update Rules. IEICE transactions on Fundamentals of Electronics, Communications and Computer Sciences E89-(2), 575–582 (2006)CrossRefGoogle Scholar
  8. 8.
    Tatikonda, S.C., Jordan, M.I.: Loopy Belief Propagation and Gibbs Measures. In: Proc. of the 18th Conf. on Unc. in Art. Int., pp. 493–500. Morgan Kaufmann, San Francisco (2002)Google Scholar
  9. 9.
    Weiss, Y.: Correctness of Local Probability Propagation in Graphical Models with Loops. Neur. Comp. 12, 1–41 (2000)MATHCrossRefGoogle Scholar
  10. 10.
    Weiss, Y., Freeman, W.T.: Correctness of Belief Propagation in Gaussian Graphical Models of Arbitrary Topology. Neur. Comp. 12, 2173–2200 (2001)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Nobuyuki Taga
    • 1
  • Shigeru Mase
    • 1
  1. 1.Tokyo Institute of TechnologyTokyoJapan

Personalised recommendations