Error Bounds Between Marginal Probabilities and Beliefs of Loopy Belief Propagation Algorithm
Belief propagation (BP) algorithm has been becoming increasingly a popular method for probabilistic inference on general graphical models. When networks have loops, it may not converge and, even if converges, beliefs, i.e., the result of the algorithm, may not be equal to exact marginal probabilities. When networks have loops, the algorithm is called Loopy BP (LBP). Tatikonda and Jordan applied Gibbs measures theory to LBP algorithm and derived a sufficient convergence condition. In this paper, we utilize Gibbs measure theory to investigate the discrepancy between a marginal probability and the corresponding belief. Consequently, in particular, we obtain an error bound if the algorithm converges under a certain condition. It is a general result for the accuracy of the algorithm. We also perform numerical experiments to see the effectiveness of the result.
KeywordsIsing Model Error Bound Belief Propagation Gibbs Measure Marginal Probability
Unable to display preview. Download preview PDF.
- 1.Frey, B.J.: Graphical Models for Pattern Classification, Data Compression and Channel Coding. MIT press, Cambridge (1998)Google Scholar
- 3.Jensen, F.: An Introduction to Bayesian Networks. UCL Press, London (1996)Google Scholar
- 5.Murphy, K.P., Weiss, Y., Jordan, M.I.: Loopy belief propagation for approximate inference: an empirical study. In: Proc. of the 15th Conf. on Unc. in Art. Int., pp. 467–475. Morgan Kaufmann, San Francisco (1999)Google Scholar
- 6.Pearl, J.: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann, San Francisco (1988)Google Scholar
- 8.Tatikonda, S.C., Jordan, M.I.: Loopy Belief Propagation and Gibbs Measures. In: Proc. of the 18th Conf. on Unc. in Art. Int., pp. 493–500. Morgan Kaufmann, San Francisco (2002)Google Scholar