Skip to main content
Log in

Factor graph fragmentization of expectation propagation

  • Research Article
  • Published:
Journal of the Korean Statistical Society Aims and scope Submit manuscript

Abstract

Expectation propagation is a general approach to fast approximate inference for graphical models. The existing literature treats models separately when it comes to deriving and coding expectation propagation inference algorithms. This comes at the cost of similar, long-winded algebraic steps being repeated and slowing down algorithmic development. We demonstrate how factor graph fragmentization can overcome this impediment. This involves adoption of the message passing on a factor graph approach to expectation propagation and identification of factor graph sub-graphs, which we call fragments, that are common to wide classes of models. Key fragments and their corresponding messages are catalogued which means that their algebra does not need to be repeated. This allows compartmentalization of coding and efficient software development.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  • Bishop, C. M. (2006). Pattern Recognition and Machine Learning. New York: Springer.

    MATH  Google Scholar 

  • Carpenter, B., Gelman, A., Hoffman, M. D., Lee, D., Goodrich, B., Betancourt, M., et al. (2017). Stan: A probabilistic programming language. Journal of Statistical Software, 76(1), 1–32.

    Article  Google Scholar 

  • Gelman, A., Carlin, J. B., Stern, H. S., Dunson, D. B., Vehtari, A., & Rubin, D. B. (2014). Bayesian data analysis (3rd ed.). Boca Raton: CRC Press.

    MATH  Google Scholar 

  • Guo, J., Gabry, J. & Goodrich, B. (2017). The R package rstan: R interface to Stan. R package (version 2.17.2). http://mc-stan.org.

  • Heese, N., Tarlow, D., & Winn, J. (2013). Learning to pass expectation propagation messages. In C. J. C. Burges, L. Bottou, M. Welling, Z. Ghahramani, & K. Q. Weinberger (Eds.), Advances in neural information processing systems (Vol. 26, pp. 3219–3227). Red Hook: Curran Associates, Incorporated.

    Google Scholar 

  • Herbrich, R. (2005). Gaussian expectation propagation. https://www.microsoft.com/en-us/research/publication/on-gaussian-expectation-propagation/.

  • Heskes, T., Opper, M., Wiegerinck, W., Winther, O., & Zoeter, O. (2005). Approximate inference techniques with expectation constraints. Journal of Statistical Mechanics Theory and Experiment, P11015, 1–24.

    MATH  Google Scholar 

  • Heskes, T., & Zoeter, O. (2002). Expectation propagation for approximate inference in dynamic Bayesian networks. In A. Darwiche & N. Friedman (Eds.), Proceedings of the eighteenth annual conference on uncertainty in artificial intelligence (pp. 216–223). San Francisco: Morgan Kaufmann.

    Google Scholar 

  • Jylänki, P., Vanhatalo, J., & Vehtari, A. (2011). Robust Gaussian process regression with a student-\(t\) likelihood. Journal of Machine Learning Research, 12, 3227–3257.

    MathSciNet  MATH  Google Scholar 

  • Kim, A. S. I., & Wand, M. P. (2016). The explicit form of expectation propagation for a simple statistical model. Electronic Journal of Statistics, 10, 550–581.

    Article  MathSciNet  Google Scholar 

  • Kim, A. S. I., & Wand, M. P. (2018). On expectation propagation for generalized, linear and mixed models. A Australian and New Zealand Journal of Statistics, 60, 75–102.

    Article  Google Scholar 

  • Lienart, T., Teh, Y. W., & Doucet, A. (2015). Expectation particle belief propagation. In C. Cortes, N. D. Lawrence, D. D. Lee, M. Sugiyama, & R. Garnett (Eds.), Advances in neural information processing systems (Vol. 28, pp. 3609–3617). Red Hook: Curran Associates, Incorporated.

    Google Scholar 

  • McLean, M. W., & Wand, M. P. (2018). Variational message passing for elaborate response regression models. Bayesian Analysis (in press).

  • Minka, T. (2005). Divergence measures and message passing. Microsoft research technical report series, MSR-TR-2005-173, pp. 1–17.

  • Minka, T., & Winn, J. (2008). Gates: A graphical notation for mixture models. Microsoft research technical report series, MSR-TR-2008-185, pp. 1–16.

  • Minka, T. P. (2001). Expectation propagation for approximate Bayesian inference. In J. S. Breese & D. Koller (Eds.), Proceedings of the seventeenth conference on uncertainty in artificial intelligence (pp. 362–369). Burlington: Morgan Kaufmann.

    Google Scholar 

  • Minka, T., Winn, J. M., Guiver, J. P., Webster, S., Zaykov, Y., Yangel, B., Spengler, A. & Bronskill, J. (2014). Infer.NET 2.6, Microsoft Research Cambridge, 2014. http://research.microsoft.com/infernet.

  • Murphy, K. (2007). Software for graphical models: a review. International Society for Bayesian Analysis Bulletin, 14, 13–15.

    Google Scholar 

  • Nolan, T. H., & Wand, M. P. (2017). Accurate logistic variational message passing: Algebraic and numerical details. Stat, 6, 102–112.

    Article  MathSciNet  Google Scholar 

  • Opper, M., & Winther, O. (2000). Gaussian processes for classification: Mean-field algorithms. Neural Computation, 12, 2655–2684.

    Article  Google Scholar 

  • Opper, M., & Winther, O. (2005). Expectation consistent approximate inference. Journal of Machine Learning Research, 6, 2177–2204.

    MathSciNet  MATH  Google Scholar 

  • Ruppert, D., Wand, M. P., & Carroll, R. J. (2009). Semiparametric regression during 2003–2007. Electronic Journal of Statistics, 3, 1193–1256.

    Article  MathSciNet  Google Scholar 

  • Sommer, A. (1982). Nutritional blindness. New York: Oxford University Press.

    Google Scholar 

  • Thouless, D. J., Anderson, P. W., & Palmer, R. G. (1977). Solution of a “solvable model of a spin glass”. The Philosophical Magazine, 35, 593.

    Article  Google Scholar 

  • Verdinelli, I., & Wasserman, L. (1991). Bayesian analysis of outlier problems using the Gibbs sampler. Statistics and Computing, 1(2), 105–117.

    Article  Google Scholar 

  • Wainwright, M. J., & Jordan, M. I. (2008). Graphical models, exponential families, and variational inference. Foundations and Trends in Machine Learning, 1, 1–305.

    Article  Google Scholar 

  • Wand, M. P. (2017). Fast approximate inference for arbitrarily large semiparametric regression models via message passing (with discussion). Journal of the American Statistical Association, 112, 137–168.

    Article  MathSciNet  Google Scholar 

  • Wand, M. P., & Ripley, B. D. (2015). The R package KernSmooth. Functions for kernel smoothing supporting Wand & Jones (1995) (version 2.23). https://cran.R-project.org.

  • Wand, M. P., Ormerod, J. T., Padoan, S. A., & Frühwirth, R. (2011). Mean Field Variational Bayes for Elaborate Distributions. Bayesian Analysis, 6(4), 847–900.

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

This research was supported by Australian Research Council Discovery Project DP180100597.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Matt P. Wand.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (pdf 245 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Chen, W.Y., Wand, M.P. Factor graph fragmentization of expectation propagation. J. Korean Stat. Soc. 49, 722–756 (2020). https://doi.org/10.1007/s42952-019-00033-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s42952-019-00033-9

Keywords

Navigation