Dealing with Continuous Variables in Graphical Models

  • Christophe GonzalesEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11940)


Uncertain reasoning over both continuous and discrete random variables is important for many applications in artificial intelligence. Unfortunately, dealing with continuous variables is not an easy task. In this tutorial, we will study some of the methods and models developed in the literature for this purpose. We will start with the discretization of continuous random variables. A special focus will be made on the numerous issues they raise, ranging from which discretization criterion to use, to the appropriate way of using them during structure learning. These issues will justify the exploitation of hybrid models designed to encode mixed probability distributions. Several such models have been proposed in the literature. Among them, Conditional Linear Gaussian models are very popular. They can be used very efficiently for inference but they lack flexibility in the sense that they impose that the continuous random variables follow conditional Normal distributions and are related to other variables through linear relations. Other popular models are mixtures of truncated exponentials, mixtures of polynomials and mixtures of truncated basis functions. Through a clever use of mixtures of distributions, these models can approximate very well arbitrary mixed probability distributions. However, exact inference can be very time consuming in these models. Therefore, when choosing which model to exploit, one has to trade-off between the flexibility of the uncertainty model and the computational complexity of its learning and inference mechanisms.


Continuous variable Hybrid graphical model Discretization 


  1. 1.
    Bergsma, W.: Testing conditional independence for continuous random variables. Technical report, 2004–049, EURANDOM (2004)Google Scholar
  2. 2.
    Cobb, B., Shenoy, P.: Inference in hybrid Bayesian networks with mixtures of truncated exponentials. Int. J. Approximate Reasoning 41(3), 257–286 (2006)MathSciNetCrossRefGoogle Scholar
  3. 3.
    Cobb, B., Shenoy, P., Rumí, R.: Approximating probability density functions in hybrid Bayesian networks with mixtures of truncated exponentials. Stat. Comput. 16(3), 293–308 (2006)MathSciNetCrossRefGoogle Scholar
  4. 4.
    Dechter, R.: Bucket elimination: a unifying framework for reasoning. Artif. Intell. 113, 41–85 (1999)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Elidan, G.: Copula Bayesian networks. In: Proceedings of NIPS 2010, pp. 559–567 (2010)Google Scholar
  6. 6.
    Elidan, G., Nachman, I., Friedman, N.: “ideal parent” structure learning for continuous variable Bayesian networks. J. Mach. Learn. Res. 8, 1799–1833 (2007)MathSciNetzbMATHGoogle Scholar
  7. 7.
    Friedman, N., Goldszmidt, M.: Discretizing continuous attributes while learning Bayesian networks. In: Proceedings of ICML 1996, pp. 157–165 (1996)Google Scholar
  8. 8.
    Geiger, D., Heckerman, D.: Learning Gaussian networks. In: Proceedings of UAI 1994, pp. 235–243 (1994)CrossRefGoogle Scholar
  9. 9.
    Koller, D., Friedman, N.: Probabilistic Graphical Models: Principles and Techniques. MIT Press, Cambridge (2009)zbMATHGoogle Scholar
  10. 10.
    Kozlov, A., Koller, D.: Nonuniform dynamic discretization in hybrid networks. In: Proceedings of UAI 1997, pp. 314–325 (1997)Google Scholar
  11. 11.
    Kuipers, J., Moffa, G., Heckerman, D.: Addendum on the scoring of Gaussian directed acyclic graphical models. Ann. Stat. 42(4), 1689–1691 (2014)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Langseth, H., Nielsen, T., Rumí, R., Salmerón, A.: Inference in hybrid Bayesian networks with mixtures of truncated basis functions. In: Proceedings of PGM 2012, pp. 171–178 (2012)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Langseth, H., Nielsen, T., Rumí, R., Salmerón, A.: Mixtures of truncated basis functions. Int. J. Approximate Reasoning 53(2), 212–227 (2012)MathSciNetCrossRefGoogle Scholar
  14. 14.
    Lauritzen, S.: Propagation of probabilities, means and variances in mixed graphical association models. J. Am. Stat. Assoc. 87, 1098–1108 (1992)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Lauritzen, S., Jensen, F.: Stable local computation with mixed Gaussian distributions. Stat. Comput. 11(2), 191–203 (2001)MathSciNetCrossRefGoogle Scholar
  16. 16.
    Lauritzen, S., Wermuth, N.: Graphical models for associations between variables, some of which are qualitative and some quantitative. Ann. Stat. 17(1), 31–57 (1989)MathSciNetCrossRefGoogle Scholar
  17. 17.
    Lerner, U., Segal, E., Koller, D.: Exact inference in networks with discrete children of continuous parents. In: Proceedings of UAI 2001, pp. 319–328 (2001)Google Scholar
  18. 18.
    Mabrouk, A., Gonzales, C., Jabet-Chevalier, K., Chojnaki, E.: Multivariate cluster-based discretization for Bayesian network structure learning. In: Beierle, C., Dekhtyar, A. (eds.) SUM 2015. LNCS (LNAI), vol. 9310, pp. 155–169. Springer, Cham (2015). Scholar
  19. 19.
    Madsen, A., Jensen, F.: LAZY propagation: a junction tree inference algorithm based on lazy inference. Artif. Intell. 113(1–2), 203–245 (1999)CrossRefGoogle Scholar
  20. 20.
    Margaritis, D., Thrun, S.: A Bayesian multiresolution independence test for continuous variables. In: Proceedings of UAI 2001, pp. 346–353 (2001)Google Scholar
  21. 21.
    Monti, S., Cooper, G.: A multivariate discretization method for learning Bayesian networks from mixed data. In: Proceedings of UAI 1998, pp. 404–413 (1998)Google Scholar
  22. 22.
    Moral, S., Rumi, R., Salmerón, A.: Mixtures of truncated exponentials in hybrid Bayesian networks. In: Benferhat, S., Besnard, P. (eds.) ECSQARU 2001. LNCS (LNAI), vol. 2143, pp. 156–167. Springer, Heidelberg (2001). Scholar
  23. 23.
    Moral, S., Rumí, R., Salmerón, A.: Estimating mixtures of truncated exponentials from data. In: Proceedings of PGM 2002, pp. 135–143 (2002)Google Scholar
  24. 24.
    Pearl, J.: Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kauffmann, Burlington (1988)zbMATHGoogle Scholar
  25. 25.
    Poland, W., Shachter, R.: Three approaches to probability model selection. In: de Mantaras, R.L., Poole, D. (eds.) Proceedings of UAI 1994, pp. 478–483 (1994)CrossRefGoogle Scholar
  26. 26.
    Romero, R., Rumí, R., Salmerón, A.: Structural learning of Bayesian networks with mixtures of truncated exponentials. In: Proceedings of PGM 2004, pp. 177–184 (2004)Google Scholar
  27. 27.
    Rumí, R., Salmerón, A.: Approximate probability propagation with mixtures of truncated exponentials. Int. J. Approximate Reasoning 45(2), 191–210 (2007)MathSciNetCrossRefGoogle Scholar
  28. 28.
    Salmerón, A., Rumí, R., Langseth, H., Madsen, A.L., Nielsen, T.D.: MPE inference in conditional linear Gaussian networks. In: Destercke, S., Denoeux, T. (eds.) ECSQARU 2015. LNCS (LNAI), vol. 9161, pp. 407–416. Springer, Cham (2015). Scholar
  29. 29.
    Shafer, G.: Probabilistic Expert Systems. Society for Industrial and Applied Mathematics (1996)Google Scholar
  30. 30.
    Shenoy, P.: A re-definition of mixtures of polynomials for inference in hybrid Bayesian networks. In: Liu, W. (ed.) ECSQARU 2011. LNCS (LNAI), vol. 6717, pp. 98–109. Springer, Heidelberg (2011). Scholar
  31. 31.
    Shenoy, P.: Two issues in using mixtures of polynomials for inference in hybrid Bayesian networks. Int. J. Approximate Reasoning 53(5), 847–866 (2012)MathSciNetCrossRefGoogle Scholar
  32. 32.
    Shenoy, P., Shafer, G.: Axioms for probability and belief function propagation. In: Proceedings of UAI 1990, pp. 169–198 (1990)CrossRefGoogle Scholar
  33. 33.
    Shenoy, P., West, J.: Inference in hybrid Bayesian networks using mixtures of polynomials. Int. J. Approximate Reasoning 52(5), 641–657 (2011)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Aix-Marseille Université, CNRS, LISMarseilleFrance

Personalised recommendations