Learning Conditional Distributions Using Mixtures of Truncated Basis Functions
- 1 Citations
- 575 Downloads
Abstract
Mixtures of Truncated Basis Functions (MoTBFs) have recently been proposed for modelling univariate and joint distributions in hybrid Bayesian networks. In this paper we analyse the problem of learning conditional MoTBF distributions from data. Our approach utilizes a new technique for learning joint MoTBF densities, then propose a method for using these to generate the conditional distributions. The main contribution of this work is conveyed through an empirical investigation into the properties of the new learning procedure, where we also compare the merits of our approach to those obtained by other proposals.
Keywords
Mixtures of truncated basis functions Hybrid bayesian networks Joint density Conditional densityNotes
Acknowledgments
This research has been partly funded by the Spanish Ministry of Economy and Competitiveness, through project TIN2013-46638-C3-1-P and by Junta de Andalucía through project P11-TIC-7821 and by ERDF funds. A part of this work was performed within the AMIDST project. AMIDST has received funding from the European Union’s Seventh Framework Programme for research, technological development and demonstration under grant agreement no 619209.
References
- 1.Langseth, H., Nielsen, T.D., Rumí, R., Salmerón, A.: Maximum likelihood learning of conditional MTE distributions. In: Sossai, C., Chemello, G. (eds.) ECSQARU 2009. LNCS, vol. 5590, pp. 240–251. Springer, Heidelberg (2009)CrossRefGoogle Scholar
- 2.Langseth, H., Nielsen, T.D., Rumí, R., Salmerón, A.: Mixtures of truncated basis functions. Int. J. Approximate Reasoning 53, 212–227 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
- 3.Langseth, H., Nielsen, T.D., Salmerón, A.: Learning mixtures of truncated basis functions from data. In: Proceedings of the Sixth European Workshop on Probabilistic Graphical Models (PGM 2012), pp. 163–170 (2012)Google Scholar
- 4.Langseth, H., Nielsen, T.D., Rumí, R., Salmerón, A.: Inference in hybrid Bayesian networks with mixtures of truncated basis functions. In: Proceedings of the Sixth European Workshop on Probabilistic Graphical Models (PGM 2012), pp. 171–178 (2012)Google Scholar
- 5.Langseth, H., Nielsen, T.D., Pérez-Bernabé, I., Salmerón, A.: Learning mixtures of truncated basis functions from data. Int. J. Approximate Reasoning 55, 940–956 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
- 6.Lauritzen, S.: Propagation of probabilities, means and variances in mixed graphical association models. J. Am. Stat. Assoc. 87, 1098–1108 (1992)MathSciNetCrossRefzbMATHGoogle Scholar
- 7.Moral, S., Rumí, R., Salmerón, A.: Mixtures of truncated exponentials in hybrid Bayesian networks. In: Benferhat, S., Besnard, P. (eds.) ECSQARU 2001. LNCS (LNAI), vol. 2143, p. 156. Springer, Heidelberg (2001)CrossRefGoogle Scholar
- 8.Schwarz, G.: Estimating the dimension of a model. Ann. Stat. 6, 461–464 (1978)MathSciNetCrossRefzbMATHGoogle Scholar
- 9.Shenoy, P., Shafer, G.: Axioms for probability and belief function propagation. In: Shachter, R., Levitt, T., Lemmer, J., Kanal, L. (eds.) Uncertainty in Artificial Intelligence 4, pp. 169–198. North Holland, Amsterdam (1990)CrossRefGoogle Scholar
- 10.Shenoy, P., West, J.: Inference in hybrid Bayesian networks using mixtures of polynomials. Int. J. Approximate Reasoning 52, 641–657 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
- 11.Varando, G., López-Cruz, P.L., Nielsen, T.D., Bielza, C., Larrañga, P.: Conditional density approximations with mixtures of polynomials. Int. J. Intell. Syst. 30, 236–264 (2015)CrossRefGoogle Scholar
- 12.Zhang, N., Poole, D.: Exploiting causal independence in Bayesian network inference. J. Artif. Intell. Res. 5, 301–328 (1996)MathSciNetzbMATHGoogle Scholar