Abstract
We use a Markov Chain Monte Carlo (MCMC) MML algorithm to learn hybrid Bayesian networks from observational data. Hybrid networks represent local structure, using conditional probability tables (CPT), logit models, decision trees or hybrid models, i.e., combinations of the three. We compare this method with alternative local structure learning algorithms using the MDL and BDe metrics. Results are presented for both real and artificial data sets. Hybrid models compare favourably to other local structure learners, allowing simple representations given limited data combined with richer representations given massive data.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Friedman, N., Goldszmidt, M.: Learning Bayesian networks with local structure. In: Uncertainty in Artificial Intelligence (1996)
Neil, J.R., Wallace, C.S., Korb, K.B.: Learning Bayesian networks with restricted causal interactions. In: Uncertainty in Artificial Intelligence (1999)
Wallace, C.S., Korb, K.B.: Learning linear causal models by MML samplling. In: Gammerman, A. (ed.) Causal Models and Intelligent Data Management, Springer, Heidelberg (1999)
Korb, K., Nicholson, A.: Bayesian Artificial Intelligence. CRC Press, Boca Raton (2003)
O’Donnell, R.T., Nicholson, A.E., Han, B., Korb, K.B., Alam, M.J., Hope, L.R.: Causal discovery with prior information. In: 19th Australian Joint Conf. on AI (2006)
Heckerman, D., Geiger, D., Chickering, D.: Learning bayesian networks: The combination of knowledge and statistical data. Machine Learning 20, 197–243 (1995)
Lam, W., Bacchus, F.: Learning Bayesian belief networks. Computational Intelligence 10 (1994)
Wallace, C., Patrick, J.: Coding decision trees. Machine Learning 11, 7 (1993)
Bayes, T.: An essay towards solving a problem in the doctrine of chances. Philosophical Transactions of the Royal Soc. of London (1764/1958) reprinted in Biometrika 45(3/4), 293–315 (1958)
Shannon, C.E.: A mathematical theory of communication. Bell System Technical Journal 27(3), 379–423 (1948)
Farr, G.E., Wallace, C.S.: The complexity of strict minimum message length inference. Computer Journal 45(3), 285–292 (2002)
Wallace, C.S., Boulton, D.M.: An information measure for classification. The Computer Journal 11, 185–194 (1968)
Wallace, C.S.: Statistical and Inductive Inference by Minimum Message Length. Springer, Berlin (2005)
Allison, L.: Models for machine learning and data mining in functional programming. Journal of Functional Programming (2005)
Rissanen, J.: Modeling by shortest data description. Automatica 14 (1978)
Baxter, R., Oliver, J.: MDL and MML: similarities and differences. Technical Report 207, Dept of Computer Science, Monash University (1994)
Chickering, D.: A transformational characterization of equivalent Bayesian network structures. In: Uncertainty in Artificial Intelligence (1995)
Cooper, G., Herskovits, E.: A Bayesian method for the induction of probabilistic networks from data. Machine Learning 9, 309–347 (1992)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
O’Donnell, R.T., Allison, L., Korb, K.B. (2006). Learning Hybrid Bayesian Networks by MML. In: Sattar, A., Kang, Bh. (eds) AI 2006: Advances in Artificial Intelligence. AI 2006. Lecture Notes in Computer Science(), vol 4304. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11941439_23
Download citation
DOI: https://doi.org/10.1007/11941439_23
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-49787-5
Online ISBN: 978-3-540-49788-2
eBook Packages: Computer ScienceComputer Science (R0)