Compiling Bayesian Networks for Parameter Learning Based on Shared BDDs

  • Masakazu Ishihata
  • Taisuke Sato
  • Shin-ichi Minato
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7106)


Compiling Bayesian networks (BNs) is one of the most effective ways to exact inference because a logical approach enables the exploitation of local structures in BNs (i.e., determinism and context-specific independence). In this paper, a new parameter learning method based on compiling BNs is proposed. Firstly, a target BN with multiple evidence sets are compiled into a single shared binary decision diagram (SBDD) which shares common sub-graphs in multiple BDDs. Secondly, all conditional expectations which are required for executing the EM algorithm are simultaneously computed on the SBDD while their common local probabilities and expectations are shared. Due to these two types of sharing, the computation efficiency of the proposed method is higher than that of an EMalgorithm which naively uses an existing BN compiler for exact inference.


Bayesian Network Local Structure Boolean Function Conditional Expectation Parameter Learning 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Boutilier, C., Friedman, N., Goldszmidt, M., Koller, D.: Context-Specific Independence in Bayesian Networks. In: Proc. of UAI 1996 (1996)Google Scholar
  2. 2.
    Chavira, M., Allen, D., Darwiche, A.: Exploiting Evidence in Probabilistic Inference. In: Proc. of UAI 2005 (2005)Google Scholar
  3. 3.
    Chavira, M., Darwiche, A.: Compiling Bayesian Networks with Local Structure. In: Proc. of IJCAI 2005 (2005)Google Scholar
  4. 4.
    Chavira, M., Darwiche, A.: On probabilistic inference by weighted model counting. Artificial Intelligence 172, 772–799 (2008)MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Darwiche, A.: A Differential Approach to Inference in Bayesian Networks. Journal of the ACM 50(3), 123–132 (2003)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Dempster, A., Laird, N., Rubin, D.: Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Statistical Society, Series B 39, 1–38 (1977)MathSciNetzbMATHGoogle Scholar
  7. 7.
    Ishihata, M., Kameya, Y., Sato, T., Minato, S.: An EM algorithm on BDDs with order encoding for logic-based probabilistic models. In: Proc. of ACML 2010 (2010)Google Scholar
  8. 8.
    Minato, S., Ishiura, N., Yajima, S.: Shared Binary Decision Diagram with Attributed Edges for Efficient Boolean Function Manipulation. In: Proc. of DAC 1990 (1990)Google Scholar
  9. 9.
    Minato, S., Satoh, K., Sato, T.: Compiling Bayesian Networks by Symbolic Probability Calculation Based on Zero-suppressed BDDs. In: Proc. of IJCAI 2007 (2007)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Masakazu Ishihata
    • 1
  • Taisuke Sato
    • 1
  • Shin-ichi Minato
    • 2
  1. 1.Tokyo Institute of TechnologyTokyoJapan
  2. 2.Hokkaido UniversitySapporoJapan

Personalised recommendations