Compiling Bayesian Networks for Parameter Learning Based on Shared BDDs
Compiling Bayesian networks (BNs) is one of the most effective ways to exact inference because a logical approach enables the exploitation of local structures in BNs (i.e., determinism and context-specific independence). In this paper, a new parameter learning method based on compiling BNs is proposed. Firstly, a target BN with multiple evidence sets are compiled into a single sharedbinarydecisiondiagram (SBDD) which shares common sub-graphs in multiple BDDs. Secondly, all conditional expectations which are required for executing the EM algorithm are simultaneously computed on the SBDD while their common local probabilities and expectations are shared. Due to these two types of sharing, the computation efficiency of the proposed method is higher than that of an EMalgorithm which naively uses an existing BN compiler for exact inference.
Unable to display preview. Download preview PDF.
- 1.Boutilier, C., Friedman, N., Goldszmidt, M., Koller, D.: Context-Specific Independence in Bayesian Networks. In: Proc. of UAI 1996 (1996)Google Scholar
- 2.Chavira, M., Allen, D., Darwiche, A.: Exploiting Evidence in Probabilistic Inference. In: Proc. of UAI 2005 (2005)Google Scholar
- 3.Chavira, M., Darwiche, A.: Compiling Bayesian Networks with Local Structure. In: Proc. of IJCAI 2005 (2005)Google Scholar
- 7.Ishihata, M., Kameya, Y., Sato, T., Minato, S.: An EM algorithm on BDDs with order encoding for logic-based probabilistic models. In: Proc. of ACML 2010 (2010)Google Scholar
- 8.Minato, S., Ishiura, N., Yajima, S.: Shared Binary Decision Diagram with Attributed Edges for Efficient Boolean Function Manipulation. In: Proc. of DAC 1990 (1990)Google Scholar
- 9.Minato, S., Satoh, K., Sato, T.: Compiling Bayesian Networks by Symbolic Probability Calculation Based on Zero-suppressed BDDs. In: Proc. of IJCAI 2007 (2007)Google Scholar