Advertisement

Adaptive Inference on Probabilistic Relational Models

  • Tanya Braun
  • Ralf Möller
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11320)

Abstract

Standard approaches for inference in probabilistic relational models include lifted variable elimination (LVE) for single queries. To efficiently handle multiple queries, the lifted junction tree algorithm (LJT) uses a first-order cluster representation of a model, employing LVE as a subroutine in its steps. Adaptive inference concerns efficient inference under changes in a model. If the model changes, LJT restarts, possibly unnecessarily dumping information. The purpose of this paper is twofold, (i) to adapt the cluster representation to incremental changes, and (ii) to transform LJT into an adaptive version, enabling LJT to preserve as much computations as possible. Adaptive LJT fast reaches the point of answering queries again after changes, which is especially important for time-critical applications or online query answering.

References

  1. 1.
    Acar, U.A., Ihler, A.T., Mettu, R.R., Sümer, Ö.: Adaptive inference on general graphical models. In: Proceedings of the 24th Conference on Uncertainty in AI, UAI 2008, pp. 1–8 (2008)Google Scholar
  2. 2.
    Ahmadi, B., Kersting, K., Mladenov, M., Natarajan, S.: Exploiting symmetries for scaling loopy belief propagation and relational training. Mach. Learn. 92(1), 91–132 (2013)MathSciNetCrossRefGoogle Scholar
  3. 3.
    Ahmadi, B., Kersting, K., Sanner, S.: Multi-evidence lifted message passing, with application to Pagerank and the Kalman filter. In: Proceedings of the 22nd International Joint Conference on AI, IJCAI 2011, pp. 1152–1158 (2011)Google Scholar
  4. 4.
    Braun, T., Möller, R.: Lifted junction tree algorithm. In: Friedrich, G., Helmert, M., Wotawa, F. (eds.) KI 2016. LNCS (LNAI), vol. 9904, pp. 30–42. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-46073-4_3CrossRefGoogle Scholar
  5. 5.
    Braun, T., Möller, R.: Counting and conjunctive queries in the lifted junction tree algorithm. In: Croitoru, M., Marquis, P., Rudolph, S., Stapleton, G. (eds.) GKR 2017. LNCS (LNAI), vol. 10775, pp. 54–72. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-78102-0_3CrossRefGoogle Scholar
  6. 6.
    van den Broeck, G.: On the completeness of first-order knowledge compilation for lifted probabilistic inference. Adv. Neural Inf. Process. Syst. 24, 1386–1394 (2011)Google Scholar
  7. 7.
    van den Broeck, G., Niepert, M.: Lifted probabilistic inference for asymmetric graphical models. In: Proceedings of the 29th Conference on AI, AAAI 2015, pp. 3599–3605 (2015)Google Scholar
  8. 8.
    van den Broeck, G., Taghipour, N., Meert, W., Davis, J., Raedt, L.D.: Lifted probabilistic inference by first-order knowledge compilation. In: Proceedings of the 22nd International Joint Conference on AI, IJCAI 2011 (2011)Google Scholar
  9. 9.
    Das, M., Wu, Y., Khot, T., Kersting, K., Natarajan, S.: Scaling lifted probabilistic inference and learning via graph databases. In: Proceedings of the SIAM International Conference on Data Mining, pp. 738–746 (2016)Google Scholar
  10. 10.
    Delcher, A.L., Grove, A.J., Kasif, S., Pearl, J.: Logarithmic-time updates and queries in probabilistic networks. In: Proceedings of the 11th Conference on Uncertainty in AI, UAI 1995, pp. 116–124 (1995)Google Scholar
  11. 11.
    Friedman, M.: The Bayesian structural EM algorithm. In: Proceedings of the 14th Conference on Uncertainty in AI, UAI 1998, pp. 129–138 (1998)Google Scholar
  12. 12.
    Lauritzen, S.L., Spiegelhalter, D.J.: Local computations with probabilities on graphical structures and their application to expert systems. J. Royal Statist. Soc. Ser. B: Methodol. 50, 157–224 (1988)Google Scholar
  13. 13.
    Milch, B., Zettelmoyer, L.S., Kersting, K., Haimes, M., Kaelbling, L.P.: Lifted probabilistic inference with counting formulas. In: Proceedings of the 23rd Conference on AI, AAAI 2008, pp. 1062–1068 (2008)Google Scholar
  14. 14.
    Muñoz-González, L., Sgandurra, D., Barrère, M., Lupu, E.C.: Exact inference techniques for the analysis of Bayesian attack graphs. IEEE Trans. Depend. Secur. Comput. PP(99), 1–14 (2017)CrossRefGoogle Scholar
  15. 15.
    Nath, A., Domingos, P.: Efficient lifting for online probabilistic inference. In: Proceedings of the 24th AAAI Conference on AI (2010)Google Scholar
  16. 16.
    Poole, D.: First-order probabilistic inference. In: Proceedings of the 18th International Joint Conference on AI, IJCAI 2003 (2003)Google Scholar
  17. 17.
    de Salvo Braz, R., Amir, E., Roth, D.: Lifted first-order probabilistic inference. In: Proceedings of the 19th International Joint Conference on AI, IJCAI 2005 (2005)Google Scholar
  18. 18.
    Shenoy, P.P., Shafer, G.R.: Axioms for probability and belief-function propagation. Uncertain. AI 4(9), 169–198 (1990)MathSciNetGoogle Scholar
  19. 19.
    Taghipour, N., Fierens, D., van den Broeck, G., Davis, J., Blockeel, H.: Completeness results for lifted variable elimination. In: Proceedings of the 16th International Conference on AI and Statistics, pp. 572–580 (2013)Google Scholar
  20. 20.
    Taghipour, N., Fierens, D., Davis, J., Blockeel, H.: Lifted variable elimination: decoupling the operators from the constraint language. J. AI Res. 47(1), 393–439 (2013)MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Institute of Information SystemsUniversity of LübeckLübeckGermany

Personalised recommendations