Advertisement

Structural Learning of Probabilistic Graphical Models of Cumulative Phenomena

  • Daniele Ramazzotti
  • Marco S. Nobile
  • Marco Antoniotti
  • Alex GraudenziEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10860)

Abstract

One of the critical issues when adopting Bayesian networks (BNs) to model dependencies among random variables is to “learn” their structure. This is a well-known NP-hard problem in its most general and classical formulation, which is furthermore complicated by known pitfalls such as the issue of I-equivalence among different structures. In this work we restrict the investigation to a specific class of networks, i.e., those representing the dynamics of phenomena characterized by the monotonic accumulation of events. Such phenomena allow to set specific structural constraints based on Suppes’ theory of probabilistic causation and, accordingly, to define constrained BNs, named Suppes-Bayes Causal Networks (SBCNs). Within this framework, we study the structure learning of SBCNs via extensive simulations with various state-of-the-art search strategies, such as canonical local search techniques and Genetic Algorithms. This investigation is intended to be an extension and an in-depth clarification of our previous works on SBCN structure learning. Among the main results, we show that Suppes’ constraints do simplify the learning task, by reducing the solution search space and providing a temporal ordering on the variables, which simplifies the complications derived by I-equivalent structures. Finally, we report on tradeoffs among different optimization techniques that can be used to learn SBCNs.

Notes

Acknowledgments

This work was supported in part by the ASTIL Program of Regione Lombardia, by the ELIXIR-ITA network, and by the SysBioNet project, a MIUR initiative for the Italian Roadmap of European Strategy Forum on Research Infrastructures (ESFRI). We would like to thank for the useful discussions our colleagues Giulio Caravagna of ICR, London, UK, Giancarlo Mauri of DISCo, Università degli Studi di Milano-Bicocca, Milan, Italy, and Bud Mishra of Courant Institute of Mathematical Sciences, New York University, NY, USA.

References

  1. 1.
    Akaike, H.: Information theory and an extension of the maximum likelihood principle. In: Parzen, E., Tanabe, K., Kitagawa, G. (eds.) Selected Papers of Hirotugu Akaike, pp. 199–213. Springer, New York (1998).  https://doi.org/10.1007/978-1-4612-1694-0_15CrossRefGoogle Scholar
  2. 2.
    Antoniotti, M., Caravagna, G., De Sano, L., Graudenzi, A., Mauri, G., Mishra, B., Ramazzotti, D.: Design of the TRONCO bioconductor package for translational oncology. R J. 8(2), 39–59 (2016)Google Scholar
  3. 3.
    Back, T.: Selective pressure in evolutionary algorithms: a characterization of selection mechanisms. In: Proceedings of the First IEEE Conference on Evolutionary Computation, 1994. IEEE World Congress on Computational Intelligence, pp. 57–62. IEEE (1994)Google Scholar
  4. 4.
    Bonchi, F., Hajian, S., Mishra, B., Ramazzotti, D.: Exposing the probabilistic causal structure of discrimination. Int. J. Data Sci. Anal. 3(1), 1–21 (2017)CrossRefGoogle Scholar
  5. 5.
    Buldyrev, S.V., Parshani, R., Paul, G., Stanley, H.E., Havlin, S.: Catastrophic cascade of failures in interdependent networks. Nature 464(7291), 1025–1028 (2010)CrossRefGoogle Scholar
  6. 6.
    Buntine, W.: Theory refinement on Bayesian networks. In: Proceedings of the Seventh Conference on Uncertainty in Artificial Intelligence, pp. 52–60. Morgan Kaufmann Publishers Inc. (1991)CrossRefGoogle Scholar
  7. 7.
    Caravagna, G., Graudenzi, A., Ramazzotti, D., Sanz-Pamplona, R., De Sano, L., Mauri, G., Moreno, V., Antoniotti, M., Mishra, B.: Algorithmic methods to infer the evolutionary trajectories in cancer progression. Proc. Natl. Acad. Sci. 113(28), E4025–E4034 (2016)CrossRefGoogle Scholar
  8. 8.
    Carvalho, A.M.: Scoring functions for learning Bayesian networks. Inesc-id Technical report (2009)Google Scholar
  9. 9.
    Chickering, D.M.: Learning bayesian networks is NP-complete. In: Fisher, D., Lenz, H.J. (eds.) Learning from Data, pp. 121–130. Springer, New York (1996).  https://doi.org/10.1007/978-1-4612-2404-4_12CrossRefGoogle Scholar
  10. 10.
    Chickering, D.M., Heckerman, D., Meek, C.: Large-sample learning of Bayesian networks is NP-hard. J. Mach. Learn. Res. 5(Oct), 1287–1330 (2004)MathSciNetzbMATHGoogle Scholar
  11. 11.
    Cooper, G.F., Herskovits, E.: A Bayesian method for constructing Bayesian belief networks from databases. In: Proceedings of the Seventh Conference on Uncertainty in Artificial Intelligence, pp. 86–94. Morgan Kaufmann Publishers Inc. (1991)CrossRefGoogle Scholar
  12. 12.
    Cooper, G.F., Herskovits, E.: A Bayesian method for the induction of probabilistic networks from data. Mach. Learn. 9(4), 309–347 (1992)zbMATHGoogle Scholar
  13. 13.
    De Sano, L., Caravagna, G., Ramazzotti, D., Graudenzi, A., Mauri, G., Mishra, B., Antoniotti, M.: TRONCO: an R package for the inference of cancer progression models from heterogeneous genomic data. Bioinformatics 32(12), 1911–1913 (2016)CrossRefGoogle Scholar
  14. 14.
    Farahani, H.S., Lagergren, J.: Learning oncogenetic networks by reducing to mixed integer linear programming. PloS One 8(6), e65773 (2013)CrossRefGoogle Scholar
  15. 15.
    Gao, G., Mishra, B., Ramazzotti, D.: Efficient simulation of financial stress testing scenarios with Suppes-Bayes causal networks. Procedia Comput. Sci. 108, 272–284 (2017)CrossRefGoogle Scholar
  16. 16.
    Garrett, A.: Inspyred: bio-inspired algorithms in Python (2012). https://pypi.python.org/pypi/inspyred
  17. 17.
    Good, I.J., Mittal, Y., et al.: The amalgamation and geometry of two-by-two contingency tables. Ann. Stat. 15(2), 694–711 (1987)MathSciNetCrossRefGoogle Scholar
  18. 18.
    Heckerman, D., Geiger, D., Chickering, D.M.: Learning bayesian networks: the combination of knowledge and statistical data. Mach. Learn. 20(3), 197–243 (1995)zbMATHGoogle Scholar
  19. 19.
    Hitchcock, C.: Probabilistic causation. Stanford encyclopedia of philosophy (2010)Google Scholar
  20. 20.
    Holland, J.H.: Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. U Michigan Press (1975)Google Scholar
  21. 21.
    Koller, D., Friedman, N.: Probabilistic Graphical Models: Principles and Techniques. MIT Press, Cambridge (2009)zbMATHGoogle Scholar
  22. 22.
    Korsunsky, I., Ramazzotti, D., Caravagna, G., Mishra, B.: Inference of cancer progression models with biological noise. arXiv preprint arXiv:1408.6032 (2014)
  23. 23.
    Larranaga, P., Poza, M., Yurramendi, Y., Murga, R.H., Kuijpers, C.M.: Structure learning of Bayesian networks by genetic algorithms. IEEE Trans. Pattern Anal. Mach. Intell. 18(9), 912–926 (1996)CrossRefGoogle Scholar
  24. 24.
    Loohuis, L.O., Caravagna, G., Graudenzi, A., Ramazzotti, D., Mauri, G., Antoniotti, M., Mishra, B.: Inferring tree causal models of cancer progression with probability raising. PloS One 9(10), e108358 (2014)CrossRefGoogle Scholar
  25. 25.
    Oliphant, T.: A Guide to Numpy, vol. 1. Trelgol Publishing, Spanish Fork (2006)Google Scholar
  26. 26.
    Pearl, J.: Causality. Cambridge University Press, Cambridge (2009)CrossRefGoogle Scholar
  27. 27.
    Pearson, K.: Mathematical contributions to the theory of evolution on a form of spurious correlation which may arise when indices are used in the measurement of organs. Proc. R. Soc. Lond. 60(359–367), 489–498 (1896)CrossRefGoogle Scholar
  28. 28.
    Ramazzotti, D., Caravagna, G., Olde Loohuis, L., Graudenzi, A., Korsunsky, I., Mauri, G., Antoniotti, M., Mishra, B.: CAPRI: efficient inference of cancer progression models from cross-sectional data. Bioinformatics 31(18), 3016–3026 (2015)CrossRefGoogle Scholar
  29. 29.
    Ramazzotti, D., Graudenzi, A., Caravagna, G., Antoniotti, M.: Modeling cumulative biological phenomena with Suppes-Bayes causal networks. arXiv preprint arXiv:1602.07857 (2016)
  30. 30.
    Ramazzotti, D., Nobile, M.S., Cazzaniga, P., Mauri, G., Antoniotti, M.: Parallel implementation of efficient search schemes for the inference of cancer progression models. In: IEEE International Conference on Computational Intelligence in Bioinformatics and Computational Biology. IEEE (2016)Google Scholar
  31. 31.
    Hagberg, A., Swart, P., Chult D.S.: Exploring network structure, dynamics, and function using NetworkX. In: Proceedings of the 7th Python in Science Conferences (SciPy 2008), vol. 2008, pp. 11–16 (2008)Google Scholar
  32. 32.
    Schwarz, G., et al.: Estimating the dimension of a model. Ann. Stat. 6(2), 461–464 (1978)MathSciNetCrossRefGoogle Scholar
  33. 33.
    Scutari, M.: Learning Bayesian networks with the bnlearn R package. arXiv preprint arXiv:0908.3817 (2009)
  34. 34.
    Suppes, P.: A Probabilistic Theory of Causality. North-Holland Publishing Company, Amsterdam (1970)Google Scholar
  35. 35.
    Teyssier, M., Koller, D.: Ordering-based search: a simple and effective algorithm for learning Bayesian networks. arXiv preprint arXiv:1207.1429 (2012)

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Daniele Ramazzotti
    • 1
  • Marco S. Nobile
    • 2
  • Marco Antoniotti
    • 2
  • Alex Graudenzi
    • 2
    Email author
  1. 1.Department of PathologyStanford UniversityStanfordUSA
  2. 2.Department of Informatics, Systems and CommunicationUniversità degli Studi di Milano-BicoccaMilanItaly

Personalised recommendations