Skip to main content

An Approach to Reduce the Number of Conditional Independence Tests in the PC Algorithm

  • 536 Accesses

Part of the Lecture Notes in Computer Science book series (LNAI,volume 12873)

Abstract

The PC algorithm is one of the most prominent constraint-based methods for learning causal structures from observational data. The algorithm relies on conditional independence (CI) tests to infer the structure and its time consumption heavily depends on the number of performed CI tests. We present a modification, called ED-PC, such that – in the oracle model – both ED-PC and the original PC algorithm infer the same structure. However, by using a new idea allowing the detection of a v-structure without explicit knowledge of a separating set, our method reduces the number of needed CI tests significantly. This is made possible by detecting nonadjacencies considerably earlier.

Keywords

  • Graphical models
  • Causality
  • Bayesian networks

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-030-87626-5_21
  • Chapter length: 13 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   59.99
Price excludes VAT (USA)
  • ISBN: 978-3-030-87626-5
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   79.99
Price excludes VAT (USA)
Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.

Notes

  1. 1.

    For exact definitions of the notions used in the algorithm, see Sect. 2.

  2. 2.

    We use iff as shorthand for “if and only if”.

References

  1. Andersson, S.A., Madigan, D., Perlman, M.D.: A characterization of Markov equivalence classes for acyclic digraphs. Ann. Stat. 25(2), 505–541 (1997)

    MathSciNet  MATH  Google Scholar 

  2. Baba, K., Shibata, R., Sibuya, M.: Partial correlation and conditional correlation as measures of conditional independence. Aust. N. Z. J. Stat. 46(4), 657–664 (2004)

    MathSciNet  CrossRef  Google Scholar 

  3. Bergsma, W.P.: Testing conditional independence for continuous random variables. Eurandom (2004)

    Google Scholar 

  4. Colombo, D., Maathuis, M.H., Kalisch, M., Richardson, T.S.: Learning high-dimensional directed acyclic graphs with latent and selection variables. Ann. Stat. 40, 294–321 (2012)

    MathSciNet  CrossRef  Google Scholar 

  5. Doran, G., Muandet, K., Zhang, K., Schölkopf, B.: A permutation-based kernel conditional independence test. In: UAI, pp. 132–141 (2014)

    Google Scholar 

  6. Harris, N., Drton, M.: PC algorithm for nonparanormal graphical models. J. Mach. Learn. Res. 14(1), 3365–3383 (2013)

    MathSciNet  MATH  Google Scholar 

  7. Kalisch, M., Bühlmann, P.: Estimating high-dimensional directed acyclic graphs with the PC-Algorithm. J. Mach. Learn. Res. 8, 613–636 (2007)

    MATH  Google Scholar 

  8. Meek, C.: Causal inference and causal explanation with background knowledge. In: Proceedings of UAI 1995, pp. 403–410. MK Publishers Inc. (1995)

    Google Scholar 

  9. Pearl, J.: Causality: Models, Reasoning and Inference, 2nd edn. Cambridge University Press, Cambridge (2009)

    CrossRef  Google Scholar 

  10. Scutari, M.: Learning Bayesian networks with the bnlearn R package. J. Stat. Softw. 35(3), 1–22 (2010)

    Google Scholar 

  11. Sondhi, A., Shojaie, A.: The reduced PC-algorithm: improved causal structure learning in large random networks. J. Mach. Learn. Res. 20(164), 1–31 (2019)

    MathSciNet  MATH  Google Scholar 

  12. Spirtes, P., Glymour, C., Scheines, R.: Causation, Prediction, and Search, 2nd edn. MIT Press, Cambridge (2000)

    MATH  Google Scholar 

  13. Talvitie, T., Parviainen, P.: Learning Bayesian networks with cops and robbers. In: The 10th International Conference on Probabilistic Graphical Models (2020)

    Google Scholar 

  14. Verma, T., Pearl, J.: Equivalence and synthesis of causal models. In: Proceedings of UAI 1990, pp. 255–270. Elsevier (1990)

    Google Scholar 

  15. Wienöbst, M., Liśkiewicz, M.: Recovering causal structures from low-order conditional independencies. In: 34th AAAI Conference on Artificial Intelligence (AAAI), pp. 10302–10309 (2020)

    Google Scholar 

  16. Zhang, H., Zhou, S., Zhang, K., Guan, J.: Causal discovery using regression-based conditional independence tests. In: Thirty-First AAAI Conference on Artificial Intelligence (2017)

    Google Scholar 

  17. Zhang, K., Peters, J., Janzing, D., Schölkopf, B.: Kernel-based conditional independence test and application in causal discovery. In: 27th Conference on Uncertainty in Artificial Intelligence (UAI 2011), pp. 804–813. AUAI Press (2011)

    Google Scholar 

Download references

Acknowledgements

This work was supported by the Deutsche Forschungsgemeinschaft (DFG) grant LI634/4-2.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marcel Wienöbst .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Verify currency and authenticity via CrossMark

Cite this paper

Wienöbst, M., Liśkiewicz, M. (2021). An Approach to Reduce the Number of Conditional Independence Tests in the PC Algorithm. In: Edelkamp, S., Möller, R., Rueckert, E. (eds) KI 2021: Advances in Artificial Intelligence. KI 2021. Lecture Notes in Computer Science(), vol 12873. Springer, Cham. https://doi.org/10.1007/978-3-030-87626-5_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-87626-5_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-87625-8

  • Online ISBN: 978-3-030-87626-5

  • eBook Packages: Computer ScienceComputer Science (R0)