Skip to main content

Classical Bayesian Theory and Networks

  • Chapter
  • First Online:
The Art and Science of Machine Intelligence

Abstract

By their very nature, Bayesian networks (BN) represent cause-effect relationships by their parent-child structure. One can provide an observation of some events and then execute a Bayesian network with this information to ascertain the estimated probabilities of other events. Another significant advantage is that they can make very good estimates in the presence of missing information, which means that they will make the most accurate estimate with whatever information (or knowledge) is available and will provide these results in a computationally efficient manner as well.

This chapter comprises three separate sections. The first develops some of the basic probability concepts on which classical Bayes theory is based. The second develops Bayes theorem and describes several examples using Bayes theorem. The third addresses classical methods for constructing the Bayesian network structure.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 89.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Parent order is defined in Sect. 6.10.4.1.

Abbreviations

AUC:

Area under curve

BN:

Bayesian network

CH:

Cooper Herskovitz

CI:

Conditional independence

DAG:

Directed acyclic graph

FN:

False negative

FP:

False positive

GA:

Genetic algorithm

K2:

Metric by Cooper and Herskovitz

MI:

Machine intelligence

NPV:

Negative predicted value

ORACLE:

GRNN oracle

PC:

Prediction-causal

PPV:

Positive predicted value

ROC:

Receiver operator characteristic

SVM:

Support vector machine

TN:

True negative

TNR:

True negative rate

TP:

True positive

TPR:

True positive rate

References

  • Bishop CM (2006) Pattern recognition and learning. Springer, New York. ISBN-13: 978-0387-31073-2

    MATH  Google Scholar 

  • Buntine W (1991) Theory refinement in Bayesian networks. In: Proceedings of the seventh conference on uncertainty in artificial intelligence, Los Angeles, pp 52–60

    Google Scholar 

  • Cooper GF, Herskovitz E (1992) A Bayesian method for the induction of probabilistic networks from data. Mach Learn 9(4):309–347

    Google Scholar 

  • Keynes JM (1962) The principle of indifference, a treatise on probability, chap IV. Harper Torch book, New York, pp 41–64

    Google Scholar 

  • Kjaerulff U, Madsen AL (2008) Bayesian Network influence diagrams: a guide to construction and analysis. Springer, New York

    Book  Google Scholar 

  • Kjaerulff UB, Madsen AL (2013) Bayesian Networks and influence diagrams: a guide to construction and analysis, 2nd edn. Springer, New York., ISBN 978-1-4614-5103-7

    MATH  Google Scholar 

  • Neapolitan RE (2005) Learning Bayesian Networks. Prentice Hall series in Artificial Intelligence, Upper Saddle River. ISBN 0-13-012534-2

    Google Scholar 

  • Singh M, Valtorta M (1995) Construction of Bayesian network structures from data: a brief survey and an efficient algorithm. Int J Approx Reason 12(2):111–131

    Article  Google Scholar 

  • Spirtes P, Glymour C (1991) An algorithm for fast recovery of sparse causal graphs. Soc Sci Comput Rev 9:62–72

    Article  Google Scholar 

  • Spirtes P, Glymour C, Scheines R (1993) Causation, prediction and search. In: Causation, prediction, and search. Lecture notes in statistics, vol 81. Springer, New York

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Land, W.H., Schaffer, J.D. (2020). Classical Bayesian Theory and Networks. In: The Art and Science of Machine Intelligence. Springer, Cham. https://doi.org/10.1007/978-3-030-18496-4_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-18496-4_6

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-18495-7

  • Online ISBN: 978-3-030-18496-4

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics