Abstract
The most important problem in machine learning is to estimate and infer the value of unknown variables (e.g., class label) based on the observed evidence (e.g., training samples). Probabilistic models provide a framework that considers learning problems as computing the probability distributions of variables.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Andrieu C, Freitas ND, Doucet A, Jordan MI (2003) An introduction to MCMC for machine learning. Mach Learn 50(1–2):5–43
Blei DM (2012) Probabilistic topic models. Commun ACM 55(4):77–84
Blei DM, Ng A, Jordan MI (2003) Latent Dirichlet allocation. J Artif Intell Res 3:993–1022
Buntine W (1994) Operations for learning with graphical models. J Artif Intell Res 2:159–225
Geman S, Geman D (1984) Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans Patt Anal Mach Intell 6(6):721–741
Ghahramani Z, Griffiths TL (2006) Infinite latent feature models and the Indian buffet process. In: Weiss Y, Scholkopf B, Platt JC (eds) Advances in Neural Information Processing Systems 18 (NIPS). MIT Press, Cambridge, MA, pp 475–482
Gilks WR, Richardson S, Spiegelhalter DJ (1996) Markov chain monte carlo in practice. Chapman & Hall/CRC, Boca Raton, FL
Gonzalez JE, Low Y, Guestrin C (2009) Residual splash for optimally parallelizing belief propagation. In Proceedings of the 12th International Conference on Artificial Intelligence and Statistics (AISTATS), Clearwater Beach, FL, pp 177–184
Hastings WK (1970) Monte Carlo sampling methods using Markov chains and their applications. Biometrica 57(1):97–109
Hofmann T (2001) Unsupervised learning by probabilistic latent semantic analysis. Mach Learn 42(1):177–196
Jordan MI (ed) (1998) Learning in graphical models. Kluwer, Dordrecht, Netherlands
Koller D, Friedman N (2009) Probabilistic graphical models: principles and techniques. MIT Press, Cambridge, MA
Kschischang FR, Frey BJ, Loeliger H-A (2001) Factor graphs and the sum-product algorithm. IEEE Trans Inf Theory 47(2):498–519
Lafferty JD, McCallum A, Pereira FCN (2001) Conditional random fields: Probabilistic models for segmenting and labeling sequence data. In: Proceedings of the 18th International Conference on Machine Learning (ICML), Williamstown, MA, pp 282–289
Lauritzen SL, Spiegelhalter DJ (1988) Local computations with probabilities on graphical structures and their application to expert systems. J R Stat Soc -Ser B 50(2):157–224
Metropolis N, Rosenbluth AW, Rosenbluth MN, Teller AH, Teller E (1953) Equations of state calculations by fast computing machines. J Chem Phys 21(6):1087–1092
Mooij JM, Kappen HJ (2007) Sufficient conditions for convergence of the sum-product algorithm. IEEE Trans Inf Theory 53(12):4422–4437
Murphy KP, Weiss Y, Jordan MI (1999) Loopy belief propagation for approximate inference: an empirical study. In: Proceedings of the 15th Conference on Uncertainty in Artificial Intelligence (UAI), Stockholm, Sweden, pp 467–475
Neal RM (1993) Probabilistic inference using Markov chain Monte Carlo methods. Technical Report CRG-TR-93-1, Department of Computer Science, University of Toronto
Pearl J (1982) Asymptotic properties of minimax trees and game-searching procedures. In: Proceedings of the 2nd National Conference on Artificial Intelligence (AAAI), Pittsburgh, PA
Pearl J (1986) Fusion, propagation and structuring in belief networks. Artif Intell 29(3):241–288
Pearl J (1987) Evidential reasoning using stochastic simulation of causal models. Artif Intell 32(2):245–258
Pearl J (1988) Probabilistic reasoning in intelligent systems: networks of plausible inference. Morgan Kaufmanns, San Francisco, CA
Rabiner LR (1989) A tutorial on hidden Markov model and selected applications in speech recognition. Proc IEEE 77(2):257–286
Sutton C, McCallum A (2012) An introduction to conditional random fields. Found Trends Mach Learn 4(4):267–373
Teh YW, Jordan MI, Beal MJ, Blei DM (2006) Hierarchical Dirichlet processes. J Am Stat Assoc 101(476):1566–1581
Wainwright MJ, Jordan MI (2008) Graphical models, exponential families, and variational inference. Found Trends Mach Learn 1(1–2):1–305
Weiss Y (2000) Correctness of local probability propagation in graphical models with loops. Neural Comput 12(1):1–41
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2021 Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Zhou, ZH. (2021). Probabilistic Graphical Models. In: Machine Learning. Springer, Singapore. https://doi.org/10.1007/978-981-15-1967-3_14
Download citation
DOI: https://doi.org/10.1007/978-981-15-1967-3_14
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-15-1966-6
Online ISBN: 978-981-15-1967-3
eBook Packages: Computer ScienceComputer Science (R0)