In this chapter, we will describe a statistical model that conforms to the maximum entropy principle (we will call it the maximum entropy model, or ME model in short) [68, 69]. Through mathematical derivations, we will show that the maximum entropy model is a kind of exponential model, and is a close sibling of the Gibbs distribution described in Chap. 6. An essential difference between the two models is that the former is a discriminative model, while the latter is a generative model. Through a model complexity analysis, we will show why discriminative models are generally superior to generative models in terms of data modeling power. We will also describe the Conditional Random Field (CRF), one of the latest discriminative models in the literature, and prove that CRF is equivalent to the maximum entropy model. At the end of this chapter, we will provide a case study where the ME model is applied to baseball highlight detections, and is compared with the HMM model described in Chap. 7.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
Rights and permissions
Copyright information
© 2007 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
(2007). Maximum Entropy Model and Conditional Random Field. In: Machine Learning for Multimedia Content Analysis., vol 30. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-69942-4_9
Download citation
DOI: https://doi.org/10.1007/978-0-387-69942-4_9
Publisher Name: Springer, Boston, MA
Print ISBN: 978-0-387-69938-7
Online ISBN: 978-0-387-69942-4
eBook Packages: Computer ScienceComputer Science (R0)