The Information Criterion I(g : f) that measures the deviation of a model specified by the probability distribution f from the true distribution g is defined by the formula
Here E denotes the expectation with respect to the true distribution g of X. The criterion is a measure of the deviation of the model f from the true model g, or the best possible model for the handling of the present problem.
The following relation illustrates the significant characteristic of the log likelihood:
This formula shows that for an observation x of X the log likelihood logf(x) provides a relative measure of the closeness of the model f to the truth, or the goodness of the model. This measure is useful even when the true structure g is unknown.
For a model f(X ∕ a) with unknown parameter a the maximum likelihood estimate a(x) is defined as the value of a that maximizes the likelihood f(x ∕ a...
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this entry
Cite this entry
Akaike, H. (2011). Akaike’s Information Criterion. In: Lovric, M. (eds) International Encyclopedia of Statistical Science. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04898-2_110
Download citation
DOI: https://doi.org/10.1007/978-3-642-04898-2_110
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-04897-5
Online ISBN: 978-3-642-04898-2
eBook Packages: Mathematics and StatisticsReference Module Computer Science and Engineering