, Volume 52, Issue 3, pp 345370
First online:
Model selection and Akaike's Information Criterion (AIC): The general theory and its analytical extensions
 Hamparsum BozdoganAffiliated withthe Department of Mathematics, MathAstronomy Building, University of Virginia
Rent the article at a discount
Rent now* Final gross prices may vary according to local VAT.
Get AccessAbstract
During the last fifteen years, Akaike's entropybased Information Criterion (AIC) has had a fundamental impact in statistical model evaluation problems. This paper studies the general theory of the AIC procedure and provides its analytical extensions in two ways without violating Akaike's main principles. These extensions make AIC asymptotically consistent and penalize overparameterization more stringently to pick only the simplest of the “true” models. These selection criteria are called CAIC and CAICF. Asymptotic properties of AIC and its extensions are investigated, and empirical performances of these criteria are studied in choosing the correct degree of a polynomial model in two different Monte Carlo experiments under different conditions.
Key words
model selection Akaike's information criterion AIC CAIC CAICF asymptotic properties Title
 Model selection and Akaike's Information Criterion (AIC): The general theory and its analytical extensions
 Journal

Psychometrika
Volume 52, Issue 3 , pp 345370
 Cover Date
 198709
 DOI
 10.1007/BF02294361
 Print ISSN
 00333123
 Online ISSN
 18600980
 Publisher
 SpringerVerlag
 Additional Links
 Topics
 Keywords

 model selection
 Akaike's information criterion
 AIC
 CAIC
 CAICF
 asymptotic properties
 Industry Sectors
 Authors

 Hamparsum Bozdogan ^{(1)}
 Author Affiliations

 1. the Department of Mathematics, MathAstronomy Building, University of Virginia, 22903, Charlottesville, VA