, Volume 52, Issue 3, pp 345–370

Model selection and Akaike's Information Criterion (AIC): The general theory and its analytical extensions

  • Hamparsum Bozdogan
Special Section

DOI: 10.1007/BF02294361

Cite this article as:
Bozdogan, H. Psychometrika (1987) 52: 345. doi:10.1007/BF02294361


During the last fifteen years, Akaike's entropy-based Information Criterion (AIC) has had a fundamental impact in statistical model evaluation problems. This paper studies the general theory of the AIC procedure and provides its analytical extensions in two ways without violating Akaike's main principles. These extensions make AIC asymptotically consistent and penalize overparameterization more stringently to pick only the simplest of the “true” models. These selection criteria are called CAIC and CAICF. Asymptotic properties of AIC and its extensions are investigated, and empirical performances of these criteria are studied in choosing the correct degree of a polynomial model in two different Monte Carlo experiments under different conditions.

Key words

model selection Akaike's information criterion AIC CAIC CAICF asymptotic properties 

Copyright information

© The Psychometric Society 1987

Authors and Affiliations

  • Hamparsum Bozdogan
    • 1
  1. 1.the Department of Mathematics, Math-Astronomy BuildingUniversity of VirginiaCharlottesville

Personalised recommendations