An information criterion for model selection with missing data via complete-data divergence

Article

DOI: 10.1007/s10463-016-0592-7

Cite this article as:
Shimodaira, H. & Maeda, H. Ann Inst Stat Math (2017). doi:10.1007/s10463-016-0592-7
  • 57 Downloads

Abstract

We derive an information criterion to select a parametric model of complete-data distribution when only incomplete or partially observed data are available. Compared with AIC, our new criterion has an additional penalty term for missing data, which is expressed by the Fisher information matrices of complete data and incomplete data. We prove that our criterion is an asymptotically unbiased estimator of complete-data divergence, namely the expected Kullback–Leibler divergence between the true distribution and the estimated distribution for complete data, whereas AIC is that for the incomplete data. The additional penalty term of our criterion for missing data turns out to be only half the value of that in previously proposed information criteria PDIO and AICcd. The difference in the penalty term is attributed to the fact that our criterion is derived under a weaker assumption. A simulation study with the weaker assumption shows that our criterion is unbiased while the other two criteria are biased. In addition, we review the geometrical view of alternating minimizations of the EM algorithm. This geometrical view plays an important role in deriving our new criterion.

Keywords

Akaike information criterion Alternating projections Data manifold EM algorithm Fisher information matrix Incomplete data Kullback–Leibler divergence Misspecification Takeuchi information criterion 

Copyright information

© The Institute of Statistical Mathematics, Tokyo 2017

Authors and Affiliations

  1. 1.Division of Mathematical Science, Graduate School of Engineering ScienceOsaka UniversityToyonakaJapan
  2. 2.RIKEN Center for Advanced Intelligence ProjectTokyoJapan
  3. 3.Kawasaki Heavy Industries, Ltd.AkashiJapan

Personalised recommendations