Advertisement

Optimum Design via I-Divergence for Stable Estimation in Generalized Regression Models

  • Katarína BurclováEmail author
  • Andrej Pázman
Conference paper
Part of the Contributions to Statistics book series (CONTRIB.STAT.)

Abstract

Optimum designs for parameter estimation in generalized regression models are usually based on the Fisher information matrix (cf. Atkinson et al. (J Stat Plan Inference 144:81–91, 2014) for a recent exposition). The corresponding optimality criteria are related to the asymptotic properties of maximum likelihood (ML) estimators in such models. However, in finite sample experiments there can be problems with identifiability, stability and uniqueness of the ML estimate, which are not reflected by information matrices. In Pázman and Pronzato (Ann Stat 42:1426–1451, 2014) and in Chap.  7 of Pronzato and Pázman (Design of Experiments in Nonlinear Models. Asymptotic Normality, Optimality Criteria and Small-Sample Properties. Springer, New York, 2013) is discussed how to solve some of these estimability issues at the design stage of an experiment in standard nonlinear regression. Here we want to extend this design methodology to more general models based on exponential families of distributions (binomial, Poisson, normal with parametrized variances, etc.). The main tool is the information (or Kullback-Leibler) divergence, which is closely related to ML estimation.

Keywords

Information Matrix Exponential Family Interior Point Algorithm Latin Hypercube Design Extended Criterion 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgements

The authors thank Slovak Grant Agency VEGA, Grant No. 1/0163/13, for financial support.

References

  1. 1.
    Atkinson, A.C., Fedorov, V.V., Herzberg, A.M., Zhang, R.: Elemental information matrices and optimal experimental design for generalized regression models. J. Stat. Plan. Inference 144, 81–91 (2014)CrossRefzbMATHMathSciNetGoogle Scholar
  2. 2.
    Brown, L.D.: Fundamentals of Statistical Exponential Families with Applications in Statistical Decision Theory. IMS Lecture Notes—Monograph Series, vol. 9. Institute of Mathematical Statistics, Hayward (1986)Google Scholar
  3. 3.
    Efron, B.: The geometry of exponential families. Ann. Stat. 6, 362–376 (1978)CrossRefzbMATHMathSciNetGoogle Scholar
  4. 4.
    Kullback, S.: Information Theory and Statistics. Dover Publications, Mineola, N.Y. (1997)zbMATHGoogle Scholar
  5. 5.
    López-Fidalgo, J., Tommasi, C., Trandafir, P.C.: An optimal experimental design criterion for discriminating between non-normal models. J. R. Stat. Soc. B. 69, 231–242 (2007)CrossRefzbMATHMathSciNetGoogle Scholar
  6. 6.
    Pázman, A., Pronzato, L.: Optimum design accounting for the global nonlinear behavior of the model. Ann. Stat. 42, 1426–1451 (2014)CrossRefzbMATHMathSciNetGoogle Scholar
  7. 7.
    Pronzato, L., Pázman, A.: Design of Experiments in Nonlinear Models. Asymptotic Normality, Optimality Criteria and Small-Sample Properties. Springer, New York (2013)CrossRefzbMATHGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Faculty of Mathematics, Physics and InformaticsComenius University in BratislavaBratislavaSlovak Republic

Personalised recommendations