The Kullback-Leibler risk of the Stein estimator and the conditional MLE

  • Takemi Yanagimoto
Maximam Likelihood Procedures

Abstract

The decomposition of the Kullback-Leibler risk of the maximum likelihood estimator (MLE) is discussed in relation to the Stein estimator and the conditional MLE. A notable correspondence between the decomposition in terms of the Stein estimator and that in terms of the conditional MLE is observed. This decomposition reflects that of the expected log-likelihood ratio. Accordingly, it is concluded that these modified estimators reduce the risk by reducing the expected log-likelihood ratio. The empirical Bayes method is discussed from this point of view.

Key words and phrases

Conditional inference empirical Bayes method expected log-likelihood ratio exponential dispersion model maximum likelihood estimator Stein estimator 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Amari, S-I. (1985).Differential-Geometrical Methods in Statistics, Springer, Berlin.Google Scholar
  2. Andersen, E. B. (1970). Asymptotic properties of conditional maximum-likelihood estimators,J. Roy. Statist. Soc. Ser. B,32, 283–301.Google Scholar
  3. Barndorff-Nielsen, O. (1978).Information and Exponential Families in Statistical Theory, Wiley, New York.Google Scholar
  4. Berger, J. (1980). Improving on inadmissible estimators in continuous exponential families with applications to simultaneous estimation of gamma scale parameters,Ann. Statist.,8, 545–571.Google Scholar
  5. Blaesild, P. and Jensen, J. L. (1985). Saddlepoint formulas for reproductive exponential models,Scand. J. Statist.,12, 193–202.Google Scholar
  6. Brandwein, A. C. and Strawderman, W. (1990). Stein estimation: The spherically symmetric case,Statist. Sci.,5, 356–369.Google Scholar
  7. Brown, L. D. (1979). A heuristic method for determining admissibility of estimators—with applications,Ann. Statist.,7, 960–994.Google Scholar
  8. Cassella, G. (1985). An introduction to empirical Bayes data analysis,Amer. Statist.,39, 83–87.Google Scholar
  9. Cox, D. R. and Reid, N. (1987). Parameter orthogonality and approximate conditional inference (with discussion),J. Roy. Statist. Soc. Ser. B,49, 1–39.Google Scholar
  10. Dey, D. K., Ghosh, M. and Srinivasan, M. (1987). Simultaneous estimation of parameters under entropy loss,J. Statist. Plann. Inference,15, 347–363.Google Scholar
  11. Efron, B. and Morris, C. (1973). Stein's estimation rules and its competitors—An empirical Bayes approach,J. Amer. Statist. Assoc.,68, 117–130.Google Scholar
  12. Fisher, R. A. (1935). The logic of inductive inference,J. Roy. Statist. Soc.,98, 39–54.Google Scholar
  13. Godambe, V. P. (1984). On ancillarity and Fisher information in the presence of a nuisance parameter,Biometrika,71, 626–629.Google Scholar
  14. James, W. and Stein, C. (1961). Estimation with quadratic loss,Proc. Fourth Berkley Symp. on Math. Statist. Prob., Vol. 1, 361–380, Univ. of California Press, Berkeley.Google Scholar
  15. Jorgensen, B. (1987). Exponential dispersion model (with discussion),J. Roy. Statist. Soc. Ser. B,49, 127–162.Google Scholar
  16. Kullback, S. (1959).Information Theory and Statistics, Wiley, New York.Google Scholar
  17. Kullback, S. and Leibler, R. A. (1951). On information and sufficiency,Ann. Math. Statist.,22, 79–86.Google Scholar
  18. Lindsay, B. (1982). Conditional score functions: Some optimality results,Biometrika,69, 503–512.Google Scholar
  19. McCullagh, P. and Nelder, J. A. (1989).Generalized Linear Models, 2nd ed., Chapman and Hall, London.Google Scholar
  20. Morris, C. N. (1982). Natural exponential families with quadratic variance functions,Ann. Statist.,10, 65–80.Google Scholar
  21. Morris, C. N. (1983). Parametric empirical Bayes inference: Theory and applications (with discussion),J. Amer. Statist. Assoc.,78, 47–65.Google Scholar
  22. Nelder, J. A. and Wedderburn, R. W. M. (1972). Generalized linear model (with discussion),J. Roy. Statist. Soc. Ser. A,34, 370–384.Google Scholar
  23. Neyman, J. and Scott, E. L. (1948). Consistent estimates based on partially consistent observations,Econometrica,16, 1–32.Google Scholar
  24. Saville, D. J. and Wood, G. R. (1991).Statistical Methods: The Geometric Approach, Springer, New York.Google Scholar
  25. Simon, G. (1973). Additivity of information in exponential family probability laws,J. Amer. Statist. Assoc.,68, 478–482.Google Scholar
  26. Stein, C. M. (1956). Inadmissibility of the usual estimator for the mean of a multivariate normal distribution,Proc. Third Berkeley Symp. on Math. Statist. Prob., Vol. 1, 197–206, Univ. of California Press, Berkeley.Google Scholar
  27. Stein, C. M. (1962). Confidence sets for the mean of a multivariate normal distribution (with discussion),J. Roy. Statist. Soc. Ser. B,24, 265–296.Google Scholar
  28. Yanagimoto, T. (1987). A notion of an obstructive residual likelihood,Ann. Inst. Statist. Math.,39, 247–261.Google Scholar
  29. Yanagimoto, T. (1991). Estimating a model through the conditional MLE,Ann. Inst. Statist. Math.,43, 735–746.Google Scholar
  30. Yanagimoto, T. and Anraku, K. (1989). Possible superiority of the conditional MLE over the unconditional MLE,Ann. Inst. Statist. Math.,41, 269–278.Google Scholar
  31. Yanagimoto, T. and Yanagimoto, M. (1987). The use of the marginal likelihood for a diagnostic test for the goodness of fit of the simple regression model,Technometrics,29, 95–101.Google Scholar

Copyright information

© The Institute of Statistical Mathematics 1994

Authors and Affiliations

  • Takemi Yanagimoto
    • 1
  1. 1.The Institute of Statistical MathematicsTokyoJapan

Personalised recommendations