Skip to main content

Information Theory and an Extension of the Maximum Likelihood Principle

  • Chapter

Part of the Springer Series in Statistics book series (PSS)

Abstract

In this paper it is shown that the classical maximum likelihood principle can be considered to be a method of asymptotic realization of an optimum estimate with respect to a very general information theoretic criterion. This observation shows an extension of the principle to provide answers to many practical problems of statistical model fitting.

Keywords

  • Autoregressive Model
  • Final Prediction Error
  • Maximum Likelihood Principle
  • Statistical Model Identification
  • Statistical Decision Function

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-1-4612-1694-0_15
  • Chapter length: 15 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   169.00
Price excludes VAT (USA)
  • ISBN: 978-1-4612-1694-0
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   219.99
Price excludes VAT (USA)
Hardcover Book
USD   219.99
Price excludes VAT (USA)

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Akaike, H., Fitting autoregressive models for prediction. Ann. Inst. Statist. Math. 21 (1969) 243–217.

    CrossRef  MathSciNet  MATH  Google Scholar 

  2. Akaike., H., Statistical predictor identification. Ann. Inst. Statist. Math. 22 (1970) 203–217.

    CrossRef  MathSciNet  Google Scholar 

  3. Akaike, H., On a semi-automatic power spectrum estimation procedure. Proc. 3rd Hawaii International Conference on System Sciences, 1970, 974–977.

    Google Scholar 

  4. Akaike, H., On a decision procedure for system identification, Preprints, IFAC Kyoto Symposium on System Engineering Approach to Computer Control. 1970, 486–490.

    Google Scholar 

  5. Akaike, H., Autoregressive model fitting for control. Ann. Inst. Statist. Math. 23 (1971) 163–180.

    CrossRef  MathSciNet  MATH  Google Scholar 

  6. Akaike, H., Determination of the number of factors by an extended maximum likelihood principle. Research Memo. 44, Inst. Statist. Math. March, 1971.

    Google Scholar 

  7. Bartlett, M. S., The statistical approach to the analysis of time-series. Symposium on Information Theory (mimeographed Proceedings), Ministry of Supply, London, 1950, 81–101.

    Google Scholar 

  8. Billingsley, P., Statistical Inference for Markov Processes. Univ. Chicago Press, Chicago 1961.

    MATH  Google Scholar 

  9. Blackwell, D., Equivalent comparisons of experiments. Ann. Math. Statist. 24 (1953) 265–272.

    CrossRef  MathSciNet  MATH  Google Scholar 

  10. Campbell, L.L., Equivalence of Gauss’s principle and minimum discrimination information estimation of probabilities. Ann. Math. Statist. 41 (1970) 10111015.

    Google Scholar 

  11. Fisher, R.A., Theory of statistical estimation. Proc. Camb. Phil. Soc. 22 (1925) 700–725, Contributions to Mathematical Statistics John Wiley & Sons, New York, 1950, paper 11.

    Google Scholar 

  12. Good, I.J. Maximum entropy for hypothesis formulation, especially for multidimensional contingency tables. Ann. Math. Statist. 34 (1963) 911–934.

    CrossRef  MathSciNet  MATH  Google Scholar 

  13. Gorman, J.W. and Toman, R.J., Selection of variables for fitting equations to data. Technometrics 8 (1966) 27–51.

    CrossRef  Google Scholar 

  14. Jenkins, G.M. and Watts, D.G., Spectral Analysis and Its Applications. Holden Day, San Francisco, 1968.

    MATH  Google Scholar 

  15. Kullback, S. and Leibler, R.A., On information and sufficiency. Ann. Math Statist. 22 (1951) 79–86.

    CrossRef  MathSciNet  MATH  Google Scholar 

  16. Kullback, S., Information Theory and Statistics. John Wiley & Sons, New York 1959.

    MATH  Google Scholar 

  17. Le Cam, L., On some asymptotic properties of maximum likelihood estimates and related Bayes estimates. Univ. Calif. Publ. in Stat. 1 (1953) 277–330.

    Google Scholar 

  18. Lehmann, E.L., Testing Statistical Hypotheses. John Wiley & Sons, New York 1969.

    Google Scholar 

  19. Otomo, T., Nakagawa, T. and Akaike, H. Statistical approach to computer control of cement rotary kilns. 1971. Automatica 8 (1972) 35–48.

    CrossRef  Google Scholar 

  20. Rényi, A., Statistics and information theory. Studia Sci. Math. Hung. 2 (1967) 249–256.

    MATH  Google Scholar 

  21. Savage, L.J., The Foundations of Statistics. John Wiley & Sons, New York 1954.

    MATH  Google Scholar 

  22. Shannon, C.E. and Weaver, W., The Mathematical Theory of Communication. Univ. of Illinois Press, Urbana 1949.

    MATH  Google Scholar 

  23. Wald, A., Tests of statistical hypotheses concerning several parameters when the number of observations is large. Trans. Am. Math. Soc. 54 (1943) 426–482.

    CrossRef  MathSciNet  MATH  Google Scholar 

  24. Wald, A., Note on the consistency of the maximum likelihood estimate. Ann Math. Statist. 20 (1949) 595–601.

    CrossRef  MathSciNet  MATH  Google Scholar 

  25. Wald, A., Statistical Decision Functions. John Wiley & Sons, New York 1950.

    MATH  Google Scholar 

  26. Whittle, P., The statistical analysis of seiche record. J. Marine Res. 13 (1954) 76–100.

    MathSciNet  Google Scholar 

  27. Whittle, P., Prediction and Regulation. English Univ. Press, London 1963.

    Google Scholar 

  28. Wiener, N., Cybernetics. John Wiley & Sons, New York, 1948.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 1998 Springer Science+Business Media New York

About this chapter

Cite this chapter

Akaike, H. (1998). Information Theory and an Extension of the Maximum Likelihood Principle. In: Parzen, E., Tanabe, K., Kitagawa, G. (eds) Selected Papers of Hirotugu Akaike. Springer Series in Statistics. Springer, New York, NY. https://doi.org/10.1007/978-1-4612-1694-0_15

Download citation

  • DOI: https://doi.org/10.1007/978-1-4612-1694-0_15

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-1-4612-7248-9

  • Online ISBN: 978-1-4612-1694-0

  • eBook Packages: Springer Book Archive