Interpreting entropy as a prior probability suggests a universal but “purely empirical” measure of “goodness of fit.” This allows statistical techniques to be used in situations where the correct theory- and not just its parameters-is still unknown. As developed illustratively for least-squares nonlinear regression, the measure proves to be a transformation of theR2 statistic. Unlike the latter, however, it diminishes rapidly as the number of fitting parameters increases.
Unable to display preview. Download preview PDF.
- Fine, T. L. (1973).Theories of Probabilities. Academic Press, New York.Google Scholar
- Hanushek, E. A., and Jackson, J. E. (1977).Statistical Methods for Social Scientists. Academic Press, New York.Google Scholar
- Kolmogorov, A. (1968). Logical basis for information theory and probability theory,IEEE Trans. Info. Theory,14, 663.Google Scholar
- Kruskal, W., and Tanur, J. M. (eds.) (1978).International Encyclopedia of Statistics. The Free Press, New York. See the articles on hypothesis testing and significance tests.Google Scholar
- Kullback, S. (1959).Information Theory and Statistics. Wiley, New York.Google Scholar
- Penrose, O. (1970).Foundation of Statistical Mechanics. Pergamon Press, Oxford, p. 227.Google Scholar
- Schwarz, G. (1978). Estimating the dimension of a model.Ann. Stat.,6, 461–464.Google Scholar
- Yasuhara, A. (1971).Recursive Function Theory and Logic, Academic Press, New York.Google Scholar