Basu’s Work on Likelihood and Information

  • Joseph B. Kadane
Part of the Selected Works in Probability and Statistics book series (SWPS)


It has been a joy learning from Dev Basu’s work on aspects of statistical inference, and especially his deep and often provocative essays on fallacies of common statistical principles. I will limit myself to his epic paper Statistical Information and Likelihood.

“Statistical Information and likelihood” is a tour de force in three parts. In the first part, Basu studies the implications of the sufficiency and conditionality principles, and shows that these lead to the likelihood function as the summary of the information in an experiment. His treatment is similar to that of Birnbaum (1962, 1972). His second part reviews non-Bayesian likelihood methods, leaning especially on Fisher’s maximum likelihood method (MLE). He criticizes the use of sampling standard errors around the MLE to create confidence intervals in the grounds that they violate the likelihood principle. His third part gives various examples that illuminate what he finds problematic about fiducial arguments, improper Bayesian priors, and simple-null hypothesis testing. Although most of his effort is critical, on the positive side Basu advocates subjective Bayesian analysis with proper priors, and making optimal decisions using a utility (or loss) function.


Statistical Information Likelihood Function Statistical Inference Optimal Decision Positive Side 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. [1]
    Basu, D. (1975). Statistical information and likelihood, with discussion and correspondence between Barnard and Basu, Sankhyā, Ser. A, 37, 1–71.Google Scholar

Copyright information

© Springer Science+Business Media, LLC 2011

Authors and Affiliations

  1. 1.University Professor of Statistics, Carnegie Mellon UniversityPittsburghUSA

Personalised recommendations