Abstract
This article focuses on an important piece of work of the world renowned Indian statistician, Calyampudi Radhakrishna Rao. In 1945, C. R. Rao (25 years old then) published a pathbreaking paper [43], which had a profound impact on subsequent statistical research. Roughly speaking, Rao obtained a lower bound to the variance of an estimator. The importance of this work can be gauged, for instance, by the fact that it has been reprinted in the volume Breakthroughs in Statistics: Foundations and Basic Theory [32]. There have been two major impacts of this work:
-
First, it answers a fundamental question statisticians have always been interested in, namely, how good can a statistical estimator be? Is there a fundamental limit when estimating statistical parameters?
-
Second, it opens up a novel paradigm by introducing differential geometric modeling ideas to the field of Statistics. In recent years, this contribution has led to the birth of a flourishing field of Information Geometry [6].
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Ali, S.M. and Silvey, S. D. (1966). A general class of coefficients of divergence of one distribution from another. J. Roy. Statist. Soc. Series B 28, 131–142.
Altun, Y., Smola, A. J. and Hofmann, T. (2004). Exponential families for conditional random fields. In Uncertainty in Artificial Intelligence (UAI), pp. 2–9.
Amari, S. (1995). Information geometry of the EM and em algorithms for neural networks. Neural Networks 8, 1379–1408.
Amari, S. (2009). Alpha-divergence is unique, belonging to both f-divergence and Bregman divergence classes. IEEE Trans. Inf. Theor. 55, 4925–4931.
Amari, S., Barndorff-Nielsen, O. E., Kass, R. E., Lauritzen, S., L. and Rao, C. R. (1987). Differential Geometry in Statistical Inference. Lecture Notes-Monograph Series. Institute of Mathematical Statistics.
Amari, S. and Nagaoka, H. (2000). Methods of Information Geometry. Oxford University Press.
Arwini, K. and Dodson, C. T. J. (2008). Information Geometry: Near Randomness and Near Independence. Lecture Notes in Mathematics # 1953, Berlin: Springer.
Atkinson, C. and Mitchell, A. F. S. (1981). Rao’s distance measure. Sankhyā Series A 43, 345–365.
Banerjee, A., Merugu, S., Dhillon, I. S. and Ghosh, J. (2005). Clustering with Bregman divergences. J. Machine Learning Res. 6, 1705–1749.
Barbaresco, F. (2009). Interactions between symmetric cone and information geometries: Bruhat-Tits and Siegel spaces models for high resolution autoregressive Doppler imagery. In Emerging Trends in Visual Computing (F. Nielsen, Ed.) Lecture Notes in Computer Science # 5416, pp. 124–163. Berlin / Heidelberg: Springer.
Bhatia, R. and Holbrook, J. (2006). Riemannian geometry and matrix geometric means. Linear Algebra Appl. 413, 594–618.
Bhattacharyya, A. (1943). On a measure of divergence between two statistical populations defined by their probability distributions. Bull. Calcutta Math. Soc. 35, 99–110.
Bregman, L. M. (1967). The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming. USSR Computational Mathematics and Mathematical Physics 7, 200–217.
Brown, L. D. (1986). Fundamentals of Statistical Exponential Families: with Applications in Statistical Decision Theory. Institute of Mathematical Statistics, Hayworth, CA, USA.
Cardoso, J. F. (2003). Dependence, correlation and Gaussianity in independent component analysis. J. Machine Learning Res. 4, 1177–1203.
Cena, A. and Pistone, G. (2007). Exponential statistical manifold. Ann. Instt. Statist. Math. 59, 27–56.
Champkin, J. (2011). C. R. Rao. Significance 8, 175–178.
Chentsov, N. N. (1982). Statistical Decision Rules and Optimal Inferences. Transactions of Mathematics Monograph, # 53 (Published in Russian in 1972).
Cover, T. M. and Thomas, J. A. (1991). Elements of Information Theory. New York: Wiley.
Cramér, H. (1946). Mathematical Methods of Statistics. NJ, USA: Princeton University Press.
Csiszár, I. (1967). Information-type measures of difference of probability distributions and indirect observation. Studia Scientia. Mathematica. Hungarica 2, 229–318.
Darmois, G. (1945). Sur les limites de la dispersion de certaines estimations. Rev. Internat. Stat. Instt. 13.
Dawid, A. P. (2007). The geometry of proper scoring rules. Ann. Instt. Statist. Math. 59, 77–93.
del Carmen Pardo, M. C. and Vajda, I. (1997). About distances of discrete distributions satisfying the data processing theorem of information theory. IEEE Trans. Inf. Theory 43, 1288–1293.
Dempster, A. P., Laird, N. M. and Rubin, D. B. (1977). Maximum likelihood from incomplete data via the EM algorithm. J. Roy. Statist. Soc. Series B 39, 1–38.
Fisher, R. A. (1922). On the mathematical foundations of theoretical statistics. Phil. Trans. Roy. Soc. London, A 222, 309–368.
Fréchet, M. (1943). Sur l’extension de certaines évaluations statistiques au cas de petits échantillons. Internat. Statist. Rev. 11, 182–205.
Gangbo, W. and McCann, R. J. (1996). The geometry of optimal transportation. Acta Math. 177, 113–161.
Grasselli, M. R. and Streater, R. F. (2001). On the uniqueness of the Chentsov metric in quantum information geometry. Infinite Dimens. Anal., Quantum Probab. and Related Topics 4, 173–181.
Kass, R. E. and Vos, P. W. (1997). Geometrical Foundations of Asymptotic Inference. New York: Wiley.
Koopman, B. O. (1936). On distributions admitting a sufficient statistic. Trans. Amer. Math. Soc. 39, 399–409.
Kotz, S. and Johnson, N. L. (Eds.) (1993). Breakthroughs in Statistics: Foundations and Basic Theory, Volume I. New York: Springer.
Lehmann, E. L. and Casella, G. (1998). Theory of Point Estimation 2nd ed. New York: Springer.
Lovric, M., Min-Oo, M. and Ruh, E. A. (2000). Multivariate normal distributions parametrized as a Riemannian symmetric space. J. Multivariate Anal. 74, 36–48.
Mahalanobis, P. C. (1936). On the generalized distance in statistics. Proc. National Instt. Sci., India 2, 49–55.
Mahalanobis, P. C. (1948). Historical note on the D2-statistic. Sankhyā 9, 237–240.
Maybank, S., Ieng, S. and Benosman, R. (2011). A Fisher-Rao metric for para-catadioptric images of lines. Internat. J. Computer Vision, 1–19.
Morozova, E. A. and Chentsov, N. N. (1991). Markov invariant geometry on manifolds of states. J. Math. Sci. 56, 2648–2669.
Murata, N., Takenouchi, T., Kanamori, T. and Eguchi, S. (2004). Information geometry of U-boost and Bregman divergence. Neural Comput. 16, 1437–1481.
Murray, M. K. and Rice, J. W. (1993). Differential Geometry and Statistics. Chapman and Hall/CRC.
Peter, A. and Rangarajan, A. (2006). A new closed-form information metric for shape analysis. In Medical Image Computing and Computer Assisted Intervention (MICCAI) Volume 1, pp. 249–256.
Qiao, Y. and Minematsu, N. (2010). A study on invariance of f-divergence and its application to speech recognition. Trans. Signal Process. 58, 3884–3890.
Rao, C. R. (1945). Information and the accuracy attainable in the estimation of statistical parameters. Bull. Calcutta Math. Soc. 37, 81–89.
Rao, C. R. (2010). Quadratic entropy and analysis of diversity. Sankhyā, Series A, 72, 70–80.
Shen, Z. (2006). Riemann-Finsler geometry with applications to information geometry. Chinese Annals of Mathematics 27B, 73–94.
Shima, H. (2007). The Geometry of Hessian Structures. Singapore: World Scientific.
Watanabe, S. (2009). Algebraic Geometry and Statistical Learning Theory. Cambridge: Cambridge University Press.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2013 Hindustan Book Agency
About this chapter
Cite this chapter
Nielsen, F. (2013). Cramér-Rao Lower Bound and Information Geometry. In: Bhatia, R., Rajan, C.S., Singh, A.I. (eds) Connected at Infinity II. Texts and Readings in Mathematics, vol 67. Hindustan Book Agency, Gurgaon. https://doi.org/10.1007/978-93-86279-56-9_2
Download citation
DOI: https://doi.org/10.1007/978-93-86279-56-9_2
Publisher Name: Hindustan Book Agency, Gurgaon
Print ISBN: 978-93-80250-51-9
Online ISBN: 978-93-86279-56-9
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)