Abstract
For the study of quantum information theory, mathematical statistics, and information geometry, which are mainly examined in a nonquantum context. This chapter briefly summarizes the fundamentals of these topics from a unified viewpoint. Since these topics are usually treated individually, this chapter will be useful even for nonquantum applications.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
In this case, we consider \( 0 \log 0 \) to be 0 here.
- 2.
The term relative entropy is commonly used in statistical physics. In information theory, it is generally known as the Kullback–Leibler divergence, while in statistics it is known as the Kullback–Leibler information.
- 3.
In this book, monotonicity refers to only the monotonicity regarding the change in probability distributions or density matrices.
- 4.
This quantity is more commonly used in information theory, where it is called f-divergence [1]. In this text, we prefer to use the term “relative entropy” for all relative-entropy-like quantities.
- 5.
If L is not an integer, we consider the largest integer that does not exceed L.
- 6.
\( \mathop {\mathrm{argmin}}_{0\le s\le 1} f(s) \) returns the value of s that yields \( \min _{0\le s\le 1} f(s)\). \( \mathop {\mathrm{argmax}}\) is similarly defined.
- 7.
The superscript (e) means “exponential.” This is because A corresponds to the exponential representation, as discussed later.
- 8.
- 9.
This is generally true for all probability distribution families, although some regularity conditions must be imposed. For example, consider the case in which \(\varOmega \) consists of finite elements. These regularity conditions are satisfied when the first and second derivatives with respect to \(\theta \) are continuous. Generally, the central limit theorem is used in the proof [7].
- 10.
The set is called the interior of a set X when it consists of the elements of X without its boundary. For example, for a one-dimensional set, the interior of \([0,0.5]\cup \{0.7\}\) is (0, 0.5) and the closure of the interior is [0, 0.5]. Therefore, the condition is not satisfied in this case.
References
I. Csiszár, Information type measures of difference of probability distribution and indirect observations. Studia Scient. Math. Hungar. 2, 299–318 (1967)
S. Amari, H. Nagaoka, Methods of Information Geometry (AMS & Oxford University Press, Oxford, 2000)
A. Rényi, On measures of information and entropy, in Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability (University of California Press, Berkeley, 1961), pp. 547–561
R.M. Fano, Transmission of Information: A Statistical Theory of Communication (Wiley, New York, 1961)
M. Hayashi, Security analysis of \(\varepsilon \)-almost dual universal\(_2\) hash functions: smoothing of min entropy vs. smoothing of Rényi entropy of order 2 (2013). arXiv:1309.1596
S. Amari, \(\alpha \)-divergence Is unique, belonging to both \(f\)-divergence and Bregman divergence classes. IEEE Trans. Inform. Theory 55(11), 4925–4931 (2009)
A.W. van der Vaart, Asymptotic Statistics (Cambridge University Press, Cambridge, 1998)
I. Csiszár, J. Körner, Information Theory: Coding Theorems for Discrete Memoryless Systems (Academic, 1981)
I.N. Sanov, On the probability of large deviations of random variables. Mat. Sbornik 42, 11–44 (1957) (in Russian). English translation: Selected Translat. Math. Stat. 1, 213–244 (1961)
M. Keyl, R.F. Werner, Estimating the spectrum of a density operator. Phys. Rev. A 64, 052311 (2001)
K. Matsumoto, Seminar notes (1999)
M. Hayashi, Optimal sequence of POVMs in the sense of Stein’s lemma in quantum hypothesis. J. Phys. A Math. Gen. 35, 10759–10773 (2002)
M. Hayashi, Exponents of quantum fixed-length pure state source coding. Phys. Rev. A 66, 032321 (2002)
M. Hayashi, K. Matsumoto, Variable length universal entanglement concentration by local operations and its application to teleportation and dense coding, quant-ph/0109028 (2001); K. Matsumoto, M. Hayashi, Universal entanglement concentration. Phys. Rev. A 75, 062338 (2007)
M. Hayashi, K. Matsumoto, Quantum universal variable-length source coding. Phys. Rev. A 66, 022311 (2002)
M. Hayashi, K. Matsumoto, Simple construction of quantum universal variable-length source coding. Quant. Inf. Comput. 2, Special Issue, 519–529 (2002)
M. Hayashi, Asymptotics of quantum relative entropy from a representation theoretical viewpoint. J. Phys. A Math. Gen. 34, 3413–3419 (2001)
H. Cramér, Sur un nouveaux theoorème-limite de la théorie des probabilités, in Actualités Scientifiques et Industrielles, no. 736, in Colloque consacré à la thèorie des probabilités (Hermann, Paris, 1938), pp. 5–23
J. Gärtner, On large deviations from the invariant measure. Theory Prob. Appl. 22, 24–39 (1977)
R. Ellis, Large deviations for a general class of random vectors, Ann. Probab. 12, 1, 1–12 (1984); Entropy, Large Deviations and Statistical Mechanics (Springer, Berlin, 1985)
R.R. Bahadur, On the asymptotic efficiency of tests and estimates. Sankhyā 22, 229 (1960)
R.R. Bahadur, Rates of Convergence of Estimates and Test Statistics. Ann. Math. Stat. 38, 303 (1967)
R.R. Bahadur, Some limit theorems in statistics, in Regional Conference Series in Applied Mathematics, no. 4 (SIAM, Philadelphia, 1971)
J.C. Fu, On a theorem of Bahadur on the rate of convergence of point estimators. Ann. Stat. 1, 745 (1973)
A.I. Khinchin, Mathematical Foundations of Information Theory (Dover, New York, 1957)
T.S. Han, Information-Spectrum Methods in Information Theory (Springer, Berlin, 2002) (originally appeared in Japanese in 1998)
V.D. Milman, G. Schechtman, Asymptotic theory of finite-dimensional normed spaces, vol. 1200, Lecture Notes in Mathematics (Springer, Berlin, 1986)
T. Cover, J. Thomas, Elements of Information Theory (Wiley, New York, 1991)
M. Hayashi, Exponential decreasing rate of leaked information in universal random privacy amplification. IEEE Trans. Inf. Theory 57, 3989–4001 (2011)
M. Iwamoto, J. Shikata, Information theoretic security for encryption based on conditional Rényi entropies. Inform. Theor. Secur. Lect. Notes Comput. Sci. 8317(2014), 103–121 (2014)
M. Müller-Lennert, F. Dupuis, O. Szehr, S. Fehr, M. Tomamichel, On quantum Renyi entropies: a new generalization and some properties. J. Math. Phys. 54, 122203 (2013)
E.L. Lehman, G. Casella, Theory of Point Estimation (Springer, Berlin Heidelberg New York, 1998)
A. Dembo, O. Zeitouni, Large Deviation Techniques and Applications (Springer, Berlin, 1997)
J.A. Bucklew, Large Deviation Techniques in Decision, Simulation, and Estimation (Wiley, New York, 1990)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2017 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Hayashi, M. (2017). Information Quantities and Parameter Estimation in Classical Systems. In: Quantum Information Theory. Graduate Texts in Physics. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-49725-8_2
Download citation
DOI: https://doi.org/10.1007/978-3-662-49725-8_2
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-662-49723-4
Online ISBN: 978-3-662-49725-8
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)