In classical probability theory, there exist two important concepts which measure the amount of “information” of a given distribution. These are the Fisher information and the entropy. There exist various relations between these quantities, and they form a cornerstone of classical probability theory and statistics. Voiculescu introduced free probability analogues of these quantities, called free Fisher information and free entropy, denoted by Φ and χ, respectively. However, there remain some gaps in our present understanding of these quantities. In particular, there exist two different approaches, each of them yielding a notion of entropy and Fisher information. One hopes that finally one will be able to prove that both approaches give the same result, but at the moment this is not clear. Thus, for the time being, we have to distinguish the entropy χ and the free Fisher information Φ coming from the first approach (via microstates) and the free entropy χ∗ and the free Fisher information Φ∗ coming from the second non-microstates approach (via conjugate variables).
This is a preview of subscription content, log in to check access.
P. Koosis, Introduction toHpSpaces. Cambridge Tracts in Mathematics, vol. 115, 2nd edn. (Cambridge University Press, Cambridge, 1998).Google Scholar
T. Mai, R. Speicher, M. Weber, Absence of algebraic relations and of zero divisors under the assumption of full non-microstates free entropy dimension. Adv. Math. 304, 1080–1107 (2017)MathSciNetCrossRefzbMATHGoogle Scholar
F.J. Murray, J. Von Neumann, On rings of operators. Ann. Math. (2) 37(1), 116–229 (1936)Google Scholar
M. Reed, B. Simon, Methods of Modern Mathematical Physics. I: Functional Analysis (Academic, New York/London, 1980)Google Scholar
E.M. Stein, G. Weiss, Introduction to Fourier Analysis on Euclidean Spaces (Princeton University Press, Princeton, 1971)zbMATHGoogle Scholar
D. Voiculescu, The analogues of entropy and of Fisher’s information measure in free probability theory. V. Noncommutative Hilbert transforms. Invent. Math. 132(1), 189–227 (1998)MathSciNetzbMATHGoogle Scholar
D. Voiculescu, The analogues of entropy and of Fisher’s information measure in free probability theory. VI. Liberation and mutual free information. Adv. Math. 146(2), 101–166 (1999)MathSciNetzbMATHGoogle Scholar