Summary
The informational divergence between stochastic matrices is not a metric. In this paper we show that, however, consistent definitions can be given of ‘spheres’, ‘segments’ and ‘straight lines’ using the divergence as a sort of ‘distance’ between stochastic matrices. The geometric nature of many ‘reliability functions’ of Information Theory and Mathematical Statistics is thus clarified.
Similar content being viewed by others
References
R. E. Blahut,Information Bounds of the Fano-Kullback Type, IEEE Trans. Information Theory (4)22 (1976), 410–421.
S. Kullback,Information Theory and Statistics (1959), Wiley, New York.
I. Csiszár, G. Longo,On the Error Exponent for Source Coding ..., Studia Math. Acad. Sc. Hung.,6 (1971), 181–191.
R. E. Elahut,An Hypothesis Testing Approach to Information Theory (1972), Thesis, Cornell University.
I. Csiszár,I-divergence Geometry of Probability Distributions and Minimization Problems, Ann. Probability (1)3 (1975), 146–158.
A. Sgarro,An Informational Divergence Geometry (1975), Nota interna, CISM, Udine.
Author information
Authors and Affiliations
Additional information
This work has been done within the GNIM-CNR research activity.
Rights and permissions
About this article
Cite this article
Sgarro, A. An informational divergence geometry for stochastic matrices. Calcolo 15, 41–49 (1978). https://doi.org/10.1007/BF02576044
Received:
Issue Date:
DOI: https://doi.org/10.1007/BF02576044