Periodica Mathematica Hungarica

, Volume 2, Issue 1–4, pp 223–234 | Cite as

On thef-divergence and singularity of probability measures

  • I. Vajda


Probability Measure 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    I. Csiszár, Eine informationstheoretische Ungleichung, und ihre Anwendung auf den Beweis der Ergodizität von Markoffschen Ketten,Publ. Math. Inst. Hungar. Acad. Sci. 8 (1963), Ser. A, 85–108.Google Scholar
  2. [2]
    I. Csiszár, Information-type measures of difference of probability distributions and indirect observations,Studia Sci. Math. Hungar. 2 (1967), 299–318.Google Scholar
  3. [3]
    M. S. Pinsker,Information and information stability of random variables and processes, Moscow, 1960 (in Russian).Google Scholar
  4. [4]
    S. Kullback,Information theory and statistics, New York, 1959.Google Scholar
  5. [5]
    H. Hahn, Über die Integrale des Herrn Hellinger und die Orthogonalinvarianten der quadratischen Formen von unendlich vielen Veränderlichen,Monatsh. Math. Phys. 23 (1912), 161–224.Google Scholar
  6. [6]
    A. M. Kagan, Towards the theory of Fisher's amount of information,Dokl. Akad. Nauk SSSR 151 (1963), 277–278 (in Russian).Google Scholar
  7. [7]
    I. Vajda, On the amount of information contained in a sequence of independent observationsKybernetika (Prague)6 (1970), 306–323.Google Scholar
  8. [8]
    A. Rényi, On the foundations of information theory,Rev. ISI 33 (1965), 1–14.Google Scholar
  9. [9]
    A. Rényi, On measures of entropy and information,Proc. 4th Berkeley Symp. on Math. Stat. and Prob., Vol. I, Berkeley, 1960, 547–561.Google Scholar
  10. [10]
    G. H. Hardy, J. E. Littlewood andG. Pólya,Inequalities, Cambridge, 1959.Google Scholar
  11. [11]
    S. Kullback, A lower bound for discrimination information in terms of variation,IEEE Trans. Information Theory 13 (1967), 126–127.Google Scholar
  12. [12]
    P. R. Halmos,Measure theory, New York, 1966.Google Scholar
  13. [13]
    J. L. Doob,Stochastic processes, New York, 1953.Google Scholar
  14. [14]
    A. Perez, Notions généralisées d'incertitude, d'entropie et d'information du point de vue de la théorie des martingales,Trans. 1st Prague Conf. on Information Theory, Prague, 1957, 193–208.Google Scholar
  15. [15]
    T. E. Duncan, On the absolute continuity of measuresAnn. Math. Statist. 41 (1970), 30–38.Google Scholar
  16. [16]
    I. Vajda, Limit theorems for total variation of Cartesian product measures,Studia Sci. Math. Hungar. 6 (1971), 317–333.Google Scholar
  17. [17]
    S. Kakutani, On equivalence of infinite product measures,Ann. of Math. 49 (1948), 214–226.Google Scholar
  18. [18]
    J. Hájek, On a property of normal distributions of an arbitrary stochastic process,Czechoslovak Math. J. 8 (1958), 610–618.Google Scholar
  19. [19]
    I. Vajda, Note on discrimination information and variation,IEEE Trans. Information Theory 16 (1970), 771–773.Google Scholar

Copyright information

© Akadémiai Kiadó 1972

Authors and Affiliations

  • I. Vajda
    • 1
  1. 1.Institute of Information Theory and AutomationCzechoslovak academy of SciencesPrague 2Czechoslovakia

Personalised recommendations