Annals of the Institute of Statistical Mathematics

, Volume 55, Issue 3, pp 639–653

A new class of metric divergences on probability spaces and its applicability in statistics

  • Ferdinand Österreicher
  • Igor Vajda

DOI: 10.1007/BF02517812

Cite this article as:
Österreicher, F. & Vajda, I. Ann Inst Stat Math (2003) 55: 639. doi:10.1007/BF02517812


The classIfβ, βε(0, ∞], off-divergences investigated in this paper is defined in terms of a class of entropies introduced by Arimoto (1971,Information and Control,19, 181–194). It contains the squared Hellinger distance (for β=1/2), the sumI(Q1‖(Q1+Q2)/2)+I(Q2‖(Q1+Q2)/2) of Kullback-Leibler divergences (for β=1) and half of the variation distance (for β=∞) and continuously extends the class of squared perimeter-type distances introduced by Österreicher (1996,Kybernetika,32, 389–393) (for βε (1, ∞]). It is shown that\((I_{f_\beta } (Q_1 ,Q_2 ))^{\min (\beta ,1/2)}\) are distances of probability distributionsQ1,Q2 for β ε (0, ∞). The applicability of\(I_{f_\beta }\)-divergences in statistics is also considered. In particular, it is shown that the\(I_{f_\beta }\)-projections of appropriate empirical distributions to regular families define distribution estimates which are in the case of an i.i.d. sample of size'n consistent. The order of consistency is investigated as well.

Key words and phrases

Dissimilaritiesmetric divergencesminimum distance estimators

Copyright information

© The Institute of Statistical Mathematics 2003

Authors and Affiliations

  • Ferdinand Österreicher
    • 1
  • Igor Vajda
    • 2
  1. 1.Institute of MathematicsUniversity of SalzburgSalzburgAustria
  2. 2.Institute of Information Theory and AutomationAcademy of SciencesPragueCzech Republic