Advertisement

Pattern Analysis and Applications

, Volume 8, Issue 1–2, pp 95–101 | Cite as

Estimation of generalized entropies with sample spacing

  • Mark P. WachowiakEmail author
  • Renata Smolíková
  • Georgia D. Tourassi
  • Adel S. Elmaghraby
Theoretical Advances

Abstract

In addition to the well-known Shannon entropy, generalized entropies, such as the Renyi and Tsallis entropies, are increasingly used in many applications. Entropies are computed by means of nonparametric kernel methods that are commonly used to estimate the density function of empirical data. Generalized entropy estimation techniques for one-dimensional data using sample spacings are proposed. By means of computational experiments, it is shown that these techniques are robust and accurate, compare favorably to the popular Parzen window method for estimating entropies, and, in many cases, require fewer computations than Parzen methods.

Keywords

Generalized entropy Renyi entropy Parzen windows Sample spacings Order statistics Nonparametric estimation 

Notes

Acknowledgments

The authors thank the anonymous reviewers for helpful criticisms and suggestions.

References

  1. 1.
    Silverman BW (1986) Density estimation for statistics and data analysis. Chapman and Hall, LondonGoogle Scholar
  2. 2.
    Renyi A (1970) Probability theory. North-Holland, AmsterdamGoogle Scholar
  3. 3.
    Tsallis C (1988) Possible generalization of Boltzmann-Gibbs statistics. J Stat Phys 52:479–487CrossRefGoogle Scholar
  4. 4.
    Smolíkoví R, Wachowiak MP, Zurada JM (2004) An information-theoretic approach to estimating ultrasound backscatter characteristics. Comput Biol Med 34:355–370PubMedGoogle Scholar
  5. 5.
    Krishnamachari A, Mandal VM, Karmeshu (2004) Study of DNA binding sites using the Renyi parametric entropy measure. J Theor Biol 27:429–436MathSciNetGoogle Scholar
  6. 6.
    Tonga S, Bezerianosa A, Amit Malhotraa A, Zhub Y, Thakor N (2003) Parameterized entropy analysis of EEG following hypoxic-ischemic brain injury. Phys Lett A 314:354–361CrossRefGoogle Scholar
  7. 7.
    Havrda J, Charvát F (1967) Quantification method of classification processes: concept of structural α-entropy. Kybernetika 3:30–35Google Scholar
  8. 8.
    Rosso OA, Martin MT, Plastino A (2003) Brain electrical activity analysis using wavelet-based informational tools (II): Tsallis non-extensivity and complexity measures. Physica A 320:497–511Google Scholar
  9. 9.
    Wachavia KMP, Smolíkoví R, Peters TM (2003) Multiresolution biomedical image registration using generalized information measures. Lecture notes in computer science 2899 (MICCAI 2003), pp 846–853Google Scholar
  10. 10.
    Vasicek O (1976) A test for normality based on sample entropy. J Roy Stat Soc B 38:54–59Google Scholar
  11. 11.
    Dudewicz E, van der Meulen EC (1987) The empiric entropy, a new approach to nonparametric entropy estimation. In: Puri ML, Vilaplana JP, Wertz W (eds) New perspectives in theoretical and applied statistics. Wiley, NYGoogle Scholar
  12. 12.
    van Es B (1992) Estimating functionals related to a density by a class of statistics based on spacings. Scand J Stat 19:61–72Google Scholar
  13. 13.
    Correa JC (1995) A new estimator of entropy. Commun Stat Theor 24:2439–2449MathSciNetGoogle Scholar
  14. 14.
    Beirlant J, Dudewicz E, Gyorfi L, van der Meulen EC (1997) Nonparametric entropy estimation: an overview. Int J Math Stat Sci 6(1):17–39Google Scholar
  15. 15.
    Wieczorkowski R, Grzegorzewski P (1999) Entropy estimators—improvements and comparisons. Commun Stat Simul 28(2):541–567MathSciNetGoogle Scholar
  16. 16.
    Grassberger P. Entropy estimates from insufficient samplings. ArXiv:physics/0307138 2003Google Scholar
  17. 17.
    Hero A, Ma B, Michel O, Gorman J (2002) Applications of entropic spanning graphs. IEEE Signal Proc Mag 19(5):85–95CrossRefGoogle Scholar
  18. 18.
    Erdogmus D, Principe JC (2001) Entropy minimization algorithm for multilayer perceptrons. In: Proceedings of INNS-IEEE conference on neural networks (IJCNN), Washington, DC, pp 3003–3008Google Scholar
  19. 19.
    Holste D, Grosse I, Herzel H (1998) Bayes’ estimators of generalized entropies. J Phys A 31:2551–2566Google Scholar
  20. 20.
    Wolpert DH, Wolf DR (1995) Estimating functions of probability distributions from a finite set of samples. Phys Rev E 52(6):6841–6854MathSciNetGoogle Scholar
  21. 21.
    Paninski L (2003) Estimation of entropy and mutual information. Neural Comput 15:1191–1253CrossRefGoogle Scholar
  22. 22.
    Tadikamalla PR (1980) Random sampling from the exponential power distribution. J Am Stat Assoc 75(371):683–686Google Scholar
  23. 23.
    Golan A, Perloff JM (2002) Comparison of maximum entropy and higher-order entropy estimators. J Econom 107(1–2):195–211CrossRefGoogle Scholar
  24. 24.
    Eggermont PB, LaRiccia VN (1999) Best asymptotic normality of the kernel density entropy estimator for smooth densities. IEEE T Inform Theory 45(4):1321–1325CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London Limited 2005

Authors and Affiliations

  • Mark P. Wachowiak
    • 1
    Email author
  • Renata Smolíková
    • 1
  • Georgia D. Tourassi
    • 2
  • Adel S. Elmaghraby
    • 2
  1. 1.Imaging Research LaboratoriesRobarts Research InstituteLondonCanada
  2. 2.Department of Computer Engineering and Computer ScienceUniversity of LouisvilleLouisvilleUSA

Personalised recommendations