Advertisement

Multi-class and Cluster Evaluation Measures Based on Rényi and Tsallis Entropies and Mutual Information

  • Thomas VillmannEmail author
  • Tina Geweniger
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10841)

Abstract

The evaluation of cluster and classification models in comparison to ground truth information or other models is still an objective for many applications. Frequently, this leads to controversy debates regarding the informative content. This particularly holds for cluster evaluations. Yet, for imbalanced class cardinalities, similar problems occur. One possibility to handle evaluation tasks in a more natural way is to consider comparisons in terms of shared or non-shared information. Information theoretic quantities like mutual information and divergence are designed to answer respective questions. Besides formulations based on the most prominent Shannon-entropy, alternative definitions based on relaxed entropy definitions are known. Examples are Rényi- and Tsallis-entropies. Obviously, the use of those entropy concepts result in an readjustment of mutual information etc. and respective evaluation measures thereof. In the present paper we consider several information theoretic evaluation measures based on different entropy concepts and compare them theoretically as well as regarding their performance in applications.

References

  1. 1.
    Bishop, C.M.: Pattern Recognition and Machine Learning. Information Science and Statistics. Springer, New York (2006)zbMATHGoogle Scholar
  2. 2.
    Geweniger, T., Fischer, L., Kaden, M., Lange, M., Villmann, T.: Clustering by fuzzy neural gas and evaluation of fuzzy clusters. Comput. Intell. Neurosci. Article ID 165248 (2013).  https://doi.org/10.1155/2013/165248CrossRefGoogle Scholar
  3. 3.
    Geweniger, T., Schleif, F.-M., Villmann, T.: Probabilistic prototype classification using t-norms. In: Villmann, T., Schleif, F.-M., Kaden, M., Lange, M. (eds.) Advances in Self-organizing Maps and Learning Vector Quantization. AISC, vol. 295, pp. 99–108. Springer, Cham (2014).  https://doi.org/10.1007/978-3-319-07695-9_9CrossRefGoogle Scholar
  4. 4.
    Kaden, M., Lange, M., Nebel, D., Riedel, M., Geweniger, T., Villmann, T.: Aspects in classification learning - review of recent developments in learning vector quantization. Found. Comput. Decis. Sci. 39(2), 79–105 (2014)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Bezdek, J.C.: Cluster validity with fuzzy sets. J. Cybern. 3, 58–73 (1974)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Pal, N., Bezdek, J.: On the cluster validity for the fuzzy c-means modell. IEEE Trans. Inf. Theory 3(3), 370–379 (1995)Google Scholar
  7. 7.
    Tasdemir, K., Merényi, E.: A validity index for prototype-based clustering of data sets with complex cluster structures. IEEE Trans. Syst. Man Cybern. 41(4), 1039–1053 (2011)CrossRefGoogle Scholar
  8. 8.
    Duda, R.O., Hart, P.E.: Pattern Classification and Scene Analysis. Wiley, New York (1973)zbMATHGoogle Scholar
  9. 9.
    Vinh, N.X., Epps, J., Bailey, J.: Information theoretic measures for clusterings comparison: variants, properties, normalization and correction of chance. J. Mach. Learn. Res. 11, 2837–2854 (2010)MathSciNetzbMATHGoogle Scholar
  10. 10.
    Rényi, A.: On measures of entropy and information. In: Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability. University of California Press, Berkeley (1961)Google Scholar
  11. 11.
    Tsallis, C.: Possible generalization of Bolzmann-Gibbs statistics. J. Math. Phys. 52, 479–487 (1988)zbMATHGoogle Scholar
  12. 12.
    Holt, R.S., Mastromarino, P.A., Kao, E.K., Hurley, M.B.: Information theoretic approach for performance evaluation of multi-class assignment systems. In: Proceedings SPIE Defense, Security, and Sensing (Orlando). SPIE, vol. 7697, pp. 1–12. SPIE The International Society for Optical Engineering, MIT Press (2010)Google Scholar
  13. 13.
    Kapur, J.N.: Measures of Information and Their Application. Wiley, New Delhi (1994)zbMATHGoogle Scholar
  14. 14.
    Shannon, C.E.: A mathematical theory of communication. Bell Syst. Tech. J. 27, 379–432 (1948)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Mackay, D.J.C.: Information Theory, Inference and Learning Algorithms. Cambridge University Press, Cambridge (2003)zbMATHGoogle Scholar
  16. 16.
    Onicescu, O.: Theorie de l‘Information Energie informationelle. C. R. Acad. Sci. Ser. A-B Tome 263, 841–842 (1966)Google Scholar
  17. 17.
    Rényi, A.: Probability Theory. North-Holland Publishing Company, Amsterdam (1970)zbMATHGoogle Scholar
  18. 18.
    Hild, K.E., Erdogmus, D., Principe, J.: Blind source separation using Rényi’s mutual information. IEEE Sig. Process. Lett. 8(6), 174–176 (2001)CrossRefGoogle Scholar
  19. 19.
    Csiszár, I.: Information-type measures of differences of probability distributions and indirect observations. Studia Sci. Math. Hungaria 2, 299–318 (1967)MathSciNetzbMATHGoogle Scholar
  20. 20.
    Verdú, S.: \(\alpha \)-mutual information. In: Information Theory and Applications Workshop (ITA), San Diego, pp. 1–6. IEEE Press (2015)Google Scholar
  21. 21.
    Csiszár, I.: Axiomatic characterization of information measures. Entropy 10, 261–273 (2008)CrossRefGoogle Scholar
  22. 22.
    Fehr, S., Berens, S.: On the conditional Rényi entropy. IEEE Trans. Inf. Theory 60(11), 6801–6810 (2014)CrossRefGoogle Scholar
  23. 23.
    Teixeira, A., Matos, A., Antunes, L.: Conditional Rényi entropies. IEEE Trans. Inf. Theory 58(7), 4273–4277 (2012)CrossRefGoogle Scholar
  24. 24.
    Iwamoto, M., Shikata, J.: Revisiting conditional Rényi entropies and generalizing Shannons bounds in information theoretically secure encryption. Cryptology ePrint Archive 440/2013 (2013)Google Scholar
  25. 25.
    Ilić, V.M., Djordjević, I.B., Stanković, M.: On a general definition of conditional Rényi entropies. In: Proceedings of the 4th International Electronic Conference on Entropy and Its Application (ECEA 2017), vol. 2, pp. 1–6. MDPI Open Access (2018)Google Scholar
  26. 26.
    Manije, S.T., Gholamreza, M.B., Mohammad, A.: Conditional Tsallis entropy. Cybern. Inf. Technol. 13(2), 37–42 (2013)MathSciNetGoogle Scholar
  27. 27.
    Abe, S., Rajagopal, A.K.: Nonadditive conditional entropy and its signicance for local realism. Physica A 289, 157–164 (2001)MathSciNetCrossRefGoogle Scholar
  28. 28.
    Furuichi, S.: Information theoretical properties of Tsallis entropies. J. Math. Phys. 47, 023302 (2006)MathSciNetCrossRefGoogle Scholar
  29. 29.
    Tsallis, C.: Generalized entropy-based criterion for consistent testing. Phys. Rev. E 58, 1442–1445 (1998)CrossRefGoogle Scholar
  30. 30.
    Sparavigna, A.C.: Mutual information and nonadditive entropies: the case of Tsallis entropy. Int. J. Sci. 4(10), 1–4 (2015)Google Scholar
  31. 31.
    Mould, R.F.: Introductory Medical Statistics, 3rd edn. Institute of Physics Publishing, London (1998)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Saxony Institute for Computational Intelligence and Machine LearningUniversity of Applied Sciences MittweidaMittweidaGermany

Personalised recommendations