Skip to main content
Log in

Measuring information beyond communication theory—Why some generalized information measures may be useful, others not

  • Survey Paper
  • Published:
aequationes mathematicae Aims and scope Submit manuscript

Summary (and Keywords)

Non-communication models for information theory: games and experiments. Measures of uncertainty and information: entropies, divergences, information improvements.

Some useful properties of information measures, symmetry, bounds, behaviour under composition, branching, conditional measures, sources. Rényi measures, measures of higher degree.

Promising and not so promising generalizations. Measures which depend not just upon the probabilities but (also) upon the subject matters.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Aczél, J.,Determination of all additive quasiarithmetic mean codeword lengths. Z. Wahrsch. Verw. Gebiete29 (1974), 351–360.

    Article  Google Scholar 

  2. Aczél, J.,A mixed theory of information—II: Additive inset entropies (of randomized systems of events) with measurable sum property. Utilitas Math.13 (1978), 49–54.

    Google Scholar 

  3. Aczél, J.,A mixed theory of information. V. How to keep the (inset) expert honest. J. Math. Anal. Appl.75 (1980), 447–453.

    Article  Google Scholar 

  4. Aczél, J.,A mixed theory of information. VI. An example at last: A proper discrete analogue of the continuous Shannon measure of information (and its characterization). Univ. Beograd. Publ. Elektrotehn. Fak. Ser. Mat. Fiz. No. 602-633 (1978–80), 65–72.

  5. Aczél, J.,A mixed theory of information. VII. Inset information functions of all degrees. C.R. Math. Rep. Acad. Sci. Canada2 (1980), 125–129.

    Google Scholar 

  6. Aczél, J.,Functions partially constant on rings of sets. C.R. Math. Rep. Acad. Sci. Canada2 (1980), 159–164.

    Google Scholar 

  7. Aczél, J. andDaróczy, Z.,Sur la caractérisation axiomatique des entropies d'ordre positif, comprise l'entropie de Shannon. C.R. Acad. Sci. Paris257 (1963), 1581–1584.

    Google Scholar 

  8. Aczél, J. andDaróczy, Z.,Über verallgemeinerte quasilineare Mittelwerte, die mit Gewichtsfunktionen gebildet sind. Publ. Math. Debrecen10 (1963), 171–190.

    Google Scholar 

  9. Aczél, J. andDaróczy, Z.,On measures on information and their characterizations. Academic Press, New York-San Francisco-London, 1975.

    Google Scholar 

  10. Aczél, J. andDaróczy, Z.,A mixed theory of information. I. Symmetric, recursive and measurable entropies of randomized systems of events. RAIRO Inform. Théor.12 (1978), 149–155.

    Google Scholar 

  11. Aczél, J., Forte, B., andNg, C. T.,Why the Shannon and Hartley entropies are “natural”. Adv. in Appl. Probab.6 (1974), 131–146.

    Google Scholar 

  12. Aczél, J. andKannappan, Pi.,A mixed theory of information. III. Inset entropies of degree β. Inform. and Control39 (1978), 315–322.

    Article  Google Scholar 

  13. Aczél, J. andNg, C. T.,Determination of all semisymmetric recursive information measures of multiplicative type on n positive discrete probability distributions. Linear Algebra Appl.52–53 (1983), 1–30.

    Google Scholar 

  14. Campbell, L. L.,A coding theorem and Rényi's entropy. Inform. and Control8 (1965), 423–429.

    Article  Google Scholar 

  15. Campbell, L. L.,Definition of entropy by means of a coding problem. Z. Wahrsch. Verw. Gebiete6 (1966), 113–118.

    Article  Google Scholar 

  16. Csiszár, I.,Information measures: A critical survey. InTrans. Seventh Prague Conf. Information Theory, Statist., Dec. Functions, Random Processes and Eighth European Meeting of Statisticians. Vol. B, Academia, Prague, 1978, pp. 73–86.

    Google Scholar 

  17. Daróczy, Z.,Über Mittelwerte und Entropien vollständiger Wahrscheinlichkeitsverteilungen. Acta Math. Acad. Sci. Hungar.15 (1964), 203–210.

    Article  Google Scholar 

  18. Daróczy, Z.,Generalized information functions. Inform. and Control16 (1970), 36–51.

    Article  Google Scholar 

  19. Daróczy, Z.,On the measurable solutions of a functional equation. Acta Math. Acad. Sci. Hungar.22 (1971), 11–14.

    Article  Google Scholar 

  20. Daróczy, Z. andKátai, I.,Additive zahlentheoretische Funktionen und das Mass der Information. Ann. Univ. Sci. Budapest. Eötvös Sect. Math.13 (1970), 83–88.

    Google Scholar 

  21. Devijver, P. A., Entropie quadratique et reconnaissance des formes. InComputer Oriented Processes, Noordhof, Leyden, 1976, pp. 257–277.

    Google Scholar 

  22. Devijver, P. A.,Entropies of degree β and lower bounds for the average error rate. Inform. and Control34 (1977), 222–226.

    Article  Google Scholar 

  23. Diderrich, G.,Local boundedness and the Shannon entropy. Inform. and Control29 (1975), 149–161.

    Article  Google Scholar 

  24. Fischer, P., On the inequality Σpif(pi) ≧ Σpif(pi). Metrika18 (1972), 199–208.

    Google Scholar 

  25. Fischer, P., On the inequaility\(\sum\limits_{i = 1}^n {p_i f(p_i )/f(q_i ) \leqq 1} \). Canad. Math. Bull.17 (1974), 193–199.

    Google Scholar 

  26. Fischer, P., On the inequality\(\sum\limits_{i = 1}^n {p_i f(p_i )/f(q_i ) \geqq 1} \). Pacific J. Math.60 (1975), 65–74.

    Google Scholar 

  27. Forte, B.,Measures of information: The general axiomatic theory. Rev. Française Informat. Recherche Opérationnelle3 (1969), Sér. R-2, 63–89.

    Google Scholar 

  28. Guiaşu, S.,Information theory with applications. McGraw-Hill International. New York-Auckland-Bogotá, 1977.

    Google Scholar 

  29. Havrda, J. andCharvát, F.,Quantification method of classification processes. Concept of structural a-entropy. Kybernetika (Prague)3 (1967), 30–35.

    Google Scholar 

  30. Kannappan, Pl.,On generalizations of some measures in information theory. Glasnik Mat.9 (29) (1974), 81–93.

    Google Scholar 

  31. Kannappan, Pl.,A mixed theory of information. IV. Inset inaccuracy and directed divergence. Metrika27 (1980), 91–98.

    Article  Google Scholar 

  32. Kannappan, Pl., andSander, W.,A mixed theory of information. VIII. Inset measures depending upon several distributions. Aequationes Math.25 (1982–83), 177–193.

    Article  Google Scholar 

  33. Kapur, J. N.,Information of order α and type β. Proc. Indian Acad. Sci. Sect. A68 (1968), 65–75.

    Google Scholar 

  34. Kolmogorov, A. N.,Grundbegriffe der Wahrscheinlichkeitsrechnung. Springer, Berlin, 1933.

    Google Scholar 

  35. Maksa, Gy.,Bounded symmetric information functions. C.R. Math. Rep. Acad. Sci. Canada2 (1980), 247–252.

    Google Scholar 

  36. Meginnis, J. R.,A new class of symmetric utility rules for gambles, subjective marginal probability functions and a generalized Bayes rule. Bus. Econom. Statist. Sec. Proc. Amer. Statist. Assoc.1976, 471–476.

  37. Ng, C. T.,Representation for measures of information with the branching property. Inform. and Control25 (1974), 45–56.

    Article  Google Scholar 

  38. Rényi, A.,On measures of entropy and information. InProc. Fourth Berkeley Symp. Math. Statist. Prob. 1960, Vol. 1, Univ. of Calif. Press, Berkeley, 1961, pp. 547–561.

    Google Scholar 

  39. Rényi A.,On the foundations of information theory. Rev. Inst. Internat. Statist.33 (1965), 1–14.

    Google Scholar 

  40. Shannon, C. E. andWeaver, W.,The mathematical theory of communication. Univ. of Ill. Press, Urbana, 1949.

    Google Scholar 

  41. Sharma, B. D. andMittal, D. P.,New non-additive measures for discrete probability distributions. J. Math. Sci.10 (1975), 28–40.

    Google Scholar 

  42. Sharma, B. D. andTaneja, I. J.,Entropy of type (α, β) and other generalized measures in information theory. Metrika22 (1975), 205–215.

    Article  Google Scholar 

  43. Theil, H.,Economics and information theory. North Holland, Amsterdam—Rand McNally, Chicago, 1967.

    Google Scholar 

  44. Van Der Pyl, Th.,Axiomatique de l'information d'ordre α et de type β. C.R. Acad. Sci. Paris Sér. A28 (1976), 1031–1033.

    Google Scholar 

  45. The twentieth international symposium on functional equations, August 1–7, 1982 Oberwolfach, Germany (compiled by B. Ebanks). Aequationes Math.24 (1982), 261–297.

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Aczél, J. Measuring information beyond communication theory—Why some generalized information measures may be useful, others not. Aeq. Math. 27, 1–19 (1984). https://doi.org/10.1007/BF02192655

Download citation

  • Received:

  • Revised:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02192655

AMS (1980) subject classification

Navigation