Summary (and Keywords)
Non-communication models for information theory: games and experiments. Measures of uncertainty and information: entropies, divergences, information improvements.
Some useful properties of information measures, symmetry, bounds, behaviour under composition, branching, conditional measures, sources. Rényi measures, measures of higher degree.
Promising and not so promising generalizations. Measures which depend not just upon the probabilities but (also) upon the subject matters.
Similar content being viewed by others
References
Aczél, J.,Determination of all additive quasiarithmetic mean codeword lengths. Z. Wahrsch. Verw. Gebiete29 (1974), 351–360.
Aczél, J.,A mixed theory of information—II: Additive inset entropies (of randomized systems of events) with measurable sum property. Utilitas Math.13 (1978), 49–54.
Aczél, J.,A mixed theory of information. V. How to keep the (inset) expert honest. J. Math. Anal. Appl.75 (1980), 447–453.
Aczél, J.,A mixed theory of information. VI. An example at last: A proper discrete analogue of the continuous Shannon measure of information (and its characterization). Univ. Beograd. Publ. Elektrotehn. Fak. Ser. Mat. Fiz. No. 602-633 (1978–80), 65–72.
Aczél, J.,A mixed theory of information. VII. Inset information functions of all degrees. C.R. Math. Rep. Acad. Sci. Canada2 (1980), 125–129.
Aczél, J.,Functions partially constant on rings of sets. C.R. Math. Rep. Acad. Sci. Canada2 (1980), 159–164.
Aczél, J. andDaróczy, Z.,Sur la caractérisation axiomatique des entropies d'ordre positif, comprise l'entropie de Shannon. C.R. Acad. Sci. Paris257 (1963), 1581–1584.
Aczél, J. andDaróczy, Z.,Über verallgemeinerte quasilineare Mittelwerte, die mit Gewichtsfunktionen gebildet sind. Publ. Math. Debrecen10 (1963), 171–190.
Aczél, J. andDaróczy, Z.,On measures on information and their characterizations. Academic Press, New York-San Francisco-London, 1975.
Aczél, J. andDaróczy, Z.,A mixed theory of information. I. Symmetric, recursive and measurable entropies of randomized systems of events. RAIRO Inform. Théor.12 (1978), 149–155.
Aczél, J., Forte, B., andNg, C. T.,Why the Shannon and Hartley entropies are “natural”. Adv. in Appl. Probab.6 (1974), 131–146.
Aczél, J. andKannappan, Pi.,A mixed theory of information. III. Inset entropies of degree β. Inform. and Control39 (1978), 315–322.
Aczél, J. andNg, C. T.,Determination of all semisymmetric recursive information measures of multiplicative type on n positive discrete probability distributions. Linear Algebra Appl.52–53 (1983), 1–30.
Campbell, L. L.,A coding theorem and Rényi's entropy. Inform. and Control8 (1965), 423–429.
Campbell, L. L.,Definition of entropy by means of a coding problem. Z. Wahrsch. Verw. Gebiete6 (1966), 113–118.
Csiszár, I.,Information measures: A critical survey. InTrans. Seventh Prague Conf. Information Theory, Statist., Dec. Functions, Random Processes and Eighth European Meeting of Statisticians. Vol. B, Academia, Prague, 1978, pp. 73–86.
Daróczy, Z.,Über Mittelwerte und Entropien vollständiger Wahrscheinlichkeitsverteilungen. Acta Math. Acad. Sci. Hungar.15 (1964), 203–210.
Daróczy, Z.,Generalized information functions. Inform. and Control16 (1970), 36–51.
Daróczy, Z.,On the measurable solutions of a functional equation. Acta Math. Acad. Sci. Hungar.22 (1971), 11–14.
Daróczy, Z. andKátai, I.,Additive zahlentheoretische Funktionen und das Mass der Information. Ann. Univ. Sci. Budapest. Eötvös Sect. Math.13 (1970), 83–88.
Devijver, P. A., Entropie quadratique et reconnaissance des formes. InComputer Oriented Processes, Noordhof, Leyden, 1976, pp. 257–277.
Devijver, P. A.,Entropies of degree β and lower bounds for the average error rate. Inform. and Control34 (1977), 222–226.
Diderrich, G.,Local boundedness and the Shannon entropy. Inform. and Control29 (1975), 149–161.
Fischer, P., On the inequality Σpif(pi) ≧ Σpif(pi). Metrika18 (1972), 199–208.
Fischer, P., On the inequaility\(\sum\limits_{i = 1}^n {p_i f(p_i )/f(q_i ) \leqq 1} \). Canad. Math. Bull.17 (1974), 193–199.
Fischer, P., On the inequality\(\sum\limits_{i = 1}^n {p_i f(p_i )/f(q_i ) \geqq 1} \). Pacific J. Math.60 (1975), 65–74.
Forte, B.,Measures of information: The general axiomatic theory. Rev. Française Informat. Recherche Opérationnelle3 (1969), Sér. R-2, 63–89.
Guiaşu, S.,Information theory with applications. McGraw-Hill International. New York-Auckland-Bogotá, 1977.
Havrda, J. andCharvát, F.,Quantification method of classification processes. Concept of structural a-entropy. Kybernetika (Prague)3 (1967), 30–35.
Kannappan, Pl.,On generalizations of some measures in information theory. Glasnik Mat.9 (29) (1974), 81–93.
Kannappan, Pl.,A mixed theory of information. IV. Inset inaccuracy and directed divergence. Metrika27 (1980), 91–98.
Kannappan, Pl., andSander, W.,A mixed theory of information. VIII. Inset measures depending upon several distributions. Aequationes Math.25 (1982–83), 177–193.
Kapur, J. N.,Information of order α and type β. Proc. Indian Acad. Sci. Sect. A68 (1968), 65–75.
Kolmogorov, A. N.,Grundbegriffe der Wahrscheinlichkeitsrechnung. Springer, Berlin, 1933.
Maksa, Gy.,Bounded symmetric information functions. C.R. Math. Rep. Acad. Sci. Canada2 (1980), 247–252.
Meginnis, J. R.,A new class of symmetric utility rules for gambles, subjective marginal probability functions and a generalized Bayes rule. Bus. Econom. Statist. Sec. Proc. Amer. Statist. Assoc.1976, 471–476.
Ng, C. T.,Representation for measures of information with the branching property. Inform. and Control25 (1974), 45–56.
Rényi, A.,On measures of entropy and information. InProc. Fourth Berkeley Symp. Math. Statist. Prob. 1960, Vol. 1, Univ. of Calif. Press, Berkeley, 1961, pp. 547–561.
Rényi A.,On the foundations of information theory. Rev. Inst. Internat. Statist.33 (1965), 1–14.
Shannon, C. E. andWeaver, W.,The mathematical theory of communication. Univ. of Ill. Press, Urbana, 1949.
Sharma, B. D. andMittal, D. P.,New non-additive measures for discrete probability distributions. J. Math. Sci.10 (1975), 28–40.
Sharma, B. D. andTaneja, I. J.,Entropy of type (α, β) and other generalized measures in information theory. Metrika22 (1975), 205–215.
Theil, H.,Economics and information theory. North Holland, Amsterdam—Rand McNally, Chicago, 1967.
Van Der Pyl, Th.,Axiomatique de l'information d'ordre α et de type β. C.R. Acad. Sci. Paris Sér. A28 (1976), 1031–1033.
The twentieth international symposium on functional equations, August 1–7, 1982 Oberwolfach, Germany (compiled by B. Ebanks). Aequationes Math.24 (1982), 261–297.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Aczél, J. Measuring information beyond communication theory—Why some generalized information measures may be useful, others not. Aeq. Math. 27, 1–19 (1984). https://doi.org/10.1007/BF02192655
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF02192655