Characterizing information measures: Approaching the end of an era

  • J. Aczél
Section IV Information Theoretic Approach
Part of the Lecture Notes in Computer Science book series (LNCS, volume 286)


We try to indicate what not to do in characterizations of information measures — because it makes little sense or because it already has been done. For this we have to summarize at least roughly what has been done already. We mention also some problems which we do think are worth working at.


Functional Equation Shannon Entropy Information Measure Information Function Open Domain 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Abou-Zaid, S.H. 1984 On functional equations and measures of information. M. Phil. thesis. Univ. of Waterloo, Waterloo, Ont., 1984.Google Scholar
  2. Aczél, J. 1968 On different characterizations of entropies. In Probability and information theory (Proc. Internat. Symp. McMaster Univ., Canada, April 1968). (Lecture Notes in Math.Vol. 89), Springer, Berlin-Heidelberg-New York 1969, pp. 1–11.Google Scholar
  3. Aczél, J. 1974 'Keeping the expert honest’ revisited — or: a method to prove the differentiability of solutions of functional inequalities. Selecta Statist. Canad. 2, 1–14.Google Scholar
  4. Aczél, J. 1978 A mixed theory of information — VI: An example at last: a proper discrete analogue of a continuous Shannon measure of information (and its characterization). Univ. Beograd. Publ. Elektrotehn. Fak. Ser. Mat. Fiz. 1978–1980, No. 602–633, 65–72.Google Scholar
  5. Aczél, J. 1980A Information functions of degree (0.β). Utilitas Math. 18, 15–26.Google Scholar
  6. Aczél, J. 1980B A mixed theory of information, V. How to keep the (inset) expert honest. J. Math. Anal. Appl. 75, 447–453.Google Scholar
  7. Aczél, J. 1980C A mixed theory of information — VII. Inset information functions of all degrees. C.R. Math. Rep. Acad. Sci. Canada 2, 125–129.Google Scholar
  8. Aczél, J. 1981 Notes on generalized information functions. Aequationes Math. 22, 97–107.Google Scholar
  9. Aczél, J. 1984 Measuring information beyond communication theory — some probably useful and some almost certainly useless generalizations. Inform. Process. and Management 20, 383–395.Google Scholar
  10. Aczél, J. and Daróczy, Z. 1963A Ueber verallgemeinerte quasilineare Mittelwerte, die mit Gewichts funktionen gebildet sind. Publ. Math. Debrecen 10, 171–190.Google Scholar
  11. Aczél, J. and Daróczy, Z. 1963B Sur la caractérisation axiomatique des entropies d'ordre positif, y comprise l'entropie de Shannon. C.R. Acad. Sci. Paris 257, 1581–1584.Google Scholar
  12. Aczél, J. and Darőczy, 1975 On measures of information and their characterizations. (Mathematics in science and engineering. Vol. 115). Academic Press, New York-San Francisco-London, 1975.Google Scholar
  13. Aczél, J. and Daróczy, Z. 1978 A mixed theory of information. I: Symmetric, recursive and measurable entropies of randomized systems of events. RAIRO Informat. Théor. 12, 149–155.Google Scholar
  14. Aczél, J. and Forte, B. 1984 Generalized entropies and the maximum entropy principle. In Proc. 4th Workshop Max. Entropy Bayesian Meth. in Appl. Statistics, August 1984, Calgary. Cambridge Univ. Press, Cambridge, England 1986, pp. 95–100.Google Scholar
  15. Aczél, J., Forte, B. and Ng, C.T. 1974 Why the Shannon and Hartley entropies are ‘natural'. Adv. in Appl. Probab. 6, 131–146.Google Scholar
  16. Aczél, J. and Kannappan, Pl. 1978 A mixed theory of information. III. Inset entropies of degree β. Inform. and Control 39, 315–322.Google Scholar
  17. Aczél, J. and Kannappan, Pl. 1982 General two-place information functions. Resultate Math. 5, 99–106.Google Scholar
  18. Aczél, J. and Ng, C.T. 1981 On general information functions. Utilitas Math. 19, 157–170.Google Scholar
  19. Aczél, J. and Ng, C.T. 1983 Determination of all semisymmetric recursive information measures of multiplicative type on n positive discrete probability distributions. Linear Algebra and Appl. 52–53, 1–30.Google Scholar
  20. Aczél, J. and Ostrowski, A.M. 1973 On the characterization of Shannon's entropy by Shannon's inequality. J. Austral. Math. Soc. 16, 368–374.Google Scholar
  21. Aczél, J. and Pfanzagl, J. 1966 Remarks on the measurement of subjective probability and information. Metrika 11, 91–105.Google Scholar
  22. Aggarwal, N.L., Cesari, Y. and Picard, C.-F. 1972 Propriétés de branchement liées aux questionnaires de Campbell et à l'information de Rényi. C.R. Acad. Sci. Paris Sér. A 275, 437–440.Google Scholar
  23. Arimoto, S. 1970 Bayesian decision rule and quantity of equivoction (Japanese). Denshi Tsushin Gakkai Ronbunshi Sect. C 53, 16–22. (English translation: Systems-Comput.-Controls 1 (1970), 17–23.)Google Scholar
  24. Arimoto, S. 1971 Information-theoretical considerations on estimation problems. Inform. and Control 19, 181–194.Google Scholar
  25. Arimoto, S. 1972 Generalized information measure and finite-parameter estimation problems. Inform. Process. in Japan 12, 26–30.Google Scholar
  26. Batten, D.F. 1983 Spatial analysis of interacting economies. Kluwer-Nijhoft, Boston-Hague-London, 1983.Google Scholar
  27. Batty, M. 1978 Speculations on an information theoretical approach to apatial representation. In Spatial representation and spatial interaction. Nijhoft, Leiden-Boston, 1978, pp. 115–147.Google Scholar
  28. Behara, M. and Nath, P. 1973 Additive and non-additive entropies of finite measurable partitions. In Probability and information theory, II. (Lecture Notes in Math. Vol. 296). Springer, Berlin, 1973, pp. 102–138.Google Scholar
  29. Behara, M. and Nath, P. 1974 Information and entropy of countable measurable partitions. I. Kybernetika (Prague) 10, 491–503.Google Scholar
  30. Borges, R. 1967 Zur Herleitung der Shanonschen Information. Math. Z. 96, 282–287.Google Scholar
  31. Campbell, L.L. 1965 A coding theorem and Rényi's entropy. Inform. and Control 8, 423–429.Google Scholar
  32. Campbell, L.L. 1985 The relation between information theory and the differential geometry approach to statistics. Inform. Sci. 35, 199–210.Google Scholar
  33. Daróczy, Z. 1964 Ueber Mittelwerte und Entropien vollständiger Wahrscheinlichkeitsverteilungen. Acta Math. Acad. Sci. Hungar. 15, 203–210.Google Scholar
  34. Daróczy, Z. 1969 On the Shannon measure of information (Hungarian). Magyar Tud. Akad. Mat. Fiz. Oszt. Közl. 19, 9–24. (English translation in Selected translations in mathematical statistics and probability. Vol. 10. Inst. Math. Statist.-Amer. Math. Soc., Providence, RI, 1972, pp. 193–210).Google Scholar
  35. Daróczy, Z. 1970 Generalized information functions. Inform. and Control 16, 36–51.Google Scholar
  36. Daróczy, Z. 1971 On measurable solutions of a functional equation. Acta Math. Acad. Sci. Hungar. 22, 11–14.Google Scholar
  37. Daróczy, Z. and Járai, A. 1979 On the measurable solutions of a functional equation arising in information theory. Acta Math. Acad. Sci. Hungar. 34, 105–116.Google Scholar
  38. Daróczy, Z. and Maksa, Gy. 1977 Nonnegative information functions. In Analytic function methods in probability theory (Proc. Colloq. Methods of Complex Anal. in the Theory of Probab. and Statist., Kossuth L. Univ. Debrecen 1977). (Colloq. Math. Soc. János Bolyai. Vol. 21). North Holland, Amsterdam, 1979, pp. 67–78.Google Scholar
  39. De Luca, A. and Termini, S. 1972 A definition of a nonprobabilistic entropy in the setting of fuzzy set theory. Inform. and Control 20, 301–312.Google Scholar
  40. Devijver, P.A. 1976 Entropie quadratique et reconnaissance des formes. In Computer oriented processes, Noordhof, Leyden, 1976, pp. 257–277.Google Scholar
  41. Devijver, P.A. 1977 Entropies of degree β and lower bounds of the average error rate. Inform. and Control 34, 222–226.Google Scholar
  42. Diderrich, G.T. 1975 The role of boundedness in characterizing Shannon entropy. Inform. and Control 29, 149–161.Google Scholar
  43. Diderrich, G.T. 1978 Local boundedness and the Shannon entropy. Inform. and Control 36, 292–308.Google Scholar
  44. Diderrich, G.T. 1986 Boundedness on a set of positive measure and the fundamental equation of information. Publ. Math. Debrecen 33, 1–7.Google Scholar
  45. Ebanks, B.R., Kannappan, Pl. and Ng, C.T. 1987 Generalized fundamental equation of information of multiplicative type. Aequationes Math. 32, 19–31.Google Scholar
  46. Ebanks, B.R., Kannappan, Pl. and Ng, C.T. 1988 Recursive inset entropies of multiplicative type on open domains. To appear.Google Scholar
  47. Ebanks, B. and Maksa, Gy. 1986 Measures of inset information on the open domain — I. Inset entropies and information functions of all degrees. Aequationes Math. 30, 187–206.Google Scholar
  48. Faddeev, D.K. 1956 On the concept of entropy of a finite probabilistic scheme (Russian). Uspekhi Mat. Nauk. 11, No. 1 (67), 227–231. (German translation in Arbeiten sur Informationtheorie, I. (Mathematische Forschungsbereichte. No. IV). Deutscher Verlag der Wissenschaften, Berlin, 1957, pp. 88–90.Google Scholar
  49. Feinstein, A. 1958 Foundations of information theory. McGraw-Hill, New York, 1958.Google Scholar
  50. Fischer, P. 1972 On the inequality Σp if(pi) ≥ Σpif(qi). Metrika 18, 199–208.Google Scholar
  51. Fischer, P. 1974 On the inequality \(\sum\limits_{i = 1}^n {p_i \frac{{f(p_i )}}{{f(q_i )}}} \leqslant 1\). Canad. Math. Bull. 17, 193–199.Google Scholar
  52. Fischer, P. 1975 On the inequality \(\sum\limits_{i = 1}^n {p_i \frac{{f(p_i )}}{{f(q_i )}}} \geqslant 1\). Pacific J. Math. 60, 65–74.Google Scholar
  53. Forte, B. 1973 Why Shannon's entropy. In Symposia mathematica. Vol. XVIII (Conv. Inform. Teor. Ist. Naz. Alta Mat. Roma 1973). Monograf, Bologna, 1975, pp. 137–152.Google Scholar
  54. Forte, B. 1976 Characterization of the entropy functional for grand canonical ensembles. The discrete case. Ann. Mat. Pura Appl. (4) 111, 213–228.Google Scholar
  55. Forte, B. 1977 Subadditive entropies for a random variable. Boll. Un. Mat. Ital. B (5) 14, 118–133.Google Scholar
  56. Forte, B. 1984 Entropies with and without probabilities. Applications to questionnaires. Inform. Process. and Management 20, 397–405.Google Scholar
  57. Forte, B. and Bortone 1977 Non-symmetric entropies with the branching property. Utilitas Math. 12, 3–23.Google Scholar
  58. Forte, B. and Ng, C.T. 1974 Entropies with the branching property. Ann. Mat. Pura Appl. (4) 101, 355–373.Google Scholar
  59. Forte, B., Ng, C.T. and Lo Schiavo, M. 1984 Additive and subadditive entropies for discrete random vectors. J. Combin. Inform. System Sci. 9, 207–216.Google Scholar
  60. Gallager, R.G. 1968 Information theory and reliable communication. Wiley, New York-London-Sydney-Toronto, 1968.Google Scholar
  61. Guiasu, S. and Shenitser, A. 1985 The principle of maximum entropy. Math. Intelligencer 7, 42–48.Google Scholar
  62. Havrda, J. and Charvát, F. 1967 Quantification method of classification processes. Concept of structural a-entropy. Kybernetika (Prague) 3, 30–35.Google Scholar
  63. Járai, A. 1985 Remark 12. In Proc. 23rd Internat. Symp. Functional Equations, June 1985, Gargnano, Italy. Centre for Information Theory, Univ. of Waterloo, Waterloo, Ont. 1985, pp. 57–58.Google Scholar
  64. Jaynes, E.T. 1957A Information theory and statistical mechanics. Phys. Rev. (2) 106, 620–630.Google Scholar
  65. Jaynes, E.T. 1957B Information theory and statistical mechanics II. Phys. Rev. (2) 108, 171–190.Google Scholar
  66. Jelinek, F. 1968 Buffer over flow in variable length coding of fixed rate sources. IEEE Trans. Inform. Theory IT-11, 490–501.Google Scholar
  67. Kannappan, Pl. 1974 On a generalization of some measures in information theory. Glas. Mat. Ser. III 9(29), 81–93.Google Scholar
  68. Kannappan, Pl. 1978 Note on generalized information function. Töhoku Math. J. 30, 251–255.Google Scholar
  69. Kannappan, Pl. 1979 An application of a differential equation in information theory. Glas. Mat. Ser. III 14(34), 269–274.Google Scholar
  70. Kannappan, Pl. 1980A A mixed theory of information — IV: Inset inaccuracy and directed divergence. Metrika 27, 91–98.Google Scholar
  71. Kannappan, Pl. 1980B On some functional equations from additive and nonadditive measures — I. Proc. Edinburgh Math. Soc. (2) 23, 145–150.Google Scholar
  72. Kannappan, Pl. 1985 On a generalization of sum form functional equation — V. Aequationes Math. 28, 255–261.Google Scholar
  73. Kannappan, Pl. and Ng, C.T. 1973 Measurable solutions of functional equations related to information theory. Proc. Amer. Math. Soc. 38, 303–310.Google Scholar
  74. Kannappan, Pl. and Ng, C.T. 1980 On functional equations and measures of information. II. J. Appl. Probab. 17, 271–277.Google Scholar
  75. Kannappan, Pl. and Sahoo, P.K. 1985 On a functional equation connected to sum form nonadditive information measures on an open domain. C.R. Math. Rep. Acad. Sci. Canada 7, 45–50.Google Scholar
  76. Kannappan, Pl. and Sahoo, P.K. 1987 On the general solution of a functional equation connected to sum form information measures on open domain. Glas. Mat. Ser. III. To appear.Google Scholar
  77. Kannappan, Pl. and Sander, W. 1982 A mixed theory of information — VIII: Inset measures depending on several distributions. Aequationes Math. 25, 177–193.Google Scholar
  78. Kapur, J.N. 1983 Twenty-five years of maximum-entropy principle. J. Math. Phys. Sci. 17, 103–156.Google Scholar
  79. Kendall, D.G. 1963 Functional equations in information theory. Z. Warsch. Verw. Gebiete 2, 225–229.Google Scholar
  80. Kullback, S. 1959 Information theory and statistics. Wiley, New York — Chapman & Hall, London, 1959.Google Scholar
  81. Lawrence, J., Mess, G. and Zorzitto 1979 Near derivations and information functions. Proc. Amer. Math. Soc. 7, 117–122.Google Scholar
  82. Lee, P.M. 1964 On the axioms of information theory. Ann. Math. Statist. 35, 415–418.Google Scholar
  83. Losonczi, L. 1981 A characterization of entropies of degree α. Metrika 28, 237–244.Google Scholar
  84. Losonczi, L. 1985 Sum form equations on an open domain. I. C.R. Math. Rep. Acad. Sci. Canada 7, 85–90.Google Scholar
  85. Losonczi, L. 1986 Sum form equations on an open domain. II. Utilitas Math. 29, 125–132.Google Scholar
  86. Losonczi, L. and Maksa, Gy. 1981 The general solution of a functional equation in information theory. Glas. Mat. Ser. III 16(36), 261–266.Google Scholar
  87. Losonczi, L. and Maksa, Gy. 1982 On some functional equations of the information theory. Acta Math. Acad. Sci. Hungar. 39, 73–82.Google Scholar
  88. Maksa, Gy. 1980 Bounded symmetric information functions. C.R. Math. Rep. Acad. Sci. Canada 2, 247–252.Google Scholar
  89. Maksa, Gy. 1981A Characterization of nonnegative information functions. Proc. Amer. Math. Soc. 81, 406–408.Google Scholar
  90. Maksa, Gy. 1981B On the bounded solutions of a functional equation. Acta Math. Acad. Sci. Hungar. 37, 445–450.Google Scholar
  91. Maksa, Gy. 1982 Solution on the open triangle of the generalized fundamental equation of information with four unknown functions. Utilitas Math. 21C, 267–282.Google Scholar
  92. Maksa, Gy. 1988 The role of boundedness and nonnegativity in characterizing entropies of degree α. To appear.Google Scholar
  93. Meginnis, J.R. 1976 A new class of symmetric utility rules for gambles, subjective marginal probability functions, and a generalized Bayes rule. Bus. and Econ. Stat. Sec. Proc. Amer. Stat. Assoc. 1976, 471–476.Google Scholar
  94. Ng, C.T. 1974 Representation of measures of information with the branching property. Inform. and Control 25, 45–56.Google Scholar
  95. Ng, C.T. 1987 The equation F(x) + M(x)G(1/x)=0 and homogenous biadditive forms. Linear Algebra and Appl., to appear.Google Scholar
  96. Picard, C.-F. 1972 Graphes et questionnaires. Tome 2. Questionnaires. Gauthier-Villars, 1972. (English translation: Graphs and questionnaires. Elsevier North-Holland, Amsterdam-New York, 1980.)Google Scholar
  97. Rathie, P.N. and Kannappan, Pl. 1971 On a functional equation connected with Shannon's entropy. Funkcial. Ekvac. 14, 153–159.Google Scholar
  98. Rényi, A. 1960 On measures of entropy and information. In Proc. 4th Berkeley Symp. Math. Statist. Probability, 1960. Vol. I. Univ. of California Press, Berkeley, 1961, pp. 547–561.Google Scholar
  99. Rényi, A. 1965 On the foundations of information theory. Rev. Inst. Internat. Statist. 33, 1–14.Google Scholar
  100. Sander, W. 1987 A mixed theory of information — X. Information functions and information measures. J. Math. Anal. Appl., to appear.Google Scholar
  101. Sharma, B. and Mittal, D.P. 1975 New non-additive measures for discrete probability distributions. J. Math. Sci. 10, 28–40.Google Scholar
  102. Sommerfeld, A. 1953 Thermodynamik und Statistik. (Herausg. F. Bopp und J. Meixner). Dietrich, Wiesbaden, 1953. (English translation: Thermodynamics and statistical mechanics. Lectures on theoretical physics. Vol. V. Academic Press, New York, 1956.)Google Scholar
  103. Special Session on Information Measures 1980 In Proc. 18th Internat. Symp. Functional Equations, Waterloo-Scarborough, Aug. 26-Sept. 6, 1980. Centre for Information Theory, Univ. of Waterloo, Waterloo, Ont. 1980, p. 46.Google Scholar
  104. Taneja, I.J. 1977 On the branching property of entropy. Ann. Polon. Math. 35, 67–75.Google Scholar
  105. Theil, H. 1967 Economics and information theory. North Holland, Amsterdam — Rand McNally, Chicago, 1967.Google Scholar
  106. Tverberg, H. 1958 A new derivation of the information function. Math. Scand. 6, 297–298.Google Scholar
  107. Van der Pyl, T. 1976 Axiomatique de l'information d'ordre α et de type β. C.R. Acad. Sci. Paris Sér. A 282, 1031–1033.Google Scholar
  108. Van der Pyl, T. 1977A Information d'ordre α et de type β: axiomatique, propriétés. Thèse. Univ. Pierre et Marie Curie, Paris, 1977.Google Scholar
  109. Van der Pyl, T. 1977B Propriétés de l'information d'ordre α et de type β. In Théorie de l'information; développements récents et applications. Cachan, 4–8 juillet, 1977. (Colloques internat. du C.N.R.S. No. 276). Centre National de la Recherche Scientifique, Paris, 1978, pp. 161–171.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1987

Authors and Affiliations

  • J. Aczél
    • 1
  1. 1.Centre for Information TheoryUniversity of WaterlooWaterlooCanada

Personalised recommendations