Abstract
The Shannon entropy and the associated Kullback—Leibler divergence measure (relative entropy) between probability measures are fundamental from the applications point of view, and arise naturally from statistical concepts. While restricted to information theory and statistics in the work of Shannon [38] and Kullback—Leibler [31] in the early 1950s, the concept of entropy started to be used in optimization modeling for various problems of engineering and management science. An early work on entropy optimization problems over linear constraints sets (equality or inequality) was studied by Chames and Cooper [16] via convex programming techniques. Many other useful applications in a diversity of problems such as traffic engineering, game theory, information theory, and marketing were developed by Chames et al. (see, e.g., [15,17-19] and the references therein). For further general information and applications, we refer the reader to Frieden [23] and Kay and Marple [30] for engineering problems and to Lev and Theil [32] and more recently to Theil and Feibig [39] for economic and finance models.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Ben-Tal, A., “The Entropic Penalty Approach to Stochastic Programming,” Mathematics of Operations Research 10 (1985), 263–279.
Ben-Tal, A. and A. Ben-Israel, “A Recourse Certainty Equivalent for Decisions Under Uncertainty” Annals of Operation Research 31 (1991).
Ben-Tal, A., J. Borwein, and M. Teboulle, “Spectral Estimation via Convex Programming,” in this volume.
Ben-Tal, A., A. Chames, and B. Golany, “On N-Person Game Solutions and Convex Programs with Essentially Unconstrained Duals,” Technical Report Center for Cybernetics Studies (CCS), The University of Texas, Austin, 1987.
Ben-Tal, A., A. Chames, and M. Teboulle, “Entropic Means,” Journal of Mathematical Analysis and Applications 139 (1989), 537–551.
Ben-Tal, A. and M. Teboulle, “Expected Utility, Penalty Functions and Duality in Stochastic Non-Linear Programming,” Management Science 32 (1986), 1445–1466.
Ben-Tal, A. and M. Teboulle, “Rate Distortion Theory with Generalized Information Measures via Convex Programming Duality,” IEEE Transactions on Information Theory IT32 (1986), 630–641.
Ben-Tal, A. and M. Teboulle, “Penalty Functions and Duality in Stochastic Programming via 0-divergence Functionals.” Mathematics of Operations Research 12 (1987), 224–240.
Ben-Tal, A. and M. Teboulle, “Portfolio Theory for the Recourse Certainty-Equivalent Maximixing Investor” Annals of Operations Research 31 (1991) 479–500.
Ben-Tal, A. and M. Teboulle, “Extension of Some Results for Channel Capacity,” Journal of Applied Mathematics and Optimization 17 (1988), 121–132.
Berger, T., Rate Distortion Theory, Prentice-Hall, Englewood Cliffs, NJ, 1971.
Burbea, J., “The Bose-Einstein Entropy of Degree a and its Jensen Difference,” Utilitas Mathematica 25 (1984), 225–240.
Burbea, J. and C.R. Rao, “On the Convexity of Some Divergence Measures Based on Entropy Functions,” IEEE Transactions on Information Theory IT28 (1982), 489–495.
Burg, J.P., Maximum Entropy Spectral Analysis, Ph.D. thesis, Stanford University. Stanford, CA, 1975.
Chames, A., P.L. Brockett, and K. Paick, “Computation of Minimum Cross Entropy Spectral Estimates: An Unconstrained Dual Convex Programming Method,” IEEE Transactions on Information Theory IT32 (1986), 236–242.
Chames, A. and W.W. Cooper, “Constrained Kullback¨CLeibler Estimation: Generalized Cobb¨CDouglas Balance and Unconstrained Convex Programming,” Re. Accad. Naz. Sez VIII, 58 (1975), 568–576.
Chames, A., W.W. Cooper, and L. Seiford, “Extremal Principles and Optimization Dualities for Khincin¨CKullback¨CLeibler Estimation,” Optimization 9 (1) (1978), 21–29.
Chames, A., W.W. Cooper, and D.B. Learner, “Constrained Information Teoretic Characterizations in Consumer Purchase Behaviour,” Journal of the Operational Research Society 29 (1978), 833–842.
Chames, A. and M. Keane, “Convex Nuclei and the Shapley Value,” International Congress of Mathematicians, Nice, 1970.
Chames, A. and K. Kortanek, “On Classes of Convex Preemptive Nuclei for n-Person Games,” Princeton Symposium on Mathematical Programming, Princeton, NJ, 1967.
Chames, A., S. Littlechild, and S. Sorensen, “Core-Stem Solutions of n-Person Essential Games,” Journal of Socio-Economic Planning Sciences 7 (1973), 649–660.
Chames, A., J. Rousseau, and L. Seiford, “Complements Mollifiers and the Propensity to Disrupt,” International Journal of Game Theory 7 (1976), 37–50.
Csiszar, J., “Information-Type Measures of Difference of Probability Distributions of Indirect Observations,” Studia Mat. Hungar 2 (1967), 299–318.
Frieden, B.R., “Image Enhancement and Restoration,” in T.S. Huang (ed.), Picture Processing and Digital Filtering, 1975.
Gini, C., “Di una Formula Compressiva delle Medie,” Metron 13 (1938), 3–22.
Goodrick, B.K. and A. Steinhardt, “L 2 Spectral-Estimation,“ SIAM Journal of Applied Mathematics 46 (1986), 417–426.
Jaynes, E.T., “Information Theory and Statistical Mechanics,” Physical Review 106 (1957), 620–630.
Johnson, R. and J.E. Shore, “Which is the Better Entropy Expression for Speech Processing—s log s ór log s?,” IEEE Transactions on Acoustics, Speech, Signal Processing ASSP-29 (1981), 129–136.
Kagan, A.M., “On the Theory of Fisher’s Amount of Information,” Soy. Math.,Dik14 (1963), 991–993.
Kapur, J.N., “On the Roles of Maximum Entropy and Minimum Discrimination Information Principles in Statistics,” 38th Annual Conference of the Indian Society of Agricultural Statistics, 1984.
Kay, S.M. and S.L. Marple, “Spectrum Analysis—A modern Perspective,” Proceedings of the IEEE 69 (1981), 1380–1419.
Kullback, S. and R.A. Leibler, “On Information and Sufficiency,” Annals of Mathematics 22 (1951), 79–86.
Lev, B. and H. Theil, “A Maximum Entropy Approach to the Choice of Asset Depreciation,” Journal of Accounting Research 16 (1978), 286–293.
Luenberger, D.G., Optimization by Vector Space Methods, John Wiley and Sons, New York, 1969.
Renyi, A., “On Measures of Entropy and Information,” Berkeley Symposium on Mathematical Statistics, University of California, Berkeley, CA, 1961.
Schmeidler, D., “The Nucleolus of a Characteristic Function Game,” SIAM Journal of Applied Mathematical 17 (1969), 1163–1170.
Seiford, L., Entropic Solutions and Disruption Solutions for n-Person Games, Ph.D. thesis, University of Texas, Austin, Austin, TX, 1977.
Sempi, C. and B. Forte, “Maximizing Conditional Entropies: A Derivation of Quantal Statistics,” Rend. Mat. 6 (1976), 551–566.
Shannon, C.E., “Mathematical theory of Communication,” Bell Systems Technical Journal 27 (1948), 379–423, 623–656.
Theil, H. and D.C. Feibig, Exploiting Continuity Maximum Entropy Estimation of Continuous Distribution, Ballinger, Cambridge, MA, 1984.
Ziv, J. and M. Zakai, “On Functionals Satisfying a Data-Processing Theorem,” IEEE Transactions on Information Theory IT19 (1973), 275–283.
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1992 Springer Science+Business Media New York
About this chapter
Cite this chapter
Teboulle, M. (1992). On ϕ-Divergence and Its Applications. In: Phillips, F.Y., Rousseau, J.J. (eds) Systems and Management Science by Extremal Methods. Springer, Boston, MA. https://doi.org/10.1007/978-1-4615-3600-0_17
Download citation
DOI: https://doi.org/10.1007/978-1-4615-3600-0_17
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4613-6599-0
Online ISBN: 978-1-4615-3600-0
eBook Packages: Springer Book Archive