Skip to main content

On ϕ-Divergence and Its Applications

  • Chapter
  • 136 Accesses

Abstract

The Shannon entropy and the associated Kullback—Leibler divergence measure (relative entropy) between probability measures are fundamental from the applications point of view, and arise naturally from statistical concepts. While restricted to information theory and statistics in the work of Shannon [38] and Kullback—Leibler [31] in the early 1950s, the concept of entropy started to be used in optimization modeling for various problems of engineering and management science. An early work on entropy optimization problems over linear constraints sets (equality or inequality) was studied by Chames and Cooper [16] via convex programming techniques. Many other useful applications in a diversity of problems such as traffic engineering, game theory, information theory, and marketing were developed by Chames et al. (see, e.g., [15,17-19] and the references therein). For further general information and applications, we refer the reader to Frieden [23] and Kay and Marple [30] for engineering problems and to Lev and Theil [32] and more recently to Theil and Feibig [39] for economic and finance models.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Ben-Tal, A., “The Entropic Penalty Approach to Stochastic Programming,” Mathematics of Operations Research 10 (1985), 263–279.

    Article  Google Scholar 

  2. Ben-Tal, A. and A. Ben-Israel, “A Recourse Certainty Equivalent for Decisions Under Uncertainty” Annals of Operation Research 31 (1991).

    Google Scholar 

  3. Ben-Tal, A., J. Borwein, and M. Teboulle, “Spectral Estimation via Convex Programming,” in this volume.

    Google Scholar 

  4. Ben-Tal, A., A. Chames, and B. Golany, “On N-Person Game Solutions and Convex Programs with Essentially Unconstrained Duals,” Technical Report Center for Cybernetics Studies (CCS), The University of Texas, Austin, 1987.

    Google Scholar 

  5. Ben-Tal, A., A. Chames, and M. Teboulle, “Entropic Means,” Journal of Mathematical Analysis and Applications 139 (1989), 537–551.

    Article  Google Scholar 

  6. Ben-Tal, A. and M. Teboulle, “Expected Utility, Penalty Functions and Duality in Stochastic Non-Linear Programming,” Management Science 32 (1986), 1445–1466.

    Article  Google Scholar 

  7. Ben-Tal, A. and M. Teboulle, “Rate Distortion Theory with Generalized Information Measures via Convex Programming Duality,” IEEE Transactions on Information Theory IT32 (1986), 630–641.

    Article  Google Scholar 

  8. Ben-Tal, A. and M. Teboulle, “Penalty Functions and Duality in Stochastic Programming via 0-divergence Functionals.” Mathematics of Operations Research 12 (1987), 224–240.

    Article  Google Scholar 

  9. Ben-Tal, A. and M. Teboulle, “Portfolio Theory for the Recourse Certainty-Equivalent Maximixing Investor” Annals of Operations Research 31 (1991) 479–500.

    Article  Google Scholar 

  10. Ben-Tal, A. and M. Teboulle, “Extension of Some Results for Channel Capacity,” Journal of Applied Mathematics and Optimization 17 (1988), 121–132.

    Article  Google Scholar 

  11. Berger, T., Rate Distortion Theory, Prentice-Hall, Englewood Cliffs, NJ, 1971.

    Google Scholar 

  12. Burbea, J., “The Bose-Einstein Entropy of Degree a and its Jensen Difference,” Utilitas Mathematica 25 (1984), 225–240.

    Google Scholar 

  13. Burbea, J. and C.R. Rao, “On the Convexity of Some Divergence Measures Based on Entropy Functions,” IEEE Transactions on Information Theory IT28 (1982), 489–495.

    Article  Google Scholar 

  14. Burg, J.P., Maximum Entropy Spectral Analysis, Ph.D. thesis, Stanford University. Stanford, CA, 1975.

    Google Scholar 

  15. Chames, A., P.L. Brockett, and K. Paick, “Computation of Minimum Cross Entropy Spectral Estimates: An Unconstrained Dual Convex Programming Method,” IEEE Transactions on Information Theory IT32 (1986), 236–242.

    Google Scholar 

  16. Chames, A. and W.W. Cooper, “Constrained Kullback¨CLeibler Estimation: Generalized Cobb¨CDouglas Balance and Unconstrained Convex Programming,” Re. Accad. Naz. Sez VIII, 58 (1975), 568–576.

    Google Scholar 

  17. Chames, A., W.W. Cooper, and L. Seiford, “Extremal Principles and Optimization Dualities for Khincin¨CKullback¨CLeibler Estimation,” Optimization 9 (1) (1978), 21–29.

    Google Scholar 

  18. Chames, A., W.W. Cooper, and D.B. Learner, “Constrained Information Teoretic Characterizations in Consumer Purchase Behaviour,” Journal of the Operational Research Society 29 (1978), 833–842.

    Google Scholar 

  19. Chames, A. and M. Keane, “Convex Nuclei and the Shapley Value,” International Congress of Mathematicians, Nice, 1970.

    Google Scholar 

  20. Chames, A. and K. Kortanek, “On Classes of Convex Preemptive Nuclei for n-Person Games,” Princeton Symposium on Mathematical Programming, Princeton, NJ, 1967.

    Google Scholar 

  21. Chames, A., S. Littlechild, and S. Sorensen, “Core-Stem Solutions of n-Person Essential Games,” Journal of Socio-Economic Planning Sciences 7 (1973), 649–660.

    Article  Google Scholar 

  22. Chames, A., J. Rousseau, and L. Seiford, “Complements Mollifiers and the Propensity to Disrupt,” International Journal of Game Theory 7 (1976), 37–50.

    Google Scholar 

  23. Csiszar, J., “Information-Type Measures of Difference of Probability Distributions of Indirect Observations,” Studia Mat. Hungar 2 (1967), 299–318.

    Google Scholar 

  24. Frieden, B.R., “Image Enhancement and Restoration,” in T.S. Huang (ed.), Picture Processing and Digital Filtering, 1975.

    Google Scholar 

  25. Gini, C., “Di una Formula Compressiva delle Medie,” Metron 13 (1938), 3–22.

    Google Scholar 

  26. Goodrick, B.K. and A. Steinhardt, “L 2 Spectral-Estimation,“ SIAM Journal of Applied Mathematics 46 (1986), 417–426.

    Article  Google Scholar 

  27. Jaynes, E.T., “Information Theory and Statistical Mechanics,” Physical Review 106 (1957), 620–630.

    Article  Google Scholar 

  28. Johnson, R. and J.E. Shore, “Which is the Better Entropy Expression for Speech Processing—s log s ór log s?,” IEEE Transactions on Acoustics, Speech, Signal Processing ASSP-29 (1981), 129–136.

    Google Scholar 

  29. Kagan, A.M., “On the Theory of Fisher’s Amount of Information,” Soy. Math.,Dik14 (1963), 991–993.

    Google Scholar 

  30. Kapur, J.N., “On the Roles of Maximum Entropy and Minimum Discrimination Information Principles in Statistics,” 38th Annual Conference of the Indian Society of Agricultural Statistics, 1984.

    Google Scholar 

  31. Kay, S.M. and S.L. Marple, “Spectrum Analysis—A modern Perspective,” Proceedings of the IEEE 69 (1981), 1380–1419.

    Article  Google Scholar 

  32. Kullback, S. and R.A. Leibler, “On Information and Sufficiency,” Annals of Mathematics 22 (1951), 79–86.

    Google Scholar 

  33. Lev, B. and H. Theil, “A Maximum Entropy Approach to the Choice of Asset Depreciation,” Journal of Accounting Research 16 (1978), 286–293.

    Article  Google Scholar 

  34. Luenberger, D.G., Optimization by Vector Space Methods, John Wiley and Sons, New York, 1969.

    Google Scholar 

  35. Renyi, A., “On Measures of Entropy and Information,” Berkeley Symposium on Mathematical Statistics, University of California, Berkeley, CA, 1961.

    Google Scholar 

  36. Schmeidler, D., “The Nucleolus of a Characteristic Function Game,” SIAM Journal of Applied Mathematical 17 (1969), 1163–1170.

    Article  Google Scholar 

  37. Seiford, L., Entropic Solutions and Disruption Solutions for n-Person Games, Ph.D. thesis, University of Texas, Austin, Austin, TX, 1977.

    Google Scholar 

  38. Sempi, C. and B. Forte, “Maximizing Conditional Entropies: A Derivation of Quantal Statistics,” Rend. Mat. 6 (1976), 551–566.

    Google Scholar 

  39. Shannon, C.E., “Mathematical theory of Communication,” Bell Systems Technical Journal 27 (1948), 379–423, 623–656.

    Google Scholar 

  40. Theil, H. and D.C. Feibig, Exploiting Continuity Maximum Entropy Estimation of Continuous Distribution, Ballinger, Cambridge, MA, 1984.

    Google Scholar 

  41. Ziv, J. and M. Zakai, “On Functionals Satisfying a Data-Processing Theorem,” IEEE Transactions on Information Theory IT19 (1973), 275–283.

    Article  Google Scholar 

Download references

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1992 Springer Science+Business Media New York

About this chapter

Cite this chapter

Teboulle, M. (1992). On ϕ-Divergence and Its Applications. In: Phillips, F.Y., Rousseau, J.J. (eds) Systems and Management Science by Extremal Methods. Springer, Boston, MA. https://doi.org/10.1007/978-1-4615-3600-0_17

Download citation

  • DOI: https://doi.org/10.1007/978-1-4615-3600-0_17

  • Publisher Name: Springer, Boston, MA

  • Print ISBN: 978-1-4613-6599-0

  • Online ISBN: 978-1-4615-3600-0

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics