Skip to main content

The Maximum Entropy on the Mean Method, Noise and Sensitivity

  • Conference paper
Maximum Entropy and Bayesian Methods

Part of the book series: Fundamental Theories of Physics ((FTPH,volume 70))

Abstract

In this paper we address the problem of building convenient criteria to solve linear and noisy inverse problems of the form y = Ax + n. Our approach is based on the specification of constraints on the solution x through its belonging to a given convex set C. The solution is chosen as the mean of the distribution which is the closest to a reference measure μ on C with respect to the Kullback divergence, or cross-entropy. This is therefore called the Maximum Entropy on the Mean Method (MEMM). This problem is shown to be equivalent to the convex one x = arg min x F(x) submitted to y = Ax (in the noiseless case). Many classical criteria are found to be particular solutions with different reference measures μ. But except for some measures, these primal criteria have no explicit expression. Nevertheless, taking advantage of a dual formulation of the problem, the MEMM enables us to compute a solution in such cases. This indicates that such criteria could hardly have been derived without the MEMM. In order to integrate the presence of additive noise in the MEMM scheme, the object and noise are searched simultaneously for in an appropriate convex C′. The MEMM then gives a criterion of the form x = arg min x F(x) + G(y - Ax), where F and G are convex, without constraints. The functional G is related to the prior distribution of noise, and may be used to account for specific noise distributions. Using the regularity of the criterion, the sensitivity of the solution to variations of the data is also derived.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. P. L. Combettes. The foundation of set theoretic estimation. Proceedings of the IEEE 81 (2): 182 – 208, Feb. 1993.

    Article  Google Scholar 

  2. I. Csiszár. Why least-squares and maximum entropy ? An axiomatic approach to inference for linear inverse problems. The Annals of Statistics, 19 (4): 2032–2066, 1991.

    Article  MathSciNet  MATH  Google Scholar 

  3. D. Dacunha-Castelle and F. Gamboa. Maximum d’entropie et problème des moments. Annales de l’Institut Henri Poincaré, 26 (4): 567–596, 1990.

    MathSciNet  MATH  Google Scholar 

  4. R. S. Elfis. Entropy, Large Deviations, and Statistical Mechanics. Springer-Verlag, New York, 1985.

    MathSciNet  MATH  Google Scholar 

  5. L. K. Jones and C. L. Byrne. General entropy criteria for inverse problems, with applications to data compression, pattern classification and cluster analysis. IEEE transactions on Information Theory, 36 (l): 23–30, Jan. 1990.

    Article  MathSciNet  MATH  Google Scholar 

  6. R. E. Kass and L. Wasserman. Formal Rules for Selecting Prior Distributions: A Review and Annotated Bibliography. Technical report, Department of Statistics, Carnegie Mellon University, 1994. Submitted to Jnl. of American Statistic Association.

    Google Scholar 

  7. S. Kullback. Information Theory and Statistics. Wiley, New York, 1959.

    MATH  Google Scholar 

  8. G. Le Besnerais. Méthode du maximum d’entropie sur la moyenne, critères de reconstruction d’image et synthese d’ouverture en radio-astronomie. PhD thesis, University of Paris-Sud, 1993.

    Google Scholar 

  9. G. Le Besnerais, J.-F. Bercher, and G. Demoment. A new look at the entropy for solving linear inverse problems, submitted to IEEE transactions on Information Theory, 1994.

    Google Scholar 

  10. J. Navaza. The use of non-local constraints in maximum-entropy electron density reconstruction. Acta Crystallographica, pages 212–223, 1986.

    Google Scholar 

  11. J. E. Shore. Minimum cross-entropy spectral analysis. IEEE transactions on Acoustics, Speech and Signal Processing, (2): 230 – 237, Apr. 1981.

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1996 Kluwer Academic Publishers

About this paper

Cite this paper

Bercher, JF., Le Besnerais, G., Demoment, G. (1996). The Maximum Entropy on the Mean Method, Noise and Sensitivity. In: Skilling, J., Sibisi, S. (eds) Maximum Entropy and Bayesian Methods. Fundamental Theories of Physics, vol 70. Springer, Dordrecht. https://doi.org/10.1007/978-94-009-0107-0_24

Download citation

  • DOI: https://doi.org/10.1007/978-94-009-0107-0_24

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-94-010-6534-4

  • Online ISBN: 978-94-009-0107-0

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics