Maximum Entropy and Bayesian Methods pp 223-232 | Cite as

# The Maximum Entropy on the Mean Method, Noise and Sensitivity

## Abstract

In this paper we address the problem of building convenient criteria to solve linear and noisy inverse problems of the form * y* =

*+*

**Ax***. Our approach is based on the specification of constraints on the solution*

**n***through its belonging to a given convex set*

**x***C*. The solution is chosen as the mean of the distribution which is the closest to a reference measure μ on

*C*with respect to the Kullback divergence, or cross-entropy. This is therefore called the Maximum Entropy on the Mean Method (MEMM). This problem is shown to be equivalent to the convex one

*= arg min*

**x**_{ x }F(

*) submitted to*

**x***=*

**y***(in the noiseless case). Many classical criteria are found to be particular solutions with different reference measures μ. But except for some measures, these primal criteria have no explicit expression. Nevertheless, taking advantage of a dual formulation of the problem, the MEMM enables us to compute a solution in such cases. This indicates that such criteria could hardly have been derived without the MEMM. In order to integrate the presence of additive noise in the MEMM scheme, the object and noise are searched simultaneously for in an appropriate convex*

**Ax***C*′. The MEMM then gives a criterion of the form

*= arg min*

**x**_{ x }

*F*(

*) +*

**x***G*(

*-*

**y***), where*

**Ax***F*and

*G*are convex, without constraints. The functional

*G*is related to the prior distribution of noise, and may be used to account for specific noise distributions. Using the regularity of the criterion, the sensitivity of the solution to variations of the data is also derived.

## Keywords

Maximum Entropy Dual Problem Reference Measure Regularized Criterion Convex Constraint## Preview

Unable to display preview. Download preview PDF.

## References

- [1]P. L. Combettes. The foundation of set theoretic estimation.
*Proceedings of the IEEE*81 (2): 182 – 208, Feb. 1993.CrossRefGoogle Scholar - [2]I. Csiszár. Why least-squares and maximum entropy ? An axiomatic approach to inference for linear inverse problems.
*The Annals of Statistics*, 19 (4): 2032–2066, 1991.MathSciNetzbMATHCrossRefGoogle Scholar - [3]D. Dacunha-Castelle and F. Gamboa. Maximum d’entropie et problème des moments.
*Annales de l’Institut Henri Poincaré*, 26 (4): 567–596, 1990.MathSciNetzbMATHGoogle Scholar - [4]R. S. Elfis.
*Entropy, Large Deviations, and Statistical Mechanics*. Springer-Verlag, New York, 1985.MathSciNetzbMATHGoogle Scholar - [5]L. K. Jones and C. L. Byrne. General entropy criteria for inverse problems, with applications to data compression, pattern classification and cluster analysis.
*IEEE transactions on Information Theory*, 36 (l): 23–30, Jan. 1990.MathSciNetzbMATHCrossRefGoogle Scholar - [6]R. E. Kass and L. Wasserman. Formal Rules for Selecting Prior Distributions: A Review and Annotated Bibliography. Technical report, Department of Statistics, Carnegie Mellon University, 1994. Submitted to Jnl. of American Statistic Association.Google Scholar
- [7]S. Kullback.
*Information Theory and Statistics*. Wiley, New York, 1959.zbMATHGoogle Scholar - [8]G. Le Besnerais.
*Méthode du maximum d’entropie sur la moyenne, critères de reconstruction d’image et synthese d’ouverture en radio-astronomie*. PhD thesis, University of Paris-Sud, 1993.Google Scholar - [9]G. Le Besnerais, J.-F. Bercher, and G. Demoment. A new look at the entropy for solving linear inverse problems,
*submitted to IEEE transactions on Information Theory*, 1994.Google Scholar - J. Navaza. The use of non-local constraints in maximum-entropy electron density reconstruction.
*Acta Crystallographica*, pages 212–223, 1986.Google Scholar - J. E. Shore. Minimum cross-entropy spectral analysis.
*IEEE transactions on Acoustics, Speech and Signal Processing*, (2): 230 – 237, Apr. 1981.MathSciNetCrossRefGoogle Scholar