# The Maximum Entropy on the Mean Method, Noise and Sensitivity

• Jean-François Bercher
• Guy Le Besnerais
• Guy Demoment
Chapter
Part of the Fundamental Theories of Physics book series (FTPH, volume 70)

## Abstract

In this paper we address the problem of building convenient criteria to solve linear and noisy inverse problems of the form y = Ax + n. Our approach is based on the specification of constraints on the solution x through its belonging to a given convex set C. The solution is chosen as the mean of the distribution which is the closest to a reference measure μ on C with respect to the Kullback divergence, or cross-entropy. This is therefore called the Maximum Entropy on the Mean Method (MEMM). This problem is shown to be equivalent to the convex one x = arg minx F(x) submitted to y = Ax (in the noiseless case). Many classical criteria are found to be particular solutions with different reference measures μ. But except for some measures, these primal criteria have no explicit expression. Nevertheless, taking advantage of a dual formulation of the problem, the MEMM enables us to compute a solution in such cases. This indicates that such criteria could hardly have been derived without the MEMM. In order to integrate the presence of additive noise in the MEMM scheme, the object and noise are searched simultaneously for in an appropriate convex C′. The MEMM then gives a criterion of the form x = arg minxF(x) + G(y - Ax), where F and G are convex, without constraints. The functional G is related to the prior distribution of noise, and may be used to account for specific noise distributions. Using the regularity of the criterion, the sensitivity of the solution to variations of the data is also derived.

## Preview

### References

1. [1]
P. L. Combettes. The foundation of set theoretic estimation. Proceedings of the IEEE 81 (2): 182 – 208, Feb. 1993.
2. [2]
I. Csiszár. Why least-squares and maximum entropy ? An axiomatic approach to inference for linear inverse problems. The Annals of Statistics, 19 (4): 2032–2066, 1991.
3. [3]
D. Dacunha-Castelle and F. Gamboa. Maximum d’entropie et problème des moments. Annales de l’Institut Henri Poincaré, 26 (4): 567–596, 1990.
4. [4]
R. S. Elfis. Entropy, Large Deviations, and Statistical Mechanics. Springer-Verlag, New York, 1985.
5. [5]
L. K. Jones and C. L. Byrne. General entropy criteria for inverse problems, with applications to data compression, pattern classification and cluster analysis. IEEE transactions on Information Theory, 36 (l): 23–30, Jan. 1990.
6. [6]
R. E. Kass and L. Wasserman. Formal Rules for Selecting Prior Distributions: A Review and Annotated Bibliography. Technical report, Department of Statistics, Carnegie Mellon University, 1994. Submitted to Jnl. of American Statistic Association.Google Scholar
7. [7]
S. Kullback. Information Theory and Statistics. Wiley, New York, 1959.
8. [8]
G. Le Besnerais. Méthode du maximum d’entropie sur la moyenne, critères de reconstruction d’image et synthese d’ouverture en radio-astronomie. PhD thesis, University of Paris-Sud, 1993.Google Scholar
9. [9]
G. Le Besnerais, J.-F. Bercher, and G. Demoment. A new look at the entropy for solving linear inverse problems, submitted to IEEE transactions on Information Theory, 1994.Google Scholar
10. J. Navaza. The use of non-local constraints in maximum-entropy electron density reconstruction. Acta Crystallographica, pages 212–223, 1986.Google Scholar
11. J. E. Shore. Minimum cross-entropy spectral analysis. IEEE transactions on Acoustics, Speech and Signal Processing, (2): 230 – 237, Apr. 1981.