Classic Maximum Entropy
This paper presents a fully Bayesian derivation of maximum entropy image reconstruction. The argument repeatedly goes from the particular to the general, in that if there are general theories then they must apply to special cases. Two such special cases, formalised as the “Cox axioms “, lead to the well-known fact that Bayesian probability theory is the only consistent language of inference. Further cases, formalised as the axioms of maximum entropy, show that the prior probability distribution for any positive, additive distribution must be monotonic in the entropy. Finally, a quantified special case shows that this monotonic function must be the exponential, leaving only a single dimensional scaling factor to be determined a posteriori. Many types of distribution, including probability distributions themselves, are positive and additive, so the entropy exponential is very general.
The following paper (Gull 1989) applies these ideas to image reconstruction, showing how a sophisticated treatment can incorporate prior expectation of spatial correlations.
KeywordsMaximum Entropy Maximum Entropy Method Testable Information Prior Probability Distribution Incoherent Light
Unable to display preview. Download preview PDF.
- Gull, S.F. (1989). Developments in maximum entropy data analysis. In these Proceedings.Google Scholar
- Gull, S.F. & Skilling, J. (1984). The maximum entropy method. In Indirect Imaging, ed. J.A. Roberts. Cambridge University Press.Google Scholar
- Jaynes, E.T. (1978). “Where do we stand on maximum entropy? Reprinted in E.T. Jaynes: Papers on Probability, Statistics and Statistical Physics, ed. R. Rosenkrantz, 1983 Dordrecht: Reidel.Google Scholar
- Rodriguez, C. (1989). The metrics induced by the Kullback number. In these Proceedings.Google Scholar
- Smith, C.R. & Erickson, G.J. (1989). From rationality and consistency to Bayesian probability. In these Proceedings.Google Scholar