Advertisement

Synthese

, Volume 187, Issue 1, pp 147–178 | Cite as

Deceptive updating and minimal information methods

  • Haim GaifmanEmail author
  • Anubav Vasudevan
Article

Abstract

The technique of minimizing information (infomin) has been commonly employed as a general method for both choosing and updating a subjective probability function. We argue that, in a wide class of cases, the use of infomin methods fails to cohere with our standard conception of rational degrees of belief. We introduce the notion of a deceptive updating method and argue that non-deceptiveness is a necessary condition for rational coherence. Infomin has been criticized on the grounds that there are no higher order probabilities that ‘support’ it, but the appeal to higher order probabilities is a substantial assumption that some might reject. Our elementary arguments from deceptiveness do not rely on this assumption. While deceptiveness implies lack of higher order support, the converse does not, in general, hold, which indicates that deceptiveness is a more objectionable property. We offer a new proof of the claim that infomin updating of any strictly-positive prior with respect to conditional-probability constraints is deceptive. In the case of expected-value constraints, infomin updating of the uniform prior is deceptive for some random variables but not for others. We establish both a necessary condition and a sufficient condition (which extends the scope of the phenomenon beyond cases previously considered) for deceptiveness in this setting. Along the way, we clarify the relation which obtains between the strong notion of higher order support, in which the higher order probability is defined over the full space of first order probabilities, and the apparently weaker notion, in which it is defined over some smaller parameter space. We show that under certain natural assumptions, the two are equivalent. Finally, we offer an interpretation of Jaynes, according to which his own appeal to infomin methods avoids the incoherencies discussed in this paper.

Keywords

Updating probabilities Minimal information Higher order probabilities Maximum entropy Cross entropy Jaynes 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. de Finetti B. (1974) Theory of probability (Vol. 1). John Wiley and Sons, New York, NYGoogle Scholar
  2. Friedman K., Shimony A. (1971) Jaynes’s maximum entropy prescription and probability theory. Journal of Statistical Physics 3(4): 381CrossRefGoogle Scholar
  3. Gaifman H. (1983) Paradoxes of infinity and self-applications, I’. Erkenntnis 20: 131–155CrossRefGoogle Scholar
  4. Gaifman H. (1986) A theory of higher order probabilities. In: Halpern J. (eds) Theoretical aspects of reasoning about knowledge. Morgan Kaufmann Publishers Inc, San Francisco, CAGoogle Scholar
  5. Gaifman H. (2004) Reasoning with limited resources and assigning probabilities to arithmetical statements. Synthese 140: 97–119CrossRefGoogle Scholar
  6. Gaifman H., Snir M. (1982) Probabilities over rich languages, testing and randomness. Journal of Symbolic Logic 47(3): 495–548CrossRefGoogle Scholar
  7. Good I. J. (1972) 46,656 varieties of Bayesians. American Statistician 25: 62–63Google Scholar
  8. Grove, A., & Halpern, J. (1997). Probability update: Conditioning vs. cross entropy. In: Proceedings of the thirteenth annual conference on uncertainty in artificial intelligence.Google Scholar
  9. Hobson A., Cheng B. (1973) A comparison of the Shannon and Kullback information measures. Journal of Statistical Physics 7(4): 301–310CrossRefGoogle Scholar
  10. Jaynes E. T. (1957) Information theory and statistical mechanics, 1’. Physical Review 106: 620–630CrossRefGoogle Scholar
  11. Jaynes E. T. (1968) Prior probabilities. In: Rosenkrantz R. (eds) E.T. Jaynes: Papers on probability, statistics and statistical physics. D. Reidel Publishing Company, Boston, MA, pp 116–130Google Scholar
  12. Jaynes E. T. (1983) Where do we stand on maximum entropy. In: Rosenkrantz R. (eds) E.T. Jaynes: Papers on probability, statistics and statistical physics. D. Reidel Publishing Company, Boston, MA, pp 210–314Google Scholar
  13. Jaynes E. T. (2003) Probability theory: The logic of science. Cambridge University Press, CambridgeCrossRefGoogle Scholar
  14. Jeffrey R. (1965) The logic of decision. McGraw Hill, New YorkGoogle Scholar
  15. Keynes J.M. (1920) A treatise on probability, 2006 edn. Cosimo, Inc., New York, NYGoogle Scholar
  16. Kullback S., Leibler R. (1951) On information and sufficiency. Annals of Mathematical Statistics 22(1): 79–86CrossRefGoogle Scholar
  17. Levi I. (1985) Imprecision and indeterminacy in probability judgment. Philosophy of Science 52(3): 390–409CrossRefGoogle Scholar
  18. Paris J. (1998) Common sense and maximum entropy. Synthese 117(1): 75–93CrossRefGoogle Scholar
  19. Paris J., Vencovská A. (1997) In defense of the maximum entropy inference process. International Journal of Approximate Reasoning 17(1): 77–103CrossRefGoogle Scholar
  20. Putnam H. (1963) Degree of confirmation and inductive logic. In: Schilpp P. (eds) The philosophy of Rudolf Carnap. The Open Court Publishing Co, La Salle, IL, pp 761–784Google Scholar
  21. Savage L. (1954) The foundations of statistics. John Wiley and Sons, New YorkGoogle Scholar
  22. Seidenfeld T. (1979) Why I am not an objective Bayesian. Theory and Decision 11: 413–440CrossRefGoogle Scholar
  23. Seidenfeld T. (1987) Entropy and uncertainty (revised). In: MacNeill I., Humphreys G. (eds) Foundations of statistical inference. D. Reidel Publishing Co, Dordrecht, pp 259–287Google Scholar
  24. Shannon C.E. (1948) A mathematical theory of communication. The Bell System Technical Journal 27: 379–423Google Scholar
  25. Shimony A. (1973) Comment on the interpretation of inductive probabilities. Journal of Statistical Physics 9(2): 187–191CrossRefGoogle Scholar
  26. Shimony A. (1985) The status of the principle of maximum entropy. Synthese 63: 35–53CrossRefGoogle Scholar
  27. Shore J., Johnson R. (1980) Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Transactions on Information Theory IT-26(1): 26–37CrossRefGoogle Scholar
  28. van Fraassen B. (1981) A problem for relative information minimizers in probability kinematics. The British Journal for the Philosophy of Science 32(4): 375–379CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media B.V. 2011

Authors and Affiliations

  1. 1.Columbia UniversityNew YorkUSA

Personalised recommendations