Skip to main content
Log in

Information Gain and Approaching True Belief

  • Original Article
  • Published:
Erkenntnis Aims and scope Submit manuscript

Abstract

Recent years have seen a renewed interest in the philosophical study of information. In this paper a two-part analysis of information gain—objective and subjective—in the context of doxastic change is presented and discussed. Objective information gain is analyzed in terms of doxastic movement towards true belief, while subjective information gain is analyzed as an agent’s expectation value of her objective information gain for a given doxastic change. The resulting expression for subjective information gain turns out to be a familiar one with well-known formal properties: the Kullback–Leibler divergence. The two notions of information are discussed and the suggested measure of subjective information gain is then compared with the widely held view that information gain equals uncertainty reduction.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Abramsky, S. (2008). Information, processes and games. In P. Adriaans & J. van Benthem (Eds.), Philosophy of information. Amsterdam: Elsevier.

    Google Scholar 

  • Adriaans, P., & van Benthem, J. (Eds.). (2008). Philosophy of Information. Amsterdam: North Holland.

    Google Scholar 

  • Asch, D. A., Patton, J. P., & Hershey, J. C. (1990). Knowing for the sake of knowing: The value of prognostic information. Medical Decision Making, 10, 47–57.

    Article  Google Scholar 

  • Bar-Hillel, Y., & Carnap, R. (1953). Semantic information. British Journal for the Philosophy of Science, 4(14), 147–157.

    Article  Google Scholar 

  • Benish, W. A. (1999). Relative entropy as a measure of diagnostic information. Medical Decision Making, 19(2), 202–206.

    Article  Google Scholar 

  • Bozdogan, H. (1987). Model selection and Akaike’s information criterion (AIC): The general theory and its analytical extensions. Psychometrika, 52(3), 345–370.

    Article  Google Scholar 

  • Carnap, R. & Bar-Hillel, Y. (1952). An outline of a theory of semantic information: Technical Report No. 247. Massachusetts Institute of Technology Research Laboratory of Electronics.

  • Cover, T. M., & Thomas, A. J. (1991). Elements of information theory. New Jersey: Wiley.

    Book  Google Scholar 

  • Corfield, D. (2001). Bayesianism in mathematics. In D. Corfield & J. Williamson (Eds.), Foundations of Bayesianism. Dordrecht: Kluwer Academic Publishers.

    Chapter  Google Scholar 

  • D’Agostino, M., & Floridi, L. (2009). The enduring scandal of deduction. Synthese, 167, 271–315.

    Article  Google Scholar 

  • David, M. (2001). Truth as the epistemic goal. In M. Steup (Ed.), Knowledge, truth and duty. Oxford: Oxford University Press.

    Google Scholar 

  • DeGroot, M. H. (1962). Uncertainty, information and sequential experiments. The Annals of Mathematical Statistics, 33(2), 404–419.

    Article  Google Scholar 

  • Domotor, Z., Zanotti, M., & Graves, H. (1980). Probability kinematics. Synthese, 44(3), 421–442.

    Article  Google Scholar 

  • Dretske, F. (1983/2000). Précis of knowledge and the flow of information. In S. Bernecker & F. Dretske (Eds.), Knowledge. New York: Oxford University Press.

  • Dretske, F. (2008). Epistemology and information. In P. Adriaans & J. van Benthem (Eds.), Philosophy of information. Amsterdam: Elsevier.

    Google Scholar 

  • Floridi, L. (2004). Outline of a strongly semantic theory of information. Minds and Machines, 14(2), 197–221.

    Article  Google Scholar 

  • Floridi, L. (2011a). Semantic conceptions of information. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy (Spring 2011 Edition). Retrieved December 27, 2012 from http://plato.stanford.edu/archives/spr2011/entries/information-semantic/.

  • Floridi, L. (2011b). The philosophy of information. Oxford: Oxford University Press.

    Book  Google Scholar 

  • Gaifman, H. (2004). Reasoning with limited resources and assigning probabilities to arithmetical statements. Synthese, 140, 97–119.

    Article  Google Scholar 

  • Gärdenfors, P. (1988). Knowledge in flux. Cambridge: The MIT Press.

    Google Scholar 

  • Hartley, R. (1928). Transmission of Information. Bell System Technical Journal, 7(3), 535–563.

    Article  Google Scholar 

  • Hobson, A. (1969). A new theorem of information theory. Journal of Statistical Physics, 1(3), 383–391.

    Article  Google Scholar 

  • Hobson, A., & Cheng, B.-K. (1973). A comparison of the Shannon and Kullback information measures. Journal of Statistical Physics, 7(4), 301–310.

    Article  Google Scholar 

  • Johnson, R. W. (1979). Axiomatic characterization of the directed divergences and their linear combinations. IEEE Transactions on Information Theory, 25(6), 709–716.

    Article  Google Scholar 

  • Klir, G. J. (2006). Uncertainty and Information. New Jersey: Wiley.

    Google Scholar 

  • Kullback, S. (1959). Information theory and statistics. New York: Wiley.

    Google Scholar 

  • Kullback, S., & Leibler, R. A. (1951). On information and sufficiency. The Annals of Mathematical Statistics, 22(1), 79–86.

    Article  Google Scholar 

  • MacKay, D. J. C. (2003). Information theory, inference and learning algorithms. Cambridge: Cambridge University Press.

    Google Scholar 

  • Oddie, G. (2008). Truthlikeness. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy (Fall 2008 Edition). Retrieved October 31, 2012 from http://plato.stanford.edu/archives/fall2008/entries/truthlikeness/.

  • Pincock, C. (2010). Mathematics, science and confirmation theory. Philosophy of Science, 77(5), 959–970.

    Article  Google Scholar 

  • Popper, K. R. (1959). The logic of scientific discovery. London: Hutchinson & Co.

    Google Scholar 

  • Rényi, A. (1961). On measures of entropy and information. Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, 1, 547–561.

    Google Scholar 

  • Reza, F. M. (1961). An introduction to information theory. New York: McGraw Hill.

    Google Scholar 

  • Rott, H. (2000). Two dogmas of belief revision. The Journal of Philosophy, 97(9), 503–522.

    Article  Google Scholar 

  • Rott, H. (2008). Information structures in belief revision. In P. Adriaans & J. van Benthem (Eds.), Philosophy of information. Amsterdam: Elsevier.

    Google Scholar 

  • Shannon, C.E. & Weaver, W. (1949/1998). The mathematical theory of communication. Urbana: University of Illinois Press.

  • Tribus, M. (1961). Thermostatics and thermodynamics. New York: D. Van Nostrand.

    Google Scholar 

  • Ufflink, J. (1995). Can the maximum entropy principle be explained as a consistency requirement? Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, 26(3), 223–261.

    Article  Google Scholar 

  • van Benthem, J. (2007). Dynamic logic for belief revision. Journal of Applied Non-Classical Logics, 17(2), 129–155.

    Article  Google Scholar 

  • van Fraassen, B. C. (1980). Rational belief and probability kinematics. Philosophy of Science, 47(2), 165–187.

    Article  Google Scholar 

Download references

Acknowledgements

The author would like to thank John Cantwell, Tor Sandqvist, Sven Ove Hansson and Anders Eriksson for helpful comments on previous versions of this paper. Two anonymous reviewers also provided insightful and highly constructive comments. Partial funding from the Swedish Defence Research Agency (FOI) is gratefully acknowledged.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jonas Clausen Mork.

Appendix: Proof of Theorem 1

Appendix: Proof of Theorem 1

Below I show that there is a simple functional (1) that satisfies the requirements I1 through I5 (though I attempt no uniqueness proof), if the following conventions are assumed:

Convention 1

log2(0/0) = 0

Convention 2

log2(a/0) = ∞, a > 0

Convention 3

log2(0/a) = log20 = −∞, a > 0

$$ {\text{I}}_{\text{H}} \left( {p,q} \right) = { \log }_{ 2} \left( {q\left( t \right)/p\left( t \right)} \right) $$
(7)

Theorem 1

Assuming Conventions 1, 2 and 3, the functional (1) satisfies the requirements I1–I5.

Satisfaction of I1

For p(t) > 0, the functional log2(q(t)/p(t)) is greater than 0 if and only if q(t)/p(t) is greater than 1, and this is the case if and only if p(t) < q(t). When p(t) = 0, log2(q(t)/p(t)) is greater than 0 if and only if p(t) < q(t) (according to Conventions 1 and 2). We can therefore conclude that log2(q(t)/p(t)) > 0 iff p(t) < q(t).

Satisfaction of I2

The functional log2(q(t)/p(t)) is less than 0 if and only if either (1) p(t) > q(t) = 0 (Convention 3) or (2) p(t) > q(t) > 0. We can conclude that log2(q(t)/p(t)) < 0 iff p(t) > q(t).

Satisfaction of I3

The functional log2(q(t)/p(t)) equals 0 if and only if either (1) q(t)/p(t) equals 1, which is the case if and only if p(t) = q(t) > 0 or (2) p(t) = q(t) = 0 (according to Convention 1). So log2(q(t)/p(t)) equals 0 if and only if p(t) = q(t).

Satisfaction of I4

Let t A be the true member of A and t B the same for B. We then also have <t A, t B> as the true member of A × B. By assumption we have that \( p^{\prime\prime} \left( {t_{A} ,t_{B} } \right) = p\left( {t_{A} } \right)p^\prime \left( {t_{B} } \right){\text{ and}}\quad q^{\prime \prime} \left( {t_{A} ,t_{B} } \right) = q\left( {t_{A} } \right)q^\prime \left( {t_{B} } \right) \). The following equality then holds when \( {\text{I}}_{\text{H}} \left( {p,q} \right) = \log_{2} \left( {q\left( t \right)/p\left( t \right)} \right) \):

$$ {\text{I}}_{\text{A}} \left( {p,q} \right) + {\text{I}}_{\text{B}} \left( {p^\prime ,q^\prime } \right) = { \log }_{ 2} \left( {q\left( {t_{\text{A}} } \right)/p\left( {t_{\text{A}} } \right)} \right) + { \log }_{ 2} \left( {q^\prime \left( {t_{\text{B}} } \right)/p^\prime \left( {t_{\text{B}} } \right)} \right) = { \log }_{ 2} \left( {q^{\prime \prime} \left( {t_{\text{A}} ,t_{\text{B}} } \right)/p^{\prime \prime} \left( {t_{\text{A}} ,t_{\text{B}} } \right)} \right) = {\text{I}}_{{{\text{A}} \times {\text{B}}}} \left( {p^{\prime \prime} ,q^{\prime \prime} } \right) $$

Satisfaction of I5

The proof is trivial. Inserting q(t) = 1 into log2(q(t)/p(t)) gives log2(1/p(t)), or −log2 p(t).

This concludes the proof.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Clausen Mork, J. Information Gain and Approaching True Belief. Erkenn 80, 77–96 (2015). https://doi.org/10.1007/s10670-014-9613-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10670-014-9613-1

Keywords

Navigation