A Quantitative Measure of Relevance Based on Kelly Gambling Theory

  • Mathias Winther Madsen
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8607)


This paper proposes a quantitative measure relevance which can quantify the difference between useful and useless facts. This measure evaluates sources of information according to how they affect the expected logarithmic utility of an agent. A number of reasons are given why this is often preferable to a naive value-of-information approach, and some properties and interpretations of the concept are presented, including a result about the relation between relevant information and Shannon information. Lastly, a number of illustrative examples of relevance measurements are discussed, including random number generation and job market signaling.


Relevant Information Turing Machine Horse Race Shannon Information Relevance Rate 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Avriel, M., Williams, A.C.: The value of information and stochastic programming. Operations Research 18(5), 947–954 (1970)CrossRefzbMATHMathSciNetGoogle Scholar
  2. 2.
    Borlund, P.: The concept of relevance in IR. Journal of the American Society for Information Science and Technology 54(10), 913–925 (2003)CrossRefGoogle Scholar
  3. 3.
    Cooper, W.S.: A definition of relevance for information retrieval. Information Storage and Retrieval 7(1), 19–37 (1971)CrossRefGoogle Scholar
  4. 4.
    Cover, T.M., Thomas, J.A.: Elements of information theory. Wiley-interscience (2006)Google Scholar
  5. 5.
    Floridi, L.: Understanding epistemic relevance. Erkenntnis 69(1), 69–92 (2008)CrossRefzbMATHGoogle Scholar
  6. 6.
    Glazer, J., Rubinstein, A.: A study in the pragmatics of persuasion: a game theoretical approach. Theoretical Economics 1(4), 395–410 (2006)Google Scholar
  7. 7.
    Kelly, J.L.: A new interpretation of information rate. IRE Transactions on Information Theory 2(3), 185–189 (1956)CrossRefGoogle Scholar
  8. 8.
    Kullback, S., Leibler, R.A.: On information and sufficiency. The Annals of Mathematical Statistics 22(1), 79–86 (1951)CrossRefzbMATHMathSciNetGoogle Scholar
  9. 9.
    MacKay, D.: Information theory, inference and learning algorithms. Cambridge University Press (2003)Google Scholar
  10. 10.
    Shannon, C.E.: A mathematical theory of communication. Bell System Technical Journal 27, 379–423, 623–656 (1948)Google Scholar
  11. 11.
    Spence, M.: Job market signaling. The Quarterly Journal of Economics 87(3), 355–374 (1973)CrossRefGoogle Scholar
  12. 12.
    Sperber, D., Wilson, D.: Relevance: Communication and Cognition. Blackwell Publishing (1995)Google Scholar
  13. 13.
    Zucker, J.-D., Meyer, C.: Apprentissage pour l’anticipation de comportements de joueurs humains dans les jeux à information complète et imparfaite: les «mind-reading machines». Revue d’intelligence Artificielle 14(3-4), 313–338 (2000)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  • Mathias Winther Madsen
    • 1
  1. 1.Institute for Logic, Language, and ComputationUniversity of AmsterdamThe Netherlands

Personalised recommendations