This paper proposes a quantitative measure relevance which can quantify the difference between useful and useless facts. This measure evaluates sources of information according to how they affect the expected logarithmic utility of an agent. A number of reasons are given why this is often preferable to a naive value-of-information approach, and some properties and interpretations of the concept are presented, including a result about the relation between relevant information and Shannon information. Lastly, a number of illustrative examples of relevance measurements are discussed, including random number generation and job market signaling.
Keywords
- Relevant Information
- Turing Machine
- Horse Race
- Shannon Information
- Relevance Rate
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.