The Mathematical Theory of Information

  • Authors
  • Jan Kåhre

Table of contents

  1. Front Matter
    Pages i-xiv
  2. Jan Kåhre
    Pages 1-21
  3. Jan Kåhre
    Pages 22-40
  4. Jan Kåhre
    Pages 41-79
  5. Jan Kåhre
    Pages 80-115
  6. Jan Kåhre
    Pages 116-144
  7. Jan Kåhre
    Pages 145-189
  8. Jan Kåhre
    Pages 190-227
  9. Jan Kåhre
    Pages 228-261
  10. Jan Kåhre
    Pages 262-288
  11. Jan Kåhre
    Pages 289-325
  12. Jan Kåhre
    Pages 326-363
  13. Jan Kåhre
    Pages 364-396
  14. Jan Kåhre
    Pages 397-430
  15. Jan Kåhre
    Pages 431-477
  16. Back Matter
    Pages 478-502

About this book

Introduction

The general concept of information is here, for the first time, defined mathematically by adding one single axiom to the probability theory. This Mathematical Theory of Information is explored in fourteen chapters: 1. Information can be measured in different units, in anything from bits to dollars. We will here argue that any measure is acceptable if it does not violate the Law of Diminishing Information. This law is supported by two independent arguments: one derived from the Bar-Hillel ideal receiver, the other is based on Shannon's noisy channel. The entropy in the 'classical information theory' is one of the measures conforming to the Law of Diminishing Information, but it has, however, properties such as being symmetric, which makes it unsuitable for some applications. The measure reliability is found to be a universal information measure. 2. For discrete and finite signals, the Law of Diminishing Information is defined mathematically, using probability theory and matrix algebra. 3. The Law of Diminishing Information is used as an axiom to derive essential properties of information. Byron's law: there is more information in a lie than in gibberish. Preservation: no information is lost in a reversible channel. Etc. The Mathematical Theory of Information supports colligation, i. e. the property to bind facts together making 'two plus two greater than four'. Colligation is a must when the information carries knowledge, or is a base for decisions. In such cases, reliability is always a useful information measure. Entropy does not allow colligation.

Keywords

Information Shannon Symbol Text algorithms communication control engineering game theory information technology information theory logic thermodynamics

Bibliographic information

  • DOI https://doi.org/10.1007/978-1-4615-0975-2
  • Copyright Information Kluwer Academic Publishers 2002
  • Publisher Name Springer, Boston, MA
  • eBook Packages Springer Book Archive
  • Print ISBN 978-1-4613-5332-4
  • Online ISBN 978-1-4615-0975-2
  • Series Print ISSN 0893-3405
  • About this book