Entropy as a Fixed Point

  • Keye Martin
Conference paper

DOI: 10.1007/978-3-540-27836-8_79

Part of the Lecture Notes in Computer Science book series (LNCS, volume 3142)
Cite this paper as:
Martin K. (2004) Entropy as a Fixed Point. In: Díaz J., Karhumäki J., Lepistö A., Sannella D. (eds) Automata, Languages and Programming. ICALP 2004. Lecture Notes in Computer Science, vol 3142. Springer, Berlin, Heidelberg

Abstract

We present general ideas about the complexity of objects and how complexity can be used to define the information in objects. In essence, the idea is that while complexity is relative to a given class of processes, information is process independent: information is complexity relative to the class of all conceivable processes. We test these ideas on the complexity of classical states. A domain is used to specify the class of processes, and both qualitative and quantitative notions of complexity for classical states emerge. The resulting theory can be used to give new proofs of fundamental results from classical information theory, to give a new characterization of entropy, to derive lower bounds on algorithmic complexity and even to establish new connections between physics and computation. All of this is a consequence of the setting which gives rise to the fixed point theorem: The least fixed point of the copying operator above complexity is information.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Keye Martin
    • 1
  1. 1.Oxford University Computing LaboratoryOxford

Personalised recommendations