Abstract
We introduce the rather wide-ranging considerations which follow with a discussion of the concept of information and its role in scientific discourse. Ever since Shannon began to talk of information theory (by which he meant a probabilistic analysis of the deleterious effects of propagating signals through channels; cf. Shannon and Weaver, 1949), the concept has been relentlessly analyzed and reanalyzed. The time and effort expended on these analyses must surely rank as one of the most unprofitable investments in modern scientific history; not only has there been no profit, but also the currency itself has been debased to worth- lessness. Yet, in biology, for example, the terminology of information intrudes itself insistently at every level; code, signal, computation, recognition. It may be that these informational terms are simply not scientific at all; that they are a temporary anthropomorphic expedient; a facon de parler which merely reflects the immaturity of biology as a science, to be replaced at the earliest opportunity by the more rigorous terminology of force, energy, and potential which are the province of more mature sciences (i.e. physics), in which information is never mentioned. Or, it may be that the informational terminology which seems to force itself upon us bespeaks something fundamental; something that is missing from physics as we now understand it. We take this latter viewpoint, and see where it leads us.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Burks, A. (1986) Theory of Self-Reproducing Automata ( Urbana, IL: University of Illinois Press).
Handler, P. (Ed) (1970) Biology and the Future of Man ( Oxford: Oxford University- Press ).
Higgins, J. (1967) Oscillating chemical reactions. J. Ind. & Eng. Chem. 59: 18–62.
Monod, J. (1971) Chance and necessity ( New York: Alfred A. Knopf).
von Neumann, J. (1951) The general and logical theory of automata, in L.A. Jeffress (Ed) Cerebral Mechanisms in Behavior ( New York: John Wiley & Sons ) pp 1–41.
Rosen, R. (1977) Complexity as a system property. Int. J. General Systems 3: 227–32.
Rosen, R. (1978) Fundamentals of Measurement and Representation of Natural Systems ( New York: Elsevier).
Rosen, R. (1979) Some comments on activation and inhibition. Bull. Math. Biophysics 41: 427–45.
Rosen, R. (1983) The role of similarity principles in data extrapolation. Am. J. Physiol. 244: R591–9.
Rosen, R. (1985) Anticipatory Systems (London: Pergamon Press) in press.
Shannon, C. and Weaver, W. (1949) The Mathematical Theory of Communication ( Urbana, IL: University of Illinois Press).
Turing, A.M. (1936) On computable numbers. Proc. London Math. Soc. Ser. 2, 42: 230–65.
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1986 International Institute for Applied Systems Analysis
About this chapter
Cite this chapter
Rosen, R. (1986). On Information and Complexity. In: Casti, J.L., Karlqvist, A. (eds) Complexity, Language, and Life: Mathematical Approaches. Biomathematics, vol 16. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-70953-1_7
Download citation
DOI: https://doi.org/10.1007/978-3-642-70953-1_7
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-70955-5
Online ISBN: 978-3-642-70953-1
eBook Packages: Springer Book Archive