Abstract
This paper addresses one of the fundamental problems of the philosophy of information: How does semantic information emerge within the underlying dynamics of the world?—the dynamical semantic information problem. It suggests that the canonical approach to semantic information that defines data before meaning and meaning before use is inadequate for pre-cognitive information media. Instead, we should follow a pragmatic approach to information where one defines the notion of information system as a special kind of purposeful system emerging within the underlying dynamics of the world and define semantic information as the currency of the system. In this way, systems operating with semantic information can be viewed as patterns in the dynamics—semantic information is a dynamical system phenomenon of highly organized systems. In the simplest information systems, the syntax, semantics, and pragmatics of the information medium are co-defined. It proposes a new more general theory of information semantics that focuses on the interface role of the information states in the information system—the interface theory of meaning. Finally, with the new framework, it addresses the debate between weakly semantic and strongly semantic accounts of information, siding with the strongly semantic view because the pragmatic account developed here is a better generalization of it.
Similar content being viewed by others
Notes
The requirement is not constitutive, of course, but causal.
There are important differences between Peirce and Morris as to the notion of “semiotic” and the structure and purpose for a general theory signs. Most of what nowadays is called semiotics is affected more by Morris’ (and Saussure’s; Burch 2010).
They are atypical, even though they are ubiquitous and highly salient in human experience. Similarly, an atmosphere high in oxygen is highly atypical as far as planetary atmospheres go, even though for us it is the stereotypical atmosphere.
Newell and Simon offered the physical symbol system hypothesis as a hypothesis about the nature of intelligence: that the collection of intelligent systems is included in the collection of physical symbol systems. In the early days of cognitive science, it was common to equate intelligence with cognition. Nowadays, it is mostly recognized that physical symbol systems are not necessary for simpler forms of cognition. Here, I am suggesting something stronger, however. I am suggesting that simpler forms of cognition are required for physical symbol systems to exist. This is either because cognition is required for creating physical symbol systems, in the case of artificial systems, or because a physical symbol system may emerge naturally only within a system that contains simpler cognitive capacities already. I will not argue for this claim here.
Nauta uses the term “emitter”. I prefer the term “effector”. “Emitter” has the connotation of something being emitted, while “effector” conveys the idea of a general causal effect.
We assume that both M and P are non-trivial systems. That is, M and P play an active role in the dynamics of s.
Peirce would have disagreed that the account is not naturalistic. This, however, is related to his somewhat obscure metaphysics, which had pan-psychics elements (Burch 2010).
It is not assumed that P operates by “representing” the value of the current state of M and uses the value to select an action. It, of course, may operate like that, but such an operation would likely involve cognitive machinery. In the same way, P should not be assumed to “represent” the goal state and evaluate the action in light of which one would achieve the goal under the condition specified by M. This would be to regard P as a desire. P is not supposed to be a straight forward generalization of a desire architecture.
This is different from (though related to) the question of the information dynamics that some (Williams and Beer 2010) investigate, which is how information propagates and changes within a computational system.
Often the phase space is called also “state space”. For example, in quantum mechanics, the Hilbert space of quantum state is called state space. Or in computer science, the space of possible computation states is called a state space.
Often the economical order parameters do not have obvious interpretation as degrees of freedom of internal mechanisms of the sub-system.
It is important to clear a potential terminological confusion here. The term “control parameter” is internal to Haken’s theory of synergetics (Haken 1993a), b). Haken calls external influence parameters “control parameters” because they are often used to control the behavior of self-organized systems. I also talk about control in a less technical sense: I say that a system controls its behavior, or that the locus of control lies in a system. In this more general notion of control, the control relations may depend solely on the order parameters (in Haken’s sense) of the system. Also, some control parameters (in Haken’s sense) may not have any control significance in my sense. The terminology is unfortunate, but I stick to it to be consistent with the literature. Thus, when the expression “control parameter” is used, it is always in the technical sense of synergetics. Any other expression that has the word “control” is used in my (or a control system theoretic; Levine 1996) sense.
Here I am assuming a complex, non-linear dynamics without simple stable regions. There is no trajectory that remains forever in V.
Essentially, here I am assuming that the goal of the system is survival or, stated differently, that the G states in the definition of a purposeful system compose the region V.
Keeping the initial direction is needed for illustrative reasons only. The phase space of the problem is not physical space but an abstract space including the direction. There is no problem characterizing the stability and variability of the flow with respect to the position of the wheel.
Of course, this is relative to the phase space of interest. If the system is parametrized with two states—operative and inoperative—then a bomb can be regarded as a locus of control. It is a binary switch that moves a vehicle from both operative and inoperative states to an inoperative state.
On a low friction surface, such as ice, often the only possible way of steering corners fast is using the so-called drift method, where the car slides sideways in the direction of the turn and one controls the attitude by adjusting the throttle (gas peddle). This is a very difficult and dangerous technique. Leave it for the professionals! Besides, most modern cars with front wheel drives and electronic stability control cannot drift steer.
M, of course, is already defined by order parameters, which define macro-state in the global phase space. Indeed, there may be several levels of such macro-states until the right invariance are identified that isolate the system M. Still, we can think of the states of M as local micro-states in the immediate reduced phase space of M. Yet, further reductions are possible, and further macro-states can be defined.
It is very tempting to describe the problem with the language of supervenience. I recommend caution in using supervenience here because the notion of supervenience has its roots in the classical object/property metaphysics, while my discussion is based on the dynamical system theory approach to system analysis. The two approaches are not incompatible, although I think that the dynamical systems approach is more general. The notion of supervenience, as used by Kim for example, is not readily convertible to the dynamical systems approach. But if I must describe the problem in terms of supervenience, I can describe it as follows: we cannot assume that the macro-properties of the sub-systems supervene on the micro-properties of the same subsystem. They may supervene on the entire environment/information system ensemble.
In fact, only a small subset of the informationally relevant states of S would, in general, be significant for the internal control pathway. This difference—a kind of informational deficiency of the system—is central for understating the role of cognition for an organism (Vakarelov 2011).
Eating spoiled food may help an organism not die of hunger now, but it may cause food poisoning that may harm the organism later.
Some (Maturana and Varela 1980; Varela et al. 1992) have suggested that the system of color discrimination and categorization of many organisms only partially depict physical reflectance (or other) properties of external objects. It depends to a large degree on the internal dynamics of the visual system. In a sense, the organism imposes structure on the world that is not there independently but that is utilized by the system.
Of course, the opposite is true too. There may be many ways one can generalize a specific situation. Here I follow one specific generalization supported by the pragmatic approach.
The theoretical method of generalization and re-instantiation is a great tool for resolving disagreement between competing theories of something (e.g., of meaning). By obtaining a general theory and then showing how specific but competing scenarios are instances of the generalization, one can demonstrate that the disagreement is not conceptual but results from a different fixation of some theoretical parameters. It may turn out that both specific theories are correct, but they are theories for different domains, and moreover, both are justified in using the same concept because the concept turns out to be a specific instance of the general concept.
There is one important contention here. Is not the notion of goal and thus purposeful system already semantic? Such an objection has been raised by Dretske (1988) in response to Bogdan and more generally by Floridi (2010). Careless use of goals can indeed sneak in semantics. The important thing is not to assume that goals are explicit (like desires). Goals should not be regarded as kinds of propositional attitudes. My, and I believe Bogdan’s, notion of goal is not content determining. For Bogdan’s reply to Dretske, see (Bogdan 1988b). In my case, the notion of purposeful system is purely dynamical. It captures a particular patterns of interaction between a system and its environment. Such a pattern may be selected by an external designer, in which case Floridi’s zero semantic condition (Floridi and Taddeo 2005, 2007) is not satisfied, but it could result from (or be) a natural pattern in the global dynamics. As it has been argued by some (Maturana and Varela 1980; Varela 2000; Weber and Varela 2002), convincingly at least to me, the phenomenon of life may be related to the natural emergence of purposeful systems. This, however, is a separate issue that I do not wish to discuss here.
Here I use the notion of process informally. It is assumed that the system is ultimately describable with a DSM.
... and most everyone working in the field of IST. See Floridi (2010: 4.2) for many examples and references.
...and not merely manipulation of data. In information system theory (IST) one is often interested in manipulating information in a semantically sensitive way.
References
Ackoff, R. L. (1958). Towards a behavioral theory of communication. Management Science, 4(3), 218–234.
Aczel, P., Israel, D. J., Katagiri, Y., & Peters, S. (Eds.) (1993). Situation theory and its applications (Vol. 3). Stanford: CSLI.
Adriaans, P. (2008). Philosophy of information: Concepts and history. In P. Adriaans & J. van Benthem (Eds.), Handbook on the philosophy of information. Amsterdam: North Holland.
Bar-Hillel, Y. (1964). Language and information: Selected essays on their theory and application. London: Addison Wesley.
Barwise, J., & Seligman, J. (1997). Information flow: The logic of distributed systems. Cambridge tracks in theoretical computer science (no. 44). Cambridge: Cambridge University Press.
Barwise, J., & Perry, J. (1983). Situations and attitudes. Cambridge: Bradford Book.
Beer, R. (2000). Dynamical approaches to cognitive science. Trends in Cognitive Sciences, 4, 91–99.
Bogdan, R. J. (1988a). Information and semantic cognition: An ontological account. Mind and Language, 3(2), 81–122.
Bogdan, R. J. (1988b). Replies to commentators. Mind and Language, 3(2), 145–151.
Bogdan, R. J. (1994). Grounds for cognition: How goal-guided behavior shapes the mind. Hillsdale: Lawrence Erlbaum Associates.
Burch, R. (2010). Charles Sanders Peirce. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy (2010 ed.). Stanford: Metaphysics Research Lab, CSLI, Stanford University
Carnap, R., & Bar-Hillel, Y. (1952). An outline of a theory of semantic information. Tech. rept. 247. Cambridge: MIT.
Chemero, A. (2009). Radical embodied cognitive science. Cambridge: MIT.
Cooper, R., Mukai, K., Barwise, J., & Perry, J. (Eds.) (1991). Situation theory and its applications (Vol. 2). Stanford: CSLI.
Cooper, R., Mukai, K., & Perry, J. (Eds.) (1990). Situation theory and its applications (Vol. 1). Stanford: CSLI.
Dennett, D. C. (1991). Real patterns. Journal of Philosophy, LXXXVIII, 7–51.
Di Paolo, E. A. (2005). Autopoiesis, adaptivity, teleology, agency. Phenomenology and the Cognitive Sciences, 4, 429–452.
Dretske, F. (1981). Knowledge and the flow of information. Cambridge: MIT.
Dretske, F. (1988). Commentary: Bogdan on information. Mind and Language, 3(2), 141–144.
Fetzer, J. H. (2004). Information: Does it have to be true? Minds and Machines, 14, 223–229.
Floridi, L. (2003). Information. In L. Floridi (Ed.), The Blackwell guide to the philosophy of computing and information. Oxford: Blackwell.
Floridi, L. (2004). Outline of a theory of strongly semantic information. Minds and Machines, 14(2), 197–222.
Floridi, L. (2005). Is information meaningful data? Philosophy and Phenomenological Research, 70(2), 351–370.
Floridi, L. (2007). In defence of the veridical nature of semantic information. European Journal of Analytic Philosophy, 3(1), 31–41.
Floridi, L. (2008a). The method of levels of abstraction. Minds and Machines, 18(3), 303–329.
Floridi, L. (2008b). Trends in the philosophy of information. In P. Adriaans & J. van Benthem (Eds.), Handbook of philosophy of information (pp. 113–132). Amsterdam: North Holland.
Floridi, L. (2010). The philosophy of information. Oxford: Oxford University Press.
Floridi, L., & Sanders, J. (2004). Levellism and the method of abstraction. IEG Research Report IEG-RR-4. Oxford: Oxford University.
Floridi, L., & Taddeo, M. (2005). The symbol grounding problem: A critical review of fifteen years of research. Journal of Experimental and Theoretical Artificial Intelligence, 17(4), 419–445.
Floridi, L., & Taddeo, M. (2007). A praxical solution of the symbol grounding problem. Minds and Machines, 17(4), 369–389.
Gao, J., Cao, Y., Tung, W. -w., & Hu, J. (2007). Multiscale analysis of complex time series: Integration of chaos and random fractal theory, and beyond. Hoboken: Wiley-Interscience.
Grunwald, P. D., & Vitanyi, P. M. B. (2008). Algorithmic information theory. In P. Adriaans & J. van Benthem (Eds.), Handbook on the philosophy of information. Amsterdam: North Holland.
Haken, H. (1993a) Advanced synergetics: Instability hierarchies of self-organizing systems and devices (3rd ed.). Berlin: Springer.
Haken, H. (1993b). Synergetics: An introduction (3rd ed.). Berlin: Springer.
Haken, H. (2000). Information and self-organization: A macroscopic approach to complex systems (2nd ed.). Berlin: Springer.
Hinrichsen, D., & Pritchard, A. J. (2005). Mathematical systems theory I—modelling, state space analysis, stability and robustness. Berlin: Springer.
Hirsch, M. W., Smale, S., & Devaney, R. L. (2004). Differential equations, dynamical systems, and an introduction to chaos. New York: Academic.
Israel, D. J., & Perry, J. R. (1990). What is information? In Hanson, P. (Ed.), Information, language and cognition: Vancouver studies in cognitive science (Vol. I). Vancouver: University of British Columbia Press.
Ivancevic, V. G., & Ivancevic, T. T. (2008). Complex nonlinearity: Chaos, phase transitions, topology change and path integrals. Understanding complex systems. Berlin: Springer.
Katok, A., & Hasselblatt, B. (1996). Introduction to the modern theory of dynamical systems. Cambridge: Cambridge University Press.
Kelso, J., & Scott, A. (1995). Dynamic patterns: The self-organization of brain and behavior. Cambridge: MIT Press.
Ladyman, J., Ross, D., Spurrett, D., & Collier, J. (2007). Every thing must go: Metaphysics naturalized. Oxford: Oxford University Press.
Levine, W. S. (Ed.) (1996). The control handbook. New York: CRC.
Li, M., & Vitanyi, P. (1997). An introduction to Kolmogorov complexity and its applications. Berlin: Springer.
MacKay, D. M. (1969a). Information, mechanisms and meaning. Cambridge: MIT.
MacKay, D. M. (1969b). Meaning and mechanism. In Information, mechanism and meaning. Cambridge: MIT.
MacKay, D. M. (1969c). The place of ‘meaning’ in the theory of information. In Information, mechanism and meaning. Cambridge: MIT.
Maturana, H. R., & Varela, F. J. (1980). Autopoiesis and cognition: The realization of the living. Berlin: Springer.
Millikan, R. (1987). Language, thought, and other biological categories: New foundations for realism. Cambridge: MIT.
Millikan, R. (1995). White queen psychology and other essays for Alice. Cambridge: MIT.
Millikan, R. (2006). Varieties of meaning: The 2002 Jean Nicod lectures. Cambridge: MIT.
Morris, C. W. (1938). Foundations of the theory of signs. Chicago: University of Chicago Press.
Nauta, D. (1970). The meaning of information. The Hague: Mouton.
Newell, A., & Simon, H. (1981). Computer science as empirical enquiry. In J. Hougeland (Ed.), Mind design. Cambridge: MIT.
Peirce, C. (1940). Logic as semiotic: The theory of signs. In The philosophy of Peirce: Selected writings. London: Routledge and Kegan Paul.
Putnam, H. (1975). The meaning of ‘meaning’. In K. Gunderson (Ed.), Language, mind and knowledge. Minnesota Studies in the Philosophy of Science, no. VII. Minneapolis: University of Minnesota Press.
Seligmen, J. (1991). Physical situations and information flow. In R. Cooper, K. Mukai, J. Barwise, & J. Perry (Eds.), Situation theory and its applications (Vol. 2). Stanford: CSLI.
Sethna, J. P. (2009). Statistical mechanics: Entropy, order parameters, and complexity. Oxford: Clarendon.
Shannon, C. (1948). The mathematical theory of communication. Bell Systems Technical Journal, 27, 379–423.
Thelen, E., & Smith, L. (1994). Dynamics system approach to the development of cognition and action. Cambridge: MIT.
Turchin, V. (1990). Cybernetics and philosophy. In F. Geyer (Ed.), The cybernetics of complex systems (pp. 61–74). Salinas: Intersystems.
Vakarelov, O. (2011). The cognitive agent: Overcoming informational limits. Philosophical Psychology, in press.
van Gelder, T. (1998). The dynamical hypothesis in cognitive science. Behavioral and Brain Sciences, 21, 615–628.
Varela, F. (2000). El fenomeno de la vida. Santiago de Chile: Dolmen Ediciones.
Varela, F. J., Thompson, E. T., & Rosch, E. (1992). The embodied mind. Cambridge: MIT.
von Uexküll, J. (1909). Unwelt und Innenwelt der Tierre. Berlin: Springer.
von Uexküll, J. (1932). Streifziige durch die Umwelten von Tieren und Menschen. Berlin: Springer.
von Uexküll, J. (1982). The theory of meaning. Semiotica, 42(1), 25–82.
Weaver, W., & Shannon, C. E. (1963). The mathematical theory of communication. Champaign: Univ. of Illinois Press.
Weber, A., & Varela, F. (2002). Life after Kant: Natural purposes and the autopoietic foundations of biological individuality. Phenomenology and the Cognitive Sciences, 1(2), 97–125.
Wiener, N. (1965). Cybernetics: Or the control and communication in the animal and the machine. Cambridge: MIT.
Williams, P. L., & Beer, R. D. (2010). Information dynamics of evolved agents. In From animals to animats 11: Proceedings of the 11th international conference on simulation of adaptive behavior.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Vakarelov, O. Pre-cognitive Semantic Information. Know Techn Pol 23, 193–226 (2010). https://doi.org/10.1007/s12130-010-9109-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12130-010-9109-5