Lower bounds of computational power of a synaptic calculus

  • João Pedro Neto
  • J. Félix Costa
  • Helder Coelho
Formal Tools and Computational Models of Neurons and Neural Net Architectures
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1240)

Abstract

The majority of neural net models presented in the literature focus mainly in the neural structure of nets, leaving aside many details about synapses and dendrites. This can be very reductionist if we want to approach our model to real neural nets. These structures tend to be very elaborate, and are able to process information in very complex ways (see [MEL 94] for details).

We will introduce a new model, the S-Net (Synaptic-Net), in order to represent neural nets with special emphasis on synaptic and dendritic transmission. First, we present the supporting mathematical structure of S-Nets, initially inspired on Petri-Net formalism, adding a transition to transition connection type. There are two main components of S-Nets, neurones and synaptic/dendritic units (s/d units). All activation values are integers. Neurones are similar to McCulloch-Pitts neurones, and s/d units will process information within certain class of functions.

S-Nets are able to represent spatial nets representations in a very natural way. We can easily modulate the length of an axon, the connection or branching of two dendrites or synaptic connections. Some examples are shown.

Next, the focus will be on what kind of functions are suited to s/d units. We will present three function types: sum, maximization and simple negation (changing an excitatory impulse to an inhibitory one, or vice-versa). With these functions for s/d units and with simple neurones, we will prove that all recursive functions can be computed by at least one specific S-Net. In order to achieve this, we will use the Register Machine, and show a way to build for each symbolic program, a S-Net capable of computing the function defined for that specific program. It is shown a simple application example. This computational power will be achieved without any use of synaptic weights (i.e., all weights are one as in McCulloch's model) or neurones activation values (i.e., all values are set to zero).

Finally, some aspects for future investigation are presented, namely, the possibility of synaptic-synaptic connections, how can noise be handled, and some other features intended to approach this mathematical model to our reality.

Keywords

Neural Networks Synapses Dentritic Trees Neural Computation Computational Theory Spatial Representation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Bibliography

  1. ANDERSON, J. (1995). An Introduction to Neural Networks. Massachusetts Institute of Technology.Google Scholar
  2. ARBIB, M (1989). The Metaphorical Brain 2. Neural Networks and Beyond. John Wiley & Sons.Google Scholar
  3. CUTLAND, N. (1988). Computability. An Introduction to Recursive Function Theory. Cambridge University Press.Google Scholar
  4. McCULLOCH, W.; PITTS, W.(1943). A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics, 5, pp.115–33.Google Scholar
  5. MEL, B. (1994). “Information Processing in Dendritic Trees”. Neural Computation, 6, pp. 1031–85. Massachusetts Institute of Technology.Google Scholar
  6. SHEPHERD, G. (1994). Neurobiology. Oxford University Press.Google Scholar
  7. SIEGELMANN, H.; SONTAG, E. (1995). “On the Computational Power of Neural Nets”, in Journal of Computer and System Sciences, Vol. 50, No.1. Academic Press.Google Scholar

Copyright information

© Springer-Verlag 1997

Authors and Affiliations

  • João Pedro Neto
    • 1
  • J. Félix Costa
    • 1
  • Helder Coelho
    • 1
  1. 1.Dept. InformáticaFaculdade de Ciências da Universidade de LisboaLisboaPortugal

Personalised recommendations