Advertisement

Capacity and parasitic fixed points control in a recursive neural network

  • V. Giménez
  • M. Pérez-Castellanos
  • J. Rios Carrion
  • F. de Mingo
Formal Tools and Computational Models of Neurons and Neural Net Architectures
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1240)

Abstract

This paper describes a new method for controlling the capacity and for diminishing the number of parasitic fixed points in a Recursive Neural Network RNN. Based on preliminary researches [1] a Recursive Neural Network may be seen as a graph. The matrix of weights W presents certain properties for which it may be called a tetrahedral matrix [2]. The geometrical properties of these kind of matrices may be used for classifying the n-dimensional state-vector space in n classes[2]. In the recall stage, a parameter vector σ may be introduced, which is related with the capacity of the network [3]. It may be shown that the bigger is the value of the i-th component the vector σ the higher became the capacity of the i class of the state-vector space[2]. Once the capacity has been controlled with the parameter σ, we introduce a new parameter that use the statistical deviation of the prototypes to compare them with those that appears as fixed points, eliminating in this way a great number of parasitic fixed points.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    V. Giménez, P. Gómez-Vilda, M. Pérez-Castellanos and V. Rodellar, A New Approach for Finding the Weights in a Neural Network using Graphs, Proc. of the 36th Midwest Symposium on Circuits and Systems, Detroit, August 16–18, 1993, pp. 113–116.Google Scholar
  2. [2]
    V. Giménez, E. Torrano, P. Gómez-Vilda and M. Pérez-Castellanos, A Class of Recursive Neural Networks Based on Analytic Geometry, Proc. of the International Conference on Brain Processes, Theories and Models. Canary Islands, Spain, November 12–17, 1995. pp.330–339.Google Scholar
  3. [3]
    V. Giménez, P. Gómez-Vilda, M. Pérez-Castellanos and E. Torrano, A New Approach for improving the capacity limit on a Recursive Neural Network, Proc. of the AMS'94. IASTED, Lugano, Switzerland, June 20–22, 1994, pp. 90–93.Google Scholar
  4. [4]
    V. Giménez, P. Gómez-Vilda, E. Torrano and M. Pérez-Castellanos, A New Algorithm for Implementing a Recursive Neural Network, Proc. of the IWANN'95 Málaga-Torremolinos, Spain, June 1995, pp. 252–259.Google Scholar
  5. [5]
    V. Rodellar, P. Gómez, M. Hermida and R. W. Newcomb, An Auditory Neural System for Speech Processing and Recognition, Proceedings of the ICARCV92, Singapore, September 16–18, 1992, pp. INV-6.2.1–5.Google Scholar
  6. [6]
    Yves Kamp and Martin Hasler, Recursive Neural Networks for Associative Memory, Wiley-Interscience Series in Systems and Optimization, England, 1990, pp. 10–34.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1997

Authors and Affiliations

  • V. Giménez
    • 1
  • M. Pérez-Castellanos
    • 2
  • J. Rios Carrion
    • 3
  • F. de Mingo
    • 3
  1. 1.Departamento de Matemática Aplicada Facultad de InformáticaUniversidad Politécnica de MadridMadridSpain
  2. 2.Departamento de Arquitectura y Tecnología de Sistemas Informáticos Facultad de InformáticaUniversidad Politécnica de MadridMadridSpain
  3. 3.Departamento de Inteligencia Artificial Facultad de InformáticaUniversidad Politécnica de MadridMadridSpain

Personalised recommendations