Advertisement

Sparsely interconnected artificial neural networks for associative memories

  • Derong Liu
  • Anthony N. Michel
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 686)

Abstract

We develop in the present paper a design procedure for neural networks with sparse coefficient matrices. Our results guarantee that the synthesized neural networks have predetermined sparse interconnection structures and store any set of desired memory patterns as reachable memory vectors. We show that a sufficient condition for the existence of a sparse neural network design is self feedback for every neuron in the network. Our design procedure for neural networks with sparse interconnecting structure can take into account various problems encountered in VLSI realizations of such networks. For example, our procedure can be used to design neural networks with few or without any line-crossings resulting from the network interconnections. Several specific examples are included to demonstrate the applicability of the methodology advanced herein.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    L. O. Chua, L. Yang, “Cellular Neural Networks: Theory,” IEEE Transactions on Circuits and Systems, Vol. 35, pp. 1257–1272, Oct. 1988Google Scholar
  2. [2]
    S. R. Das, “On the Synthesis of Nonlinear Continuous Neural Networks,” IEEE Transactions on Systems, Man, and Cybernetics, Vol. 21, pp. 413–418, March/Apr. 1991Google Scholar
  3. [3]
    J. A. Farrell, A. N. Michel, “A Synthesis Procedure for Hopfield's Continuous-Time Associative Memory,” IEEE Transactions on Circuits and Systems, Vol. 37, pp. 877–884, July 1990Google Scholar
  4. [4]
    J. J. Hopfield, “Neurons with graded response have collective computational properties like those of two-state neurons,” Proc. Nat. Acad. Sci. USA, Vol. 81, pp. 3088–3092, May 1984Google Scholar
  5. [5]
    J.-H. Li, A. N. Michel. W. Porod, “Analysis and Synthesis of a Class of Neural Networks: Variable Structure Systems with Infinite Gain,” IEEE Transactions on Circuits and Systems, Vol. 36, pp. 713–731, May 1989Google Scholar
  6. [6]
    J.-H. Li, A. N. Michel, W. Porod, “Analysis and Synthesis of a Class of Neural Networks: Linear Systems Operating on a Closed Hypercube,” IEEE Transactions on Circuits and Systems, Vol. 36, pp. 1405–1422, Nov. 1989Google Scholar
  7. [7]
    Derong Liu, A. N. Michel, “Sparsely Interconnected Neural Networks for Associative Memories with Applications to Cellular Neural Networks,” Submitted to IEEE Transactions on Circuits and Systems Google Scholar
  8. [8]
    A. N. Michel, J. A. Farrell, “Associative Memories via Artificial Neural Networks,” IEEE Control Systems Magazine, Vol. 10, pp. 6–17, Apr. 1990Google Scholar
  9. [9]
    A. N. Michel, J. Si, G. Yen, “Analysis and Synthesis of a Class of Discrete-Time Neural Networks Described on Hypercubes,” IEEE Transactions on Neural Networks, Vol. 2, pp. 32–46, Jan. 1991Google Scholar
  10. [10]
    L. Personnaz, I. Guyon, G. Dreyfus, “Collective Computational Properties of Neural Networks: New Learning Mechanisms,” Physical Review A, Vol. 34, pp. 4217–4228, Nov. 1986Google Scholar
  11. [11]
    F. M. A. Salam, Y. Wang, M.-R. Choi, “On the Analysis of Dynamic Feedback Neural Nets,” IEEE Transactions on Circuits and Systems, Vol. 38, pp. 196–201, Feb. 1991Google Scholar
  12. [12]
    G. Yen, A. N. Michel, “A Learing and Forgetting Algorithm in Associative Memories: The Eigenstructure Method,” IEEE Transactions on Circuits and Systems-II: Analog and Digital Signal Processing, Vol. 39, pp. 212–225, Apr. 1992Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1993

Authors and Affiliations

  • Derong Liu
    • 1
  • Anthony N. Michel
    • 1
  1. 1.Department of Electrical EngineeringUniversity of Notre DameNotre DameUSA

Personalised recommendations