Abstract
The rebirth of activity in the area of neural computation is stimulated by the increasing frequency with which traditional computational paradigms appear to inefficiently handle fuzzy problems of large dimensionality (e.g. pattern recognition, associative information retrieval, etc.) and the technological advances. Indeed, with the huge strides in VLSI and WSI technologies and the emergence of electro-optics, massively parallel systems that were unrealisable only a few years ago are coming within reach.
The paper details the efficient implementation of two neural network models (i.e. Hopfield’s relaxation model and the back propagation model) on a massively parallel, programmable, fault-tolerant architecture, the ASP (Associative String Processor), which can efficiently support low-MIMD/high-SIMD and other parallel computation paradigms.
Indeed, the paper describes the mapping of the two neural networks, details the steps required to execute the network computations and reports the performance of the ASP implementations which achieved computational rate of Giga-interconnections/sec (i.e. 10 9 interconnections per sec).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Lea, R. M., “The ASP: a cost-effective parallel microcomputer” IEEE Micro, pp. 10–29, October 1988,.
Hopfield, J. J. “Neural Networks and Physical Systems with Emergent Collective Computational Abilities”, Proceedings of the National Academy of Science, USA, Vol. 79, pp. 2554–2558, April 1982.
Jones, W. P. and Hoskins, J., “Back-Propagation”, BYTE, pp. 155–162, October 1987.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1991 Springer Science+Business Media New York
About this chapter
Cite this chapter
Krikelis, A., Grözinger, M. (1991). Implementing Neural Networks with the Associative String Processor. In: Delgado-Frias, J.G., Moore, W.R. (eds) VLSI for Artificial Intelligence and Neural Networks. Springer, Boston, MA. https://doi.org/10.1007/978-1-4615-3752-6_39
Download citation
DOI: https://doi.org/10.1007/978-1-4615-3752-6_39
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4613-6671-3
Online ISBN: 978-1-4615-3752-6
eBook Packages: Springer Book Archive