Abstract
Today Feed Forward Neural Networks (FFNs) use paradigms tied to mathematical frameworks more than to actual electronic devices. This fact makes analog neural integrated circuits heavy to design. Here we propose an alternative model that can use the native computational properties of the basic electronic circuits. A practical framework is described to train analog FFNs in compliance with the model. This is especially useful whenever the weight storage elements cannot be re-programmed on the fly at a high rate. To show how the capability of such framework can be applied to neural systems with non conventional architectures two cases are presented. The first one is a neural signal processor named NESP which has sigmoidal neurons and the other is an innovative architecture named N-LESS.
Similar content being viewed by others
References
M. L. Minsky and S. A. Papert, Perceptrons. MIT Press, Cambridge (USA), 1969.
F. Rosenblatt, Principles of Neurodynamics. Spartan Books, New York (USA), 1962.
B. Widrow, Self Organizing Systems, chapter Generalization and information storage in Network of Adaline neurons, pp. 435-461, Spartan Books, Washington DC (USA), 1962.
J. J. Hopfield, “Neurons with Graded Response have Collectiv Computational Properties like those of Two-State Neurons.” in Proc. National Academy of Sciences, 1984, pp. 3088-3092.
A. J. Montalvo, R. S. Gyurcsik, and J. J. Paulos, “An Analog VLSI Neural Network with On-Chip Perturbation Learning.” IEEE Journal of Solid-State Circuits32(4), pp. 535-543, 1997.
M. Valle, D. D. Caviglia, and G. M. Bisio, “An Experimental Analog VLSI Neural Network with On-Chip Back-Propagation Learning.” Analog Intergrated Circuits and Signal Processing9(3), pp. 231-245, 1996.
S. Satyanarayana, Y. P. Tsividis, and H. P. Graf, “A Reconfigurable VLSI Neural Network.” IEEE Journal of Solid-State CircuitsSC 27(1), pp. 67-81, 1992.
M. Riedmiller and H. Braun, “A Direct Adaptive Method for Faster Backpropagation Learning:the RPROP Algorithm.” in Proc. ICNN93 International Conference on Neural Networks, San Francisco, 1993, IEEE.
T. Tollenaere, “SuperSAB: Fast Adaptive Back Propagation with Good Scaling Properties.” Neural Networks3(5), pp. 561-573, 1990.
M. Costa, “Modelli per Reti Neuronali Artificiali Volti alla Realizzazione su Circuiti Integrati Analogici.” Ph.D. thesis, Politecnico di Torino, Torino, (I), 1997.
M. Costa, D. Palmisano, and E. Pasero, “An Analog Neuronless Reconfigurable Neural Network.” in ECS97 Electronics Circuits and Systems Conference. V. Stopjanková, I. Mucha, and N. Frištacký, Eds., Bratislava (SK), 1997, pp. 315-321.
E. Pasero, “Floating Gate as Adaptative Weights for Artificial Neural Networks.” in Silicon Architectures for Neural Networks. M. Sami and M. Caldazilla-Daguerre, Eds., pp. 125-135. North-Holland, Amsterdam (NL), 1991.
C. K. Sin, A. Kramer, V. Hu, R. R. Chu, and P. K. Ko, “EEPROM as Analog Storage Devices with Particular Applications in Neural Networks.” IEEE Transanctions on Electron Devices39(6), pp. 1410-1419, 1992.
D. Palmisano, “Progetto di Circuiti Neuronali Analogici.” Ph.D. thesis, Politecnico di Torino, Torino, (I), 1998.
G. M. Bollano, M. Costa, D. Palmisano, and E. Pasero, “Off-chip Training of Analog Hardware Feed-forward Neural Networks through Hyper-Floating Resilient Propagation.” in Wirn96 Italian Workshop on Neural Nets. M. Marinaro and R. Tagliaferri, Eds., Vietri sul Mare, 1996, SIREN, pp. 285-297, Springer.
A. Kramer, “EEPROM Devices as Reconfigurable Analog Element for Neural Networks,” in IEDM, 1989, pp. 259-262.
M. D. Garrisand and R. A. Wilkinson, “Handwritten segmented characters database,” Technical Report Special Database 3, National Institute of Standard Technology, 1992.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Costa, M., Palmisano, D. & Pasero, E. A System Design Methodology for Analog Feed Forward Artificial Neural Networks. Analog Integrated Circuits and Signal Processing 21, 45–55 (1999). https://doi.org/10.1023/A:1008375726853
Issue Date:
DOI: https://doi.org/10.1023/A:1008375726853