Skip to main content
Log in

A System Design Methodology for Analog Feed Forward Artificial Neural Networks

  • Published:
Analog Integrated Circuits and Signal Processing Aims and scope Submit manuscript

Abstract

Today Feed Forward Neural Networks (FFNs) use paradigms tied to mathematical frameworks more than to actual electronic devices. This fact makes analog neural integrated circuits heavy to design. Here we propose an alternative model that can use the native computational properties of the basic electronic circuits. A practical framework is described to train analog FFNs in compliance with the model. This is especially useful whenever the weight storage elements cannot be re-programmed on the fly at a high rate. To show how the capability of such framework can be applied to neural systems with non conventional architectures two cases are presented. The first one is a neural signal processor named NESP which has sigmoidal neurons and the other is an innovative architecture named N-LESS.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. M. L. Minsky and S. A. Papert, Perceptrons. MIT Press, Cambridge (USA), 1969.

    Google Scholar 

  2. F. Rosenblatt, Principles of Neurodynamics. Spartan Books, New York (USA), 1962.

    Google Scholar 

  3. B. Widrow, Self Organizing Systems, chapter Generalization and information storage in Network of Adaline neurons, pp. 435-461, Spartan Books, Washington DC (USA), 1962.

    Google Scholar 

  4. J. J. Hopfield, “Neurons with Graded Response have Collectiv Computational Properties like those of Two-State Neurons.” in Proc. National Academy of Sciences, 1984, pp. 3088-3092.

  5. A. J. Montalvo, R. S. Gyurcsik, and J. J. Paulos, “An Analog VLSI Neural Network with On-Chip Perturbation Learning.” IEEE Journal of Solid-State Circuits32(4), pp. 535-543, 1997.

    Google Scholar 

  6. M. Valle, D. D. Caviglia, and G. M. Bisio, “An Experimental Analog VLSI Neural Network with On-Chip Back-Propagation Learning.” Analog Intergrated Circuits and Signal Processing9(3), pp. 231-245, 1996.

    Google Scholar 

  7. S. Satyanarayana, Y. P. Tsividis, and H. P. Graf, “A Reconfigurable VLSI Neural Network.” IEEE Journal of Solid-State CircuitsSC 27(1), pp. 67-81, 1992.

    Google Scholar 

  8. M. Riedmiller and H. Braun, “A Direct Adaptive Method for Faster Backpropagation Learning:the RPROP Algorithm.” in Proc. ICNN93 International Conference on Neural Networks, San Francisco, 1993, IEEE.

  9. T. Tollenaere, “SuperSAB: Fast Adaptive Back Propagation with Good Scaling Properties.” Neural Networks3(5), pp. 561-573, 1990.

    Google Scholar 

  10. M. Costa, “Modelli per Reti Neuronali Artificiali Volti alla Realizzazione su Circuiti Integrati Analogici.” Ph.D. thesis, Politecnico di Torino, Torino, (I), 1997.

    Google Scholar 

  11. M. Costa, D. Palmisano, and E. Pasero, “An Analog Neuronless Reconfigurable Neural Network.” in ECS97 Electronics Circuits and Systems Conference. V. Stopjanková, I. Mucha, and N. Frištacký, Eds., Bratislava (SK), 1997, pp. 315-321.

  12. E. Pasero, “Floating Gate as Adaptative Weights for Artificial Neural Networks.” in Silicon Architectures for Neural Networks. M. Sami and M. Caldazilla-Daguerre, Eds., pp. 125-135. North-Holland, Amsterdam (NL), 1991.

    Google Scholar 

  13. C. K. Sin, A. Kramer, V. Hu, R. R. Chu, and P. K. Ko, “EEPROM as Analog Storage Devices with Particular Applications in Neural Networks.” IEEE Transanctions on Electron Devices39(6), pp. 1410-1419, 1992.

    Google Scholar 

  14. D. Palmisano, “Progetto di Circuiti Neuronali Analogici.” Ph.D. thesis, Politecnico di Torino, Torino, (I), 1998.

    Google Scholar 

  15. G. M. Bollano, M. Costa, D. Palmisano, and E. Pasero, “Off-chip Training of Analog Hardware Feed-forward Neural Networks through Hyper-Floating Resilient Propagation.” in Wirn96 Italian Workshop on Neural Nets. M. Marinaro and R. Tagliaferri, Eds., Vietri sul Mare, 1996, SIREN, pp. 285-297, Springer.

  16. A. Kramer, “EEPROM Devices as Reconfigurable Analog Element for Neural Networks,” in IEDM, 1989, pp. 259-262.

  17. M. D. Garrisand and R. A. Wilkinson, “Handwritten segmented characters database,” Technical Report Special Database 3, National Institute of Standard Technology, 1992.

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Costa, M., Palmisano, D. & Pasero, E. A System Design Methodology for Analog Feed Forward Artificial Neural Networks. Analog Integrated Circuits and Signal Processing 21, 45–55 (1999). https://doi.org/10.1023/A:1008375726853

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1008375726853

Navigation