A mixed parallel-sequential SHNN for large networks
This paper presents an architecture for the implementation of a large Stochastic Hopfield Neural Networks (SHNNs). The sequential SHNN, originally proposed in , takes a long time to convergence. On the other hand, the connection between chips limits to one hundred the number of neurons of the fully parallel SHNN proposed in –. A multichip approach proposed in this paper overcome both problems. The architecture, using a mixed parallel-sequential strategy, reduces the number of interconection lines to k while accelerates the convergence time of the network in  by a factor k. A partitioning problem is simulated to evaluate the behavior of the network.
Unable to display preview. Download preview PDF.
- A.Torralba, F.Colodro. “Towards a fully parallel Stochastic Hopfield Neural Network”. Proc. of the ISCAS'93, pp. 2741–2743, May 1993.Google Scholar
- A.Torralba, F.Colodro. “Two digital circuits for a Fully Parallel Stochastic Neural Network”. IEEE. Trans. of Neural Network (to appear) Google Scholar
- D.E. van den Bout and T.K.Miller III, “TInMANN: The Integer Markovian Artificial Neural Network”.Google Scholar
- M.S.Melton, T.Phan, D.S.Reeves, D.E. van den Bout, “The TInMANN VLSI chip”. IEEE Trans. Neural Networks, vol.3, no. 3, May 1992.Google Scholar
- Y.Kondo and Y.Sawada, “Functional abilities of a stochastic logic neural network”. IEEE Trans. Neural Networks, vol. 3, no. 3, May 1992.Google Scholar
- C.Janer and J.M.Quero, “Fully parallel summation in a new Stochastic Neural Network architecture”. IEEE Trans. Int. Conf. in Neural Network, San Francisco, 1993.Google Scholar
- L.Dadda. “Some schemes for parallel multipliers”. Alta Freq., vol. 19, pp. 349–356, May 1965.Google Scholar