Coupled Neural Networks
Multilayered feed-forward networks (perceptrons) are special cases of the general McCulloch-Pitts neural network with arbitrarily interconnected neurons. On the other hand, any general “recurrent” neural network can be considered to be represente by a feed-forward perceptron, albeit one with possibly very many layers. The reason for this strange equivalence is that the temporal evolution (3.5) of an arbitrary network constructed from binary neurons is necessarily periodic. This statement follows immediately from the observation that the N neurons can only assume 2 N configurations altogether, and hence some state of the network must reoccur after at most 2 N steps. Since only the present state of the network enters on the right-hand side of the evolution law (14.1), the subsequent evolution proceeds strictly periodically from that moment on. If one considers the neural network at a certain moment t = n as the nth layer of a perceptron (with all layers identical!), the temporal-evolution law can be viewed as the law governing the flow of information from one layer to the next. It is then sufficient to take into account only a finite number of such layers, just as many as there are time steps leading up to the first repetition of a network configuration.
KeywordsOriginal Network Synaptic Coupling Couple Neural Network Network Hierarchy Prescribe Trajectory
Unable to display preview. Download preview PDF.
- 1.It is common to treat only some of the neurons as receptor neurons; then the I i do not vanish only for those neurons.Google Scholar
- 2.If the fixed-point equation (14.3) has more than one solution, it may well depend on the start configuration which fixpoint is reached. Here we do not consider this case of multi-stability further, and concentrate on a single task to be learned by the network.Google Scholar
- 3.The same complication occurs if lateral synaptic connections between the neurons contained in the same layer of a perceptron are allowed.Google Scholar
- 4.The existence of a stationary state of the assistant network is guaranteed if the original network has a fixed point, since its dynamics corresponds to that of the linearized original network in the vicinity of its fixed point, but run backwards in time. The matrix w ik of the synaptic connections of the assistant network contains the transpose of the synaptic matrix w ki of the original neural net, which means that the directions of all synapses have been reversed.Google Scholar
- 5.In a similar differential equation was found to describe an electronic network of coupled nonlinear circuits.Google Scholar