Introduction

  • Zhang Yi
  • K. K. Tan
Part of the Network Theory and Applications book series (NETA, volume 13)

Abstract

Generally, neural networks can be divided into two large classes. One class contains feedforward neural networks (FNNs), and the other contains recurrent neural networks (RNNs). This book focused on RNNs only. The essential difference between FNNs and RNNs is the presence of a feedback mechanism among the neurons in the latter. A FNN is a network without any feedback connections among its neurons, while a RNN has at least one feedback connection. Since RNNs allow feedback connections in neurons, the network topology can be very general: any neuron can be connected to any other, even to itself. Allowing the presence of feedback connections among neurons has an advantage, it leads naturally to an analysis of the networks as dynamic systems, in which the state of a network, at one moment in time, depends on the state at a previous moment in time. The topology of RNNs is shown in Figure 1.1.

Keywords

Neral Librium Rium 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media Dordrecht 2004

Authors and Affiliations

  • Zhang Yi
    • 1
  • K. K. Tan
    • 2
  1. 1.School of Computer Science and EngineeringUniversity of Electronic Science and Technology of ChinaChengduPeople’s Republic of China
  2. 2.Department of Electrical and Computer EngineeringThe National University of SingaporeSingapore

Personalised recommendations