Abstract
Traditional feedforward neural networks are static structures that simply map input to output. To better reflect the dynamics in the biological system, time dependency is incorporated into the network by using Finite Impulse Response (FIR) linear filters to model the processes of axonal transport, synaptic modulation, and charge dissipation. While a constructive proof gives a theoretical equivalence between the class of problems solvable by the FIR model and the static structure, certain practical and computational advantages exist for the FIR model. Adaptation of the network is achieved through an efficient gradient descent algorithm, which is shown to be a temporal generalization of the popular backpropagation algorithm for static networks. Applications of the network are discussed with a detailed example of using the network for time series prediction.
Similar content being viewed by others
References
D. E. Rumelhart, J. L. McClelland, and the PDP Research Group,Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vol. 1, MIT Press: Cambridge, MA, 1986.
P. Werbos, “Beyond regression: new tools for prediction and analysis in the behavioral sciences,” Dissertation, Harvard University, Cambridge, MA, 1974.
B. Widrow and S. D. Stearns,Adaptive Signal Processing, Prentice Hall: Englewood Cliffs, NJ, 1985.
D. Tank and J. Hopfield, “Simple ‘neural’ optimization networks: an A/D converter, signal decision circuit, and a linear programming circuit,”IEEE Trans. Circuits Systems vol. 33, pp. 533–541, 1986.
K. Hornik, M. Stinchombe, and H. White, “Multilayer feedforward networks are universal approximators,”Neural Networks vol. 2, pp. 359–366, 1989.
G. Cybenko, “Approximation by superpositions of a sigmoidal function,”Math. of Control Signals Systems vol. 2, no. 4, pp. 303–314, 1989.
B. Irie and S. Miyake, “Capabilities of three-layered perceptrons,”Proc. IEEE Second Int. Conf. Neural Networks, vol. 1, San Diego, CA, July 1988, pp. 641–647.
B. Widrow and M. Lehr, “30 years of adaptive neural networks: perceptron, madaline, and backpropagation,”Proc. IEEE vol. 78, no. 9, pp. 1415–1442, September 1990.
D. Junge,Nerve and Muscle Excitation, Third Edition, Sinauer Associates, Inc., Sunderland, MA, 1991.
C. Koch and I. Segev (eds.),Methods in Neuronal Modeling: From Synapses to Networks, MIT Press: Cambridge, MA, 1989.
R. MacGregor and E. Lewis,Neural Modeling Plenum Press: New York, 1977.
E. Wan, “Temporal backpropagation for FIR neural networks,”Int. Joint Conf. Neural Networks, vol. 1, San Diego, CA, 1990, pp. 575–580.
A. Waibel, T. Hanazawa, G. Hinton, K. Shikano, and K. Lang. “Phoneme recognition using time-delay neural networks,”IEEE Trans. Acoust. Speech Signal Processing vol. 37, no. 3, pp. 328–339, 1989.
E. Wan, “Finite Impulse Response neural networks for autoregressive time series prediction,” to appear inProc. NATO Adv. Workshop Time Series Prediction Analysis, edited by A. Weigend and N. Gershenfeld, Addison-Wesley: Reading, MA, 1993.
E. Wan, “Temporal backpropagation: an efficient algorithm for finite impulse response neural networks,”Proc. 1990 Connectionist Models Summer School, Morgan Kaufmann: San Mateo, CA, 1990, pp. 131–140.
G. Yule, “On a method of investigating periodicity in disturbed series with special reference to Wolfer's sunspot numbers,”Philos. Trans. R. Soc. Lond. A vol. 226, pp. 267–298, 1927.
L. Ljung,System Identification: Theory for the User, Prentice-Hall: Englewood Cliffs, NJ, 1987.
S. Wei and W. William,Time Series Analysis: Univariate and Multivariate Methods, Addison-Wesley: Reading, MA, 1990.
H. White. “Learning in artificial neural networks: a statistical perspective,”Neural Computations vol. 1, pp. 425–464, 1989.
E. Wan, “Neural network classification: a Bayesian interpretation,”IEEE Trans. Neural Networks vol. 1, no. 4, pp. 303–305, 1990.
U. Huebner, N. B. Abrahm, and C. O. Weiss, “Dimension and entropies of a chaotic intensity pulsations in a single-mode far-unfrared NH3 laser,”Phys. Rev. A vol. 40, pp. 6345, 1989.
Proc. NATO Adv. Res. Workshop on Time Series Prediction and Analysis, Santa Fe, New Mexico, May 14–17, 1992, edited by A. Weigend and N. Gershenfeld, Addison-Wesley: Reading, MA, 1993.
A. Weigend and N. Gershenfeld, “Results of the time series prediction competition at the Santa Fe Institute,” inNeural Information Processing Syst., Denver, CO, December 1992.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Wan, E.A. Discrete time neural networks. Appl Intell 3, 91–105 (1993). https://doi.org/10.1007/BF00871724
Issue Date:
DOI: https://doi.org/10.1007/BF00871724