Historical Consistent Complex Valued Recurrent Neural Network

  • Hans-Georg Zimmermann
  • Alexey Minin
  • Victoria Kusherbaeva
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6791)

Abstract

Recurrent Neural Networks are in the scope of the machine learning community for many years. In the current paper we discuss the Historical Consistent Recurrent Neural Network and its extension to the complex valued case. We give some insights into complex valued back propagation and its application to the complex valued recurrent neural network training. Finally we present the results for the the Lorenz system modeling. In the end we discuss the advantages of the proposed algorithm and give the outlook.

Keywords

complex valued neural networks recurrent neural networks complex valued recurrent neural networks complex dynamics analysis 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Zimmermann, H.G., Grothmann, R., Schafer, A.M., Tietz, C.: Dynamical Consistent Recurrent Neural Networks. In: Proc. of the Int. Joint Conference on Neural Networks (IJCNN), vol. 3, pp. 1537–1541 Montreal (2005)Google Scholar
  2. 2.
    Schaefer, A.M., Zimmermann, H.G.: Recurrent Neural Networks are Universal Approximators. In: Proc. of International Conference on Artificial Neural Networks (ICANN), Athens. LNCS, vol. 17(4), pp. 253–263. Springer, Heidelberg (2006)Google Scholar
  3. 3.
    Zimmermann, H.G., Minin, A., Kusherbaeva, V.: Comparison of the Complex Valued and Real Valued Neural Networks Trained with Gradient Descent and Random Search Algorithms. In: European Symposium on Artificial Neural Networks, ESANN 2011 (to appear 2011)Google Scholar
  4. 4.
    Brandwood, D.H.: A complex gradient operator and its application in adaptive array theory. IEE Proceedings, F: Communications, Radar and Signal Processing 130(1), 1116 (1983)MathSciNetGoogle Scholar
  5. 5.
    Leung, H., Haykin, S.: The Complex Back Propagation. IEEE Transactions on Signal Processing 39(9), 2101–2104 (1991)CrossRefGoogle Scholar
  6. 6.
    Hirose, A.: Continuous Complex-Valued Back-propagation Learning. Electronics Letters 28(20), 1854–1855 (1992)CrossRefGoogle Scholar
  7. 7.
    Nitta, T.: An Extension of the Back-Propagation Algorithm to Complex Numbers. Neural Networks 10(8), 1391–1415 (1997)CrossRefGoogle Scholar
  8. 8.
    Kim, T., Adali, T.: Fully Complex Multi-Layered Perceptron Network for Nonlinear Signal Processing. VLSI Signal Processing 32, 29–43 (2002)CrossRefMATHGoogle Scholar
  9. 9.
    Gangal, A., Kalra, P., Chauhan, S.: Performance Evaluation of Complex Valued Neural Networks Using Various Error Functions. Enformatika 23, 27–32 (2007)Google Scholar
  10. 10.
    Lorenz, E.N.: Deterministic nonperiodic flow. Lecture Supplement 20, 130–141 (1963)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Hans-Georg Zimmermann
    • 1
  • Alexey Minin
    • 2
    • 3
  • Victoria Kusherbaeva
    • 3
  1. 1.Siemens AGCorporate TechnologyMuenchenGermany
  2. 2.Institut fur Informatik VITechnische Universitat MunchenMuenchenGermany
  3. 3.Siemens LLCCorporate TechnologySt. PetersburgRussia

Personalised recommendations