Advertisement

Parallel Approach to Learning of the Recurrent Jordan Neural Network

  • Jarosław Bilski
  • Jacek Smoląg
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7894)

Abstract

This paper presents the parallel architecture of the Jordan network learning algorithm. The proposed solution is based on the high parallel three dimensional structures to speed up learning performance. Detailed parallel neural network structures are explicitly shown.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bilski, J.: The UD RLS Algorithm for Training the Feedforward Neural Networks. International Journal of Applied Mathematics and Computer Science 15(1), 101–109 (2005)Google Scholar
  2. 2.
    Bilski, J., Litwiński, S., Smoląg, J.: Parallel realisation of QR algorithm for neural networks learning. In: Rutkowski, L., et al. (eds.) ICAISC 2004. LNCS (LNAI), vol. 3070, pp. 158–165. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  3. 3.
    Bilski, J., Smoląg, J.: Parallel realisation of the recurrent RTRN neural network learning. In: Rutkowski, L., et al. (eds.) ICAISC 2008. LNCS (LNAI), vol. 5097, pp. 11–16. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  4. 4.
    Bilski, J., Smoląg, J.: Parallel Realisation of the Recurrent Elman Neural Network Learning. In: Rutkowski, L., et al. (eds.) ICAISC 2010, Part II. LNCS(LNAI), vol. 6114, pp. 19–25. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  5. 5.
    Bilski, J., Smoląg, J.: Parallel Realisation of the Recurrent Multi Layer Perceptron Learning. In: Rutkowski, L., et al. (eds.) ICAISC 2012, Part I. LNCS(LNAI), vol. 7267, pp. 12–20. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  6. 6.
    Kolen, J.F., Kremer, S.C.: A Field Guide to Dynamical Recurrent Neural Networks. IEEE Press (2001)Google Scholar
  7. 7.
    Korbicz, J., Patan, K., Obuchowicz, A.: Dynamic neural networks for process modelling in fault detection and isolation. Int. J. Appl. Math. Comput. Sci. 9(3), 519–546 (1999)zbMATHGoogle Scholar
  8. 8.
    Li, X., Er, M.J., Lim, B.S., et al.: Fuzzy Regression Modeling for Tool Performance Prediction and Degradation Detection. International Journal of Neural Systems 20(5), 405–419 (2010)CrossRefGoogle Scholar
  9. 9.
    Rutkowski, L.: Multiple Fourier series procedures for extraction of nonlinear regressions from noisy data. IEEE Transactions on Signal Processing 41(10), 3062–3065 (1993)zbMATHCrossRefGoogle Scholar
  10. 10.
    Rutkowski, L.: Non-parametric learning algorithms in the time-varying environments. Signal Processing 18(2), 129–137 (1989)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Rutkowski, L.: Generalized regression neural networks in time-varying environment. IEEE Trans. Neural Networks 15, 576–596 (2004)CrossRefGoogle Scholar
  12. 12.
    Rutkowski, L., Przybył, A., Cpałka, K.: Novel Online Speed Profile Generation for Industrial Machine Tool Based on Flexible Neuro-Fuzzy Approximation. IEEE Transactions on Industrial Electronics 59(2), 1238–1247 (2012)CrossRefGoogle Scholar
  13. 13.
    Smoląg, J., Bilski, J.: A systolic array for fast learning of neural networks. In: Proc. of V Conf. Neural Networks and Soft Computing, Zakopane, pp. 754–758 (2000)Google Scholar
  14. 14.
    Smoląg, J., Rutkowski, L., Bilski, J.: Systolic array for neural networks. In: Proc. of IV Conf. Neural Networks and Their Applications, Zakopane, pp. 487–497 (1999)Google Scholar
  15. 15.
    Williams, R., Zipser, D.: A learning algorithm for continually running fully recurrent neural networks. Neural Computation, 270–280 (1989)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Jarosław Bilski
    • 1
  • Jacek Smoląg
    • 1
  1. 1.Czȩstochowa University of TechnologyCzȩstochowaPoland

Personalised recommendations