A Novel Fractional Gradient-Based Learning Algorithm for Recurrent Neural Networks

  • Shujaat Khan
  • Jawwad Ahmad
  • Imran Naseem
  • Muhammad Moinuddin
Article
  • 73 Downloads

Abstract

In this research, we propose a novel algorithm for learning of the recurrent neural networks called as the fractional back-propagation through time (FBPTT). Considering the potential of the fractional calculus, we propose to use the fractional calculus-based gradient descent method to derive the FBPTT algorithm. The proposed FBPTT method is shown to outperform the conventional back-propagation through time algorithm on three major problems of estimation namely nonlinear system identification, pattern classification and Mackey–Glass chaotic time series prediction.

Keywords

Back-propagation through time (BPTT) Recurrent neural network (RNN) Gradient descent Fractional calculus Mackey–Glass chaotic time series Minimum redundancy and maximum relevance (mRMR) 

Copyright information

© Springer Science+Business Media New York 2017

Authors and Affiliations

  1. 1.Department of Bio and Brain EngineeringKorea Advanced Institute of Science and Technology (KAIST)DaejeonRepublic of Korea
  2. 2.Faculty of Engineering Science and TechnologyIqra UniversityKarachiPakistan
  3. 3.Department of Electrical EngineeringUsman Institute of TechnologyKarachiPakistan
  4. 4.College of EngineeringKarachi Institute of Economics and TechnologyKorangi Creek, KarachiPakistan
  5. 5.School of Electrical, Electronic and Computer EngineeringThe University of Western AustraliaCrawleyAustralia
  6. 6.Center of Excellence in Intelligent Engineering Systems (CEIES)King Abdulaziz UniversityJeddahSaudi Arabia

Personalised recommendations