, Volume 39, Issue 2, pp 167-177
Date: 23 Mar 2013

New Results on Exponential Convergence for HRNNs with Continuously Distributed Delays in the Leakage Terms

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access

Abstract

This paper concerns with exponential convergence for a class of high-order recurrent neural networks with continuously distributed delays in the leakage terms. Without assuming the boundedness on the activation functions, some sufficient conditions are derived to ensure that all solutions of the networks converge exponentially to the zero point by using Lyapunov functional method and differential inequality techniques, which correct some recent results of Chen and Yang (Neural Comput Appl. doi:10.1007/s00521-012-1172-2, 2012). Moreover, we propose a new approach to prove the exponential convergence of HRNNs with continuously distributed leakage delays.