Delay-dependent exponential stability results for uncertain stochastic Hopfield neural networks with interval time-varying delays

This paper is concerned with stability analysis problem for uncertain stochastic neural networks with interval time-varying delays. The parameter uncertainties are assumed to be norm bounded and the delay is assumed to be time varying and belong to a given interval, which means that the lower and upper bounds of interval time-varying delays are available. Both the cases of the time-varying delays which may be differentiable and may not be differentiable are considered in this paper. Based on the Lyapunov–Krasovskii functional and stochastic stability theory, delay/interval-dependent stability criteria are obtained in terms of linear matrix inequalities. Some stability criteria are formulated by means of the feasibility of a linear matrix inequality (LMI), by introducing some free-weighting matrices. Finally, three numerical examples are provided to demonstrate the less conservatism and effectiveness of the proposed LMI conditions.


Introduction
In the past two decades, neural networks have received increasing interest owing to their applications in a variety of areas, such as signal processing, pattern recognition, static image processing, associative memory, and combinatorial optimization [9]. In implementation of artificial neural networks, time delays often arise in the processing of information storage and transmission. Since the time delays may lead to instability and oscillation of the neural network model, the issue on the stability analysis of neural networks with time delays has received more and more attention. As well known, in practice time-delays are often encountered in various engineering, biological, and economic systems. Up to now, the stability analysis problem of neural networks with time delay has attracted a large amount of research interest and many sufficient conditions have been proposed to guarantee the asymptotic or exponential stability for the neural networks with various types of time delays such as constant, time-varying, or distributed, see for example, [1,2,10,[12][13][14][15]17,21,24,25,28] and the references therein.
It is worth noting that the synaptic transmission is a noisy process brought on by random fluctuations from the release of neurotransmitters and other probabilistic causes in real nerve systems. Therefore, it is of practical importance to study the stochastic effects on the stability property of delayed neural networks, see for example [3][4][5][6]11,16,18,19,23,27,29]. Also, there are systems which are with some nonzero delays, but they are unstable without delay [7,8,30]. Therefore, it is important to perform the stability analysis systems with nonzero delays [22] and the non-zero delay can be placed into a given interval. Recently, some results on stability of stochastic neural networks with finite distributed delays have been reported in [16,18,19]. To the best of authors knowledge, so far, very few results on the delay/interval-dependent robust exponential stability analysis for uncertain stochastic neural networks with interval time-varying delays are available in the literature.
In this paper a class of uncertain stochastic neural networks with interval time-varying delays is considered. The parameter uncertainties are assumed to be norm bounded. By using the Lyapunov-Krasovskii functional technique, global robust stability conditions for the considered uncertain stochastic neural networks are given in terms of LMIs, which can be easily calculated by MATLAB LMI control toolbox and introducing some free-weighting matrices. Numerical examples are given to illustrate the effectiveness and less conservativeness of the proposed method.
Notations: Throughout this paper, R n and R n×n denote, respectively, the n-dimensional Euclidean space and the set of all n × n real matrices. The superscript T denotes the transposition and the notation X ≥ Y (respectively, X > Y ), where X and Y are symmetric matrices, means that X − Y is positive semi-definite (respectively, positive definite). I denotes the identity matrix of appropriate dimension. | · | is the Euclidean norm in R n . Moreover, let ( , F, {F t } t≥0 , P) be a complete probability space with a filtration {F t } t≥0 satisfying the usual conditions (i.e. the filtration contains all P-null sets and is right continuous). The notation * always denotes the symmetric block in one symmetric matrix. Sometimes, the arguments of a function or a matrix will be omitted in the analysis when no confusion can arise.

Problem description and preliminaries
Consider the following stochastic neural networks with time-varying delays and parameter uncertainties: where ) being the activation functions; A(t), B(t) and C(t) take the following form: where A = diag(a 1 , a 2 , . . . , a n ) > 0 is the self-feedback term; B = (b i j ) n×n is the connection weight matrix; C = (c i j ) n×n is the delayed connection weight matrix; M, N 1 , N 2 and N 3 are known real constant matrices; F(·) : R + → R k×l is an unknown time-varying matrix function satisfying F T (t)F(t) ≤ I for all t > 0. In addition, we assume that σ : (1) is locally Lipschitz continuous and satisfies the linear growth condition. In the sequel, we use σ (t) to denote σ (t, x(t), x(t − τ (t))).
In this paper, we consider the following two classes of time-varying delays: where h 1 , h 2 and μ are constants. Case (II): τ (t) is a continuous function that may not be differentiable but satisfies We make the following assumptions: (A2) There exist constant real matrices G 1 and G 2 such that hold for any ξ 1 , ξ 2 ∈ R and ξ 1 = ξ 2 . Throughout this paper, we denoteC = diag(c 1 ,c 2 , . . . ,c n ) and C = diag(c 1 , c 2 , . . . , c n ). Now we introduce the following definition: , the equilibrium point of the uncertain delayed Hopfield type neural networks (1) is said to be robustly exponentially stable in the mean square if there exists a scalar γ > 0 such that lim sup holds for every solution x(t; ξ) of (1) and all admissible uncertainties.

Lemma 2.2 [20] Let X and Y > 0 be real constant matrices of appropriate dimensions and F(t) be a real matrix function satisfying F T (t)F(t) ≤ I . Then we have
(1) For scalar > 0 and vectors x and y of appropriate dimensions, the following inequality holds: (2) For vectors x, y, and matrix P > 0 of appropriate dimensions, the following inequality holds: The objective of this paper is to derive LMI-based conditions guaranteeing that the uncertain delayed stochastic delayed Hopfield neural networks (1) is robustly exponentially stable in the mean square for interval time-varying delay. Remark 2.3 In this paper interval time-varying delay satisfying assumption (A1) is considered for establishing the stability results different from the previous works. This work will merge the established work of [26] when μ = 0 that is h 1 = h 2 in which case τ (t) denotes a constant delay. Further for h 1 = 0 it implies that 0 ≤ τ (t) ≤ h 2 which was investigated in [11].

Lemma 2.4 (Schur Complement)
Given constant matrices 1 , 2 , and 3 with appropriate dimensions, where T 1 = 1 and T 2 = 2 > 0, then and scalars 1 > 0, 2 > 0 such that the linear matrix inequalities (LMIs) hold: with where , ϕ 6,13 = P 67 , ϕ 6,14 = −P 66 , ϕ 6,15 = P T 16 + Z 6 , ϕ 6,16 = −W 6 , ϕ 6,17 = −X 6 , Proof Define a new state variable for the stochastic neural networks (1), Consider the Lyapunov-Krasovskii functional as follows: with Then, it can be obtained by Ito's formula that where Similarly, we obtain On the other hand, let Then we have 2θ By Lemma 2.4 we have In addition, it is not difficult to see that Moreover, we have the following inequalities [28]: Then combining (10)- (21) and using the technique in [3], we obtain Applying Schur complement equivalence to (8) gives Consequently, by the proof of Lyapunov stability theory and Definition 2.1, we know that the equilibrium solution of the stochastic neural networks (1) is robustly exponentially stochastically stable in the mean square for any τ (t) satisfying 0 ≤ h 1 ≤ τ (t) ≤ h 2 andτ (t) ≤ μ. The proof is completed.
In the following, we will discuss the robust exponential stability for the following uncertain stochastic neural networks with time-varying delays: where the time-delay τ (t) satisfies 0 ≤ h 1 ≤ τ (t) ≤ h 2 ,τ (t) ≤ μ. Then, we have the following results:

Numerical examples
In this section, we will give three examples showing the effectiveness of the conditions given here.
First we assume that the activation functions satisfy Assumption (A2) with c 1 = c 2 = c 3 = −0.5 and c 1 =c 2 =c 3 = 1. Now we let μ = 0.5; it was reported in [29] the above system is robustly exponentially stable in the mean square when 0 < τ(t) ≤ 2.2471. However, by our Theorem 3.1 and using Matlab LMI Toolbox, for μ = 0.5, h 1 = 0 it is found that the equilibrium solution of uncertain stochastic neural networks (1) is robustly exponentially stable in mean square for any τ (t) satisfying 0 < τ(t) ≤ h 2 = 4.4690. This shows that the established results in this paper is finer than the previous results since the stability region is valid upto the upper bound 4.4690 instead of 2.2471 in [29]. In order to compare the results in this paper with those in [3,29], we assume that the activation functions satisfy (A2) with c 1 = c 2 = c 3 = 0 andc 1 = 1.2,c 2 = 0.5,c 3 = 1.3. When the time-varying delay is differentiable and μ = 0.85, by using Theorem 3.1 in this paper, Theorem 1 in [29] and Theorem 1 in [3], we obtain the maximum allowable upper bound of τ (t) as h 2 = 9.7377, h = 9.6876, and h = 7.7377, respectively. When the time-varying delay may not be differentiable, that is, μ is unknown, by using Theorem 2 in [29] and Theorem 2 in [3], the maximum allowable upper bounds are h = 2.2379 and h = 2.3514, respectively. However, by our Theorem 3.3 and using Matlab LMI Toolbox, for h 1 = 0 it is found that the equilibrium solution of uncertain stochastic neural networks (1) is robustly exponentially stable in mean square for any arbitrarily large h 2 (as long as numerical computation reliable). Therefore, for this example, the results given in this paper are less conservative than those in [29] and [3].
The activation function satisfy Assumption (A3) with c 1 = c 2 = c 3 = −0.5 andc 1 =c 2 =c 3 = 0.5. We note that, when μ ≤ 0.9, the LMIs in Theorem 3 in [29] and Theorem 3.3 in this paper are feasible for any arbitrarily large h 2 (as long as numerical computation reliable). When μ = 0.95, by Theorem 3 in [29], it is found that the equilibrium solution of stochastic neural network (22) is robustly exponentially stable in mean square for any delay τ (t) satisfying h = 0.6633. However, by Theorem 3 in this paper we can conclude that if h 2 = 0.6691. When the time-varying delay may not be differentiable, by Theorem 4 in [29], the maximum allowable upper bound is h = 0.6520. By applying Theorem 3.5 in this paper the LMIs are feasible for any arbitrarily large h 2 ; system (22) is robustly exponentially stable in the mean square and finer than the previous works based on the upper bound.  For μ ≥ 1, Q will no longer be helpful to improve the stability condition since −(1 − μ)Q is nonnegative definite. Therefore, by setting Q = 0, an easy delay/interval-dependent rate-independent criterion is derived for unknown μ. For the above system, applying Theorem 2 in [3], it is found that the equilibrium solution of stochastic neural network (24) is robustly exponentially stable in mean square for any delay τ (t) satisfying 0 < τ(t) ≤ 0.5730. However, by Theorem 3.7 in this paper we can conclude that if 0 < τ(t) ≤ 0.6413, system (24) is robustly exponentially stable in mean square sense and finer than the previous works based on the upper bound.

Conclusion
This paper investigated the stability problem for stochastic uncertain neural networks with interval time-varying delays. Some less conservative stability criteria have been obtained by considering the relationship between the time-varying delay and its lower and upper bounds when calculating the upper bound of the derivative of Lyapunov-Krasovskii functional. By applying the free-weighting matrices technique together with a new Lyapunov-Krasovskii functional, some delay/interval-dependent stability conditions have been obtained in terms of LMIs and it has been shown whether the time-varying delays are differentiable or not. Numerical examples have been given to demonstrate the effectiveness of the presented criteria and their improvement over existing results.
Open Access This article is distributed under the terms of the Creative Commons Attribution License which permits any use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.