Neural Processing Letters | 2019

Improved Delay-Derivative-Dependent Stability Analysis for Generalized Recurrent Neural Networks with Interval Time-Varying Delays

 

Abstract


In this paper, the problem of delay-derivative-dependent stability analysis for generalized neural networks with interval time-varying delays is considered. First, we divide the whole delay interval into two segmentations with an unequal width and checking the variation of the Lyapunov–Krasovskii functional (LKF) for each subinterval of delay, where the information on the lower and upper bounds of time delay and its derivative are fully exploited. Second, a new delay-derivative-dependent stability condition for time-varying delay systems with interval time-varying delays, which expressed in terms of quadratic forms of linear matrix inequalities (LMIs), and has been derived by constructing the LKF from the delayed-decomposition approach and integral inequality approach. Third, all the conditions are presented in terms of LMIs can be easily calculated by using Matlab LMI control toolbox. Fourth, the computational complexity of newly obtained stability conditions is reduced because fewer variables are involved. Finally, four numerical examples are provided to verify the effectiveness of the proposed criteria.

Volume 51
Pages 427-448
DOI 10.1007/s11063-019-10088-8
Language English
Journal Neural Processing Letters

Full Text