Ailong Wu
Xi'an Jiaotong University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ailong Wu.
Information Sciences | 2012
Ailong Wu; Shiping Wen; Zhigang Zeng
In this paper, we formulate and investigate a class of memristor-based recurrent neural networks. Some sufficient conditions are obtained to guarantee the exponential synchronization of the coupled networks based on drive-response concept, differential inclusions theory and Lyapunov functional method. The analysis in the paper employs results from the theory of differential equations with discontinuous right-hand side as introduced by Filippov. Finally, the validity of the obtained result is illustrated by a numerical example.
IEEE Transactions on Neural Networks | 2012
Ailong Wu; Zhigang Zeng
In this paper, a general class of memristive neural networks with time delays is formulated and studied. Some sufficient conditions in terms of linear matrix inequalities are obtained, in order to achieve exponential stabilization. The result can be applied to the closed-loop control of memristive systems. In particular, several succinct criteria are given to ascertain the exponential stabilization of memristive cellular neural networks. In addition, a simplified and effective algorithm is considered for design of the optimal controller. These conditions are the improvement and extension of the existing results in the literature. Two numerical examples are given to illustrate the theoretical results via computer simulations.
Neural Networks | 2012
Ailong Wu; Zhigang Zeng
The paper introduces a general class of memristor-based recurrent neural networks with time-varying delays. Conditions on the nondivergence and global attractivity are established by using local inhibition, respectively. Moreover, exponential convergence of the networks is studied by using local invariant sets. The analysis in the paper employs results from the theory of differential equations with discontinuous right-hand sides as introduced by Filippov. The obtained results extend some previous works on conventional recurrent neural networks.
Neurocomputing | 2011
Ailong Wu; Zhigang Zeng; Xusheng Zhu; Jine Zhang
In this paper, the synchronization control of a general class of memristor-based recurrent neural networks with time delays is investigated. A delay-dependent feedback controller is derived to achieve the exponential synchronization based on the drive-response concept, linear matrix inequalities (LMIs) and Lyapunov functional method. Finally, a numerical example is given to illustrate the derived theoretical results.
IEEE Transactions on Neural Networks | 2014
Ailong Wu; Zhigang Zeng
Memristive neuromorphic system is a good candidate for creating artificial brain. In this paper, a general class of memristive neural networks with discrete and distributed delays is introduced and studied. Some Lagrange stability criteria dependent on the network parameters are derived via nonsmooth analysis and control theory. In particular, several succinct criteria are provided to ascertain the Lagrange stability of memristive neural networks with and without delays. The proposed Lagrange stability criteria are the improvement and extension of the existing results in the literature. Three numerical examples are given to show the superiority of theoretical results.
Neural Networks | 2014
Ailong Wu; Zhigang Zeng
Memristive neural networks are studied across many fields of science. To uncover their structural design principles, the paper introduces a general class of memristive neural networks with time delays. Passivity analysis is conducted by constructing suitable Lyapunov functional. The analysis in the paper employs the results from the theories of nonsmooth analysis and linear matrix inequalities. A numerical example is provided to illustrate the effectiveness and less conservatism of the proposed results.
Neural Networks | 2017
Ailong Wu; Ling Liu; Tingwen Huang; Zhigang Zeng
Neurodynamic system is an emerging research field. To understand the essential motivational representations of neural activity, neurodynamics is an important question in cognitive system research. This paper is to investigate Mittag-Leffler stability of a class of fractional-order neural networks in the presence of generalized piecewise constant arguments. To identify neural types of computational principles in mathematical and computational analysis, the existence and uniqueness of the solution of neurodynamic system is the first prerequisite. We prove that the existence and uniqueness of the solution of the network holds when some conditions are satisfied. In addition, self-active neurodynamic system demands stable internal dynamical states (equilibria). The main emphasis will be then on several sufficient conditions to guarantee a unique equilibrium point. Furthermore, to provide deeper explanations of neurodynamic process, Mittag-Leffler stability is studied in detail. The established results are based on the theories of fractional differential equation and differential equation with generalized piecewise constant arguments. The derived criteria improve and extend the existing related results.
Neurocomputing | 2011
Ailong Wu; Zhigang Zeng; Chaojin Fu; Wenwen Shen
In this paper, global exponential stability in Lagrange sense for periodic neural networks with various activation functions is further studied. By constructing appropriate Lyapunov-like functions, we provide easily verifiable criteria for the boundedness and global exponential attractivity of periodic neural networks. These theoretical analysis can narrow the search field of optimization computation, associative memories, chaos control and provide convenience for applications.
IEEE Transactions on Neural Networks | 2017
Ailong Wu; Zhigang Zeng
According to conventional memristive neural network theories, neurodynamic properties are powerful tools for solving many problems in the areas of brain-like associative learning, dynamic information storage or retrieval, etc. However, as have often been noted in most fractional-order systems, system analysis approaches for integral-order systems could not be directly extended and applied to deal with fractional-order systems, and consequently, it raises difficult issues in analyzing and controlling the fractional-order memristive neural networks. By using the set-valued maps and fractional-order differential inclusions, then aided by a newly proposed fractional derivative inequality, this paper investigates the global Mittag-Leffler stabilization for a class of fractional-order memristive neural networks. Two types of control rules (i.e., state feedback stabilizing control and output feedback stabilizing control) are designed for the stabilization of fractional-order memristive neural networks, while a list of stabilization criteria is established. Finally, two numerical examples are given to show the effectiveness and characteristics of the obtained theoretical results.According to conventional memristive neural network theories, neurodynamic properties are powerful tools for solving many problems in the areas of brain-like associative learning, dynamic information storage or retrieval, etc. However, as have often been noted in most fractional-order systems, system analysis approaches for integral-order systems could not be directly extended and applied to deal with fractional-order systems, and consequently, it raises difficult issues in analyzing and controlling the fractional-order memristive neural networks. By using the set-valued maps and fractional-order differential inclusions, then aided by a newly proposed fractional derivative inequality, this paper investigates the global Mittag-Leffler stabilization for a class of fractional-order memristive neural networks. Two types of control rules (i.e., state feedback stabilizing control and output feedback stabilizing control) are designed for the stabilization of fractional-order memristive neural networks, while a list of stabilization criteria is established. Finally, two numerical examples are given to show the effectiveness and characteristics of the obtained theoretical results.
Neurocomputing | 2016
Ailong Wu; Zhigang Zeng; Xingguo Song
In this paper, stabilization control of fractional-order bidirectional associative memory neural networks is formulated and studied. By estimating Mittag-Leffler function and some novel analysis techniques of fractional calculation, a generalized Gronwall-like inequality of Caputo fractional derivative is established. Then by applying Lyapunov approach, linear state feedback control law and partial state feedback control law are presented to stabilize the fractional-order bidirectional associative memory neural networks. This analysis framework can be applied to closed-loop control of fractional-order systems. A numerical example is given to show the effectiveness of the derived results via computer simulations.