Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Wudai Liao is active.

Publication


Featured researches published by Wudai Liao.


international conference on intelligent computing | 2009

Synchronization of Neural Networks by Decentralized Linear-Feedback Control

Jinghuan Chen; Zhongsheng Wang; Yanjun Liang; Wudai Liao; Xiaoxin Liao

The problem of synchronization for a class of neural networks with time-delays has been discussed in this paper.By using of the Lyapunov stability theorem, a novel delay-independent and decentralized linear-feedback control law is designed to achieve the exponential synchronization. The controllers can be more easily designed than that obtained. The illustrative examples show the effectiveness of the presented synchronization scheme.


international symposium on neural networks | 2007

Stochastic Stabilization of Delayed Neural Networks

Wudai Liao; Jinghuan Chen; Yulin Xu; Xiaoxin Liao

By introducing appropriate stochastic factors into the neural networks, there were results showing that the neural networks can be stabilized. In this paper, stochastic stabilization of delayed neural networks is studied. First, a new type Razumikhin-type theorem about stochastic functional differential equations is proposed and the rigid proof is given by using Ito formula, Borel-Contelli lemma etc.. As a corollary of the theorem, a new type Razumikhin-type theorem of delayed stochastic differential equation is obtained. Next, taking the results obtained in the first section as the theoretic basis, the stabilization of the delayed deterministic neural networks is examined. The result obtained in the paper shows that the neural networks can be stabilized so long as the intensity of the random perturbation is large enough. The expression of the random intensity is presented which is convenient to networks design.


international symposium on neural networks | 2010

A lower order discrete-time recurrent neural network for solving high order quadratic problems with equality constraints

Wudai Liao; Jiangfeng Wang; Junyan Wang

A lower order discrete-time recurrent neural network is presented in this paper for solving higher quadratic programming It bases on the orthogonal decomposition method and solves high order quadratic programs, especially for the case that the number of decision variables is close to the number of its constraints The proposed recurrent neural network is globally exponential stability and converges to the optimal solutions of the higher quadratic programming The condition for the neural network to globally converge to the optimal solution of the quadratic program is given An illustrative example and the simulation results are presented to illustrate its performance.


international symposium on neural networks | 2006

Exponential stability of delayed stochastic cellular neural networks

Wudai Liao; Yulin Xu; Xiaoxin Liao

In view of the character of saturation linearity of output functions of neurons of the cellular neural networks, the method decomposing the state space to sub-regions is adopted to study almost sure exponential stability on delayed cellular neural networks which are in the noised environment. When perturbed terms in the model of the neural network satisfy Lipschitz condition, some algebraic criteria are obtained. The results obtained in this paper show that if an equilibrium of the neural network is the interior point of a sub-region, and an appropriate matrix related to this equilibrium has some stable degree to stabilize the perturbation, then the equilibrium of the delayed cellular neural network can still remain the property of exponential stability. All results in the paper is only to compute eigenvalues of matrices.


international symposium on neural networks | 2006

Almost sure exponential stability on interval stochastic neural networks with time-varying delays

Wudai Liao; Zhongsheng Wang; Xiaoxin Liao

Because of VLSI realization of artificial neural networks and measuring the elements of the circuits, noises coming from the circuits and the errors of the parameters of the network systems are therefore unavoidable. Making use of the stochastic version of Razumikhin theorem of stochastic functional differential equation, Lyapunov direct methods and matrix analysis,almost sure exponential stability on interval neural networks perturbed by white noises with time varying delays is examined, and some sufficient algebraic criteria which only depend on the systems’ parameters are given. For well designed deterministic neural networks, the results obtained in the paper also imply that how much tolerance against perturbation they have.


international conference on swarm intelligence | 2010

A discrete-time recurrent neural network for solving systems of complex-valued linear equations

Wudai Liao; Jiangfeng Wang; Junyan Wang

A discrete-time recurrent neural network is presented in this paper for solving systems of complex-valued linear equations. The network shown in this paper is simple in structure and can converge to the solutions of complex-valued linear equations. The condition for the neural network to globally converge to the complex-valued linear equations is given. An illustrative example is presented to illustrate its performance.


international symposium on neural networks | 2009

Stability of Stochastic Recurrent Neural Networks with Positive Linear Activation Functions

Wudai Liao; Xuezhao Yang; Zhongsheng Wang

In view of the character of positive linearity of activation functions of neurons of the recurrent neural networks, the method decomposing the state space to sub-regions is adopted to study almost sure exponential stability on delayed cellular neural networks which are in the noised environment. When perturbed terms in the model of the neural network satisfy Lipschitz condition, some algebraic criteria are obtained. The results obtained in this paper show that if an equilibrium of the neural network is the interior point of a sub-region, and an appropriate matrix related to this equilibrium has some stable degree to stabilize the perturbation, then the equilibrium of the delayed cellular neural network can still remain the property of exponential stability. All results in the paper is only to compute eigenvalues of matrices. All results obtained in this paper include the deterministic neural network as special case.


international symposium on neural networks | 2008

Stability of Neural Networks with Parameters Disturbed by White Noises

Wuyi Zhang; Wudai Liao

Almost sure exponential stability (ASES) of neural networks with parameters disturbed by noises is studied, the basis of which is that the parameters in the implemented neural networks by very large scale integration (VLSI) approaches is well defined by the white-noise stochastic process, and an appropriate way to impose random factors on deterministic neural networks is proposed. By using the theory of stochastic dynamical system and matrix theory, some stability criteria are obtained which ensure the neural networks ASES, and the convergent rate is estimated. Also, the capacity of enduring random factors of the well-designed neural networks is estimated. The results obtained in this paper need only to compute the eigenvalues or verify the negative-definite of some matrices constructed by the parameters of the neural networks. An illustrative example is given to show the effectiveness of the results in the paper.


international symposium on neural networks | 2007

Some New Stability Conditions of Delayed Neural Networks with Saturation Activation Functions

Wudai Liao; Dongyun Wang; Jianguo Xu; Xiaoxin Liao

Locally and globally asymptotical stability on equilibria of delayed neural networks with saturation activation functions are studied by the Razumikhin-type theorems, which are the main approaches to study the stability of functional differential equations, and some new stability conditions are obtained, which are constructed by the networks parameters. In the case of local stability conditions, the attracted fields of equilibria are also estimated. All results obtained in this paper need only to compute the eigenvalues of some matrices or to verify some inequalities to be holden.


international conference on neural information processing | 2006

Delay-dependent and delay-independent stability conditions of delayed cellular neural networks

Wudai Liao; Dongyun Wang; Yulin Xu; Xiaoxin Liao

By using the saturation linearity of the output functions of neurons in cellular neural networks, and by adopting the method of decomposing the state space to sub-regions, the mathematical equations of delayed cellular neural networks are rewritten to be the form of linear differential difference equations in the neighbourhood of each equilibrium, which is an interior point of some sub-region. Based on this linear form and by using the stability theory of linear differential difference equations and the tool of M-matrix, delay-dependent and delay-independent stability algebraic criteria are obtained. All results obtained in this paper need only to compute the eigenvalues of some matrices or to examine the matrices to be M-matrix or to verify some inequalities to be holden.

Collaboration


Dive into the Wudai Liao's collaboration.

Top Co-Authors

Avatar

Xiaoxin Liao

Huazhong University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Yulin Xu

Zhongyuan University of Technology

View shared research outputs
Top Co-Authors

Avatar

Zhongsheng Wang

Zhongyuan University of Technology

View shared research outputs
Top Co-Authors

Avatar

Jiangfeng Wang

Zhongyuan University of Technology

View shared research outputs
Top Co-Authors

Avatar

Jinghuan Chen

Zhongyuan University of Technology

View shared research outputs
Top Co-Authors

Avatar

Junyan Wang

Zhongyuan University of Technology

View shared research outputs
Top Co-Authors

Avatar

Dongyun Wang

Zhongyuan University of Technology

View shared research outputs
Top Co-Authors

Avatar

Jianguo Xu

Zhongyuan University of Technology

View shared research outputs
Top Co-Authors

Avatar

Wuyi Zhang

Zhongyuan University of Technology

View shared research outputs
Top Co-Authors

Avatar

Xuezhao Yang

Zhongyuan University of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge