Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Zengfu Wang is active.

Publication


Featured researches published by Zengfu Wang.


Neural Processing Letters | 2005

Global Stability of a General Class of Discrete-Time Recurrent Neural Networks

Zhigang Zeng; De-Shuang Huang; Zengfu Wang

A general class of discrete-time recurrent neural networks (DTRNNs) is formulated and studied in this paper. Several sufficient conditions are obtained to ensure the global stability of DTRNNs with delays based on induction principle (not based on the well-known Liapunov methods). The obtained results have neither assumed the symmetry of the connection matrix, nor boundedness, monotonicity or the differentiability of the activation functions. In addition, discrete-time analogues of a general class of continuous-time recurrent neural networks (CTRNNs) are derived and studied. The convergence characteristics of CTRNNs are preserved by the discrete-time analogues without any restriction imposed on the uniform discretization step size. Finally, the simulating results demonstrate the validity and feasibility of our proposed approach.


International Journal of Neural Systems | 2004

ATTRACTABILITY AND LOCATION OF EQUILIBRIUM POINT OF CELLULAR NEURAL NETWORKS WITH TIME-VARYING DELAYS

Zhigang Zeng; De-Shuang Huang; Zengfu Wang

This paper presents new theoretical results on global exponential stability of cellular neural networks with time-varying delays. The stability conditions depend on external inputs, connection weights and delays of cellular neural networks. Using these results, global exponential stability of cellular neural networks can be derived, and the estimate for location of equilibrium point can also be obtained. Finally, the simulating results demonstrate the validity and feasibility of our proposed approach.


international symposium on neural networks | 2004

Stability Analysis of Discrete-Time Cellular Neural Networks

Zhigang Zeng; De Shuang Huang; Zengfu Wang

Discrete-time cellular neural networks (DTCNNs) are formulated and studied in this paper. Several sufficient conditions are obtained to ensure the global stability of DTCNNs with delays based on comparison methods (not based on the well-known Liapunov methods). Finally, the simulating results demonstrate the validity and feasibility of our proposed approach.


world congress on intelligent control and automation | 2004

Practical stability criteria for cellular neural networks described by a template

Zhigang Zeng; De-Shuang Huang; Zengfu Wang

In this paper, we show that the N/spl times/M -dimensional cellular neural networks described by a template can have 2/sup N/spl times/M/ locally exponentially stable equilibrium points located in saturation regions. In particular, we also derive several conditions that the equilibrium point is locally exponentially stable when the equilibrium point locates the designated saturation region. Moreover, these conditions can be checked by direct examination of the template, regardless of the number of cells, and are the improvement and extension of the existing relevant stability results in the literature.


international symposium on neural networks | 2004

Pattern Recognition Based on Stability of Discrete Time Cellular Neural Networks

Zhigang Zeng; De-Shuang Huang; Zengfu Wang

In this paper, some sufficient conditions are obtained to guarantee that discrete time cellular neural networks (DTCNNs) can have some stable memory patterns. These conditions can be directly derived from the structure of the neural networks. Moreover, the method of how to estimate of the attracting domain of such stable memory patterns is also described in this paper. In addition, a new design algorithm for DTCNNs is developed based on stability theory (not based on the well-known perceptron training algorithm), and the convergence of the design algorithm can be guaranteed by some stability theorems. Finally, the simulating results demonstrate the validity and feasibility of our proposed approach.


intelligent data engineering and automated learning | 2004

Global Convergence of Steepest Descent for Quadratic Functions

Zhigang Zeng; De-Shuang Huang; Zengfu Wang

This paper analyzes the effect of momentum on steepest descent training for quadratic performance functions. Some global convergence conditions of the steepest descent algorithm are obtained by directly analyzing the exact momentum equations for quadratic cost functions. Those conditions can be directly derived from the parameters (different from eigenvalues that are used in the existed ones.) of the Hessian matrix. The results presented in this paper are new.


international conference on control, automation, robotics and vision | 2004

Global exponential stability of delayed Cohen-Grossberg neural networks

Zhigang Zeng; Zengfu Wang; De-Shuang Huang

In this paper, using a fixed-point theorem and reduction to absurdity, the authors have obtained some sufficient conditions to guarantee that Cohen-Grossberg neural networks with discrete and distributed delays are globally exponentially stable. Since the model is more general and the assumptions relax the previous assumptions in some existing works, the results presented in this paper are the improvement and extension of the existed ones. Finally, the validity and performance of the results are illustrated by two simulation examples.


Physics Letters A | 2005

Memory pattern analysis of cellular neural networks

Zhigang Zeng; De-Shuang Huang; Zengfu Wang


Applied Mathematical Modelling | 2008

Pattern memory analysis based on stability theory of cellular neural networks

Zhigang Zeng; De-Shuang Huang; Zengfu Wang


Lecture Notes in Computer Science | 2005

Globally attractive periodic state of discrete-time cellular neural networks with time-varying delays

Zhigang Zeng; Boshan Chen; Zengfu Wang

Collaboration


Dive into the Zengfu Wang's collaboration.

Top Co-Authors

Avatar

Zhigang Zeng

Huazhong University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

De Shuang Huang

Chinese Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Zejun Ding

University of Science and Technology of China

View shared research outputs
Researchain Logo
Decentralizing Knowledge