Tianping Chen
Fudan University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Tianping Chen.
IEEE Transactions on Circuits and Systems | 2007
Tianping Chen; Xiwei Liu; Wenlian Lu
In this paper, without assuming symmetry, irreducibility, or linearity of the couplings, we prove that a single controller can pin a coupled complex network to a homogenous solution. Sufficient conditions are presented to guarantee the convergence of the pinning process locally and globally. An effective approach to adapt the coupling strength is proposed. Several numerical simulations are given to verify our theoretical analysis.
IEEE Transactions on Circuits and Systems | 2004
Wenlian Lu; Tianping Chen
We investigate synchronization of an array of linearly coupled identical connected neural networks with delays; Variational method is used to investigate local synchronization. Global exponential stability is studied, too. We do not assume that the coupling matrix A is symmetric or irreducible. The linear matrix inequality approach is used to judge synchronization with global convergence property.
Neural Networks | 1997
Shun-ichi Amari; Tianping Chen; Andrzej Cichocki
Recently a number of adaptive learning algorithms have been proposed for blind source separation. Although the underlying principles and approaches are different, most of them have very similar forms. Two important issues remained to be elucidated further: the statistical efficiency and the stability of learning algorithms. The present letter analyzes a general form of statistically efficient algorithms and gives a necessary and sufficient condition for the separating solution to be a stable equilibrium of a general learning algorithm. Moreover, when the separating solution is unstable, a simple method is given for stabilizing the separating solution by modifying the algorithm.
IEEE Transactions on Circuits and Systems | 2009
Wei Wu; Wenjuan Zhou; Tianping Chen
In this paper, we focus on the problem of driving a general network to a selected cluster synchronization pattern by means of a pinning control strategy. Sufficient conditions are presented to guarantee the realization of the cluster synchronization pattern for all initial values. We also show the detailed steps on how to construct the coupling matrix and to modify the control strengths. Moreover, the method of adapting the coupling strength is provided to refine the result.
international symposium on neural networks | 1993
Tianping Chen; Hong Chen
The purpose of this paper is to explore the representation capability of radial basis function (RBF) neural networks. The main results are: 1) the necessary and sufficient condition for a function of one variable to be qualified as an activation function in RBF network is that the function is not an even polynomial, and 2) the capability of approximation to nonlinear functionals and operators by RBF networks is revealed, using sample data either in frequency domain or in time domain, which can be used in system identification by neural networks.
Neural Networks | 2005
Wenlian Lu; Tianping Chen
In this paper, we discuss dynamics of Cohen-Grossberg neural networks with discontinuous activations functions. We provide a relax set of sufficient conditions based on the concept of Lyapunov diagonally stability (LDS) for Cohen-Grossberg networks to be absolutely stable. Moreover, under certain conditions we prove that the system is exponentially stable globally or convergent globally in finite time. Convergence rate for global exponential convergence and convergence time for global convergence in finite time are also provided.
IEEE Transactions on Neural Networks | 2001
Tianping Chen; Shun-ichi Amari
In this paper, we discuss dynamical behaviors of recurrently asymmetrically connected neural networks in detail. We propose an effective approach to study global and local stability of the networks. Many of well known existing results are unified in our framework, which gives much better test conditions for global and local stability. Sufficient conditions for the uniqueness of the equilibrium point and its stability conditions are given, too.
IEEE Transactions on Neural Networks | 2008
Wei Wu; Tianping Chen
In this paper, global synchronization of linearly coupled neural network (NN) systems with time-varying coupling is investigated. The dynamical behavior of the uncoupled system at each node is general, which can be chaotic or others; the coupling configuration is time varying, i.e., the coupling matrix is not a constant matrix. Based on Lyapunov function method and the specific property of Householder transform, some criteria for the global synchronization are obtained. By these criteria, one can verify whether the coupled system with time-varying coupling is globally synchronized, which is important and useful for both understanding and interpreting synchronization phenomena and designing coupling configuration. Finally, two simulations are given to demonstrate the effectiveness of the theoretical results.
Neural Computation | 2006
Wenlian Lu; Tianping Chen
In this letter, without assuming the boundedness of the activation functions, we discuss the dynamics of a class of delayed neural networks with discontinuous activation functions. A relaxed set of sufficient conditions is derived, guaranteeing the existence, uniqueness, and global stability of the equilibrium point. Convergence behaviors for both state and output are discussed. The constraints imposed on the feedback matrix are independent of the delay parameter and can be validated by the linear matrix inequality technique. We also prove that the solution of delayed neural networks with discontinuous activation functions can be regarded as a limit of the solutions of delayed neural networks with high-slope continuous activation functions.
Neural Computation | 2000
Shun-ichi Amari; Tianping Chen; Andrzej Cichocki
Independent component analysis or blind source separation extracts independent signals from their linear mixtures without assuming prior knowledge of their mixing coefficients. It is known that the independent signals in the observed mixtures can be successfully extracted except for their order and scales. In order to resolve the indeterminacy of scales, most learning algorithms impose some constraints on the magnitudes of the recovered signals. However, when the source signals are nonstationary and their average magnitudes change rapidly, the constraints force a rapid change in the magnitude of the separating matrix. This is the case with most applications (e.g., speech sounds, electroencephalogram signals). It is known that this causes numerical instability in some cases. In order to resolve this difficulty, this article introduces new nonholonomic constraints in the learning algorithm. This is motivated by the geometrical consideration that the directions of change in the separating matrix should be orthogonal to the equivalence class of separating matrices due to the scaling indeterminacy. These constraints are proved to be nonholonomic, so that the proposed algorithm is able to adapt to rapid or intermittent changes in the magnitudes of the source signals. The proposed algorithm works well even when the number of the sources is overestimated, whereas the existent algorithms do not (assuming the sensor noise is negligibly small), because they amplify the null components not included in the sources. Computer simulations confirm this desirable property.