Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sitian Qin is active.

Publication


Featured researches published by Sitian Qin.


Information Sciences | 2013

Global exponential stability of almost periodic solution of delayed neural networks with discontinuous activations

Sitian Qin; Xiaoping Xue; Peng Wang

In this paper, we study the existence, uniqueness and stability of almost periodic solution for the class of delayed neural networks. The neural network considered in this paper employs the activation functions which are discontinuous monotone increasing and (possibly) unbounded. Under a new sufficient condition, we prove that the neural network has a unique almost periodic solution, which is globally exponentially stable. Moreover, the obtained conclusion is applied to prove the existence and stability of periodic solution (or equilibrium point) for delayed neural networks with periodic coefficients (or constant coefficients). We also give some illustrative numerical examples to show the effectiveness of our results.


Neural Processing Letters | 2009

Global Exponential Stability and Global Convergence in Finite Time of Neural Networks with Discontinuous Activations

Sitian Qin; Xiaoping Xue

In this paper, we consider a general class of neural networks, which have arbitrary constant delays in the neuron interconnections, and neuron activations belonging to the set of discontinuous monotone increasing and (possibly) unbounded functions. Based on the topological degree theory and Lyapunov functional method, we provide some new sufficient conditions for the global exponential stability and global convergence in finite time of these delayed neural networks. Under these conditions the uniqueness of initial value problem (IVP) is proved. The exponential convergence rate can be quantitatively estimated on the basis of the parameters defining the neural network. These conditions are easily testable and independent of the delay. In the end some remarks and examples are discussed to compare the present results with the existing ones.


IEEE Transactions on Neural Networks | 2015

A Two-Layer Recurrent Neural Network for Nonsmooth Convex Optimization Problems

Sitian Qin; Xiaoping Xue

In this paper, a two-layer recurrent neural network is proposed to solve the nonsmooth convex optimization problem subject to convex inequality and linear equality constraints. Compared with existing neural network models, the proposed neural network has a low model complexity and avoids penalty parameters. It is proved that from any initial point, the state of the proposed neural network reaches the equality feasible region in finite time and stays there thereafter. Moreover, the state is unique if the initial point lies in the equality feasible region. The equilibrium point set of the proposed neural network is proved to be equivalent to the Karush-Kuhn-Tucker optimality set of the original optimization problem. It is further proved that the equilibrium point of the proposed neural network is stable in the sense of Lyapunov. Moreover, from any initial point, the state is proved to be convergent to an equilibrium point of the proposed neural network. Finally, as applications, the proposed neural network is used to solve nonlinear convex programming with linear constraints and L1-norm minimization problems.


Neural Networks | 2015

Convergence and attractivity of memristor-based cellular neural networks with time delays

Sitian Qin; Jun Wang; Xiaoping Xue

This paper presents theoretical results on the convergence and attractivity of memristor-based cellular neural networks (MCNNs) with time delays. Based on a realistic memristor model, an MCNN is modeled using a differential inclusion. The essential boundedness of its global solutions is proven. The state of MCNNs is further proven to be convergent to a critical-point set located in saturated region of the activation function, when the initial state locates in a saturated region. It is shown that the state convergence time period is finite and can be quantitatively estimated using given parameters. Furthermore, the positive invariance and attractivity of state in non-saturated regions are also proven. The simulation results of several numerical examples are provided to substantiate the results.


Neurocomputing | 2013

A new one-layer recurrent neural network for nonsmooth pseudoconvex optimization

Sitian Qin; Wei Bian; Xiaoping Xue

Abstract This paper proposes a one-layer recurrent neural network for solving nonlinear nonsmooth pseudoconvex optimization problem subject to linear equality constraints. We first prove that the equilibrium point set of the proposed neural network is equivalent to the optimal solution of the original optimization problem, even though the objective function is pseudoconvex. Then, it is proved that the state of the proposed neural network is stable in the sense of Lyapunov, and globally convergent to an exact optimal solution of the original optimization. In the end, some illustrative examples are given to demonstrate the effectiveness of the proposed neural network.


IEEE Transactions on Automatic Control | 2010

Dynamical Analysis of Neural Networks of Subgradient System

Sitian Qin; Xiaoping Xue

In this technical note, we consider a class of neural network, which is a generalization of neural network models considered in the optimization context. Under some mild assumptions, this neural network can be translated into a negative subgradient dynamical system. At first, we study the existence and uniqueness of solution of this neural network. Then, by nonsmooth Łojasiewicz inequality, we prove the convergence of the trajectories of this neural network. In the end, a constrained minimization problem is studied, which can be associated with this neural network. It is proved that the local constrained minimum of the cost function coincides with the stable equilibria point of this neural network.


International Journal of Pattern Recognition and Artificial Intelligence | 2017

Exponential Stability of Periodic Solution for Impulsive Memristor-Based Cohen–Grossberg Neural Networks with Mixed Delays

Jiqiang Feng; Qiang Ma; Sitian Qin

Memristor, as the future of artificial intelligence, has been widely used in pattern recognition or signal processing from sensor arrays. Memristor-based recurrent neural network (MRNN) is an ideal model to mimic the functionalities of the human brain due to the physical properties of memristor. In this paper, the periodicity for memristor-based Cohen–Grossberg neural networks (MCGNNs) is studied. The neural network (NN) considered in this paper is based on the memristor and involves time-varying delays, distributed delays and impulsive effects. The boundedness and monotonicity of the activation function are not assumed. By some inequality technique and contraction mapping principle, we prove the existence, uniqueness and exponential stability of periodic solution for MCGNNs. Finally, some numeral examples and comparisons are provided to illustrate the validation of our results.


Neurocomputing | 2010

Dynamical behavior of a class of nonsmooth gradient-like systems

Sitian Qin; Xiaoping Xue

In this paper, we consider a class of nonsmooth gradient-like systems, which is a generalization of existing neural-network models. Under several assumptions, by virtue of Lyapunov function and topological degree theory, we investigate the dynamical behaviors of this system, such as the existence of global solution and equilibrium point, the global asymptotic stability of the global solution. Then, we apply these results into optimization problem, such as the problem of minimizing a convex objective function over the discrete set {0,1}^n and nonlinear programming problems. Besides, we also investigate the existence and global exponential stability of periodic solution of this system with a periodic input vector. Some examples are given to illustrate our results.


Neural Networks | 2015

Neural network for constrained nonsmooth optimization using Tikhonov regularization

Sitian Qin; Dejun Fan; Guangxi Wu; Lijun Zhao

This paper presents a one-layer neural network to solve nonsmooth convex optimization problems based on the Tikhonov regularization method. Firstly, it is shown that the optimal solution of the original problem can be approximated by the optimal solution of a strongly convex optimization problems. Then, it is proved that for any initial point, the state of the proposed neural network enters the equality feasible region in finite time, and is globally convergent to the unique optimal solution of the related strongly convex optimization problems. Compared with the existing neural networks, the proposed neural network has lower model complexity and does not need penalty parameters. In the end, some numerical examples and application are given to illustrate the effectiveness and improvement of the proposed neural network.


Neural Processing Letters | 2014

Global Robust Exponential Stability for Interval Delayed Neural Networks with Possibly Unbounded Activation Functions

Sitian Qin; Dejun Fan; Ming Yan; Qinghe Liu

In this paper, we mainly study the global robust exponential stability of the neural networks with possibly unbounded activation functions. Based on the topological degree theory and Lyapunov functional method, we provide some new sufficient conditions for the global robust exponential stability. Under these conditions, we prove existence, uniqueness and global robust exponential stability of equilibrium point. In the end, some examples are provided to demonstrate the validity of the theoretical results.

Collaboration


Dive into the Sitian Qin's collaboration.

Top Co-Authors

Avatar

Xiaoping Xue

Harbin Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Dejun Fan

Harbin Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Fengqiu Liu

Harbin University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Jiahui Song

Harbin Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jianmin Wang

Harbin University of Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Qinghe Liu

Harbin Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Wei Bian

Harbin Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jun Wang

City University of Hong Kong

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge