Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Chun-Ming Tang is active.

Publication


Featured researches published by Chun-Ming Tang.


Applied Mathematics and Computation | 2006

A new superlinearly convergent norm-relaxed method of strongly sub-feasible direction for inequality constrained optimization ☆

Jinbao Jian; Hai-Yan Zheng; Chun-Ming Tang; Qing-Jie Hu

Method of feasible directions (MFD) is an important method for solving nonlinearly constrained optimization. However, various types of MFD all need an initial feasible point, which can not be found easily in generally. In addition, the computational cost of some MFD with superlinearly convergent property is rather high. On the other hand, the strongly sub-feasible direction method does not need an initial feasible point, but most of the proposed algorithm do not have the superlinearly convergent property, and can not guarantee that the iteration point is feasible after finite iterations. In this paper, we present a new superlinearly convergent algorithm with arbitrary initial point. At each iteration, a master direction is obtained by solving one direction finding subproblem (DFS), and an auxiliary direction is yielded by an explicit formula. After finite iterations, the iteration point goes into the feasible set and the master direction is a feasible direction of descent. Since a new generalized projection technique is contained in the auxiliary direction formula, under some mild assumptions without the strict complementarity, the global convergence and superlinear convergence of the algorithm can be obtained.


Applied Mathematics and Computation | 2005

A new norm-relaxed method of strongly sub-feasible direction for inequality constrained optimization

Jinbao Jian; Hai-Yan Zheng; Qing-Jie Hu; Chun-Ming Tang

Combining the norm-relaxed method of feasible direction (MFD) with the idea of strongly sub-feasible direction method, we present a new convergent algorithm with arbitrary initial point for inequality constrained optimization. At each iteration, the new algorithm solves one direction finding subproblem (DFS) which always possesses a solution. Some good properties of the new algorithm are that it can unify automatically the operations of initialization (Phase I) and optimization (Phase II) and the number of the functions satisfying the inequality constrains is nondecreasing, particularly, a feasible direction of descent can be obtained by solving DFS whenever the iteration point gets into the feasible set. Under some mild assumptions without the linear independence, the global and strong convergence of the algorithm can be obtained.


Numerical Functional Analysis and Optimization | 2008

A New Superlinearly Convergent Strongly Subfeasible Sequential Quadratic Programming Algorithm for Inequality-Constrained Optimization

Jinbao Jian; Chun-Ming Tang; Qing-Jie Hu; Hai-Yan Zheng

Combining the ideas of generalized projection and the strongly subfeasible sequential quadratic programming (SQP) method, we present a new strongly subfeasible SQP algorithm for nonlinearly inequality-constrained optimization problems. The algorithm, in which a new unified step-length search of Armijo type is introduced, starting from an arbitrary initial point, produces a feasible point after a finite number of iterations and from then on becomes a feasible descent SQP algorithm. At each iteration, only one quadratic program needs to be solved, and two correctional directions are obtained simply by explicit formulas that contain the same inverse matrix. Furthermore, the global and superlinear convergence results are proved under mild assumptions without strict complementarity conditions. Finally, some preliminary numerical results show that the proposed algorithm is stable and promising.


Applied Mathematics and Computation | 2007

A new version of the Liu–Storey conjugate gradient method☆

Chun-Ming Tang; Zengxin Wei; Guoyin Li

In this paper, the global convergence of a new version of the Liu–Storey conjugate gradient method is discussed. This method combines the Liu–Storey conjugate gradient formula and a new inexact line search. We prove that the new method is globally convergent. Some preliminary numerical results show that the corresponding algorithm is efficient.


Applied Mathematics Letters | 2010

A new norm-relaxed SQP algorithm with global convergence

Hai-Yan Zheng; Jinbao Jian; Chun-Ming Tang; Ran Quan

Abstract A new norm-relaxed sequential quadratic programming algorithm with global convergence for inequality constrained problem is presented in this paper, and the quadratic programming subproblem can be solved at each iteration. Without the boundedness assumptions on any of the iterative sequences, the global convergence can be guaranteed by line search with l ∞ penalty function and under some mild assumptions.


Bulletin of The Australian Mathematical Society | 2007

Semilocal E-convexity and semilocal E-convex programming

Qing-Jie Hu; Jinbao Jian; Hai-Yan Zheng; Chun-Ming Tang

In this paper, a new type of generalised convexity—semilocal E -convexity is introduced by combining the concepts of the semi- E -convexity in X.S. Chen [ J. Math. Anal. Appl. 275(2002), 251–262] and semilocal convexity in G.M. Ewing [ SIAM. Rev. 19(1977), 202–220], and some of its basic characters are discussed. By utilising the new concepts, we derive some optimality conditions and establish some duality results for the inequality constrained optimisation problem.


Applied Mathematics and Computation | 2015

A strongly sub-feasible primal-dual quasi interior-point algorithm for nonlinear inequality constrained optimization

Jinbao Jian; Hua-qin Pan; Chun-Ming Tang; Jianling Li

In this paper, a primal-dual quasi interior-point algorithm for inequality constrained optimization problems is presented. At each iteration, the algorithm solves only two or three reduced systems of linear equations with the same coefficient matrix. The algorithm starts from an arbitrarily initial point. Then after finite iterations, the iteration points enter into the interior of the feasible region and the objective function is monotonically decreasing. Furthermore, the proposed algorithm is proved to possess global and superlinear convergence under mild conditions including a weak assumption of positive definiteness. Finally, some encouraging preliminary computational results are reported.


Journal of Computational and Applied Mathematics | 2015

A new superlinearly convergent algorithm of combining QP subproblem with system of linear equations for nonlinear optimization

Jinbao Jian; Chuan-Hao Guo; Chun-Ming Tang; Yan-Qin Bai

In this paper, a class of optimization problems with nonlinear inequality constraints is discussed. Based on the ideas of sequential quadratic programming algorithm and the method of strongly sub-feasible directions, a new superlinearly convergent algorithm is proposed. The initial iteration point can be chosen arbitrarily for the algorithm. At each iteration, the new algorithm solves one quadratic programming subproblem which is always feasible, and one or two systems of linear equations with a common coefficient matrix. Moreover, the coefficient matrix is uniformly nonsingular. After finite iterations, the iteration points can always enter the feasible set of the problem, and the search direction is obtained by solving one quadratic programming subproblem and only one system of linear equations. The new algorithm possesses global and superlinear convergence under some suitable assumptions without the strict complementarity. Finally, some numerical results are reported to show that the algorithm is promising.


European Journal of Operational Research | 2012

Strongly sub-feasible direction method for constrained optimization problems with nonsmooth objective functions

Chun-Ming Tang; Jinbao Jian

In this paper, we propose a strongly sub-feasible direction method for the solution of inequality constrained optimization problems whose objective functions are not necessarily differentiable. The algorithm combines the subgradient aggregation technique with the ideas of generalized cutting plane method and of strongly sub-feasible direction method, and as results a new search direction finding subproblem and a new line search strategy are presented. The algorithm can not only accept infeasible starting points but also preserve the “strong sub-feasibility” of the current iteration without unduly increasing the objective value. Moreover, once a feasible iterate occurs, it becomes automatically a feasible descent algorithm. Global convergence is proved, and some preliminary numerical results show that the proposed algorithm is efficient.


Computational Optimization and Applications | 2011

Inverse problems and solution methods for a class of nonlinear complementarity problems

Jian-zhong Zhang; Jinbao Jian; Chun-Ming Tang

AbstractIn this paper, motivated by the KKT optimality conditions for a sort of quadratic programs, we first introduce a class of nonlinear complementarity problems (NCPs). Then we present and discuss a kind of inverse problems of the NCPs, i.e., for a given feasible decision

Collaboration


Dive into the Chun-Ming Tang's collaboration.

Top Co-Authors

Avatar

Jinbao Jian

Yulin Normal University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Guoyin Li

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gaohang Yu

Sun Yat-sen University

View shared research outputs
Researchain Logo
Decentralizing Knowledge