V. Jeyakumar
University of New South Wales
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by V. Jeyakumar.
Siam Journal on Optimization | 2003
V. Jeyakumar; G. M. Lee; N. Dinh
In this paper a new sequential Lagrange multiplier condition characterizing optimality without a constraint qualification for an abstract nonsmooth convex program is presented in terms of the subdifferentials and the
Mathematical Programming | 2007
V. Jeyakumar; Alex M. Rubinov; Zhiyou Wu
\epsilon
Mathematical Programming | 1992
V. Jeyakumar; Henry Wolkowicz
-subdifferentials. A sequential condition involving only the subdifferentials, but at nearby points to the minimizer for constraints, is also derived. For a smooth convex program, the sequential condition yields a limiting Kuhn--Tucker condition at nearby points without a constraint qualification. It is shown how the sequential conditions are related to the standard Lagrange multiplier condition. Applications to semidefinite programs, semi-infinite programs, and semiconvex programs are given. Several numerical examples are discussed to illustrate the significance of the sequential conditions.
Optimization | 1992
X.Q. Yang; V. Jeyakumar
In this paper, we first examine how global optimality of non-convex constrained optimization problems is related to Lagrange multiplier conditions. We then establish Lagrange multiplier conditions for global optimality of general quadratic minimization problems with quadratic constraints. We also obtain necessary global optimality conditions, which are different from the Lagrange multiplier conditions for special classes of quadratic optimization problems. These classes include weighted least squares with ellipsoidal constraints, and quadratic minimization with binary constraints. We discuss examples which demonstrate that our optimality conditions can effectively be used for identifying global minimizers of certain multi-extremal non-convex quadratic optimization problems.
Journal of Optimization Theory and Applications | 1999
V. Jeyakumar; D.T. Luc
In this paper we study constraint qualifications and duality results for infinite convex programs (P)μ = inf{f(x): g(x) ∈ − S, x ∈ C}, whereg = (g1,g2) andS = S1 ×S2,Si are convex cones,i = 1, 2,C is a convex subset of a vector spaceX, andf andgi are, respectively, convex andSi-convex,i = 1, 2. In particular, we consider the special case whenS2 is in afinite dimensional space,g2 is affine andS2 is polyhedral. We show that a recently introduced simple constraint qualification, and the so-called quasi relative interior constraint qualification both extend to (P), from the special case thatg = g2 is affine andS = S2 is polyhedral in a finite dimensional space (the so-called partially finite program). This provides generalized Slater type conditions for (P) which are much weaker than the standard Slater condition. We exhibit the relationship between these two constraint qualifications and show how to replace the affine assumption ong2 and the finite dimensionality assumption onS2, by a local compactness assumption. We then introduce the notion of strong quasi relative interior to get parallel results for more general infinite dimensional programs without the local compactness assumption. Our basic tool reduces to guaranteeing the closure of the sum of two closed convex cones.
Siam Journal on Optimization | 2002
V. Jeyakumar
In this paper, a new generalized second-order directional derivative and a set-valued generalized Hessian are introudced for C1,1 functions in real Banach spaces. It is shown that this set-valued generalized Hessian is single-valued at a point if and only if the function is twice weakly Gateaux differentiable at the point and that the generalized second-order directional derivative is upper semi-continuous under a regularity condition. Various generalized calculus rules are also given for C1,1 functions. The generalized second-order directional derivative is applied to derive second-order necessary optirnality conditions for mathematical programming problems.
Journal of Optimization Theory and Applications | 1986
V. Jeyakumar
Noncompact convexificators, which provide upper convex and lower concave approximations for a continuous function, are defined. Various calculus rules, including extremality and mean-value properties, are presented. Regularity conditions are given for convexificators to be minimal. A characterization of quasiconvexity of a continuous function is obtained in terms of the quasimonotonicity of convexificators.
Mathematical Programming | 1993
V. Jeyakumar; Xiaoqi Yang
Dual characterizations of the containment of a closed convex set, defined by infinite convex constraints, in an arbitrary polyhedral set, in a reverse-convex set, defined by convex constraints, and in another convex set, defined by finite convex constraints, are given. A special case of these dual characterizations has played a key role in generating knowledge-based support vector machine classifiers which are powerful tools in data classification and mining. The conditions in these dual characterizations reduce to simple nonasymptotic conditions under Slaters constraint qualification.
Optimization | 1985
V. Jeyakumar
The paper contains a version of a minimax theorem with weakened convexity, extending a minimax theorem of Fan. The main result is obtained with the use of a generalized Gordan theorem, which is proved using a separation theorem. An example is also discussed.
Siam Journal on Optimization | 2010
V. Jeyakumar; Guoyin Li
This paper examines nonsmooth constrained multi-objective optimization problems where the objective function and the constraints are compositions of convex functions, and locally Lipschitz and Gâteaux differentiable functions. Lagrangian necessary conditions, and new sufficient optimality conditions for efficient and properly efficient solutions are presented. Multi-objective duality results are given for convex composite problems which are not necessarily convex programming problems. Applications of the results to new and some special classes of nonlinear programming problems are discussed. A scalarization result and a characterization of the set of all properly efficient solutions for convex composite problems are also discussed under appropriate conditions.