Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where V. Jeyakumar is active.

Publication


Featured researches published by V. Jeyakumar.


Siam Journal on Optimization | 2003

NEW SEQUENTIAL LAGRANGE MULTIPLIER CONDITIONS CHARACTERIZING OPTIMALITY WITHOUT CONSTRAINT QUALIFICATION FOR CONVEX PROGRAMS

V. Jeyakumar; G. M. Lee; N. Dinh

In this paper a new sequential Lagrange multiplier condition characterizing optimality without a constraint qualification for an abstract nonsmooth convex program is presented in terms of the subdifferentials and the


Mathematical Programming | 2007

Non-convex quadratic minimization problems with quadratic constraints: global optimality conditions

V. Jeyakumar; Alex M. Rubinov; Zhiyou Wu

\epsilon


Mathematical Programming | 1992

Generalizations of Slater's constraint qualification for infinite convex programs

V. Jeyakumar; Henry Wolkowicz

-subdifferentials. A sequential condition involving only the subdifferentials, but at nearby points to the minimizer for constraints, is also derived. For a smooth convex program, the sequential condition yields a limiting Kuhn--Tucker condition at nearby points without a constraint qualification. It is shown how the sequential conditions are related to the standard Lagrange multiplier condition. Applications to semidefinite programs, semi-infinite programs, and semiconvex programs are given. Several numerical examples are discussed to illustrate the significance of the sequential conditions.


Optimization | 1992

Generalized second-order directional derivatives and optimization with C1,1 functions

X.Q. Yang; V. Jeyakumar

In this paper, we first examine how global optimality of non-convex constrained optimization problems is related to Lagrange multiplier conditions. We then establish Lagrange multiplier conditions for global optimality of general quadratic minimization problems with quadratic constraints. We also obtain necessary global optimality conditions, which are different from the Lagrange multiplier conditions for special classes of quadratic optimization problems. These classes include weighted least squares with ellipsoidal constraints, and quadratic minimization with binary constraints. We discuss examples which demonstrate that our optimality conditions can effectively be used for identifying global minimizers of certain multi-extremal non-convex quadratic optimization problems.


Journal of Optimization Theory and Applications | 1999

Nonsmooth calculus, minimality, and monotonicity of convexificators

V. Jeyakumar; D.T. Luc

In this paper we study constraint qualifications and duality results for infinite convex programs (P)μ = inf{f(x): g(x) ∈ − S, x ∈ C}, whereg = (g1,g2) andS = S1 ×S2,Si are convex cones,i = 1, 2,C is a convex subset of a vector spaceX, andf andgi are, respectively, convex andSi-convex,i = 1, 2. In particular, we consider the special case whenS2 is in afinite dimensional space,g2 is affine andS2 is polyhedral. We show that a recently introduced simple constraint qualification, and the so-called quasi relative interior constraint qualification both extend to (P), from the special case thatg = g2 is affine andS = S2 is polyhedral in a finite dimensional space (the so-called partially finite program). This provides generalized Slater type conditions for (P) which are much weaker than the standard Slater condition. We exhibit the relationship between these two constraint qualifications and show how to replace the affine assumption ong2 and the finite dimensionality assumption onS2, by a local compactness assumption. We then introduce the notion of strong quasi relative interior to get parallel results for more general infinite dimensional programs without the local compactness assumption. Our basic tool reduces to guaranteeing the closure of the sum of two closed convex cones.


Siam Journal on Optimization | 2002

Characterizing Set Containments Involving Infinite Convex Constraints and Reverse-Convex Constraints

V. Jeyakumar

In this paper, a new generalized second-order directional derivative and a set-valued generalized Hessian are introudced for C1,1 functions in real Banach spaces. It is shown that this set-valued generalized Hessian is single-valued at a point if and only if the function is twice weakly Gateaux differentiable at the point and that the generalized second-order directional derivative is upper semi-continuous under a regularity condition. Various generalized calculus rules are also given for C1,1 functions. The generalized second-order directional derivative is applied to derive second-order necessary optirnality conditions for mathematical programming problems.


Journal of Optimization Theory and Applications | 1986

A generalization of a minimax theorem of Fan via a theorem of the alternative

V. Jeyakumar

Noncompact convexificators, which provide upper convex and lower concave approximations for a continuous function, are defined. Various calculus rules, including extremality and mean-value properties, are presented. Regularity conditions are given for convexificators to be minimal. A characterization of quasiconvexity of a continuous function is obtained in terms of the quasimonotonicity of convexificators.


Mathematical Programming | 1993

Convex composite multi-objective nonsmooth programming

V. Jeyakumar; Xiaoqi Yang

Dual characterizations of the containment of a closed convex set, defined by infinite convex constraints, in an arbitrary polyhedral set, in a reverse-convex set, defined by convex constraints, and in another convex set, defined by finite convex constraints, are given. A special case of these dual characterizations has played a key role in generating knowledge-based support vector machine classifiers which are powerful tools in data classification and mining. The conditions in these dual characterizations reduce to simple nonasymptotic conditions under Slaters constraint qualification.


Optimization | 1985

Convexlike alternative theorems and mathematical programming

V. Jeyakumar

The paper contains a version of a minimax theorem with weakened convexity, extending a minimax theorem of Fan. The main result is obtained with the use of a generalized Gordan theorem, which is proved using a separation theorem. An example is also discussed.


Siam Journal on Optimization | 2010

Strong Duality in Robust Convex Programming: Complete Characterizations

V. Jeyakumar; Guoyin Li

This paper examines nonsmooth constrained multi-objective optimization problems where the objective function and the constraints are compositions of convex functions, and locally Lipschitz and Gâteaux differentiable functions. Lagrangian necessary conditions, and new sufficient optimality conditions for efficient and properly efficient solutions are presented. Multi-objective duality results are given for convex composite problems which are not necessarily convex programming problems. Applications of the results to new and some special classes of nonlinear programming problems are discussed. A scalarization result and a characterization of the set of all properly efficient solutions for convex composite problems are also discussed under appropriate conditions.

Collaboration


Dive into the V. Jeyakumar's collaboration.

Top Co-Authors

Avatar

Guoyin Li

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar

G. M. Lee

Pukyong National University

View shared research outputs
Top Co-Authors

Avatar

D.T. Luc

University of Avignon

View shared research outputs
Top Co-Authors

Avatar

B. M. Glover

Federation University Australia

View shared research outputs
Top Co-Authors

Avatar

Thai Doan Chuong

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar

Zhiyou Wu

Chongqing Normal University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge