Joydeep Dutta
Indian Institute of Technology Kanpur
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Joydeep Dutta.
Mathematical Programming | 2012
Stephan Dempe; Joydeep Dutta
Bilevel programming problems are often reformulated using the Karush–Kuhn–Tucker conditions for the lower level problem resulting in a mathematical program with complementarity constraints(MPCC). Clearly, both problems are closely related. But the answer to the question posed is “No” even in the case when the lower level programming problem is a parametric convex optimization problem. This is not obvious and concerns local optimal solutions. We show that global optimal solutions of the MPCC correspond to global optimal solutions of the bilevel problem provided the lower-level problem satisfies the Slater’s constraint qualification. We also show by examples that this correspondence can fail if the Slater’s constraint qualification fails to hold at lower-level. When we consider the local solutions, the relationship between the bilevel problem and its corresponding MPCC is more complicated. We also demonstrate the issues relating to a local minimum through examples.
Optimization | 2007
Stephan Dempe; Joydeep Dutta; Boris S. Mordukhovich
The article is devoted to the study of the so-called optimistic version of bilevel programming in finite-dimensional spaces. Problems of this type are intrinsically nonsmooth (even for smooth initial data) and can be treated by using appropriate tools of modern variational analysis and generalized differentiation. Considering a basic optimistic model in bilevel programming, we reduce it to a one-level framework of nondifferentiable programs formulated via (nonsmooth) optimal value function of the parametric lower-level problem in the original model. Using advanced formulas for computing basic subgradients of value/marginal functions in variational analysis, we derive new necessary optimality conditions for bilevel programs reflecting significant phenomena that have never been observed earlier. In particular, our optimality conditions for bilevel programs do not depend on the partial derivatives with respect to parameters of the smooth objective function in the parametric lower-level problem. We present efficient implementations of our approach and results obtained for bilevel programs with differentiable, convex, linear, and Lipschitzian functions describing the initial data of the lower-level and upper-level problems. ¶This work is dedicated to the memory of Prof. Dr Alexander Moiseevich Rubinov.
Numerical Functional Analysis and Optimization | 2001
Joydeep Dutta; V. Vetrivel
Necessary and sufficient conditions are obtained for the existence of approximate minima in vector optimization problems. The notion of approximate saddle point is introduced and the relation between approximate saddle points and the approximate minima are established.
Mathematical Methods of Operations Research | 2006
Joydeep Dutta; Christiane Tammer
We consider vector optimization problems on Banach spaces without convexity assumptions. Under the assumption that the objective function is locally Lipschitz we derive Lagrangian necessary conditions on the basis of Mordukhovich subdifferential and the approximate subdifferential by Ioffe using a non-convex scalarization scheme. Finally, we apply the results for deriving necessary conditions for weakly efficient solutions of non-convex location problems.
congress on evolutionary computation | 2007
Kalyanmoy Deb; Rahul Tewari; Mayur Dixit; Joydeep Dutta
Despite having a wide-spread applicability of evolutionary optimization procedures over the past few decades, EA researchers still face criticism about the theoretical optimality of obtained solutions. In this paper, we address this issue for problems for which gradients of objectives and constraints can be computed either exactly, or numerically or through subdifferentials. We suggest a systematic procedure of analyzing a representative set of Pareto-optimal solutions for their closeness to satisfying Karush-Kuhn-Tucker (KKT) points, which every Pareto-optimal solution must also satisfy. The procedure involves either a least-square solution or an optimum solution to a set of linear system of equations involving Lagrange multipliers. The procedure is applied to a number of differentiable and non-differentiable test problems and to a highly nonlinear engineering design problem. The results clearly show that EAs are capable of finding solutions close to theoretically optimal solutions in various problems. As a by-product, the error metric suggested in this paper can also be used as a termination condition for an EA application. Hopefully, this study will bring EAs and its research closer to classical optimization studies.
Optimization | 2004
Joydeep Dutta; Suresh Chandra
In this article we study a recently introduced notion of non-smooth analysis, namely convexifactors. We study some properties of the convexifactors and introduce two new chain rules. A new notion of non-smooth pseudoconvex function is introduced and its properties are studied in terms of convexifactors. We also present some optimality conditions for vector minimization in terms of convexifactors.
Journal of Global Optimization | 2013
Joydeep Dutta; Kalyanmoy Deb; Rupesh Tulshyan; Ramnik Arora
Karush–Kuhn–Tucker (KKT) optimality conditions are often checked for investigating whether a solution obtained by an optimization algorithm is a likely candidate for the optimum. In this study, we report that although the KKT conditions must all be satisfied at the optimal point, the extent of violation of KKT conditions at points arbitrarily close to the KKT point is not smooth, thereby making the KKT conditions difficult to use directly to evaluate the performance of an optimization algorithm. This happens due to the requirement of complimentary slackness condition associated with KKT optimality conditions. To overcome this difficulty, we define modified
Journal of Optimization Theory and Applications | 2002
Joydeep Dutta; Suresh Chandra
Operations Research Letters | 2008
Didier Aussel; Joydeep Dutta
{\epsilon}
Optimization | 2004
Joydeep Dutta; Juan Enrique Martínez-Legaz; Alexander M. Rubinov