Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Adam B. Levy is active.

Publication


Featured researches published by Adam B. Levy.


Siam Journal on Optimization | 1999

Stability of Locally Optimal Solutions

Adam B. Levy; R. A. Poliquin; R. T. Rockafellar

Necessary and sufficient conditions are obtained for the Lipschitzian stability of local solutions to finite-dimensional parameterized optimization problems in a very general setting. Properties of prox-regularity of the essential objective function and positive definiteness of its coderivative Hessian are the keys to these results. A previous characterization of tilt stability arises as a special case.


Mathematical Programming | 2004

Coderivatives in parametric optimization

Adam B. Levy; Boris S. Mordukhovich

Abstract.We consider parametric families of constrained problems in mathematical programming and conduct a local sensitivity analysis for multivalued solution maps. Coderivatives of set-valued mappings are our basic tool to analyze the parametric sensitivity of either stationary points or stationary point-multiplier pairs associated with parameterized optimization problems. An implicit mapping theorem for coderivatives is one key to this analysis for either of these objects, and in addition, a partial coderivative rule is essential for the analysis of stationary points. We develop general results along both of these lines and apply them to study the parametric sensitivity of stationary points alone, as well as stationary point-multiplier pairs. Estimates are computed for the coderivative of the stationary point multifunction associated with a general parametric optimization model, and these estimates are refined and augmented by estimates for the coderivative of the stationary point-multiplier multifunction in the case when the constraints are representable in a special composite form. When combined with existing coderivative formulas, our estimates are entirely computable in terms of the original data of the problem.


Mathematical Programming | 1996

Implicit multifunction theorems for the sensitivity analysis of variational conditions

Adam B. Levy

We study implicit multifunctions (set-valued mappings) obtained from inclusions of the form 0∈M(p,x), whereM is a multifunction. Our basic implicit multifunction theorem provides an approximation for a generalized derivative of the implicit multifunction in terms of the derivative of the multifunctionM. Our primary focus is on three special cases of inclusions 0∈M(p,x) which represent different kinds of generalized variational inequalities, called “variational conditions”. Appropriate versions of our basic implicit multifunction theorem yield approximations for generalized derivatives of the solutions to each kind of variational condition. We characterize a well-known generalized Lipschitz property in terms of generalized derivatives, and use our implicit multifunction theorems to state sufficient conditions (and necessary in one case) for solutions of variational conditions to possess this Lipschitz, property. We apply our results to a general parameterized nonlinear programming problem, and derive a new second-order condition which guarantees that the stationary points associated with the Karush-Kuhn-Tucker conditions exhibit generalized Lipschitz continuity with respect to the parameter.


Siam Journal on Control and Optimization | 2001

Solution Sensitivity from General Principles

Adam B. Levy

We present a generic approach for the sensitivity analysis of solutions to parameterized finite-dimensional optimization problems. We study differentiability and continuity properties of quasi-solutions (stationary points or stationary point-multiplier pairs), as well as their existence and uniqueness, and the issue of when quasi-solutions are actually optimal solutions. Our approach is founded on a few general rules that can all be viewed as generalizations of the classical inverse mapping theorem, and sensitivity analyses of particular optimization models can be made by computing certain generalized derivatives in order to translate the general rules into the terminology of the particular model. The useful application of this approach hinges on an inverse mapping theorem that allows us to compute derivatives of solution mappings without computing solutions, which is crucial since numerical solutions to sensitive problems are fundamentally unreliable. We illustrate how this process works for parameterized nonlinear programs, but the generality of the rules on which our approach is based means that a similar sensitivity analysis is possible for practically any finite-dimensional optimization problem. Our approach is distinguished not only by its broad applicability but by its separate treatment of different issues that are frequently treated in tandem. In particular, meaningful generalized derivatives can be computed and continuity properties can be established even in cases of multiple or no quasi-solutions (or optimal solutions) for some parameters. This approach has not only produced unprecedented and computable conditions for traditional properties in well-studied situations, but has also characterized interesting new properties that might otherwise have remained unexplored.


Nonlinear Analysis-theory Methods & Applications | 1996

Variational conditions and the proto-differentiation of partial subgradient mappings

Adam B. Levy; R. T. Rockafellar

Subgradient mappings have many roles in variational analysis, such as the formulation of optimality conditions and, in the special case of normal cone mappings, the statement and analysis of variational inequalities and related expressions. Central to the study of perturbations of solutions to systems of conditions in which subgradient mappings appear are concepts of generalized differentiation such as proto-differentiability, which is unhampered by a solution mapping’s potential multivaluedness. This paper extends the known examples of proto-differentiability by showing that a large and important class of “partial” subgradient mappings have this property. Until now the main examples have been the complete subgradient mappings associated with fully amenable functions. The extension, made difficult by the need to deal geometrically with projections of graphs, relies on the notion of a fully amenable function having additional variables that provide a “compatible parameterization.” The results are applied to the sensitivity analysis of generalized variational inequalities in which the underlying set need not be convex and can vary with the parameters.


Set-valued Analysis | 1997

Characterizing the Single-Valuedness of Multifunctions ?

Adam B. Levy; R. A. Poliquin

We characterize the local single-valuedness and continuity of multifunctions (set-valued mappings) in terms of their ‘premonotonicity’ and lower semicontinuity. This result completes the well-known fact that lower semicontinuous, monotone multifunctions are single-valued and continuous. We also show that a multifunction is actually a Lipschitz single-valued mapping if and only if it is premonotone and has a generalized Lipschitz property called ‘Aubin continuity’. The possible single-valuedness and continuity of multifunctions is at the heart of some of the most fundamental issues in variational analysis and its application to optimization. We investigate the impact of our characterizations on several of these issues; discovering exactly when certain generalized subderivatives can be identified with classical derivatives, and determining precisely when solutions to generalized variational inequalities are locally unique and Lipschitz continuous. As an application of our results involving generalized variational inequalities, we characterize when the Karush–Kuhn–Tucker pairs associated with a parameterized optimization problem are locally unique and Lipschitz continuous.


Siam Journal on Control and Optimization | 1999

Sensitivity of Solutions to Variational Inequalities on Banach Spaces

Adam B. Levy

We analyze the sensitivity of parameterized variational inequalities for convex polyhedric sets in reflexive Banach spaces. We compute a generalized derivative of the solution mapping where the formula for the derivative is given in terms of the solutions to an auxiliary variational inequality. These results are distinguished from other work in this area by the fact that they do not depend on the uniqueness of the solutions to the variational inequalities. To obtain our results, we use second-order epi-derivatives to analyze the second-order properties of polyhedric sets. We apply our results to sensitivity analyses of stationary points and KKT pairs associated with constrained infinite-dimensional optimization problems.


Archive | 1995

Sensitivity of Solutions in Nonlinear Programming Problems with Nonunique Multipliers

Adam B. Levy; R. T. Rockafellar

We analyze the perturbations of quasi-solutions to a parameterized nonlinear programming problem, these being feasible solutions accompanied by a Lagrange multiplier vector such that the Karush-Kuhn-Tucker optimality conditions are satisfied. We show under a standard constraint qualification, not requiring uniqueness of the multipliers, that the quasi-solution mapping is differentiable in a generalized sense, and we present a formula for its derivative. The results are distinguished from previous ones in the subject, in that they do not entail having to impose conditions to ensure that dual as well as primal elements behave well with respect to sensitivity.


Siam Journal on Optimization | 2000

Calm Minima in Parameterized Finite-Dimensional Optimization

Adam B. Levy

Calmness is a restricted form of local Lipschitz continuity where one point of comparison is fixed. We study the calmness of solutions to parameterized optimization problems of the form


Proceedings of the American Mathematical Society | 1998

CONVEX COMPOSITE FUNCTIONS IN BANACH SPACES AND THE PRIMAL LOWER-NICE PROPERTY

C. Combari; A. E. Alaoui; Adam B. Levy; R. A. Poliquin; Lionel Thibault

Collaboration


Dive into the Adam B. Levy's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Clare Hunter

University of Washington

View shared research outputs
Top Co-Authors

Avatar

Kenneth Martay

University of Washington

View shared research outputs
Top Co-Authors

Avatar

Youri Vater

University of Washington

View shared research outputs
Top Co-Authors

Avatar

Lionel Thibault

University of Montpellier

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Greg Dembo

University of Washington

View shared research outputs
Top Co-Authors

Avatar

A. E. Alaoui

University of Montpellier

View shared research outputs
Top Co-Authors

Avatar

C. Combari

University of Montpellier

View shared research outputs
Researchain Logo
Decentralizing Knowledge