María Laura Schuverdt
State University of Campinas
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by María Laura Schuverdt.
Siam Journal on Optimization | 2007
Roberto Andreani; Ernesto G. Birgin; José Mario Martínez; María Laura Schuverdt
Augmented Lagrangian methods with general lower-level constraints are considered in the present research. These methods are useful when efficient algorithms exist for solving subproblems in which the constraints are only of the lower-level type. Inexact resolution of the lower-level constrained subproblems is considered. Global convergence is proved using the constant positive linear dependence constraint qualification. Conditions for boundedness of the penalty parameters are discussed. The resolution of location problems in which many constraints of the lower-level set are nonlinear is addressed, employing the spectral projected gradient method for solving the subproblems. Problems of this type with more than
Mathematical Programming | 2007
Roberto Andreani; Ernesto G. Birgin; José Mario Martínez; María Laura Schuverdt
3 \times 10^6
Mathematical Programming | 2012
Roberto Andreani; Gabriel Haeser; María Laura Schuverdt; Paulo J. S. Silva
variables and
Siam Journal on Optimization | 2012
Roberto Andreani; Gabriel Haeser; María Laura Schuverdt; Paulo J. S. Silva
14 \times 10^6
Optimization | 2007
Roberto Andreani; José Mario Martínez; María Laura Schuverdt
constraints are solved in this way, using moderate computer time. All the codes are available at http://www.ime.usp.br/
Computational Optimization and Applications | 2010
Roberto Andreani; Ernesto G. Birgin; José Mario Martínez; María Laura Schuverdt
\sim
Journal of Optimization Theory and Applications | 2011
Gabriel Haeser; María Laura Schuverdt
egbirgin/tango/.
Applied Mathematics and Computation | 2012
Nélida E. Echebest; María Laura Schuverdt; R. P. Vignau
Two Augmented Lagrangian algorithms for solving KKT systems are introduced. The algorithms differ in the way in which penalty parameters are updated. Possibly infeasible accumulation points are characterized. It is proved that feasible limit points that satisfy the Constant Positive Linear Dependence constraint qualification are KKT solutions. Boundedness of the penalty parameters is proved under suitable assumptions. Numerical experiments are presented.
Computational & Applied Mathematics | 2011
Nélida E. Echebest; María Laura Schuverdt; R. P. Vignau
In this work we introduce a relaxed version of the constant positive linear dependence constraint qualification (CPLD) that we call RCPLD. This development is inspired by a recent generalization of the constant rank constraint qualification by Minchenko and Stakhovski that was called RCRCQ. We show that RCPLD is enough to ensure the convergence of an augmented Lagrangian algorithm and that it asserts the validity of an error bound. We also provide proofs and counter-examples that show the relations of RCRCQ and RCPLD with other known constraint qualifications. In particular, RCPLD is strictly weaker than CPLD and RCRCQ, while still stronger than Abadie’s constraint qualification. We also verify that the second order necessary optimality condition holds under RCRCQ.
Journal of Optimization Theory and Applications | 2016
Nélida E. Echebest; María Daniela Sánchez; María Laura Schuverdt
We present two new constraint qualifications (CQs) that are weaker than the recently introduced relaxed constant positive linear dependence (RCPLD) CQ. RCPLD is based on the assumption that many subsets of the gradients of the active constraints preserve positive linear dependence locally. A major open question was to identify the exact set of gradients whose properties had to be preserved locally and that would still work as a CQ. This is done in the first new CQ, which we call the constant rank of the subspace component (CRSC) CQ. This new CQ also preserves many of the good properties of RCPLD, such as local stability and the validity of an error bound. We also introduce an even weaker CQ, called the constant positive generator (CPG), which can replace RCPLD in the analysis of the global convergence of algorithms. We close this work by extending convergence results of algorithms belonging to all the main classes of nonlinear optimization methods: sequential quadratic programming, augmented Lagrangians, ...