Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Regina Sandra Burachik is active.

Publication


Featured researches published by Regina Sandra Burachik.


Set-valued Analysis | 2002

MAXIMAL MONOTONE OPERATORS, CONVEX FUNCTIONS AND A SPECIAL FAMILY OF ENLARGEMENTS

Regina Sandra Burachik; B. F. Svaiter

This work establishes new connections between maximal monotone operators and convex functions. Associated to each maximal monotone operator, there is a family of convex functions, each of which characterizes the operator. The basic tool in our analysis is a family of enlargements, recently introduced by Svaiter. This family of convex functions is in a one-to-one relation with a subfamily of these enlargements. We study the family of convex functions, and determine its extremal elements. An operator closely related to the Legendre–Fenchel conjugacy is introduced and we prove that this family of convex functions is invariant under this operator. The particular case in which the operator is a subdifferential of a convex function is discussed.


arXiv: Functional Analysis | 2003

Maximal monotonicity, conjugation and the duality product

Regina Sandra Burachik; B. F. Svaiter

Recently, the authors studied the connection between each maximal monotone operator T and a family H(T) of convex functions. Each member of this family characterizes the operator and satisfies two particular inequalities. The aim of this paper is to establish the converse of the latter fact. Namely, that every convex function satisfying those two particular inequalities is associated to a unique maximal monotone operator.


Set-valued Analysis | 1999

ε-Enlargements of Maximal Monotone Operators in Banach Spaces

Regina Sandra Burachik; B. F. Svaiter

Given a maximal monotone operator T in a Banach space, we consider an enlargement Tε, in which monotonicity is lost up to ε, in a very similar way to the ε-subdifferential of a convex function. We establish in this general framework some theoretical properties of Tε, like a transportation formula, local Lipschitz continuity, local boundedness, and a Brøndsted–Rockafellar property.


Archive | 1998

ε-Enlargements of Maximal Monotone Operators: Theory and Applications

Regina Sandra Burachik; Claudia A. Sagastizábal; B. F. Svaiter

Given a maximal monotone operator T,we consider a certain e-enlargement T e , playing the role of the e-subdifferential in nonsmooth optimization. We establish some theoretical properties of T e , including a transportation formula, its Lipschitz continuity, and a result generalizing Bronsted & Rockafellar’s theorem. Then we make use of the e-enlargement to define an algorithm for finding a zero of T.


Mathematical Programming | 2005

A new geometric condition for Fenchel's duality in infinite dimensional spaces

Regina Sandra Burachik; V. Jeyakumar

In 1951, Fenchel discovered a special duality, which relates the minimization of a sum of two convex functions with the maximization of the sum of concave functions, using conjugates. Fenchels duality is central to the study of constrained optimization. It requires an existence of an interior point of a convex set which often has empty interior in optimization applications. The well known relaxations of this requirement in the literature are again weaker forms of the interior point condition. Avoiding an interior point condition in duality has so far been a difficult problem. However, a non-interior point type condition is essential for the application of Fenchels duality to optimization. In this paper we solve this problem by presenting a simple geometric condition in terms of the sum of the epigraphs of conjugate functions. We also establish a necessary and sufficient condition for the ε-subdifferential sum formula in terms of the sum of the epigraphs of conjugate functions. Our results offer further insight into Fenchels duality.


Siam Journal on Control and Optimization | 2005

An Outer Approximation Method for the Variational Inequality Problem

Regina Sandra Burachik; J. O. Lopes; B. F. Svaiter

We study two outer approximation schemes, applied to the variational inequality problem in reflexive Banach spaces. First we propose a generic outer approximation scheme, and its convergence analysis unifies a wide class of outer approximation methods applied to the constrained optimization problem. As is standard in this setting, boundedness and optimality of weak limit points are proved to hold under two alternative conditions: (i) boundedness of the feasible set, or (ii) coerciveness of the operator. To develop a convergence analysis where (i) and (ii) do not hold, we consider a second scheme in which the approximated subproblems use a coercive approximation of the original operator. Under conditions alternative to both (i) and (ii), we obtain standard convergence results. Furthermore, when the space is uniformly convex, we establish full strong convergence of the second scheme to a solution.


Proceedings of the American Mathematical Society | 2005

A simple closure condition for the normal cone intersection formula

Regina Sandra Burachik; V. Jeyakumar

In this paper it is shown that if C and D are two closed convex subsets of a Banach space X and x ∈ C∩D, then N C ∩ D (x) = N C (x)+N D (x) whenever the convex cone, (Epiσ C + Epiσ D ), is weak* closed, where σ C and N C are the support function and the normal cone of the set C respectively. This closure condition is shown to be weaker than the standard interior-point-like conditions and the bounded linear regularity condition.


Journal of Optimization Theory and Applications | 2001

Robustness of the Hybrid Extragradient Proximal-Point Algorithm

Regina Sandra Burachik; S. Scheimberg; B. F. Svaiter

The hybrid extragradient proximal-point method recently proposed by Solodov and Svaiter has the distinctive feature of allowing a relative error tolerance. We extend the error tolerance of this method, proving that it converges even if a summable error is added to the relative error. Furthermore, the extragradient step may be performed inexactly with a summable error. We present a convergence analysis, which encompasses other well-known variations of the proximal-point method, previously unrelated. We establish weak global convergence under mild assumptions.


Optimization Methods & Software | 2006

An inexact method of partial inverses and a parallel bundle method

Regina Sandra Burachik; Claudia A. Sagastizábal; Susana Scheimberg

For a maximal monotone operator T on a Hilbert space H and a closed subspace A of H, we consider the problem of finding (x, y∈T(x)) satisfying x∈A and y∈A ⊥. An equivalent formulation of this problem makes use of the partial inverse operator of Spingarn. The resulting generalized equation can be solved by using the proximal point algorithm. We consider instead the use of hybrid proximal methods. Hybrid methods use enlargements of operators, close in spirit to the concept of ϵ-subdifferentials. We characterize the enlargement of the partial inverse operator in terms of the enlargement of T itself. We present a new algorithm of resolution that combines Spingarn and hybrid methods, we prove for this method global convergence only assuming existence of solutions and maximal monotonicity of T. We also show that, under standard assumptions, the method has a linear rate of convergence. For the important problem of finding a zero of a sum of maximal monotone operators T 1, …, T m , we present a highly parallelizable scheme. Finally, we derive a parallel bundle method for minimizing the sum of polyhedral functions.


Rairo-operations Research | 1999

A generalized proximal point algorithm for the nonlinear complementarity problem

Regina Sandra Burachik; Alfredo N. Iusem

We consider a generalized proximal point method (GPPA) for solving the nonlinear complementarity problem with monotone operators in R lt differs from the classical proximal point method discussed by Rockafellar for the problem offinding zeroes of monotone operators in the use of generalized distances, called (p-divergences, instead of the Euclidean one. These distances play not only a regularization wie but also a penalization one, forcing the sequence generaled by the method to remain in the interior of the feasible set, so that the method behaves like an interior point one. Under appropriate assumptions on the ip-divergence and the monotone operator we prove that the sequence converges if and only if the problem has solutions, in which case the limit is a solution. If the problem does not have solutions, then the sequence is unbounded. We extend previous results for the proximal point method concerning convex optimization problems.

Collaboration


Dive into the Regina Sandra Burachik's collaboration.

Top Co-Authors

Avatar

B. F. Svaiter

Instituto Nacional de Matemática Pura e Aplicada

View shared research outputs
Top Co-Authors

Avatar

Susana Scheimberg

Federal University of Rio de Janeiro

View shared research outputs
Top Co-Authors

Avatar

Alfredo N. Iusem

Instituto Nacional de Matemática Pura e Aplicada

View shared research outputs
Top Co-Authors

Avatar

Claudia A. Sagastizábal

Instituto Nacional de Matemática Pura e Aplicada

View shared research outputs
Top Co-Authors

Avatar

C. Yalçın Kaya

University of South Australia

View shared research outputs
Top Co-Authors

Avatar

V. Jeyakumar

University of New South Wales

View shared research outputs
Top Co-Authors

Avatar

J. O. Lopes

Federal University of Rio de Janeiro

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

S. Scheimberg

Federal University of Rio de Janeiro

View shared research outputs
Top Co-Authors

Avatar

Nergiz A. Ismayilova

Eskişehir Osmangazi University

View shared research outputs
Researchain Logo
Decentralizing Knowledge