J. Y. Bello Cruz
Universidade Federal de Goiás
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by J. Y. Bello Cruz.
Numerical Functional Analysis and Optimization | 2009
J. Y. Bello Cruz; A. N. Iusem
We introduce a two-step direct method, like Korpelevichs, for solving monotone variational inequalities. The advantage of our method over that one is that ours converges strongly in Hilbert spaces, whereas only weak convergence has been proved for Korpelevichs algorithm. Our method also has the following desirable property: the sequence converges to the solution of the problem that lies closest to the initial iterate.
Journal of Optimization Theory and Applications | 2014
J. Y. Bello Cruz; G. Bouza Allende
In some applications, the comparison between two elements may depend on the point leading to the so called variable order structure. Optimality concepts may be extended to this more general framework. In this paper, we extend the steepest descent-like method for smooth unconstrained vector optimization problems under a variable order structure. Roughly speaking, we see that every accumulation point of the generated sequence satisfies a necessary first order condition. We discuss the consequence of this fact in the convex case.In some applications, the comparison between two elements may depend on the point leading to the so called variable order structure. Optimality concepts may be extended to this more general framework. In this paper, we extend the steepest descent-like method for smooth unconstrained vector optimization problems under a variable order structure. Roughly speaking, we see that every accumulation point of the generated sequence satisfies a necessary first order condition. We discuss the consequence of this fact in the convex case.
Optimization | 2012
J. Y. Bello Cruz; A. N. Iusem
We introduce a fully explicit method for solving monotone variational inequalities in Hilbert spaces, where orthogonal projections onto the feasible set are replaced by projections onto suitable hyperplanes. We prove weak convergence of the whole generated sequence to a solution of the problem, under only the assumptions of continuity and monotonicity of the operator and existence of solutions.
Numerical Functional Analysis and Optimization | 2011
J. Y. Bello Cruz; A. N. Iusem
In this article, we propose a strongly convergent variant on the projected subgradient method for constrained convex minimization problems in Hilbert spaces. The advantage of the proposed method is that it converges strongly when the problem has solutions, without additional assumptions. The method also has the following desirable property: the sequence converges to the solution of the problem which lies closest to the initial iterate.
Mathematics and Computers in Simulation | 2015
J. Y. Bello Cruz; A. N. Iusem
We analyze an explicit method for solving nonsmooth variational inequality problems, establishing convergence of the whole sequence, under paramonotonicity of the operator. Previous results on similar methods required much more demanding assumptions, like coerciveness of the operator.
Journal of Optimization Theory and Applications | 2014
J. Y. Bello Cruz; R. Díaz Millán
We propose a direct splitting method for solving a nonsmooth variational inequality in Hilbert spaces. The weak convergence is established when the operator is the sum of two point-to-set and monotone operators. The proposed method is a natural extension of the incremental subgradient method for nondifferentiable optimization, which strongly explores the structure of the operator using projected subgradient-like techniques. The advantage of our method is that any nontrivial subproblem must be solved, like the evaluation of the resolvent operator. The necessity to compute proximal iterations is the main difficulty of other schemes for solving this kind of problem.We propose a direct splitting method for solving a nonsmooth variational inequality in Hilbert spaces. The weak convergence is established when the operator is the sum of two point-to-set and monotone operators. The proposed method is a natural extension of the incremental subgradient method for nondifferentiable optimization, which strongly explores the structure of the operator using projected subgradient-like techniques. The advantage of our method is that any nontrivial subproblem must be solved, like the evaluation of the resolvent operator. The necessity to compute proximal iterations is the main difficulty of other schemes for solving this kind of problem.
Journal of Global Optimization | 2014
J. Y. Bello Cruz; W. Oliveira
We propose two restricted memory level bundle-like algorithms for minimizing a convex function over a convex set. If the memory is restricted to one linearization of the objective function, then both algorithms are variations of the projected subgradient method. The first algorithm, proposed in Hilbert space, is a conceptual one. It is shown to be strongly convergent to the solution that lies closest to the initial iterate. Furthermore, the entire sequence of iterates generated by the algorithm is contained in a ball with diameter equal to the distance between the initial point and the solution set. The second algorithm is an implementable version. It mimics as much as possible the conceptual one in order to resemble convergence properties. The implementable algorithm is validated by numerical results on several two-stage stochastic linear programs.
Numerical Functional Analysis and Optimization | 2016
J. Y. Bello Cruz; W. de Oliveira
ABSTRACT This work focuses on convergence analysis of the projected gradient method for solving constrained convex minimization problems in Hilbert spaces. We show that the sequence of points generated by the method employing the Armijo line search converges weakly to a solution of the considered convex optimization problem. Weak convergence is established by assuming convexity and Gateaux differentiability of the objective function, whose Gateaux derivative is supposed to be uniformly continuous on bounded sets. Furthermore, we propose some modifications in the classical projected gradient method in order to obtain strong convergence. The new variant has the following desirable properties: the sequence of generated points is entirely contained in a ball with diameter equal to the distance between the initial point and the solution set, and the whole sequence converges strongly to the solution of the problem that lies closest to the initial iterate. Convergence analysis of both methods is presented without Lipschitz continuity assumption.ABSTRACTThis work focuses on convergence analysis of the projected gradient method for solving constrained convex minimization problems in Hilbert spaces. We show that the sequence of points generated by the method employing the Armijo line search converges weakly to a solution of the considered convex optimization problem. Weak convergence is established by assuming convexity and Gateaux differentiability of the objective function, whose Gateaux derivative is supposed to be uniformly continuous on bounded sets. Furthermore, we propose some modifications in the classical projected gradient method in order to obtain strong convergence. The new variant has the following desirable properties: the sequence of generated points is entirely contained in a ball with diameter equal to the distance between the initial point and the solution set, and the whole sequence converges strongly to the solution of the problem that lies closest to the initial iterate. Convergence analysis of both methods is presented without...
Journal of Optimization Theory and Applications | 2013
J. Y. Bello Cruz; P. S. M. Santos; S. Scheimberg
We introduce an explicit algorithm for solving nonsmooth equilibrium problems in finite-dimensional spaces. A particular iteration proceeds in two phases. In the first phase, an orthogonal projection onto the feasible set is replaced by projections onto suitable hyperplanes. In the second phase, a projected subgradient type iteration is replaced by a specific projection onto a halfspace. We prove, under suitable assumptions, convergence of the whole generated sequence to a solution of the problem. The proposed algorithm has a low computational cost per iteration and, some numerical results are reported.
Optimization | 2015
J. Y. Bello Cruz; R. Díaz Millán
In this paper, we propose variants of Forward-Backward splitting method for finding a zero of the sum of two operators. A classical modification of Forward-Backward method was proposed by Tseng, which is known to converge when the forward and the backward operators are monotone and with Lipschitz continuity of the forward operator. The conceptual algorithm proposed here improves Tseng’s method in some instances. The first and main part of our approach, contains an explicit Armijo-type search in the spirit of the extragradient-like methods for variational inequalities. During the iteration process, the search performs only one calculation of the forward-backward operator in each tentative of the step. This achieves a considerable computational saving when the forward-backward operator is computationally expensive. The second part of the scheme consists in special projection steps. The convergence analysis of the proposed scheme is given assuming monotonicity on both operators, without Lipschitz continuity ...In this paper, we propose variants of Forward-Backward splitting method for finding a zero of the sum of two operators. A classical modification of Forward-Backward method was proposed by Tseng, which is known to converge when the forward and the backward operators are monotone and with Lipschitz continuity of the forward operator. The conceptual algorithm proposed here improves Tseng’s method in some instances. The first and main part of our approach, contains an explicit Armijo-type search in the spirit of the extragradient-like methods for variational inequalities. During the iteration process, the search performs only one calculation of the forward-backward operator in each tentative of the step. This achieves a considerable computational saving when the forward-backward operator is computationally expensive. The second part of the scheme consists in special projection steps. The convergence analysis of the proposed scheme is given assuming monotonicity on both operators, without Lipschitz continuity assumption on the forward operator.