Jefferson G. Melo
Universidade Federal de Goiás
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jefferson G. Melo.
Journal of Optimization Theory and Applications | 2012
G. C. Bento; Jefferson G. Melo
In this paper, a subgradient type algorithm for solving convex feasibility problem on Riemannian manifold is proposed and analysed. The sequence generated by the algorithm converges to a solution of the problem, provided the sectional curvature of the manifold is non-negative. Moreover, assuming a Slater type qualification condition, we analyse a variant of the first algorithm, which generates a sequence with finite convergence property, i.e., a feasible point is obtained after a finite number of iterations. Some examples motivating the application of the algorithm for feasibility problems, nonconvex in the usual sense, are considered.
Journal of Optimization Theory and Applications | 2010
Regina S. Burachik; Alfredo N. Iusem; Jefferson G. Melo
We consider a problem of minimizing an extended real-valued function defined in a Hausdorff topological space. We study the dual problem induced by a general augmented Lagrangian function. Under a simple set of assumptions on this general augmented Lagrangian function, we obtain strong duality and existence of exact penalty parameter via an abstract convexity approach. We show that every cluster point of a sub-optimal path related to the dual problem is a primal solution. Our assumptions are more general than those recently considered in the related literature.
Siam Journal on Optimization | 2017
Max L. N. Gonçalves; Jefferson G. Melo; Renato D. C. Monteiro
This paper describes a regularized variant of the alternating direction method of multipliers (ADMM) for solving linearly constrained convex programs. It is shown that the pointwise iteration-complexity of the new method is better than the corresponding one for the standard ADMM method and that, up to a logarithmic term, is identical to the ergodic iteration-complexity of the latter method. Our analysis is based on first presenting and establishing the pointwise iteration-complexity of a regularized non-Euclidean hybrid proximal extragradient framework whose error condition at each iteration includes both a relative error and a summable error. It is then shown that the new method is a special instance of the latter framework where the sequence of summable errors is identically zero when the ADMM stepsize is less than one or a nontrivial sequence when the stepsize is in the interval [1, (1 +\sqrt{5})/2).
Journal of Computational and Applied Mathematics | 2017
Max L. N. Gonçalves; Jefferson G. Melo
In this paper, we consider the problem of solving a constrained system of nonlinear equations. We propose an algorithm based on a combination of the Newton and conditional gradient methods, and establish its local convergence analysis. Our analysis is set up by using a majorant condition technique, allowing us to prove in a unified way convergence results for two large families of nonlinear functions. The first one includes functions whose derivative satisfies a Holder-like condition, and the second one consists of a substantial subclass of analytic functions. Numerical experiments illustrating the applicability of the proposed method are presented, and comparisons with some other methods are discussed.
Optimization | 2015
Regina S. Burachik; Alfredo N. Iusem; Jefferson G. Melo
Augmented Lagrangian duality provides zero duality gap and saddle point properties for nonconvex optimization. On the basis of this duality, subgradient-like methods can be applied to the (convex) dual of the original problem. These methods usually recover the optimal value of the problem, but may fail to provide a primal solution. We prove that the recovery of a primal solution by such methods can be characterized in terms of (i) the differentiability properties of the dual function and (ii) the exact penalty properties of the primal-dual pair. We also connect the property of finite termination with exact penalty properties of the dual pair. In order to establish these facts, we associate the primal-dual pair to a penalty map. This map, which we introduce here, is a convex and globally Lipschitz function and its epigraph encapsulates information on both primal and dual solution sets.
Journal of Optimization Theory and Applications | 2017
G. C. Bento; O. P. Ferreira; Jefferson G. Melo
This paper considers optimization problems on Riemannian manifolds and analyzes the iteration-complexity for gradient and subgradient methods on manifolds with nonnegative curvatures. By using tools from Riemannian convex analysis and directly exploring the tangent space of the manifold, we obtain different iteration-complexity bounds for the aforementioned methods, thereby complementing and improving related results. Moreover, we also establish an iteration-complexity bound for the proximal point method on Hadamard manifolds.
Journal of Global Optimization | 2015
Max L. N. Gonçalves; Jefferson G. Melo; L. F. Prudente
In this paper, we consider a nonlinear programming problem for which the constraint set may be infeasible. We propose an algorithm based on a large family of augmented Lagrangian functions and analyze its global convergence properties taking into account the possible infeasibility of the problem. We show that, in a finite number of iterations, the algorithm stops detecting the infeasibility of the problem or finds an approximate feasible/optimal solution with any required precision. We illustrate, by means of numerical experiments, that our algorithm is reliable for different Lagrangian/penalty functions proposed in the literature.
Journal of Optimization Theory and Applications | 2018
Max L. N. Gonçalves; M. Marques Alves; Jefferson G. Melo
In this paper, we obtain global pointwise and ergodic convergence rates for a variable metric proximal alternating direction method of multipliers for solving linearly constrained convex optimization problems. We first propose and study nonasymptotic convergence rates of a variable metric hybrid proximal extragradient framework for solving monotone inclusions. Then, the convergence rates for the former method are obtained essentially by showing that it falls within the latter framework. To the best of our knowledge, this is the first time that global pointwise (resp. pointwise and ergodic) convergence rates are obtained for the variable metric proximal alternating direction method of multipliers (resp. variable metric hybrid proximal extragradient framework).
Journal of Optimization Theory and Applications | 2013
M. Marques Alves; Jefferson G. Melo
We analyze a primal-dual pair of problems generated via a duality theory introduced by Svaiter. We propose a general algorithm and study its convergence properties. The focus is a general primal-dual principle for strong convergence of some classes of algorithms. In particular, we give a different viewpoint for the weak-to-strong principle of Bauschke and Combettes and unify many results concerning weak and strong convergence of subgradient type methods.
Journal of Global Optimization | 2018
V. A. Adona; Max L. N. Gonçalves; Jefferson G. Melo
This paper analyzes the iteration-complexity of a generalized alternating direction method of multipliers (G-ADMM) for solving separable linearly constrained convex optimization problems. This ADMM variant, first proposed by Bertsekas and Eckstein, introduces a relaxation parameter