Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Francisco Javier Aragón Artacho is active.

Publication


Featured researches published by Francisco Javier Aragón Artacho.


Journal of Optimization Theory and Applications | 2014

Recent Results on Douglas---Rachford Methods for Combinatorial Optimization Problems

Francisco Javier Aragón Artacho; Jonathan M. Borwein; Matthew K. Tam

We discuss recent positive experiences applying convex feasibility algorithms of Douglas–Rachford type to highly combinatorial and far from convex problems.


Anziam Journal | 2014

Douglas-Rachford feasibility methods for matrix completion problems.

Francisco Javier Aragón Artacho; Jonathan M. Borwein; Matthew K. Tam

In this paper, we give general recommendations for successful application of the Douglas–Rachford reflection method to convex and nonconvex real matrix completion problems. These guidelines are demonstrated by various illustrative examples. doi:10.1017/S1446181114000145


Journal of Global Optimization | 2013

Global convergence of a non-convex Douglas–Rachford iteration

Francisco Javier Aragón Artacho; Jonathan M. Borwein

We establish a region of convergence for the proto-typical non-convex Douglas–Rachford iteration which finds a point on the intersection of a line and a circle. Previous work on the non-convex iteration Borwein and Sims (Fixed-point algorithms for inverse problems in science and engineering, pp. 93–109, 2011) was only able to establish local convergence, and was ineffective in that no explicit region of convergence could be given.


Journal of Global Optimization | 2011

Enhanced metric regularity and Lipschitzian properties of variational systems

Francisco Javier Aragón Artacho; Boris S. Mordukhovich

This paper mainly concerns the study of a large class of variational systems governed by parametric generalized equations, which encompass variational and hemivariational inequalities, complementarity problems, first-order optimality conditions, and other optimization-related models important for optimization theory and applications. An efficient approach to these issues has been developed in our preceding work (Aragón Artacho and Mordukhovich in Nonlinear Anal 72:1149–1170, 2010) establishing qualitative and quantitative relationships between conventional metric regularity/subregularity and Lipschitzian/calmness properties in the framework of parametric generalized equations in arbitrary Banach spaces. This paper provides, on one hand, significant extensions of the major results in op.cit. to partial metric regularity and to the new hemiregularity property. On the other hand, we establish enhanced relationships between certain strong counterparts of metric regularity/hemiregularity and single-valued Lipschitzian localizations. The results obtained are new in both finite-dimensional and infinite-dimensional settings.


Journal of Global Optimization | 2016

Global behavior of the Douglas---Rachford method for a nonconvex feasibility problem

Francisco Javier Aragón Artacho; Jonathan M. Borwein; Matthew K. Tam

In recent times the Douglas–Rachford algorithm has been observed empirically to solve a variety of nonconvex feasibility problems including those of a combinatorial nature. For many of these problems current theory is not sufficient to explain this observed success and is mainly concerned with questions of local convergence. In this paper we analyze global behavior of the method for finding a point in the intersection of a half-space and a potentially non-convex set which is assumed to satisfy a well-quasi-ordering property or a property weaker than compactness. In particular, the special case in which the second set is finite is covered by our framework and provides a prototypical setting for combinatorial optimization problems.


Mathematical Programming | 2018

Accelerating the DC algorithm for smooth functions

Francisco Javier Aragón Artacho; Ronan M. T. Fleming; Phan T. Vuong

We introduce two new algorithms to minimise smooth difference of convex (DC) functions that accelerate the convergence of the classical DC algorithm (DCA). We prove that the point computed by DCA can be used to define a descent direction for the objective function evaluated at this point. Our algorithms are based on a combination of DCA together with a line search step that uses this descent direction. Convergence of the algorithms is proved and the rate of convergence is analysed under the Łojasiewicz property of the objective function. We apply our algorithms to a class of smooth DC programs arising in the study of biochemical reaction networks, where the objective function is real analytic and thus satisfies the Łojasiewicz property. Numerical tests on various biochemical models clearly show that our algorithms outperform DCA, being on average more than four times faster in both computational time and the number of iterations. Numerical experiments show that the algorithms are globally convergent to a non-equilibrium steady state of various biochemical networks, with only chemically consistent restrictions on the network topology.


Computational Optimization and Applications | 2012

A Lyusternik–Graves theorem for the proximal point method

Francisco Javier Aragón Artacho; Michaël Gaydu

We consider a generalized version of the proximal point algorithm for solving the perturbed inclusion y∈T(x), where y is a perturbation element near 0 and T is a set-valued mapping acting from a Banach space X to a Banach space Y which is metrically regular around some point


Computational Optimization and Applications | 2018

A new projection method for finding the closest point in the intersection of convex sets

Francisco Javier Aragón Artacho; Rubén Campoy

({\bar{x}},0)


Set-valued Analysis | 2007

A New and Self-Contained Proof of Borwein's Norm Duality Theorem

Francisco Javier Aragón Artacho

in its graph. We study the behavior of the convergent iterates generated by the algorithm and we prove that they inherit the regularity properties of T, and vice versa. We analyze the cases when the mapping T is metrically regular and strongly regular.


Optimization Methods & Software | 2018

The cyclic Douglas–Rachford algorithm with -sets-Douglas–Rachford operators

Francisco Javier Aragón Artacho; Yair Censor; Aviv Gibali

In this paper we present a new iterative projection method for finding the closest point in the intersection of convex sets to any arbitrary point in a Hilbert space. This method, termed AAMR for averaged alternating modified reflections, can be viewed as an adequate modification of the Douglas–Rachford method that yields a solution to the best approximation problem. Under a constraint qualification at the point of interest, we show strong convergence of the method. In fact, the so-called strong CHIP fully characterizes the convergence of the AAMR method for every point in the space. We report some promising numerical experiments where we compare the performance of AAMR against other projection methods for finding the closest point in the intersection of pairs of finite dimensional subspaces.

Collaboration


Dive into the Francisco Javier Aragón Artacho's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Matthew K. Tam

University of Göttingen

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Asen L. Dontchev

American Mathematical Society

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

David H. Bailey

Lawrence Berkeley National Laboratory

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Liangjin Yao

University of Newcastle

View shared research outputs
Researchain Logo
Decentralizing Knowledge