Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Lorenzo Stella is active.

Publication


Featured researches published by Lorenzo Stella.


conference on decision and control | 2014

Douglas-rachford splitting: Complexity estimates and accelerated variants

Panagiotis Patrinos; Lorenzo Stella; Alberto Bemporad

We propose a new approach for analyzing convergence of the Douglas-Rachford splitting method for solving convex composite optimization problems. The approach is based on a continuously differentiable function, the Douglas-Rachford Envelope (DRE), whose stationary points correspond to the solutions of the original (possibly nonsmooth) problem. By proving the equivalence between the Douglas-Rachford splitting method and a scaled gradient method applied to the DRE, results from smooth unconstrained optimization are employed to analyze convergence properties of DRS, to tune the method and to derive an accelerated version of it.


Computational Optimization and Applications | 2017

Forward---backward quasi-Newton methods for nonsmooth optimization problems

Lorenzo Stella; Andreas Themelis; Panagiotis Patrinos

The forward–backward splitting method (FBS) for minimizing a nonsmooth composite function can be interpreted as a (variable-metric) gradient method over a continuously differentiable function which we call forward–backward envelope (FBE). This allows to extend algorithms for smooth unconstrained optimization and apply them to nonsmooth (possibly constrained) problems. Since the FBE can be computed by simply evaluating forward–backward steps, the resulting methods rely on a similar black-box oracle as FBS. We propose an algorithmic scheme that enjoys the same global convergence properties of FBS when the problem is convex, or when the objective function possesses the Kurdyka–Łojasiewicz property at its critical points. Moreover, when using quasi-Newton directions the proposed method achieves superlinear convergence provided that usual second-order sufficiency conditions on the FBE hold at the limit point of the generated sequence. Such conditions translate into milder requirements on the original function involving generalized second-order differentiability. We show that BFGS fits our framework and that the limited-memory variant L-BFGS is well suited for large-scale problems, greatly outperforming FBS or its accelerated version in practice, as well as ADMM and other problem-specific solvers. The analysis of superlinear convergence is based on an extension of the Dennis and Moré theorem for the proposed algorithmic scheme.


conference on decision and control | 2016

New primal-dual proximal algorithm for distributed optimization

Puya Latafat; Lorenzo Stella; Panagiotis Patrinos

We consider a network of agents, each with its own private cost consisting of the sum of two possibly nonsmooth convex functions, one of which is composed with a linear operator. At every iteration each agent performs local calculations and can only communicate with its neighbors. The goal is to minimize the aggregate of the private cost functions and reach a consensus over a graph. We propose a primal-dual algorithm based on Asymmetric Forward-Backward-Adjoint (AFBA), a new operator splitting technique introduced recently by two of the authors. Our algorithm includes the method of Chambolle and Pock as a special case and has linear convergence rate when the cost functions are piecewise linear-quadratic. We show that our distributed algorithm is easy to implement without the need to perform matrix inversions or inner loops. We demonstrate through computational experiments how selecting the parameter of our algorithm can lead to larger step sizes and yield better performance.


arXiv: Optimization and Control | 2014

Forward-backward truncated Newton methods for convex composite optimization

Panagiotis Patrinos; Lorenzo Stella; Alberto Bemporad


Siam Journal on Optimization | 2018

Forward-backward envelope for the sum of two nonconvex functions : further properties and nonmonotone line-search algorithms

Andreas Themelis; Lorenzo Stella; Panagiotis Patrinos


arXiv: Optimization and Control | 2014

Forward-backward truncated Newton methods for large-scale convex composite optimization

Panagiotis Patrinos; Lorenzo Stella; Alberto Bemporad


conference on decision and control | 2017

A simple and efficient algorithm for nonlinear model predictive control

Lorenzo Stella; Andreas Themelis; Pantelis Sopasakis; Panagiotis Patrinos


IEEE Transactions on Automatic Control | 2018

Newton-type alternating minimization algorithm for convex optimization

Lorenzo Stella; Andreas Themelis; Panagiotis Patrinos


arxiv:eess.SP | 2017

Proximal Gradient Algorithms: Applications in Signal Processing

Niccolò Antonello; Lorenzo Stella; Panos Patrinos; Marc Moonen; Toon van Waterschoot


Archive | 2017

Douglas-Rachford splitting and ADMM for nonconvex optimization: new convergence results and accelerated versions

Andreas Themelis; Lorenzo Stella; Panagiotis Patrinos

Collaboration


Dive into the Lorenzo Stella's collaboration.

Top Co-Authors

Avatar

Andreas Themelis

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Panagiotis Patrinos

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Alberto Bemporad

IMT Institute for Advanced Studies Lucca

View shared research outputs
Top Co-Authors

Avatar

Panagiotis Patrinos

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Niccolò Antonello

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Puya Latafat

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Toon van Waterschoot

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar

Pantelis Sopasakis

IMT Institute for Advanced Studies Lucca

View shared research outputs
Researchain Logo
Decentralizing Knowledge