Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Quoc Tran-Dinh is active.

Publication


Featured researches published by Quoc Tran-Dinh.


Siam Journal on Control and Optimization | 2014

Computational Complexity of Inexact Gradient Augmented Lagrangian Methods: Application to Constrained MPC

Valentin Nedelcu; Ion Necoara; Quoc Tran-Dinh

We study the computational complexity certification of inexact gradient augmented Lagrangian methods for solving convex optimization problems with complicated constraints. We solve the augmented Lagrangian dual problem that arises from the relaxation of complicating constraints with gradient and fast gradient methods based on inexact first order information. Moreover, since the exact solution of the augmented Lagrangian primal problem is hard to compute in practice, we solve this problem up to some given inner accuracy. We derive relations between the inner and the outer accuracy of the primal and dual problems and we give a full convergence rate analysis for both gradient and fast gradient algorithms. We provide estimates on the primal and dual suboptimality and on primal feasibility violation of the generated approximate primal and dual solutions. Our analysis relies on the Lipschitz property of the dual function and on inexact dual gradients. We also discuss implementation aspects of the proposed algor...


Computational Optimization and Applications | 2017

Adaptive smoothing algorithms for nonsmooth composite convex minimization

Quoc Tran-Dinh

We propose an adaptive smoothing algorithm based on Nesterov’s smoothing technique in Nesterov (Math Prog 103(1):127–152, 2005) for solving “fully” nonsmooth composite convex optimization problems. Our method combines both Nesterov’s accelerated proximal gradient scheme and a new homotopy strategy for smoothness parameter. By an appropriate choice of smoothing functions, we develop a new algorithm that has the


Siam Journal on Optimization | 2018

A Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex Minimization

Quoc Tran-Dinh; Olivier Fercoq; Volkan Cevher


Mathematical Programming | 2018

Self-concordant inclusions: a unified framework for path-following generalized Newton-type algorithms

Quoc Tran-Dinh; Tianxiao Sun; Shu Lu

\mathcal {O}\left( \frac{1}{\varepsilon }\right)


Optimization Letters | 2017

Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization

Andrei Patrascu; Ion Necoara; Quoc Tran-Dinh


Mathematics of Operations Research | 2018

A Single-Phase, Proximal Path-Following Framework

Quoc Tran-Dinh; Anastasios Kyrillidis; Volkan Cevher

O1ε-worst-case iteration-complexity while preserves the same complexity-per-iteration as in Nesterov’s method and allows one to automatically update the smoothness parameter at each iteration. Then, we customize our algorithm to solve four special cases that cover various applications. We also specify our algorithm to solve constrained convex optimization problems and show its convergence guarantee on a primal sequence of iterates. We demonstrate our algorithm through three numerical examples and compare it with other related algorithms.


Computational Optimization and Applications | 2018

Proximal alternating penalty algorithms for nonsmooth constrained convex optimization

Quoc Tran-Dinh

We propose a new first-order primal-dual optimization framework for a convex optimization template with broad applications. Our optimization algorithms feature optimal convergence guarantees under a variety of common structure assumptions on the problem template. Our analysis relies on a novel combination of three classic ideas applied to the primal-dual gap function: smoothing, acceleration, and homotopy. The algorithms due to the new approach achieve the best known convergence rate results, in particular when the template consists of only non-smooth functions. We also outline a restart strategy for the acceleration to significantly enhance the practical performance. We demonstrate relations with the augmented Lagrangian method and show how to exploit the strongly convex objectives with rigorous convergence rate guarantees. We provide numerical evidence with two examples and illustrate that the new methods can outperform the state-of-the-art, including Chambolle-Pock, and the alternating direction method-of-multipliers algorithms.


international conference on neural information processing | 2016

Simplicial Nonnegative Matrix Tri-factorization: Fast Guaranteed Parallel Algorithm

Quoc Tran-Dinh; Tu Bao Ho

We study a class of monotone inclusions called “self-concordant inclusion” which covers three fundamental convex optimization formulations as special cases. We develop a new generalized Newton-type framework to solve this inclusion. Our framework subsumes three schemes: full-step, damped-step, and path-following methods as specific instances, while allows one to use inexact computation to form generalized Newton directions. We prove the local quadratic convergence of both full-step and damped-step algorithms. Then, we propose a new two-phase inexact path-following scheme for solving this monotone inclusion which possesses an


neural information processing systems | 2015

A universal primal-dual convex optimization framework

Alp Yurtsever; Quoc Tran-Dinh; Volkan Cevher


arXiv: Optimization and Control | 2015

Smooth Alternating Direction Methods for Nonsmooth Constrained Convex Optimization

Quoc Tran-Dinh; Volkan Cevher

{\mathcal {O}}(\sqrt{\nu }\log (1/\varepsilon ))

Collaboration


Dive into the Quoc Tran-Dinh's collaboration.

Top Co-Authors

Avatar

Volkan Cevher

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Tianxiao Sun

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Ion Necoara

Politehnica University of Bucharest

View shared research outputs
Top Co-Authors

Avatar

Alp Yurtsever

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gábor Pataki

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Shu Lu

University of North Carolina at Chapel Hill

View shared research outputs
Top Co-Authors

Avatar

Andrei Patrascu

Politehnica University of Bucharest

View shared research outputs
Researchain Logo
Decentralizing Knowledge