Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yurii Nesterov is active.

Publication


Featured researches published by Yurii Nesterov.


Mathematical Programming | 2005

Smooth minimization of non-smooth functions

Yurii Nesterov

Abstract.In this paper we propose a new approach for constructing efficient schemes for non-smooth convex optimization. It is based on a special smoothing technique, which can be applied to functions with explicit max-structure. Our approach can be considered as an alternative to black-box minimization. From the viewpoint of efficiency estimates, we manage to improve the traditional bounds on the number of iterations of the gradient schemes from keeping basically the complexity of each iteration unchanged.


Siam Journal on Optimization | 2012

Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems

Yurii Nesterov

In this paper we propose new methods for solving huge-scale optimization problems. For problems of this size, even the simplest full-dimensional vector operations are very expensive. Hence, we propose to apply an optimization technique based on random partial update of decision variables. For these methods, we prove the global estimates for the rate of convergence. Surprisingly enough, for certain classes of objective functions, our results are better than the standard worst-case bounds for deterministic algorithms. We present constrained and unconstrained versions of the method, and its accelerated variant. Our numerical test confirms a high efficiency of this technique on problems of very big size.


Mathematical Programming | 2009

Primal-dual subgradient methods for convex problems

Yurii Nesterov

In this paper we present a new approach for constructing subgradient schemes for different types of nonsmooth problems with convex structure. Our methods are primal-dual since they are always able to generate a feasible approximation to the optimum of an appropriately formulated dual problem. Besides other advantages, this useful feature provides the methods with a reliable stopping criterion. The proposed schemes differ from the classical approaches (divergent series methods, mirror descent methods) by presence of two control sequences. The first sequence is responsible for aggregating the support functions in the dual space, and the second one establishes a dynamically updated scale between the primal and dual spaces. This additional flexibility allows to guarantee a boundedness of the sequence of primal test points even in the case of unbounded feasible set (however, we always assume the uniform boundedness of subgradients). We present the variants of subgradient schemes for nonsmooth convex minimization, minimax problems, saddle point problems, variational inequalities, and stochastic optimization. In all situations our methods are proved to be optimal from the view point of worst-case black-box lower complexity bounds.


Mathematical Programming | 2013

Gradient methods for minimizing composite functions

Yurii Nesterov

In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two terms: one is smooth and given by a black-box oracle, and another is a simple general convex function with known structure. Despite the absence of good properties of the sum, such problems, both in convex and nonconvex cases, can be solved with efficiency typical for the first part of the objective. For convex problems of the above structure, we consider primal and dual variants of the gradient method (with convergence rate


Siam Journal on Optimization | 1998

Primal-Dual Interior-Point Methods for Self-Scaled Cones

Yurii Nesterov; Michael J. Todd


Archive | 2000

Squared Functional Systems and Optimization Problems

Yurii Nesterov

O\left({1 \over k}\right)


Mathematical Programming | 1995

New variants of bundle methods

Claude Lemaréchal; Arkadii Nemirovskii; Yurii Nesterov


Optimization Methods & Software | 1998

Semidefinite relaxation and nonconvex quadratic optimization

Yurii Nesterov

), and an accelerated multistep version with convergence rate


Mathematical Programming | 2006

Cubic regularization of Newton method and its global performance

Yurii Nesterov; Boris T. Polyak


Siam Journal on Optimization | 2005

Excessive Gap Technique in Nonsmooth Convex Minimization

Yurii Nesterov

O\left({1 \over k^2}\right)

Collaboration


Dive into the Yurii Nesterov's collaboration.

Top Co-Authors

Avatar

Alexander Gasnikov

Moscow Institute of Physics and Technology

View shared research outputs
Top Co-Authors

Avatar

Arkadii Nemirovskii

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Vincent D. Blondel

Université catholique de Louvain

View shared research outputs
Top Co-Authors

Avatar

François Glineur

Université catholique de Louvain

View shared research outputs
Top Co-Authors

Avatar

Olivier Devolder

Université catholique de Louvain

View shared research outputs
Top Co-Authors

Avatar

Paul Van Dooren

University College London

View shared research outputs
Top Co-Authors

Avatar

Pavel Dvurechensky

Moscow Institute of Physics and Technology

View shared research outputs
Top Co-Authors

Avatar

André de Palma

Cergy-Pontoise University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge