Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Pavel Dvurechensky is active.

Publication


Featured researches published by Pavel Dvurechensky.


Journal of Optimization Theory and Applications | 2016

Stochastic Intermediate Gradient Method for Convex Problems with Stochastic Inexact Oracle

Pavel Dvurechensky; Alexander Gasnikov

In this paper, we introduce new methods for convex optimization problems with stochastic inexact oracle. Our first method is an extension of the Intermediate Gradient Method proposed by Devolder, Glineur and Nesterov for problems with deterministic inexact oracle. Our method can be applied to problems with composite objective function, both deterministic and stochastic inexactness of the oracle, and allows using a non-Euclidean setup. We estimate the rate of convergence in terms of the expectation of the non-optimality gap and provide a way to control the probability of large deviations from this rate. Also we introduce two modifications of this method for strongly convex problems. For the first modification, we estimate the rate of convergence for the non-optimality gap expectation and, for the second, we provide a bound for the probability of large deviations from the rate of convergence in terms of the expectation of the non-optimality gap. All the rates lead to the complexity estimates for the proposed methods, which up to a multiplicative constant coincide with the lower complexity bound for the considered class of convex composite optimization problems with stochastic inexact oracle.


arXiv: Optimization and Control | 2016

Fast Primal-Dual Gradient Method for Strongly Convex Minimization Problems with Linear Constraints

Alexey Chernov; Pavel Dvurechensky; Alexander Gasnikov

In this paper we consider a class of optimization problems with a strongly convex objective function and the feasible set given by an intersection of a simple convex set with a set given by a number of linear equality and inequality constraints. A number of optimization problems in applications can be stated in this form, examples being the entropy-linear programming, the ridge regression, the elastic net, the regularized optimal transport, etc. We extend the Fast Gradient Method applied to the dual problem in order to make it primal-dual so that it allows not only to solve the dual problem, but also to construct nearly optimal and nearly feasible solution of the primal problem. We also prove a theorem about the convergence rate for the proposed algorithm in terms of the objective function and the linear constraints infeasibility.


Journal of Optimization Theory and Applications | 2015

Primal-Dual Methods for Solving Infinite-Dimensional Games

Pavel Dvurechensky; Yurii Nesterov; Vladimir Spokoiny

In this paper, we show that the infinite-dimensional differential games with simple objective functional can be solved in a finite-dimensional dual form in the space of dual multipliers for the constraints related to the end points of the trajectories. The primal solutions can be easily reconstructed by the appropriate dual subgradient schemes. The suggested schemes are justified by the worst-case complexity analysis.


arXiv: Optimization and Control | 2014

Stochastic Intermediate Gradient Method for Convex Problems with Inexact Stochastic Oracle

Pavel Dvurechensky; Alexander Gasnikov


international conference on machine learning | 2018

Computational Optimal Transport: Complexity by Accelerated Gradient Descent Is Better Than by Sinkhorn's Algorithm

Pavel Dvurechensky; Alexander Gasnikov; Alexey Kroshnin


neural information processing systems | 2016

Learning Supervised PageRank with Gradient-Based and Gradient-Free Optimization Methods

Lev Bogolubsky; Pavel Dvurechensky; Alexander Gasnikov; Gleb Gusev; Yurii Nesterov; A. M. Raigorodskii; Aleksey Tikhonov; Maksim Zhukovskii


arXiv: Optimization and Control | 2015

Gradient and gradient-free methods for stochastic convex optimization with inexact oracle

Alexander Gasnikov; Pavel Dvurechensky; Dmitry Kamzolov


arXiv: Optimization and Control | 2014

Stochastic gradient methods with inexact oracle

Alexander Gasnikov; Pavel Dvurechensky; Yurii Nesterov


arXiv: Optimization and Control | 2018

An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization.

Pavel Dvurechensky; Alexander Gasnikov; Eduard Gorbunov


arXiv: Optimization and Control | 2018

Distributed Computation of Wasserstein Barycenters over Networks.

César A. Uribe; Darina Dvinskikh; Pavel Dvurechensky; Alexander Gasnikov; Angelia Nedic

Collaboration


Dive into the Pavel Dvurechensky's collaboration.

Top Co-Authors

Avatar

Alexander Gasnikov

Moscow Institute of Physics and Technology

View shared research outputs
Top Co-Authors

Avatar

Yurii Nesterov

Catholic University of Leuven

View shared research outputs
Top Co-Authors

Avatar

Alexey Chernov

Moscow Institute of Physics and Technology

View shared research outputs
Top Co-Authors

Avatar

Anastasia Lagunovskaya

Moscow Institute of Physics and Technology

View shared research outputs
Top Co-Authors

Avatar

Eduard Gorbunov

Moscow Institute of Physics and Technology

View shared research outputs
Top Co-Authors

Avatar

Vladimir Spokoiny

Humboldt University of Berlin

View shared research outputs
Top Co-Authors

Avatar

A. M. Raigorodskii

Moscow Institute of Physics and Technology

View shared research outputs
Top Co-Authors

Avatar

Anastasia Bayandina

Moscow Institute of Physics and Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge