Pavel Dvurechensky
Moscow Institute of Physics and Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Pavel Dvurechensky.
Journal of Optimization Theory and Applications | 2016
Pavel Dvurechensky; Alexander Gasnikov
In this paper, we introduce new methods for convex optimization problems with stochastic inexact oracle. Our first method is an extension of the Intermediate Gradient Method proposed by Devolder, Glineur and Nesterov for problems with deterministic inexact oracle. Our method can be applied to problems with composite objective function, both deterministic and stochastic inexactness of the oracle, and allows using a non-Euclidean setup. We estimate the rate of convergence in terms of the expectation of the non-optimality gap and provide a way to control the probability of large deviations from this rate. Also we introduce two modifications of this method for strongly convex problems. For the first modification, we estimate the rate of convergence for the non-optimality gap expectation and, for the second, we provide a bound for the probability of large deviations from the rate of convergence in terms of the expectation of the non-optimality gap. All the rates lead to the complexity estimates for the proposed methods, which up to a multiplicative constant coincide with the lower complexity bound for the considered class of convex composite optimization problems with stochastic inexact oracle.
arXiv: Optimization and Control | 2016
Alexey Chernov; Pavel Dvurechensky; Alexander Gasnikov
In this paper we consider a class of optimization problems with a strongly convex objective function and the feasible set given by an intersection of a simple convex set with a set given by a number of linear equality and inequality constraints. A number of optimization problems in applications can be stated in this form, examples being the entropy-linear programming, the ridge regression, the elastic net, the regularized optimal transport, etc. We extend the Fast Gradient Method applied to the dual problem in order to make it primal-dual so that it allows not only to solve the dual problem, but also to construct nearly optimal and nearly feasible solution of the primal problem. We also prove a theorem about the convergence rate for the proposed algorithm in terms of the objective function and the linear constraints infeasibility.
Journal of Optimization Theory and Applications | 2015
Pavel Dvurechensky; Yurii Nesterov; Vladimir Spokoiny
In this paper, we show that the infinite-dimensional differential games with simple objective functional can be solved in a finite-dimensional dual form in the space of dual multipliers for the constraints related to the end points of the trajectories. The primal solutions can be easily reconstructed by the appropriate dual subgradient schemes. The suggested schemes are justified by the worst-case complexity analysis.
arXiv: Optimization and Control | 2014
Pavel Dvurechensky; Alexander Gasnikov
international conference on machine learning | 2018
Pavel Dvurechensky; Alexander Gasnikov; Alexey Kroshnin
neural information processing systems | 2016
Lev Bogolubsky; Pavel Dvurechensky; Alexander Gasnikov; Gleb Gusev; Yurii Nesterov; A. M. Raigorodskii; Aleksey Tikhonov; Maksim Zhukovskii
arXiv: Optimization and Control | 2015
Alexander Gasnikov; Pavel Dvurechensky; Dmitry Kamzolov
arXiv: Optimization and Control | 2014
Alexander Gasnikov; Pavel Dvurechensky; Yurii Nesterov
arXiv: Optimization and Control | 2018
Pavel Dvurechensky; Alexander Gasnikov; Eduard Gorbunov
arXiv: Optimization and Control | 2018
César A. Uribe; Darina Dvinskikh; Pavel Dvurechensky; Alexander Gasnikov; Angelia Nedic