Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Alexander Gasnikov is active.

Publication


Featured researches published by Alexander Gasnikov.


Journal of Optimization Theory and Applications | 2016

Stochastic Intermediate Gradient Method for Convex Problems with Stochastic Inexact Oracle

Pavel Dvurechensky; Alexander Gasnikov

In this paper, we introduce new methods for convex optimization problems with stochastic inexact oracle. Our first method is an extension of the Intermediate Gradient Method proposed by Devolder, Glineur and Nesterov for problems with deterministic inexact oracle. Our method can be applied to problems with composite objective function, both deterministic and stochastic inexactness of the oracle, and allows using a non-Euclidean setup. We estimate the rate of convergence in terms of the expectation of the non-optimality gap and provide a way to control the probability of large deviations from this rate. Also we introduce two modifications of this method for strongly convex problems. For the first modification, we estimate the rate of convergence for the non-optimality gap expectation and, for the second, we provide a bound for the probability of large deviations from the rate of convergence in terms of the expectation of the non-optimality gap. All the rates lead to the complexity estimates for the proposed methods, which up to a multiplicative constant coincide with the lower complexity bound for the considered class of convex composite optimization problems with stochastic inexact oracle.


International Journal of Computer Mathematics | 2012

Selected mathematical problems of traffic flow theory

Alexander P. Buslaev; Alexander Gasnikov; Marina V. Yashina

In spite of the long history of traffic flow theory development, a lot of its aspects are still unsettled, initiate debates and criticism and require further research. The main reasons for such a situation are the high level of complexity of the study object and difficulties in formalizing its behaviour as a complex social technical system. This article considers mathematical statements of problems and some analytical results from the follow-the-leader models to large-scale network models. For the follow-the-leader model, the corresponding class of nonlinear systems of ordinary differential equations is presented. Existence conditions for bounded motion under certain restrictions are obtained. The physical concepts behind models of mathematical physics for traffic flow are discussed and mathematical results obtained by Soviet mathematicians are presented. Selected open problems are introduced related to new mixed traffic models combining deterministic and stochastic approaches, that is, classical dynamics and probability of random walk. An approach to traffic flow modelling on networks is also discussed.


arXiv: Optimization and Control | 2016

Fast Primal-Dual Gradient Method for Strongly Convex Minimization Problems with Linear Constraints

Alexey Chernov; Pavel Dvurechensky; Alexander Gasnikov

In this paper we consider a class of optimization problems with a strongly convex objective function and the feasible set given by an intersection of a simple convex set with a set given by a number of linear equality and inequality constraints. A number of optimization problems in applications can be stated in this form, examples being the entropy-linear programming, the ridge regression, the elastic net, the regularized optimal transport, etc. We extend the Fast Gradient Method applied to the dual problem in order to make it primal-dual so that it allows not only to solve the dual problem, but also to construct nearly optimal and nearly feasible solution of the primal problem. We also prove a theorem about the convergence rate for the proposed algorithm in terms of the objective function and the linear constraints infeasibility.


Automation and Remote Control | 2016

Gradient-free proximal methods with inexact oracle for convex stochastic nonsmooth optimization problems on the simplex

Alexander Gasnikov; Anastasia Lagunovskaya; Ilnura Usmanova; Fedor A. Fedorenko

In this paper we propose a modification of the mirror descent method for non-smooth stochastic convex optimization problems on the unit simplex. The optimization problems considered differ from the classical ones by availability of function values realizations. Our purpose is to derive the convergence rate of the method proposed and to determine the level of noise that does not significantly affect the convergence rate.


Automation and Remote Control | 2017

Stochastic online optimization. Single-point and multi-point non-linear multi-armed bandits. Convex and strongly-convex case

Alexander Gasnikov; Ekaterina Krymova; Anastasia Lagunovskaya; Ilnura Usmanova; Fedor A. Fedorenko

In this paper the gradient-free modification of the mirror descent method for convex stochastic online optimization problems is proposed. The crucial assumption in the problem setting is that function realizations are observed with minor noises. The aim of this paper is to derive the convergence rate of the proposed methods and to determine a noise level which does not significantly affect the convergence rate.


Optimization Methods & Software | 2018

A universal modification of the linear coupling method

Sergey Guminov; Alexander Gasnikov; Anton Anikin; Alexander Gornov

ABSTRACT In the late sixties, N. Shor and B. Polyak independently proposed optimal first-order methods for solving non-smooth convex optimization problems. In 1982 A. Nemirovski proposed optimal first-order methods for solving smooth convex optimization problems, which utilized auxiliary line search. In 1985 A. Nemirovski and Yu. Nesterov proposed a parametric family of optimal first-order methods for solving convex optimization problems with intermediate smoothness. In 2013 Yu. Nesterov proposed a universal gradient method which combined all good properties of the previous methods, except the possibility of using auxiliary line search. One can typically observe that in practice auxiliary line search improves performance for many tasks. In this paper, we propose the apparently first such method of non-smooth convex optimization allowing the use of the line search procedure. Moreover, it is based on the universal gradient method, which does not require any a priori information about the actual degree of smoothness of the problem. Numerical experiments demonstrate that the proposed method is, in some cases, considerably faster than Nesterovs universal gradient method.


Numerical Analysis and Applications | 2017

About the Power Law of the PageRank Vector Component Distribution. Part 1. Numerical Methods for Finding the PageRank Vector

Alexander Gasnikov; E. V. Gasnikova; P. E. Dvurechensky; A. A. M. Mohammed; E. O. Chernousova

In Part 1 of this paper we consider the web-page ranking problem, also known as the problem of finding the PageRank vector, or the Google problem. We discuss the link between this problem and the ergodic theorem and describe different numerical methods to solve this problem together with their theoretical background, such asMarkov chain Monte Carlo and equilibrium in a macrosystem.


arXiv: Optimization and Control | 2017

Optimal Algorithms for Distributed Optimization.

César A. Uribe; Soomin Lee; Alexander Gasnikov; Angelia Nedic


arXiv: Optimization and Control | 2014

Stochastic Intermediate Gradient Method for Convex Problems with Inexact Stochastic Oracle

Pavel Dvurechensky; Alexander Gasnikov


international conference on machine learning | 2018

Computational Optimal Transport: Complexity by Accelerated Gradient Descent Is Better Than by Sinkhorn's Algorithm

Pavel Dvurechensky; Alexander Gasnikov; Alexey Kroshnin

Collaboration


Dive into the Alexander Gasnikov's collaboration.

Top Co-Authors

Avatar

Pavel Dvurechensky

Moscow Institute of Physics and Technology

View shared research outputs
Top Co-Authors

Avatar

Yurii Nesterov

Catholic University of Leuven

View shared research outputs
Top Co-Authors

Avatar

Anastasia Lagunovskaya

Moscow Institute of Physics and Technology

View shared research outputs
Top Co-Authors

Avatar

Anton Anikin

Russian Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Alexander Gornov

Russian Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Alexey Chernov

Moscow Institute of Physics and Technology

View shared research outputs
Top Co-Authors

Avatar

Eduard Gorbunov

Moscow Institute of Physics and Technology

View shared research outputs
Top Co-Authors

Avatar

Ilnura Usmanova

Moscow Institute of Physics and Technology

View shared research outputs
Top Co-Authors

Avatar

Anastasia Bayandina

Moscow Institute of Physics and Technology

View shared research outputs
Top Co-Authors

Avatar

Ekaterina Krymova

Russian Academy of Sciences

View shared research outputs
Researchain Logo
Decentralizing Knowledge