Arkadii S. Nemirovski
Georgia Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Arkadii S. Nemirovski.
Siam Journal on Optimization | 2008
Arkadii S. Nemirovski; Anatoli Juditsky; Guanghui Lan; Alexander Shapiro
In this paper we consider optimization problems where the objective function is given in a form of the expectation. A basic difficulty of solving such stochastic optimization problems is that the involved multidimensional integrals (expectations) cannot be computed with high accuracy. The aim of this paper is to compare two computational approaches based on Monte Carlo sampling techniques, namely, the stochastic approximation (SA) and the sample average approximation (SAA) methods. Both approaches, the SA and SAA methods, have a long history. Current opinion is that the SAA method can efficiently use a specific (say, linear) structure of the considered problem, while the SA approach is a crude subgradient method, which often performs poorly in practice. We intend to demonstrate that a properly modified SA approach can be competitive and even significantly outperform the SAA method for a certain class of convex stochastic problems. We extend the analysis to the case of convex-concave stochastic saddle point problems and present (in our opinion highly encouraging) results of numerical experiments.
Mathematical Programming | 2011
Anatoli Juditsky; Arkadii S. Nemirovski
We discuss necessary and sufficient conditions for a sensing matrix to be “s-good”—to allow for exact ℓ1-recovery of sparse signals with s nonzero entries when no measurement noise is present. Then we express the error bounds for imperfect ℓ1-recovery (nonzero measurement noise, nearly s-sparse signal, near-optimal solution of the optimization problem yielding the ℓ1-recovery) in terms of the characteristics underlying these conditions. Further, we demonstrate (and this is the principal result of the paper) that these characteristics, although difficult to evaluate, lead to verifiable sufficient conditions for exact sparse ℓ1-recovery and to efficiently computable upper bounds on those s for which a given sensing matrix is s-good. We establish also instructive links between our approach and the basic concepts of the Compressed Sensing theory, like Restricted Isometry or Restricted Eigenvalue properties.
arXiv: Optimization and Control | 2011
Anatoli Juditsky; Arkadii S. Nemirovski; Claire Tauvel
In this paper we consider iterative methods for stochastic variational inequalities (s.v.i.) with monotone operators. Our basic assumption is that the operator possesses both smooth and nonsmooth components. Further, only noisy observations of the problem data are available. We develop a novel Stochastic Mirror-Prox (SMP) algorithm for solving s.v.i. and show that with the convenient stepsize strategy it attains the optimal rates of convergence with respect to the problem parameters. We apply the SMP algorithm to Stochastic composite minimization and describe particular applications to Stochastic Semidefinite Feasability problem and Eigenvalue minimization.
Mathematical Programming | 2015
Zaid Harchaoui; Anatoli Juditsky; Arkadii S. Nemirovski
Motivated by some applications in signal processing and machine learning, we consider two convex optimization problems where, given a cone
Mathematical Programming | 2011
Anatoli Juditsky; Fatma Kilinc Karzan; Arkadii S. Nemirovski
Annals of Statistics | 2012
Anatoli Juditsky; Fatma Kilinc Karzan; Arkadii S. Nemirovski; Boris T. Polyak
K
Mathematical Programming | 2013
Anatoli Juditsky; Fatma Kilinc Karzan; Arkadii S. Nemirovski
Bernoulli | 2015
Anatoli Juditsky; Arkadii S. Nemirovski
K, a norm
Annals of Statistics | 2000
Anatoli Juditsky; Arkadii S. Nemirovski
Machine Learning | 2013
Elmar Diederichs; Anatoli Juditsky; Arkadii S. Nemirovski; Vladimir Spokoiny
\Vert \cdot \Vert