Olga Klopp
CEREMADE
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Olga Klopp.
Electronic Journal of Statistics | 2011
Olga Klopp
In this paper we consider the trace regression model. Assume that we observe a small set of entries or linear combinations of entries of an unknown matrix
Probability Theory and Related Fields | 2017
Olga Klopp; Karim Lounici; Alexandre B. Tsybakov
A_0
Electronic Journal of Statistics | 2015
Olga Klopp
corrupted by noise. We propose a new rank penalized estimator of
Probability Theory and Related Fields | 2018
Olga Klopp; Nicolas Verzelen
A_0
Problems of Information Transmission | 2015
Olga Klopp; Alexandre B. Tsybakov
. For this estimator we establish general oracle inequality for the prediction error both in probability and in expectation. We also prove upper bounds for the rank of our estimator. Then, we apply our general results to the problems of matrix completion and matrix regression. In these cases our estimator has a particularly simple form: it is obtained by hard thresholding of the singular values of a matrix constructed from the observations.
Annals of Statistics | 2015
Olga Klopp; Marianna Pensky
This paper considers the problem of estimation of a low-rank matrix when most of its entries are not observed and some of the observed entries are corrupted. The observations are noisy realizations of a sum of a low-rank matrix, which we wish to estimate, and a second matrix having a complementary sparse structure such as elementwise sparsity or columnwise sparsity. We analyze a class of estimators obtained as solutions of a constrained convex optimization problem combining the nuclear norm penalty and a convex relaxation penalty for the sparse constraint. Our assumptions allow for simultaneous presence of random and deterministic patterns in the sampling scheme. We establish rates of convergence for the low-rank component from partial and corrupted observations in the presence of noise and we show that these rates are minimax optimal up to logarithmic factors.
Electronic Journal of Statistics | 2013
Olga Klopp; Marianna Pensky
We consider the matrix completion problem where the aim is to esti-mate a large data matrix for which only a relatively small random subset of its entries is observed. Quite popular approaches to matrix completion problem are iterative thresholding methods. In spite of their empirical success, the theoretical guarantees of such iterative thresholding methods are poorly understood. The goal of this paper is to provide strong theo-retical guarantees, similar to those obtained for nuclear-norm penalization methods and one step thresholding methods, for an iterative thresholding algorithm which is a modification of the softImpute algorithm. An im-portant consequence of our result is the exact minimax optimal rates of convergence for matrix completion problem which were known until know only up to a logarithmic factor.
Bernoulli | 2014
Olga Klopp
Consider the twin problems of estimating the connection probability matrix of an inhomogeneous random graph and the graphon of a W-random graph. We establish the minimax estimation rates with respect to the cut metric for classes of block constant matrices and step function graphons. Surprisingly, our results imply that, from the minimax point of view, the raw data, that is, the adjacency matrix of the observed graph, is already optimal and more involved procedures cannot improve the convergence rates for this metric. This phenomenon contrasts with optimal rates of convergence with respect to other classical distances for graphons such as the
Annals of Statistics | 2017
Olga Klopp; Alexandre B. Tsybakov; Nicolas Verzelen
neural information processing systems | 2014
Jean Lafond; Olga Klopp; Eric Moulines; Jospeh Salmon
l_1