Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Olga Klopp is active.

Publication


Featured researches published by Olga Klopp.


Electronic Journal of Statistics | 2011

Rank penalized estimators for high-dimensional matrices

Olga Klopp

In this paper we consider the trace regression model. Assume that we observe a small set of entries or linear combinations of entries of an unknown matrix


Probability Theory and Related Fields | 2017

Robust matrix completion

Olga Klopp; Karim Lounici; Alexandre B. Tsybakov

A_0


Electronic Journal of Statistics | 2015

Matrix completion by singular value thresholding: Sharp bounds

Olga Klopp

corrupted by noise. We propose a new rank penalized estimator of


Probability Theory and Related Fields | 2018

Optimal graphon estimation in cut distance

Olga Klopp; Nicolas Verzelen

A_0


Problems of Information Transmission | 2015

Estimation of matrices with row sparsity

Olga Klopp; Alexandre B. Tsybakov

. For this estimator we establish general oracle inequality for the prediction error both in probability and in expectation. We also prove upper bounds for the rank of our estimator. Then, we apply our general results to the problems of matrix completion and matrix regression. In these cases our estimator has a particularly simple form: it is obtained by hard thresholding of the singular values of a matrix constructed from the observations.


Annals of Statistics | 2015

Sparse high-dimensional varying coefficient model: Nonasymptotic minimax study

Olga Klopp; Marianna Pensky

This paper considers the problem of estimation of a low-rank matrix when most of its entries are not observed and some of the observed entries are corrupted. The observations are noisy realizations of a sum of a low-rank matrix, which we wish to estimate, and a second matrix having a complementary sparse structure such as elementwise sparsity or columnwise sparsity. We analyze a class of estimators obtained as solutions of a constrained convex optimization problem combining the nuclear norm penalty and a convex relaxation penalty for the sparse constraint. Our assumptions allow for simultaneous presence of random and deterministic patterns in the sampling scheme. We establish rates of convergence for the low-rank component from partial and corrupted observations in the presence of noise and we show that these rates are minimax optimal up to logarithmic factors.


Electronic Journal of Statistics | 2013

Non-asymptotic approach to varying coefficient model

Olga Klopp; Marianna Pensky

We consider the matrix completion problem where the aim is to esti-mate a large data matrix for which only a relatively small random subset of its entries is observed. Quite popular approaches to matrix completion problem are iterative thresholding methods. In spite of their empirical success, the theoretical guarantees of such iterative thresholding methods are poorly understood. The goal of this paper is to provide strong theo-retical guarantees, similar to those obtained for nuclear-norm penalization methods and one step thresholding methods, for an iterative thresholding algorithm which is a modification of the softImpute algorithm. An im-portant consequence of our result is the exact minimax optimal rates of convergence for matrix completion problem which were known until know only up to a logarithmic factor.


Bernoulli | 2014

Noisy low-rank matrix completion with general sampling distribution

Olga Klopp

Consider the twin problems of estimating the connection probability matrix of an inhomogeneous random graph and the graphon of a W-random graph. We establish the minimax estimation rates with respect to the cut metric for classes of block constant matrices and step function graphons. Surprisingly, our results imply that, from the minimax point of view, the raw data, that is, the adjacency matrix of the observed graph, is already optimal and more involved procedures cannot improve the convergence rates for this metric. This phenomenon contrasts with optimal rates of convergence with respect to other classical distances for graphons such as the


Annals of Statistics | 2017

Oracle inequalities for network models and sparse graphon estimation

Olga Klopp; Alexandre B. Tsybakov; Nicolas Verzelen


neural information processing systems | 2014

Probabilistic low-rank matrix completion on finite alphabets

Jean Lafond; Olga Klopp; Eric Moulines; Jospeh Salmon

l_1

Collaboration


Dive into the Olga Klopp's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nicolas Verzelen

Institut national de la recherche agronomique

View shared research outputs
Top Co-Authors

Avatar

Marianna Pensky

University of Central Florida

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Geneviève Robin

Chicago Metropolitan Agency for Planning

View shared research outputs
Top Co-Authors

Avatar

Jean Lafond

Institut Mines-Télécom

View shared research outputs
Researchain Logo
Decentralizing Knowledge