Alexandre B. Tsybakov
ENSAE ParisTech
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Alexandre B. Tsybakov.
Annals of Statistics | 2009
Peter J. Bickel; Ya'acov Ritov; Alexandre B. Tsybakov
We show that, under a sparsity scenario, the Lasso estimator and the Dantzig selector exhibit similar behavior. For both methods, we derive, in parallel, oracle inequalities for the prediction risk in the general nonparametric regression model, as well as bounds on the l p estimation loss for 1 ≤ p ≤ 2 in the linear model when the number of variables can be much larger than the sample size.
Archive | 2008
Alexandre B. Tsybakov
This is a concise text developed from lecture notes and ready to be used for a course on the graduate level. The main idea is to introduce the fundamental concepts of the theory while maintaining the exposition suitable for a first approach in the field. Therefore, the results are not always given in the most general form but rather under assumptions that lead to shorter or more elegant proofs. The book has three chapters. Chapter 1 presents basic nonparametric regression and density estimators and analyzes their properties. Chapter 2 is devoted to a detailed treatment of minimax lower bounds. Chapter 3 develops more advanced topics: Pinskers theorem, oracle inequalities, Stein shrinkage, and sharp minimax adaptivity. This book will be useful for researchers and grad students interested in theoretical aspects of smoothing techniques. Many important and useful results on optimal and adaptive estimation are provided. As one of the leading mathematical statisticians working in nonparametrics, the author is an authority on the subject.
Electronic Journal of Statistics | 2007
Florentina Bunea; Alexandre B. Tsybakov; Marten H. Wegkamp
This paper studies oracle properties of l1-penalized least squares in nonparametric regression setting with random design. We show that the penalized least squares estimator satisfies sparsity oracle inequalities, i.e., bounds in terms of the number of non-zero components of the oracle vec- tor. The results are valid even when the dimension of the model is (much) larger than the sample size and the regression matrix is not positive definite. They can be applied to high-dimensional linear regression, to nonparamet- ric adaptive regression estimation and to the problem of aggregation of arbitrary estimators. AMS 2000 subject classifications: Primary 62G08; secondary 62C20, 62G05, 62G20. Keywords and phrases: sparsity, oracle inequalities, Lasso, penalized least squares, nonparametric regression, dimension reduction, aggregation, mutual coherence, adaptive estimation.
Annals of Statistics | 2011
Vladimir Koltchinskii; Karim Lounici; Alexandre B. Tsybakov
This paper deals with the trace regression model where
Journal of Econometrics | 1997
Wolfgang Karl Härdle; Alexandre B. Tsybakov
n
conference on learning theory | 2003
Alexandre B. Tsybakov
entries or linear combinations of entries of an unknown
Annals of Statistics | 2011
Karim Lounici; Massimiliano Pontil; Sara van de Geer; Alexandre B. Tsybakov
m_1\times m_2
Annals of Statistics | 2011
Philippe Rigollet; Alexandre B. Tsybakov
matrix
Annals of Statistics | 2007
Jean-Yves Audibert; Alexandre B. Tsybakov
A_0
Test | 2006
Peter J. Bickel; Bo Li; Alexandre B. Tsybakov; Sara van de Geer; Bin Yu; Teófilo Valdés; Carlos Rivero; Jianqing Fan; Aad van der Vaart
corrupted by noise are observed. We propose a new nuclear norm penalized estimator of