Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jean-Michel Loubes is active.

Publication


Featured researches published by Jean-Michel Loubes.


Annals of Statistics | 2004

Least angle regression

Bradley Efron; Trevor Hastie; Iain M. Johnstone; Robert Tibshirani; Hemant Ishwaran; Keith Knight; Jean-Michel Loubes; Pascal Massart; David Madigan; Greg Ridgeway; Saharon Rosset; J. Zhu; Robert A. Stine; Berwin A. Turlach; Sanford Weisberg

DISCUSSION OF “LEAST ANGLE REGRESSION” BY EFRONET AL.By Jean-Michel Loubes and Pascal MassartUniversit´e Paris-SudThe issue of model selection has drawn the attention of both applied andtheoretical statisticians for a long time. Indeed, there has been an enor-mous range of contribution in model selection proposals, including work byAkaike (1973), Mallows (1973), Foster and George (1994), Birg´e and Mas-sart (2001a) and Abramovich, Benjamini, Donoho and Johnstone (2000).Over the last decade, modern computer-driven methods have been devel-oped such as All Subsets, Forward Selection, Forward Stagewise or Lasso.Such methods are useful in the setting of the standard linear model, wherewe observe noisy data and wish to predict the response variable using onlya few covariates, since they provide automatically linear models that fit thedata. The procedure described in this paper is, on the one hand, numeri-cally very efficient and, on the other hand, very general, since, with slightmodifications, it enables us to recover the estimates given by the Lasso andStagewise.1. Estimation procedure. The “LARS” method is based on a recursiveprocedure selecting, at each step, the covariates having largest absolute cor-relation with the response y. In the case of an orthogonal design, the esti-mates can then be viewed as an lDISCUSSION OF “LEAST ANGLE REGRESSION” BY EFRONET AL.By Berwin A. TurlachUniversity of Western AustraliaI would like to begin by congratulating the authors (referred to belowas EHJT) for their interesting paper in which they propose a new variableselection method (LARS) for building linear models and show how their newmethod relates to other methods that have been proposed recently. I foundthe paper to be very stimulating and found the additional insight that itprovides about the Lasso technique to be of particular interest.My comments center around the question of how we can select linearmodels that conform with the marginality principle [Nelder (1977, 1994)and McCullagh and Nelder (1989)]; that is, the response surface is invariantunder scaling and translation of the explanatory variables in the model.Recently one of my interests was to explore whether the Lasso techniqueor the nonnegative garrote [Breiman (1995)] could be modified such that itincorporates the marginality principle. However, it does not seem to be atrivial matter to change the criteria that these techniques minimize in such away that the marginality principle is incorporated in a satisfactory manner.On the other hand, it seems to be straightforward to modify the LARStechnique to incorporate this principle. In their paper, EHJT address thisissue somewhat in passing when they suggest toward the end of Section 3that one first fit main effects only and interactions in a second step to controlthe order in which variables are allowed to enter the model. However, sucha two-step procedure may have a somewhat less than optimal behavior asthe following, admittedly artificial, example shows.Assume we have a vector of explanatory variables X =(XThe purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to select a parsimonious set for the efficient prediction of a response variable. Least Angle Regression (LARS), a new model selection algorithm, is a useful and less greedy version of traditional forward selection methods. Three main properties are derived: (1) A simple modification of the LARS algorithm implements the Lasso, an attractive version of ordinary least squares that constrains the sum of the absolute regression coefficients; the LARS modification calculates all possible Lasso estimates for a given problem, using an order of magnitude less computer time than previous methods. (2) A different LARS modification efficiently implements Forward Stagewise linear regression, another promising new model selection method; this connection explains the similar numerical results previously observed for the Lasso and Stagewise, and helps us understand the properties of both methods, which are seen as constrained versions of the simpler LARS algorithm. (3) A simple approximation for the degrees of freedom of a LARS estimate is available, from which we derive a Cp estimate of prediction error; this allows a principled choice among the range of possible LARS estimates. LARS and its variants are computationally efficient: the paper describes a publicly available algorithm that requires only the same order of magnitude of computational effort as ordinary least squares applied to the full set of covariates.


Electronic Journal of Statistics | 2007

Semi-parametric estimation of shifts

Fabrice Gamboa; Jean-Michel Loubes; Elie Maza

We observe a large number of functions differing from each other only by a translation parameter. While the main pattern is unknown, we propose to estimate the shift parameters using


Journal of Mathematical Imaging and Vision | 2009

Statistical M-Estimation and Consistency in Large Deformable Models for Image Warping

Jérémie Bigot; Sébastien Gadat; Jean-Michel Loubes

M


Electronic Journal of Statistics | 2008

Adaptive complexity regularization for linear inverse problems

Jean-Michel Loubes; Carenne Ludeña

-estimators. Fourier transform enables to transform this statistical problem into a semi-parametric framework. We study the convergence of the estimator and provide its asymptotic behavior. Moreover, we use the method in the applied case of velocity curve forecasting.


Bernoulli | 2015

Distribution's template estimate with Wasserstein metrics

Emmanuel Boissard; Thibault Le Gouic; Jean-Michel Loubes

The problem of defining appropriate distances between shapes or images and modeling the variability of natural images by group transformations is at the heart of modern image analysis. A current trend is the study of probabilistic and statistical aspects of deformation models, and the development of consistent statistical procedure for the estimation of template images. In this paper, we consider a set of images randomly warped from a mean template which has to be recovered. For this, we define an appropriate statistical parametric model to generate random diffeomorphic deformations in two-dimensions. Then, we focus on the problem of estimating the mean pattern when the images are observed with noise. This problem is challenging both from a theoretical and a practical point of view. M-estimation theory enables us to build an estimator defined as a minimizer of a well-tailored empirical criterion. We prove the convergence of this estimator and propose a gradient descent algorithm to compute this M-estimator in practice. Simulations of template extraction and an application to image clustering and classification are also provided.


Electronic Journal of Statistics | 2012

Non asymptotic minimax rates of testing in signal detection with heterogeneous variances

Béatrice Laurent; Jean-Michel Loubes; Clément Marteau

We tackle the problem of building adaptive estimation procedures for ill-posed inverse problems. For general regularization methods depending on tuning parameters, we construct a penalized method that selects the optimal smoothing sequence without prior knowledge of the regularity of the function to be estimated. We provide for such estimators oracle inequalities and optimal rates of convergence. This penalized approach is applied to Tikhonov regularization and to regularization by projection.


Mathematical Methods of Statistics | 2009

Estimation of the distribution of random shifts deformation

Ismael Castillo; Jean-Michel Loubes

In this paper we tackle the problem of comparing distributions of random variables and defining a mean pattern between a sample of random events. Using barycenters of measures in the Wasserstein space, we propose an iterative version as an estimation of the mean distribution. Moreover, when the distributions are a common measure warped by a centered random operator, then the barycenter enables to recover this distribution template.


Electronic Journal of Statistics | 2010

Nonparametric estimation of covariance functions by model selection

Jérémie Bigot; Rolando J. Biscay; Jean-Michel Loubes; Lilian Muñiz-Alvarez

The aim of this paper is to establish non-asymptotic minimax rates of testing for goodness-of-fit hypotheses in a heteroscedastic setting. More precisely, we deal with sequences


Statistics and Computing | 2011

Non parametric estimation of the structural expectation of a stochastic increasing function

Jean-François Dupuy; Jean-Michel Loubes; Elie Maza

(Y_j)_{j\in J}


IEEE Transactions on Intelligent Transportation Systems | 2016

Review and Perspective for Distance-Based Clustering of Vehicle Trajectories

Philippe C. Besse; Brendan Guillouet; Jean-Michel Loubes; Francois Royer

of independent Gaussian random variables, having mean

Collaboration


Dive into the Jean-Michel Loubes's collaboration.

Top Co-Authors

Avatar

Fabrice Gamboa

Institut de Mathématiques de Toulouse

View shared research outputs
Top Co-Authors

Avatar

Elie Maza

University of Toulouse

View shared research outputs
Top Co-Authors

Avatar

Philippe Besse

Institut national des sciences appliquées de Toulouse

View shared research outputs
Top Co-Authors

Avatar

Jérémie Bigot

National Polytechnic Institute of Toulouse

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hélène Lescornel

Institut de Mathématiques de Toulouse

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Brendan Guillouet

Institut de Mathématiques de Toulouse

View shared research outputs
Top Co-Authors

Avatar

François Bachoc

Institut de Mathématiques de Toulouse

View shared research outputs
Top Co-Authors

Avatar

Paul Rochet

University of Toulouse

View shared research outputs
Researchain Logo
Decentralizing Knowledge