Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Magalie Fromont is active.

Publication


Featured researches published by Magalie Fromont.


Machine Learning | 2007

Model selection by bootstrap penalization for classification

Magalie Fromont

We consider the binary classification problem. Given an i.i.d. sample drawn from the distribution of an χ×{0,1}−valued random pair, we propose to estimate the so-called Bayes classifier by minimizing the sum of the empirical classification error and a penalty term based on Efron’s or i.i.d. weighted bootstrap samples of the data. We obtain exponential inequalities for such bootstrap type penalties, which allow us to derive non-asymptotic properties for the corresponding estimators. In particular, we prove that these estimators achieve the global minimax risk over sets of functions built from Vapnik-Chervonenkis classes. The obtained results generalize Koltchinskii (2001) and Bartlett et al.’s (2002) ones for Rademacher penalties that can thus be seen as special examples of bootstrap type penalties. To illustrate this, we carry out an experimental study in which we compare the different methods for an intervals model selection problem.


Annals of Statistics | 2006

Adaptive goodness-of-fit tests in a density model

Magalie Fromont; Béatrice Laurent

Given an i.i.d. sample drawn from a density f, we propose to test that f equals some prescribed density f 0 or that f belongs to some translation/scale family. We introduce a multiple testing procedure based on an estimation of the L 2 -distance between f and f 0 or between f and the parametric family that we consider. For each sample size n, our test has level of significance a. In the case of simple hypotheses, we prove that our test is adaptive: it achieves the optimal rates of testing established by Ingster [J. Math. Sci. 99 (2000) 1110-1119] over various classes of smooth functions simultaneously. As for composite hypotheses, we obtain similar results up to a logarithmic factor. We carry out a simulation study to compare our procedures with the Kolmogorov-Smimov tests, or with goodness-of-fit tests proposed by Bickel and Ritov [in Nonparametric Statistics and Related Topics (1992) 51-57] and by Kallenberg and Ledwina [Ann. Statist. 23 (1995) 1594-1608].


conference on learning theory | 2006

Functional classification with margin conditions

Magalie Fromont; Christine Tuleau

Let (X,Y) be a


Annales De L Institut Henri Poincare-probabilites Et Statistiques | 2011

Adaptive tests of homogeneity for a Poisson process

Magalie Fromont; Béatrice Laurent; Patricia Reynaud-Bouret

\mathcal{X}


conference on learning theory | 2004

Model Selection by Bootstrap Penalization for Classification

Magalie Fromont

× 0,1 valued random pair and consider a sample (X1,Y1),...,(Xn,Yn) drawn from the distribution of (X,Y). We aim at constructing from this sample a classifier that is a function which would predict the value of Y from the observation of X. The special case where


Annals of Statistics | 2015

Bootstrap and permutation tests of independence for point processes

Mélisande Albert; Yann Bouret; Magalie Fromont; Patricia Reynaud-Bouret

\mathcal{X}


Annals of Statistics | 2013

The two-sample problem for Poisson processes: adaptive tests with a non-asymptotic wild bootstrap approach

Magalie Fromont; Béatrice Laurent; Patricia Reynaud-Bouret

is a functional space is of particular interest due to the so called curse of dimensionality. In a recent paper, Biau et al. [1] propose to filter the Xi’s in the Fourier basis and to apply the classical k–Nearest Neighbor rule to the first d coefficients of the expansion. The selection of both k and d is made automatically via a penalized criterion. We extend this study, and note here the penalty used by Biau et al. is too heavy when we consider the minimax point of view under some margin type assumptions. We prove that using a penalty of smaller order or equal to zero is preferable both in theory and practice. Our experimental study furthermore shows that the introduction of a small-order penalty stabilizes the selection process, while preserving rather good performances.


Annals of Statistics | 2016

Family-Wise Separation Rates for multiple testing

Magalie Fromont; Matthieu Lerasle; Patricia Reynaud-Bouret

We propose to test the homogeneity of a Poisson process observed on a finite interval. In this framework, we first provide lower bounds for the uniform separation rates in


conference on learning theory | 2012

Kernels Based Tests with Non-asymptotic Bootstrap Approaches for Two-sample Problems

Magalie Fromont; Béatrice Laurent; Matthieu Lerasle; Patricia Reynaud-Bouret

\mathbb{L}^2


Esaim: Probability and Statistics | 2006

Adaptive tests for periodic signal detection with applications to laser vibrometry

Magalie Fromont; Céline Lévy-Leduc

norm over classical Besov bodies and weak Besov bodies. Surprisingly, the obtained lower bounds over weak Besov bodies coincide with the minimax estimation rates over such classes. Then we construct non asymptotic and nonparametric testing procedures that are adaptive in the sense that they achieve, up to a possible logarithmic factor, the optimal uniform separation rates over various Besov bodies simultaneously. These procedures are based on model selection and thresholding methods. We finally complete our theoretical study with a Monte Carlo evaluation of the power of our tests under various alternatives.

Collaboration


Dive into the Magalie Fromont's collaboration.

Top Co-Authors

Avatar

Patricia Reynaud-Bouret

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Béatrice Laurent

Institut de Mathématiques de Toulouse

View shared research outputs
Top Co-Authors

Avatar

Matthieu Lerasle

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Mélisande Albert

University of Nice Sophia Antipolis

View shared research outputs
Top Co-Authors

Avatar

Yann Bouret

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge