Yves Rozenholc
Paris Descartes University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Yves Rozenholc.
Archive | 2011
Markus Reiß; Yves Rozenholc; Charles A. Cuenod
A nonparametric procedure for quantile regression, or more generally nonparametric M-estimation, is proposed which is completely data-driven and adapts locally to the regularity of the regression function. This is achieved by considering in each point M-estimators over different local neighbourhoods and by a local model selection procedure based on sequential testing. Non-asymptotic risk bounds are obtained, which yield rate-optimality for large sample asymptotics under weak conditions. Simulations for different univariate median regression models show good finite sample properties, also in comparison to traditional methods. The approach is the basis for denoising CT scans in cancer research.
Forensic Science International | 2009
Françoise Tilotta; Frédéric J. P. Richard; Joan Alexis Glaunès; Maxime Berar; Servane Gey; Stéphane Verdeille; Yves Rozenholc; Jean-François Gaudy
This paper is devoted to the construction of a complete database which is intended to improve the implementation and the evaluation of automated facial reconstruction. This growing database is currently composed of 85 head CT-scans of healthy European subjects aged 20-65 years old. It also includes the triangulated surfaces of the face and the skull of each subject. These surfaces are extracted from CT-scans using an original combination of image-processing techniques which are presented in the paper. Besides, a set of 39 referenced anatomical skull landmarks were located manually on each scan. Using the geometrical information provided by triangulated surfaces, we compute facial soft-tissue depths at each known landmark positions. We report the average thickness values at each landmark and compare our measures to those of the traditional charts of [J. Rhine, C.E. Moore, Facial Tissue Thickness of American Caucasoïds, Maxwell Museum of Anthropology, Albuquerque, New Mexico, 1982] and of several recent in vivo studies [M.H. Manhein, G.A. Listi, R.E. Barsley, et al., In vivo facial tissue depth measurements for children and adults, Journal of Forensic Sciences 45 (1) (2000) 48-60; S. De Greef, P. Claes, D. Vandermeulen, et al., Large-scale in vivo Caucasian facial soft tissue thickness database for craniofacial reconstruction, Forensic Science International 159S (2006) S126-S146; R. Helmer, Schödelidentifizierung durch elektronische bildmischung, Kriminalistik Verlag GmbH, Heidelberg, 1984].
Bernoulli | 2007
Fabienne Comte; Valentine Genon-Catalot; Yves Rozenholc
We consider a one-dimensional diffusion process (Xt) which is observed at n + 1 discrete times with regular sampling interval ∆. Assuming that (Xt) is strictly stationary, we propose nonparametric estimators of the drift and diffusion coefficients obtained by a penalized least square approach. Our estimators belong to a finite dimensional function space whose dimension is selected by a data-driven method. We provide non asymptotic risk bounds for the estimators. When the sampling interval tends to zero while the number of observations and the length of the observation time interval tend to infinity, we show that our estimators reach the minimax optimal rates of convergence. Numerical results based on exact simulations of diffusion processes are given for several examples of models and enlight the qualities of our estimation algorithms
Annals of the Institute of Statistical Mathematics | 2004
Fabienne Comte; Yves Rozenholc
In this paper, we present a new algorithm to estimate a regression function in a fixed design regression model, by piecewise (standard and trigonometric) polynomials computed with an automatic choice of the knots of the subdivision and of the degrees of the polynomials on each sub-interval. First we give the theoretical background underlying the method: the theoretical performances of our penalized least-squares estimator are based on non-asymptotic evaluations of a mean-square type risk. Then we explain how the algorithm is built and possibly accelerated (to face the case when the number of observations is great), how the penalty term is chosen and why it contains some constants requiring an empirical calibration. Lastly, a comparison with some well-known or recent wavelet methods is made: this brings out that our algorithm behaves in a very competitive way in term of denoising and of compression.
Stochastic Processes and their Applications | 2002
Fabienne Comte; Yves Rozenholc
In this paper, we study the problem of nonparametric estimation of the mean and variance functions b and [sigma]2 in a model: Xi+1=b(Xi)+[sigma](Xi)[var epsilon]i+1. For this purpose, we consider a collection of finite dimensional linear spaces. We estimate b using a mean squares estimator built on a data driven selected linear space among the collection. Then an analogous procedure estimates [sigma]2, using a possibly different collection of models. Both data driven choices are performed via the minimization of penalized mean squares contrasts. The penalty functions are random in order not to depend on unknown variance-type quantities. In all cases, we state nonasymptotic risk bounds in empirical norm for our estimators and we show that they are both adaptive in the minimax sense over a large class of Besov balls. Lastly, we give the results of intensive simulation experiments which show the good performances of our estimator.
Journal of Statistical Computation and Simulation | 2007
Fabienne Comte; Yves Rozenholc
We consider the problem of estimating the density g of identically distributed variables X i , from a sample Z 1, …, Z n , where Z i =X i +σϵ i , i=1, …, n and σϵ i is a noise independent of X i with known density σ−1 f ϵ(·/σ). We numerically study the adaptive estimators, constructed by a model selection procedure described by Comte et al. [2006, Penalized contrast estimator for density deconvolution, Canadian Journal of Statistics, 37(3)]. We illustrate their properties in various contexts and test their robustness (misspecification of errors, dependency and so on). Comparisons are made with respect to deconvolution kernel estimators. It appears that our estimation algorithm, based on a fast procedure, performs very well in all contexts.
Finance and Stochastics | 2010
Fabienne Comte; Valentine Genon-Catalot; Yves Rozenholc
AbstractConsider discrete-time observations (Xℓδ)1≤ℓ≤n+1 of the process X satisfying
The Journal of Nuclear Medicine | 2016
Angel Torrado-Carvajal; J. L. Herraiz; Eduardo Alcain; Antonio S. Montemayor; Lina Garcia-Cañamaque; Juan Antonio Hernández-Tamames; Yves Rozenholc; Norberto Malpica
dX_{t}=\sqrt{V_{t}}dB_{t}
Computational Statistics & Data Analysis | 2013
Yves Rozenholc; Gregory Nuel
, with V a one-dimensional positive diffusion process independent of the Brownian motion B. For both the drift and the diffusion coefficient of the unobserved diffusion V, we propose nonparametric least square estimators, and provide bounds for their risk. Estimators are chosen among a collection of functions belonging to a finite-dimensional space whose dimension is selected by a data driven procedure. Implementation on simulated data illustrates how the method works.
Forensic Science International | 2010
Françoise Tilotta; Joan Alexis Glaunès; Frédéric J. P. Richard; Yves Rozenholc
Attenuation correction in hybrid PET/MR scanners is still a challenging task. This paper describes a methodology for synthesizing a pseudo-CT volume from a single T1-weighted volume, thus allowing us to create accurate attenuation correction maps. Methods: We propose a fast pseudo-CT volume generation from a patient-specific MR T1-weighted image using a groupwise patch-based approach and an MRI–CT atlas dictionary. For every voxel in the input MR image, we compute the similarity of the patch containing that voxel to the patches of all MR images in the database that lie in a certain anatomic neighborhood. The pseudo-CT volume is obtained as a local weighted linear combination of the CT values of the corresponding patches. The algorithm was implemented in a graphical processing unit (GPU). Results: We evaluated our method both qualitatively and quantitatively for PET/MR correction. The approach performed successfully in all cases considered. We compared the SUVs of the PET image obtained after attenuation correction using the patient-specific CT volume and using the corresponding computed pseudo-CT volume. The patient-specific correlation between SUV obtained with both methods was high (R2 = 0.9980, P < 0.0001), and the Bland–Altman test showed that the average of the differences was low (0.0006 ± 0.0594). A region-of-interest analysis was also performed. The correlation between SUVmean and SUVmax for every region was high (R2 = 0.9989, P < 0.0001, and R2 = 0.9904, P < 0.0001, respectively). Conclusion: The results indicate that our method can accurately approximate the patient-specific CT volume and serves as a potential solution for accurate attenuation correction in hybrid PET/MR systems. The quality of the corrected PET scan using our pseudo-CT volume is comparable to having acquired a patient-specific CT scan, thus improving the results obtained with the ultrashort-echo-time–based attenuation correction maps currently used in the scanner. The GPU implementation substantially decreases computational time, making the approach suitable for real applications.