Jérémie Bigot
University of Bordeaux
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jérémie Bigot.
IEEE Transactions on Information Theory | 2016
Jérémie Bigot; Claire Boyer; Pierre Weiss
Compressed sensing is a theory which guarantees the exact recovery of sparse signals from a small number of linear projections. The sampling schemes suggested by current compressed sensing theories are often of little practical relevance, since they cannot be implemented on real acquisition systems. In this paper, we study a new random sampling approach that consists of projecting the signal over blocks of sensing vectors. A typical example is the case of blocks made of horizontal lines in the 2-D Fourier plane. We provide the theoretical results on the number of blocks that are sufficient for exact sparse signal reconstruction. This number depends on two properties named intra- and inter-support block coherence. We then show that our bounds coincide with the best so far results in a series of examples, including Gaussian measurements or isolated measurements. We also show that the result is sharp when used with specific blocks in time-frequency bases, in the sense that the minimum required amount of blocks to reconstruct sparse signals cannot be improved up to a multiplicative logarithmic factor. The proposed results provide a good insight on the possibilities and limits of block compressed sensing in imaging devices, such as magnetic resonance imaging, radio-interferometry, or ultra-sound imaging.
Siam Journal on Imaging Sciences | 2014
Claire Boyer; Pierre Weiss; Jérémie Bigot
Reducing acquisition time is of fundamental importance in various imaging modalities. The concept of variable density sampling provides a nice framework to achieve this. It was justified recently from a theoretical point of view in the compressed sensing (CS) literature. Unfortunately, the sampling schemes suggested by current CS theories may not be relevant since they do not take the acquisition constraints into account (for example, continuity of the acquisition trajectory in Magnetic Resonance Imaging - MRI). In this paper, we propose a numerical method to perform variable density sampling with block constraints. Our main contribution is to propose a new way to draw the blocks in order to mimic CS strategies based on isolated measurements. The basic idea is to minimize a tailored dissimilarity measure between a probability distribution defined on the set of isolated measurements and a probability distribution defined on a set of blocks of measurements. This problem turns out to be convex and solvable in high dimension. Our second contribution is to define an efficient minimization algorithm based on Nesterovs accelerated gradient descent in metric spaces. We study carefully the choice of the metrics and of the prox function. We show that the optimal choice may depend on the type of blocks under consideration. Finally, we show that we can obtain better MRI reconstruction results using our sampling schemes than standard strategies such as equiangularly distributed radial lines.
Electronic Journal of Statistics | 2013
Jérémie Bigot; Sébastien Gadat; Thierry Klein; Clément Marteau
In this paper, we consider the problem of estimating nonparametrically a mean pattern intensity λ from the observation of n independent and non-homogeneous Poisson processes N1,…,Nn on the interval [0,1]. This problem arises when data (counts) are collected independently from n individuals according to similar Poisson processes. We show that estimating this intensity is a deconvolution problem for which the density of the random shifts plays the role of the convolution operator. In an asymptotic setting where the number n of observed trajectories tends to infinity, we derive upper and lower bounds for the minimax quadratic risk over Besov balls. Non-linear thresholding in a Meyer wavelet basis is used to derive an adaptive estimator of the intensity. The proposed estimator is shown to achieve a near-minimax rate of convergence. This rate depends both on the smoothness of the intensity function and the density of the random shifts, which makes a connection between the classical deconvolution problem in nonparametric statistics and the estimation of a mean intensity from the observations of independent Poisson processes.
Annales De L Institut Henri Poincare-probabilites Et Statistiques | 2017
Jérémie Bigot; Raúl Gouet; Thierry Klein; Alfredo López
We introduce the method of Geodesic Principal Component Analysis (GPCA) on the space of probability measures on the line, with finite second moment, endowed with the Wasserstein metric. We discuss the advantages of this approach, over a standard functional PCA of probability densities in the Hilbert space of square-integrable functions. We establish the consistency of the method by showing that the empirical GPCA converges to its population counterpart, as the sample size tends to infinity. A key property in the study of GPCA is the isometry between the Wasserstein space and a closed convex subset of the space of square-integrable functions, with respect to an appropriate measure. Therefore, we consider the general problem of PCA in a closed convex subset of a separable Hilbert space, which serves as basis for the analysis of GPCA and also has interest in its own right. We provide illustrative examples on simple statistical models, to show the benefits of this approach for data analysis. The method is also applied to a real dataset of population pyramids.
Siam Journal on Imaging Sciences | 2013
Jérémie Bigot; Raúl Gouet; Alfredo López
We describe a method for analyzing the shape variability of images, called geometric PCA. Our approach is based on the use of deformation operators to model the geometric variability of images around a reference mean pattern. This leads to a new algorithm for estimating shape variability. Some numerical experiments on real images are proposed to highlight the benefits of this approach. The consistency of this procedure is also analyzed in statistical deformable models.
SIAM Journal on Scientific Computing | 2018
Elsa Cazelles; Vivien Seguy; Jérémie Bigot; Marco Cuturi; Nicolas Papadakis
This paper is concerned with the statistical analysis of datasets whose elements are random histograms. For the purpose of learning principal modes of variation from such data, we consider the issue of computing the principal component analysis (PCA) of histograms with respect to the 2-Wasserstein distance between probability measures. To this end, we propose comparing the methods of log-PCA and geodesic PCA in the Wasserstein space as introduced in [J. Bigot et al., Ann. Inst. Henri Poincare Probab. Stat., 53 (2017), pp. 1--26; V. Seguy and M. Cuturi, Principal geodesic analysis for probability measures under the optimal transport metric, in Advances in Neural Information Processing Systems 28, C. Cortes, N. Lawrence, D. Lee, M. Sugiyama, and R. Garnett, eds., Curran Associates, Inc., Red Hook, NY, 2015, pp. 3294--3302]. Geodesic PCA involves solving a nonconvex optimization problem. To solve it approximately, we propose a novel forward-backward algorithm. This allows us to give a detailed comparison bet...
Electronic Journal of Statistics | 2018
Jérémie Bigot; Raúl Gouet; Thierry Klein; Alfredo López
This paper is focused on the statistical analysis of probability measures
International Conference on Geometric Science of Information | 2017
Elsa Cazelles; Jérémie Bigot; Nicolas Papadakis
bnu_{1},ldots,bnu_{n}
Archive | 2013
Jérémie Bigot; Thierry Klein
on
Applied and Computational Harmonic Analysis | 2017
Claire Boyer; Jérémie Bigot; Pierre Weiss
R