Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yann Traonmilin is active.

Publication


Featured researches published by Yann Traonmilin.


international conference on acoustics, speech, and signal processing | 2017

Compressive K-means

Nicolas Keriven; Nicolas Tremblay; Yann Traonmilin; Rémi Gribonval

The Lloyd-Max algorithm is a classical approach to perform K-means clustering. Unfortunately, its cost becomes prohibitive as the training dataset grows large. We propose a compressive version of K-means (CKM), that estimates cluster centers from a sketch, i.e. from a drastically compressed representation of the training dataset. We demonstrate empirically that CKM performs similarly to Lloyd-Max, for a sketch size proportional to the number of centroids times the ambient dimension, and independent of the size of the original dataset. Given the sketch, the computational complexity of CKM is also independent of the size of the dataset. Unlike Lloyd-Max which requires several replicates, we further demonstrate that CKM is almost insensitive to initialization. For a large dataset of 107 data points, we show that CKM can run two orders of magnitude faster than five replicates of Lloyd-Max, with similar clustering performance on artificial data. Finally, CKM achieves lower classification errors on handwritten digits classification.


arXiv: Information Theory | 2017

Compressed Sensing in Hilbert Spaces

Yann Traonmilin; Gilles Puy; Rémi Gribonval; Mike E. Davies

In many linear inverse problems, we want to estimate an unknown vector belonging to a high-dimensional (or infinite-dimensional) space from few linear measurements. To overcome the ill-posed nature of such problems, we use a low-dimension assumption on the unknown vector: it belongs to a low-dimensional model set. The question of whether it is possible to recover such an unknown vector from few measurements then arises. If the answer is yes, it is also important to be able to describe a way to perform such a recovery. We describe a general framework where appropriately chosen random measurements guarantee that recovery is possible. We further describe a way to study the performance of recovery methods that consist in the minimization of a regularization function under a data-fit constraint.


international conference on acoustics, speech, and signal processing | 2017

Phase unmixing: Multichannel source separation with magnitude constraints

Antoine Deleforge; Yann Traonmilin

We consider the problem of estimating the phases of K mixed complex signals from a multichannel observation, when the mixing matrix and signal magnitudes are known. This problem can be cast as a non-convex quadratically constrained quadratic program which is known to be NP-hard in general. We propose three approaches to tackle it: a heuristic method, an alternate minimization method, and a convex relaxation into a semi-definite program. The last two approaches are showed to outperform the oracle multichannel Wiener filter in under-determined informed source separation tasks, using simulated and speech signals. The convex relaxation approach yields best results, including the potential for exact source separation in under-determined settings.


information theory workshop | 2016

A framework for low-complexity signal recovery and its application to structured sparsity

Yann Traonmilin; Rémi Gribonval

We study the problem of estimating an unknown vector from noisy underdetermined observations, with recovery guarantees. In such context, a regularity model on the unknown is needed to obtain recovery guarantees. We show that we can guarantee the recovery of generic models (cones) with the minimization of an arbitrary regularizer subject to a data-fit constraint (generalized robust basis pursuit) under a restricted isometry property (RIP) hypothesis on the observations. In the classical cases of sparse vectors and low rank matrix recovery, our framework yields sharp recovery guarantees. For the more refined model of structured sparsity in levels, our framework extends and improves existing RIP recovery guarantees.


Applied and Computational Harmonic Analysis | 2016

Stable recovery of low-dimensional cones in Hilbert spaces: One RIP to rule them all

Yann Traonmilin; Rémi Gribonval


arXiv: Machine Learning | 2017

Compressive Statistical Learning with Random Feature Moments

Rémi Gribonval; Gilles Blanchard; Nicolas Keriven; Yann Traonmilin


SPARS workshop 2017 | 2017

Spikes super-resolution with random Fourier sampling

Yann Traonmilin; Nicolas Keriven; Rémi Gribonval; Gilles Blanchard


Springer US | 2018

Compressed Sensing and Its Applications

Yann Traonmilin; Gilles Puy; Rémi Gribonval; Mike E. Davies


Archive | 2018

Is the 1-norm the best convex sparse regularization?

Yann Traonmilin; Samuel Vaiter; Rémi Gribonval


SPARS2017 - Signal Processing with Adaptive Sparse Structured Representations workshop | 2017

Random Moments for Sketched Mixture Learning

Nicolas Keriven; Rémi Gribonval; Gilles Blanchard; Yann Traonmilin

Collaboration


Dive into the Yann Traonmilin's collaboration.

Top Co-Authors

Avatar

Gilles Puy

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nicolas Tremblay

Centre national de la recherche scientifique

View shared research outputs
Top Co-Authors

Avatar

Pierre Vandergheynst

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Samuel Vaiter

Centre national de la recherche scientifique

View shared research outputs
Researchain Logo
Decentralizing Knowledge