Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Jeremy P. Vila is active.

Publication


Featured researches published by Jeremy P. Vila.


asilomar conference on signals, systems and computers | 2011

Expectation-maximization Bernoulli-Gaussian approximate message passing

Jeremy P. Vila; Philip Schniter

The approximate message passing (AMP) algorithm originally proposed by Donoho, Maleki, and Montanari yields a computationally attractive solution to the usual ℓ1-regularized least-squares problem faced in compressed sensing, whose solution is known to be robust to the signal distribution. When the signal is drawn i.i.d from a marginal distribution that is not least-favorable, better performance can be attained using a Bayesian variation of AMP. The latter, however, assumes that the distribution is perfectly known. In this paper, we navigate the space between these two extremes by modeling the signal as i.i.d Bernoulli-Gaussian (BG) with unknown prior sparsity, mean, and variance, and the noise as zero-mean Gaussian with unknown variance, and we simultaneously reconstruct the signal while learning the prior signal and noise parameters. To accomplish this task, we embed the BG-AMP algorithm within an expectation-maximization (EM) framework. Numerical experiments confirm the excellent performance of our proposed EM-BG-AMP on a range of signal types.12


international conference on acoustics, speech, and signal processing | 2015

Adaptive damping and mean removal for the generalized approximate message passing algorithm

Jeremy P. Vila; Philip Schniter; Sundeep Rangan; Florent Krzakala; Lenka Zdeborová

The generalized approximate message passing (GAMP) algorithm is an efficient method of MAP or approximate-MMSE estimation of x observed from a noisy version of the transform coefficients z = Ax. In fact, for large zero-mean i.i.d sub-Gaussian A, GAMP is characterized by a state evolution whose fixed points, when unique, are optimal. For generic A, however, GAMP may diverge. In this paper, we propose adaptive-damping and mean-removal strategies that aim to prevent divergence. Numerical results demonstrate significantly enhanced robustness to non-zero-mean, rank-deficient, column-correlated, and ill-conditioned A.


conference on information sciences and systems | 2012

Expectation-maximization Gaussian-mixture approximate message passing

Jeremy P. Vila; Philip Schniter

When recovering a sparse signal from noisy compressive linear measurements, the distribution of the signals non-zero coefficients can have a profound effect on recovery mean-squared error (MSE). If this distribution was a priori known, then one could use computationally efficient approximate message passing (AMP) techniques for nearly minimum MSE (MMSE) recovery. In practice, however, the distribution is unknown, motivating the use of robust algorithms like LASSO-which is nearly minimax optimal-at the cost of significantly larger MSE for non-least-favorable distributions. As an alternative, we propose an empirical-Bayesian technique that simultaneously learns the signal distribution while MMSE-recovering the signal-according to the learned distribution-using AMP. In particular, we model the non-zero distribution as a Gaussian mixture and learn its parameters through expectation maximization, using AMP to implement the expectation step. Numerical experiments on a wide range of signal classes confirm the state-of-the-art performance of our approach, in both reconstruction error and runtime, in the high-dimensional regime, for most (but not all) sensing operators.


IEEE Transactions on Signal Processing | 2014

An Empirical-Bayes Approach to Recovering Linearly Constrained Non-Negative Sparse Signals

Jeremy P. Vila; Philip Schniter

We consider the recovery of an (approximately) sparse signal from noisy linear measurements, in the case that the signal is apriori known to be non-negative and obeys certain linear equality constraints. For this, we propose a novel empirical-Bayes approach that combines the Generalized Approximate Message Passing (GAMP) algorithm with the expectation maximization (EM) algorithm. To enforce both sparsity and non-negativity, we employ an i.i.d Bernoulli non-negative Gaussian mixture (NNGM) prior and perform approximate minimum mean-squared error (MMSE) recovery of the signal using sum-product GAMP. To learn the NNGM parameters, we use the EM algorithm with a suitable initialization. Meanwhile, the linear equality constraints are enforced by augmenting GAMPs linear observation model with noiseless pseudo-measurements. Numerical experiments demonstrate the state-of-the art mean-squared-error and runtime of our approach.


international conference on acoustics, speech, and signal processing | 2015

Generalized approximate message passing for cosparse analysis compressive sensing

Mark Borgerding; Philip Schniter; Jeremy P. Vila; Sundeep Rangan

In cosparse analysis compressive sensing (CS), one seeks to estimate a non-sparse signal vector from noisy sub-Nyquist linear measurements by exploiting the knowledge that a given linear transform of the signal is cosparse, i.e., has sufficiently many zeros. We propose a novel approach to cosparse analysis CS based on the generalized approximate message passing (GAMP) algorithm. Unlike other AMP-based approaches to this problem, ours works with a wide range of analysis operators and regularizers. In addition, we propose a novel ℓ0-like soft-thresholder based on MMSE denoising for a spike-and-slab distribution with an infinite-variance slab. Numerical demonstrations on synthetic and practical datasets demonstrate advantages over existing AMP-based, greedy, and reweighted-ℓ1 approaches.


Proceedings of SPIE | 2013

Hyperspectral image unmixing via bilinear generalized approximate message passing

Jeremy P. Vila; Philip Schniter; Joseph Meola

In hyperspectral unmixing, the objective is to decompose an electromagnetic spectral dataset measured over M spectral bands and T pixels, into N constituent material spectra (or “endmembers”) with corresponding spatial abundances. In this paper, we propose a novel approach to hyperspectral unmixing (i.e., joint estimation of endmembers and abundances) based on loopy belief propagation. In particular, we employ the bilinear generalized approximate message passing algorithm (BiG-AMP), a recently proposed belief-propagation-based approach to matrix factorization, in a “turbo” framework that enables the exploitation of spectral coherence in the endmembers, as well as spatial coherence in the abundances. In conjunction, we propose an expectation- maximization (EM) technique that can be used to automatically tune the prior statistics assumed by turbo BiG-AMP. Numerical experiments on synthetic and real-world data confirm the state-of-the-art performance of our approach.


ieee international workshop on computational advances in multi sensor adaptive processing | 2013

An empirical-bayes approach to recovering linearly constrained non-negative sparse signals

Jeremy P. Vila; Philip Schniter

We propose two novel approaches for the recovery of an (approximately) sparse signal from noisy linear measurements in the case that the signal is a priori known to be non-negative and obey given linear equality constraints, such as a simplex signal. This problem arises in, e.g., hyperspectral imaging, portfolio optimization, density estimation, and certain cases of compressive imaging. Our first approach solves a linearly constrained non-negative version of LASSO using the max-sum version of the generalized approximate message passing (GAMP) algorithm, where we consider both quadratic and absolute loss, and where we propose a novel approach to tuning the LASSO regularization parameter via the expectation maximization (EM) algorithm. Our second approach is based on the sum-product version of the GAMP algorithm, where we propose the use of a Bernoulli non-negative Gaussian-mixture signal prior and a Laplacian likelihood and propose an EM-based approach to learning the underlying statistical parameters. In both approaches, the linear equality constraints are enforced by augmenting GAMPs generalized-linear observation model with noiseless pseudo-measurements. Extensive numerical experiments demonstrate the state-of-the-art performance of our proposed approaches.


IEEE Transactions on Computational Imaging | 2015

Hyperspectral Unmixing Via Turbo Bilinear Approximate Message Passing

Jeremy P. Vila; Philip Schniter; Joseph Meola

The goal of hyperspectral unmixing is to decompose an electromagnetic spectral dataset measured over M spectral bands and T pixels into N constituent material spectra (or “end-members”) with corresponding spatial abundances. In this paper, we propose a novel approach to hyperspectral unmixing based on loopy belief propagation (BP) that enables the exploitation of spectral coherence in the end-members and spatial coherence in the abundances. In particular, we partition the factor graph into spectral coherence, spatial coherence, and bilinear subgraphs, and pass messages between them using a “turbo” approach. To perform message passing within the bilinear subgraph, we employ the bilinear generalized approximate message passing algorithm (BiG-AMP), a recently proposed belief-propagation-based approach to matrix factorization. Furthermore, we propose an expectation-maximization (EM) strategy to tune the prior parameters and a model-order selection strategy to select the number of materials N. Numerical experiments conducted with both synthetic and real-world data show favorable unmixing performance relative to existing methods.


IEEE Transactions on Signal Processing | 2013

Expectation-Maximization Gaussian-Mixture Approximate Message Passing

Jeremy P. Vila; Philip Schniter


Archive | 2015

Empirical-Bayes Approaches to Recovery of Structured Sparse Signals via Approximate Message Passing

Jeremy P. Vila

Collaboration


Dive into the Jeremy P. Vila's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Joseph Meola

Air Force Research Laboratory

View shared research outputs
Top Co-Authors

Avatar

Florent Krzakala

École Normale Supérieure

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lenka Zdeborová

Centre national de la recherche scientifique

View shared research outputs
Researchain Logo
Decentralizing Knowledge