Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Joel A. Tropp is active.

Publication


Featured researches published by Joel A. Tropp.


IEEE Transactions on Information Theory | 2007

Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit

Joel A. Tropp; Anna C. Gilbert

This paper demonstrates theoretically and empirically that a greedy algorithm called orthogonal matching pursuit (OMP) can reliably recover a signal with m nonzero entries in dimension d given O(m ln d) random linear measurements of that signal. This is a massive improvement over previous results, which require O(m2) measurements. The new results for OMP are comparable with recent results for another approach called basis pursuit (BP). In some settings, the OMP algorithm is faster and easier to implement, so it is an attractive alternative to BP for signal recovery problems.


IEEE Transactions on Information Theory | 2004

Greed is good: algorithmic results for sparse approximation

Joel A. Tropp

This article presents new results on using a greedy algorithm, orthogonal matching pursuit (OMP), to solve the sparse approximation problem over redundant dictionaries. It provides a sufficient condition under which both OMP and Donohos basis pursuit (BP) paradigm can recover the optimal representation of an exactly sparse signal. It leverages this theory to show that both OMP and BP succeed for every sparse input signal from a wide class of dictionaries. These quasi-incoherent dictionaries offer a natural generalization of incoherent dictionaries, and the cumulative coherence function is introduced to quantify the level of incoherence. This analysis unifies all the recent results on BP and extends them to OMP. Furthermore, the paper develops a sufficient condition under which OMP can identify atoms from an optimal approximation of a nonsparse signal. From there, it argues that OMP is an approximation algorithm for the sparse problem over a quasi-incoherent dictionary. That is, for every input signal, OMP calculates a sparse approximant whose error is only a small factor worse than the minimal error that can be attained with the same number of terms.


Siam Review | 2011

Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions

Nathan Halko; Per-Gunnar Martinsson; Joel A. Tropp

Low-rank matrix approximations, such as the truncated singular value decomposition and the rank-revealing QR decomposition, play a central role in data analysis and scientific computing. This work surveys and extends recent research which demonstrates that randomization offers a powerful tool for performing low-rank matrix approximation. These techniques exploit modern computational architectures more fully than classical methods and open the possibility of dealing with truly massive data sets. This paper presents a modular framework for constructing randomized algorithms that compute partial matrix decompositions. These methods use random sampling to identify a subspace that captures most of the action of a matrix. The input matrix is then compressed—either explicitly or implicitly—to this subspace, and the reduced matrix is manipulated deterministically to obtain the desired low-rank factorization. In many cases, this approach beats its classical competitors in terms of accuracy, robustness, and/or speed. These claims are supported by extensive numerical experiments and a detailed error analysis. The specific benefits of randomized techniques depend on the computational environment. Consider the model problem of finding the


IEEE Transactions on Information Theory | 2006

Just relax: convex programming methods for identifying sparse signals in noise

Joel A. Tropp

k


Signal Processing | 2006

Algorithms for simultaneous sparse approximation: part I: Greedy pursuit

Joel A. Tropp; Anna C. Gilbert; M. Strauss

dominant components of the singular value decomposition of an


Proceedings of the IEEE | 2010

Computational Methods for Sparse Solution of Linear Inverse Problems

Joel A. Tropp; Stephen J. Wright

m \times n


IEEE Transactions on Information Theory | 2010

Beyond Nyquist: Efficient Sampling of Sparse Bandlimited Signals

Joel A. Tropp; Jason N. Laska; Marco F. Duarte; Justin K. Romberg; Richard G. Baraniuk

matrix. (i) For a dense input matrix, randomized algorithms require


Foundations of Computational Mathematics | 2012

User-Friendly Tail Bounds for Sums of Random Matrices

Joel A. Tropp

\bigO(mn \log(k))


Communications of The ACM | 2010

CoSaMP: iterative signal recovery from incomplete and inaccurate samples

Deanna Needell; Joel A. Tropp

floating-point operations (flops) in contrast to


Signal Processing | 2006

Algorithms for simultaneous sparse approximation: part II: Convex relaxation

Joel A. Tropp

\bigO(mnk)

Collaboration


Dive into the Joel A. Tropp's collaboration.

Top Co-Authors

Avatar

Inderjit S. Dhillon

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Robert W. Heath

University of Texas at Austin

View shared research outputs
Top Co-Authors

Avatar

Michael B. McCoy

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Ati Sharma

University of Southampton

View shared research outputs
Top Co-Authors

Avatar

Beverley McKeon

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

M. Strauss

University of Michigan

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Richard Y. Chen

California Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge