Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Michael B. McCoy is active.

Publication


Featured researches published by Michael B. McCoy.


Electronic Journal of Statistics | 2011

Two proposals for robust PCA using semidefinite programming

Michael B. McCoy; Joel A. Tropp

The performance of principal component analysis suffers badly in the presence of outliers. This paper proposes two novel approaches for robust principal component analysis based on semidefinite programming. The first method, maximum mean absolute deviation rounding, seeks directions of large spread in the data while damping the effect of outliers. The second method produces a low-leverage decomposition of the data that attempts to form a low-rank model for the data by separating out corrupted observations. This paper also presents efficient computational methods for solving these semidefinite programs. Numerical experiments confirm the value of these new techniques.


Foundations of Computational Mathematics | 2015

Robust Computation of Linear Models by Convex Relaxation

Gilad Lerman; Michael B. McCoy; Joel A. Tropp; Teng Zhang

Consider a data set of vector-valued observations that consists of noisy inliers, which are explained well by a low-dimensional subspace, along with some number of outliers. This work describes a convex optimization problem, called reaper, that can reliably fit a low-dimensional model to this type of data. This approach parameterizes linear subspaces using orthogonal projectors and uses a relaxation of the set of orthogonal projectors to reach the convex formulation. The paper provides an efficient algorithm for solving the reaper problem, and it documents numerical experiments that confirm that reaper can dependably find linear structure in synthetic and natural data. In addition, when the inliers lie near a low-dimensional subspace, there is a rigorous theory that describes when reaper can approximate this subspace.


Foundations of Computational Mathematics | 2014

Sharp Recovery Bounds for Convex Demixing, with Applications

Michael B. McCoy; Joel A. Tropp

Demixing refers to the challenge of identifying two structured signals given only the sum of the two signals and prior information about their structures. Examples include the problem of separating a signal that is sparse with respect to one basis from a signal that is sparse with respect to a second basis, and the problem of decomposing an observed matrix into a low-rank matrix plus a sparse matrix. This paper describes and analyzes a framework, based on convex optimization, for solving these demixing problems, and many others. This work introduces a randomized signal model that ensures that the two structures are incoherent, i.e., generically oriented. For an observation from this model, this approach identifies a summary statistic that reflects the complexity of a particular signal. The difficulty of separating two structured, incoherent signals depends only on the total complexity of the two structures. Some applications include (1) demixing two signals that are sparse in mutually incoherent bases, (2) decoding spread-spectrum transmissions in the presence of impulsive errors, and (3) removing sparse corruptions from a low-rank matrix. In each case, the theoretical analysis of the convex demixing method closely matches its empirical behavior.


IEEE Signal Processing Magazine | 2014

Convexity in Source Separation : Models, geometry, and algorithms

Michael B. McCoy; Volkan Cevher; Quoc Tran Dinh; Afsaneh Asaei; Luca Baldassarre

Source separation, or demixing, is the process of extracting multiple components entangled within a signal. Contemporary signal processing presents a host of difficult source separation problems, from interference cancellation to background subtraction, blind deconvolution, and even dictionary learning. Despite the recent progress in each of these applications, advances in high-throughput sensor technology place demixing algorithms under pressure to accommodate extremely high-dimensional signals, separate an ever larger number of sources, and cope with more sophisticated signal and mixing models. These difficulties are exacerbated by the need for real-time action in automated decision-making systems.


Discrete and Computational Geometry | 2014

From Steiner Formulas for Cones to Concentration of Intrinsic Volumes

Michael B. McCoy; Joel A. Tropp

The intrinsic volumes of a convex cone are geometric functionals that return basic structural information about the cone. Recent research has demonstrated that conic intrinsic volumes are valuable for understanding the behavior of random convex optimization problems. This paper develops a systematic technique for studying conic intrinsic volumes using methods from probability. At the heart of this approach is a general Steiner formula for cones. This result converts questions about the intrinsic volumes into questions about the projection of a Gaussian random vector onto the cone, which can then be resolved using tools from Gaussian analysis. The approach leads to new identities and bounds for the intrinsic volumes of a cone, including a near-optimal concentration inequality.


arXiv: Information Theory | 2014

Living on the edge: phase transitions in convex programs with random data

Dennis Amelunxen; Martin Lotz; Michael B. McCoy; Joel A. Tropp


2013.. | 2013

Living on the edge: A geometric theory of phase transitions in convex optimization

Dennis Amelunxen; Martin Lotz; Michael B. McCoy; Joel A. Tropp


Archive | 2012

Sharp recovery bounds for convex deconvolution, with applications

Michael B. McCoy; Joel A. Tropp


Archive | 2012

Robust computation of linear models, or How to find a needle in a haystack

Gilad Lerman; Michael B. McCoy; Joel A. Tropp; Teng Zhang


arXiv: Information Theory | 2013

The Achievable Performance of Convex Demixing

Michael B. McCoy; Joel A. Tropp

Collaboration


Dive into the Michael B. McCoy's collaboration.

Top Co-Authors

Avatar

Joel A. Tropp

California Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Gilad Lerman

University of Minnesota

View shared research outputs
Top Co-Authors

Avatar

Teng Zhang

University of Minnesota

View shared research outputs
Top Co-Authors

Avatar

Afsaneh Asaei

Idiap Research Institute

View shared research outputs
Top Co-Authors

Avatar

Luca Baldassarre

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Volkan Cevher

École Polytechnique Fédérale de Lausanne

View shared research outputs
Top Co-Authors

Avatar

Quoc Tran Dinh

Katholieke Universiteit Leuven

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Martin Lotz

University of Paderborn

View shared research outputs
Researchain Logo
Decentralizing Knowledge