Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nam H. Nguyen is active.

Publication


Featured researches published by Nam H. Nguyen.


IEEE Transactions on Signal Processing | 2012

Fast and Efficient Compressive Sensing Using Structurally Random Matrices

Thong T. Do; Lu Gan; Nam H. Nguyen; Trac D. Tran

This paper introduces a new framework to construct fast and efficient sensing matrices for practical compressive sensing, called Structurally Random Matrix (SRM). In the proposed framework, we prerandomize the sensing signal by scrambling its sample locations or flipping its sample signs and then fast-transform the randomized samples and finally, subsample the resulting transform coefficients to obtain the final sensing measurements. SRM is highly relevant for large-scale, real-time compressive sensing applications as it has fast computation and supports block-based processing. In addition, we can show that SRM has theoretical sensing performance comparable to that of completely random sensing matrices. Numerical simulation results verify the validity of the theory and illustrate the promising potentials of the proposed sensing framework.


IEEE Transactions on Information Theory | 2013

Exact Recoverability From Dense Corrupted Observations via

Nam H. Nguyen; Trac D. Tran

This paper confirms a surprising phenomenon first observed by Wright under a different setting: given <i>m</i> highly corrupted measurements <i>y</i> = <i>A</i><sub>Ω·</sub><i>x*</i> + <i>e</i><sup>*</sup>, where <i>A</i><sub>Ω·</sub> is a submatrix whose rows are selected uniformly at random from rows of an orthogonal matrix <i>A</i> and <i>e</i><sup>*</sup> is an unknown sparse error vector whose nonzero entries may be unbounded, we show that with high probability, ℓ<sub>1</sub>-minimization can recover the sparse signal of interest <i>x</i><sup>*</sup> exactly from only <i>m</i> = <i>C</i> μ<sup>2</sup><i>k</i> (log<i>n</i>)<sup>2</sup>, where <i>k</i> is the number of nonzero components of <i>x</i><sup>*</sup> and μ = <i>n</i> max<i>ij Aij</i><sup>2</sup>, even if a significant fraction of the measurements are corrupted. We further guarantee that stable recovery is possible when measurements are polluted by both gross sparse and small dense errors: <i>y</i> = <i>A</i><sub>Ω·</sub><i>x</i><sup>*</sup> + <i>e</i><sup>*</sup>+ ν, where ν is the small dense noise with bounded energy. Numerous simulation results under various settings are also presented to verify the validity of the theory as well as to illustrate the promising potential of the proposed framework.


IEEE Transactions on Information Theory | 2017

\ell _{1}

Nam H. Nguyen; Deanna Needell; Tina Woolf

Motivated by recent work on stochastic gradient descent methods, we develop two stochastic variants of greedy algorithms for possibly non-convex optimization problems with sparsity constraints. We prove linear convergence1 in expectation to the solution within a specified tolerance. This generalized framework is specialized to the problems of sparse signal recovery in compressed sensing and low-rank matrix recovery, giving methods with provable convergence guarantees that often outperform their deterministic counterparts. We also analyze the settings, where gradients and projections can only be computed approximately, and prove the methods are robust to these approximations. We include many numerical experiments, which align with the theoretical analysis and demonstrate these improvements in several different settings.1Linear convergence is sometimes called exponential convergence.


IEEE Transactions on Signal Processing | 2016

-Minimization

Minh Dao; Nam H. Nguyen; Nasser M. Nasrabadi; Trac D. Tran

In this paper, we propose a general collaborative sparse representation framework for multi-sensor classification, which takes into account the correlations as well as complementary information between heterogeneous sensors simultaneously while considering joint sparsity within each sensors observations. We also robustify our models to deal with the presence of sparse noise and low-rank interference signals. Specifically, we demonstrate that incorporating the noise or interference signal as a low-rank component in our models is essential in a multi-sensor classification problem when multiple co-located sources/sensors simultaneously record the same physical event. We further extend our frameworks to kernelized models which rely on sparsely representing a test sample in terms of all the training samples in a feature space induced by a kernel function. A fast and efficient algorithm based on alternative direction method is proposed where its convergence to an optimal solution is guaranteed. Extensive experiments are conducted on several real multi-sensor data sets and results are compared with the conventional classifiers to verify the effectiveness of the proposed methods.


ieee international workshop on computational advances in multi sensor adaptive processing | 2013

Linear Convergence of Stochastic Iterative Greedy Algorithms With Sparse Constraints

Nam H. Nguyen; Laurent Demanet

This note extends the superset method for sparse signal recovery from bandlimited measurements to the two-dimensional case. The algorithm leverages translation-invariance of the Fourier basis functions by constructing a Hankel tensor, and identifying the signal subspace from its range space. In the noisy case, this method determines a superset which then needs to undergo pruning. The method displays reasonable robustness to noise, and unlike ℓ1 minimization, always succeeds in the noiseless case.


IEEE Transactions on Information Theory | 2013

Collaborative Multi-Sensor Classification Via Sparsity-Based Representation

Nam H. Nguyen; Trac D. Tran


symposium on the theory of computing | 2009

Sparse image super-resolution via superset selection and pruning

Nam H. Nguyen; Thong T. Do; Trac D. Tran


international conference on information fusion | 2011

Robust Lasso With Missing and Grossly Corrupted Observations

Nam H. Nguyen; Nasser M. Nasrabadi; Trac D. Tran


arXiv: Information Theory | 2015

A fast and efficient algorithm for low-rank approximation of a matrix

Laurent Demanet; Nam H. Nguyen


arXiv: Information Theory | 2013

Robust multi-sensor classification via joint sparse representation

Laurent Demanet; Deanna Needell; Nam H. Nguyen

Collaboration


Dive into the Nam H. Nguyen's collaboration.

Top Co-Authors

Avatar

Trac D. Tran

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar

Thong T. Do

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar

Laurent Demanet

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Deanna Needell

Claremont McKenna College

View shared research outputs
Top Co-Authors

Avatar

Lu Gan

Brunel University London

View shared research outputs
Top Co-Authors

Avatar

Minh Dao

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar

Yi Chen

Johns Hopkins University

View shared research outputs
Top Co-Authors

Avatar

Igor Melnyk

University of Minnesota

View shared research outputs
Top Co-Authors

Avatar

Petros Drineas

Rensselaer Polytechnic Institute

View shared research outputs
Researchain Logo
Decentralizing Knowledge