Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Laming Chen is active.

Publication


Featured researches published by Laming Chen.


IEEE Transactions on Signal Processing | 2014

The Convergence Guarantees of a Non-Convex Approach for Sparse Recovery

Laming Chen; Yuantao Gu

In the area of sparse recovery, numerous researches hint that non-convex penalties might induce better sparsity than convex ones, but up until now those corresponding non-convex algorithms lack convergence guarantees from the initial solution to the global optimum. This paper aims to provide performance guarantees of a non-convex approach for sparse recovery. Specifically, the concept of weak convexity is incorporated into a class of sparsity-inducing penalties to characterize the non-convexity. Borrowing the idea of the projected subgradient method, an algorithm is proposed to solve the non-convex optimization problem. In addition, a uniform approximate projection is adopted in the projection step to make this algorithm computationally tractable for large scale problems. The convergence analysis is provided in the noisy scenario. It is shown that if the non-convexity of the penalty is below a threshold (which is in inverse proportion to the distance between the initial solution and the sparse signal), the recovered solution has recovery error linear in both the step size and the noise term. Numerical simulations are implemented to test the performance of the proposed approach and verify the theoretical analysis.


IEEE Transactions on Signal Processing | 2013

On the Performance Bound of Sparse Estimation With Sensing Matrix Perturbation

Yujie Tang; Laming Chen; Yuantao Gu

This paper focuses on the sparse estimation in the situation where both the the sensing matrix and the measurement vector are corrupted by additive Gaussian noises. The performance bound of sparse estimation is analyzed and discussed in depth. Two types of lower bounds, the constrained Cramér-Rao bound (CCRB) and the Hammersley-Chapman-Robbins bound (HCRB), are discussed. It is shown that the situation with sensing matrix perturbation is more complex than the one with only measurement noise. For the CCRB, its closed-form expression is deduced. It demonstrates a gap between the maximal and nonmaximal support cases. It is also revealed that a gap lies between the CCRB and the MSE of the oracle pseudoinverse estimator, but it approaches zero asymptotically when the problem dimensions tend to infinity. For a tighter bound, the HCRB, despite the difficulty in obtaining a simple expression for general sensing matrix, a closed-form expression in the unit sensing matrix case is derived for a qualitative study of the performance bound. It is shown that the gap between the maximal and nonmaximal cases is eliminated for the HCRB. Numerical simulations are performed to verify the theoretical results in this paper.


IEEE Transactions on Signal Processing | 2012

Proof of Convergence and Performance Analysis for Sparse Recovery via Zero-Point Attracting Projection

Xiaohan Wang; Yuantao Gu; Laming Chen

A recursive algorithm named zero-point attracting projection (ZAP) is proposed recently for sparse signal reconstruction. Compared with the reference algorithms, ZAP demonstrates rather good performance in recovery precision and robustness. However, any theoretical analysis about the mentioned algorithm, even a proof on its convergence, is not available. In this work, a strict proof on the convergence of ZAP is provided and the condition of convergence is put forward. Based on the theoretical analysis, it is further proved that ZAP is non-biased and can approach the sparse solution to any extent, with the proper choice of step-size. Furthermore, the case of inaccurate measurements in noisy scenario is also discussed. It is proved that disturbance power linearly reduces the recovery precision, which is predictable but not preventable. The reconstruction deviation of -compressible signal is also provided. Finally, numerical simulations are performed to verify the theoretical analysis.


international conference on acoustics, speech, and signal processing | 2012

Robustness of orthogonal matching pursuit for multiple measurement vectors in noisy scenario

Jie Ding; Laming Chen; Yuantao Gu

In this paper, we consider orthogonal matching pursuit (OMP) algorithm for multiple measurement vectors (MMV) problem. The robustness of OMPMMV is studied under general perturbations-when the measurement vectors as well as the sensing matrix are incorporated with additive noise. The main result shows that although exact recovery of the sparse solutions is unrealistic in noisy scenario, recovery of the support set of the solutions is guaranteed under suitable conditions. Specifically, a sufficient condition is derived that guarantees exact recovery of the sparse solutions in noiseless scenario.


IEEE Transactions on Signal Processing | 2013

Oracle-Order Recovery Performance of Greedy Pursuits With Replacement Against General Perturbations

Laming Chen; Yuantao Gu

Applying the theory of compressive sensing in practice always takes different kinds of perturbations into consideration. In this paper, the recovery performance of greedy pursuits with replacement for sparse recovery is analyzed when both the measurement vector and the sensing matrix are contaminated with additive perturbations. Specifically, greedy pursuits with replacement include three algorithms, compressive sampling matching pursuit (CoSaMP), subspace pursuit (SP), and iterative hard thresholding (IHT), where the support estimation is evaluated and updated in each iteration. Based on restricted isometry property, a unified form of the error bounds of these recovery algorithms is derived under general perturbations for compressible signals. The results reveal that the recovery performance is stable against both perturbations. In addition, these bounds are compared with that of oracle recovery-least squares solution with the locations of some largest entries in magnitude known a priori. The comparison shows that the error bounds of these algorithms only differ in coefficients from the lower bound of oracle recovery for some certain signal and perturbations, as reveals that oracle-order recovery performance of greedy pursuits with replacement is guaranteed. Numerical simulations are performed to verify the conclusions.


international conference on acoustics, speech, and signal processing | 2014

THE CONVERGENCE GUARANTEES OF A NON-CONVEX APPROACH FOR SPARSE RECOVERY USING REGULARIZED LEAST SQUARES

Laming Chen; Yuantao Gu

Existing literatures suggest that sparsity is more likely to be induced with non-convex penalties, but the corresponding algorithms usually suffer from multiple local minima. In this paper, we introduce a class of sparsity-inducing penalties and provide the convergence guarantees of a non-convex approach for sparse recovery using regularized least squares. Theoretical analysis demonstrates that under some certain conditions, if the non-convexity of the penalty is below a threshold (which is in inverse proportion to the distance between the initialization and the sparse signal), the sparse signal can be stably recovered. Numerical simulations are implemented to verify the theoretical results in this paper and to compare the performance of this approach with other references.


international conference on acoustics, speech, and signal processing | 2013

Backtracking matching pursuit with supplement set of arbitrary size

Laming Chen; Yuantao Gu

The idea of backtracking has been incorporated into the matching pursuit algorithms in sparse recovery, for example, subspace pursuit (SP) and compressive sampling matching pursuit (CoSaMP), to improve the recovery performance. In each iteration, a supplement set of size K or 2K is added to the candidate set to re-evaluate their reliability and then discard the unreliable indices, where K is the sparsity level of the original sparse signal. Yet the optimal choice of the size of the supplement set is still unclear. This paper aims to provide comprehensive analysis on the optimal choice of the size. The optimality is twofold: performance guarantees and computational complexity. By two theorems,we provide theoretical guarantees for the supplement set of arbitrary size, and computational complexity needed for perfect recovery. Numerical simulations demonstrate that a moderate size, such as 0.25K, results in computational efficiency without loss of recovery quality.


international conference on acoustics, speech, and signal processing | 2013

From least squares to sparse: A non-convex approach with guarantee

Laming Chen; Yuantao Gu

This paper aims to provide theoretical guarantees via non-convex optimization for sparse recovery. It is shown that the sparse signal is the unique local optimal solution within a neighborhood, which contains the least squares solution if the sparsity-inducing penalties are not too non-convex. The idea of projected subgradient method is generalized to solve this non-convex optimization problem. A uniform approximate projection is applied in the projection step to make the algorithm more computationally tractable. The theoretical convergence analysis of the proposed method, approximate projected generalized gradient (APGG), is performed in the noisy scenario. The result reveals that if the non-convexity of the penalties is under a threshold, the bound of the recovery error is linear in both the noise bound and the step size. Numerical simulations are performed to test the performance of APGG and verify its theoretical analysis.


international conference on acoustics, speech, and signal processing | 2015

Dynamic zero-point attracting projection for time-varying sparse signal recovery

Jiawei Zhou; Laming Chen; Yuantao Gu

Sparse signal recovery in the static case has been well studied under the framework of Compressive Sensing (CS), while in recent years more attention has also been paid to the dynamic case. In this paper, enlightened by the idea of modified-CS with partially known support, and based on a non-convex optimization approach, we propose the dynamic zero-point attracting projection (DZAP) algorithm to efficiently recover the slowly time-varying sparse signals. Benefiting from the temporal correlation within signal structures, plus an effective prediction method of the future signal based on previous recoveries incorporated, DZAP achieves high-precision recovery with less measurements or larger sparsity level, which is demonstrated by simulations on both synthetic and real data, accompanied by the comparison with other state-of-the-art reference algorithms.


international conference on acoustics, speech, and signal processing | 2015

Local and global optimality of LP minimization for sparse recovery

Laming Chen; Yuantao Gu

In solving the problem of sparse recovery, non-convex techniques have been paid much more attention than ever before, among which the most widely used one is ℓ<sub>p</sub> minimization with p ∈ (0, 1). It has been shown that the global optimality of ℓ<sub>p</sub> minimization is guaranteed under weaker conditions than convex ℓ<sub>1</sub> minimization, but little interest is shown in the local optimality, which is also significant since practical non-convex approaches can only get local optimums. In this work, we derive a tight condition in guaranteeing the local optimality of ℓ<sub>p</sub> minimization. For practical purposes, we study the performance of an approximated version of ℓ<sub>p</sub> minimization, and show that its global optimality is equivalent to that of ℓ<sub>p</sub> minimization when the penalty approaches the ℓ<sub>p</sub> “norm”. Simulations are implemented to show the recovery performance of the approximated optimization in sparse recovery.

Collaboration


Dive into the Laming Chen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Petros T. Boufounos

Mitsubishi Electric Research Laboratories

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge