Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Yilun Chen is active.

Publication


Featured researches published by Yilun Chen.


international conference on acoustics, speech, and signal processing | 2009

Sparse LMS for system identification

Yilun Chen; Yuantao Gu; Alfred O. Hero

We propose a new approach to adaptive system identification when the system model is sparse. The approach applies ℓ1 relaxation, common in compressive sensing, to improve the performance of LMS-type adaptive methods. This results in two new algorithms, the zero-attracting LMS (ZA-LMS) and the reweighted zero-attracting LMS (RZA-LMS). The ZA-LMS is derived via combining a ℓ1 norm penalty on the coefficients into the quadratic LMS cost function, which generates a zero attractor in the LMS iteration. The zero attractor promotes sparsity in taps during the filtering process, and therefore accelerates convergence when identifying sparse systems. We prove that the ZA-LMS can achieve lower mean square error than the standard LMS. To further improve the filtering performance, the RZA-LMS is developed using a reweighted zero attractor. The performance of the RZA-LMS is superior to that of the ZA-LMS numerically. Experiments demonstrate the advantages of the proposed filters in both convergence rate and steady-state behavior under sparsity assumptions on the true coefficient vector. The RZA-LMS is also shown to be robust when the number of non-zero taps increases.


IEEE Transactions on Signal Processing | 2010

Shrinkage Algorithms for MMSE Covariance Estimation

Yilun Chen; Ami Wiesel; Yonina C. Eldar; Alfred O. Hero

We address covariance estimation in the sense of minimum mean-squared error (MMSE) when the samples are Gaussian distributed. Specifically, we consider shrinkage methods which are suitable for high dimensional problems with a small number of samples (large p small n). First, we improve on the Ledoit-Wolf (LW) method by conditioning on a sufficient statistic. By the Rao-Blackwell theorem, this yields a new estimator called RBLW, whose mean-squared error dominates that of LW for Gaussian variables. Second, to further reduce the estimation error, we propose an iterative approach which approximates the clairvoyant shrinkage estimator. Convergence of this iterative method is established and a closed form expression for the limit is determined, which is referred to as the oracle approximating shrinkage (OAS) estimator. Both RBLW and OAS estimators have simple expressions and are easily implemented. Although the two methods are developed from different perspectives, their structure is identical up to specified constants. The RBLW estimator provably dominates the LW method for Gaussian samples. Numerical simulations demonstrate that the OAS approach can perform even better than RBLW, especially when n is much less than p . We also demonstrate the performance of these techniques in the context of adaptive beamforming.


IEEE Transactions on Signal Processing | 2011

Robust Shrinkage Estimation of High-Dimensional Covariance Matrices

Yilun Chen; Ami Wiesel; Alfred O. Hero

We address high dimensional covariance estimation for elliptical distributed samples, which are also known as spherically invariant random vectors (SIRV) or compound-Gaussian processes. Specifically we consider shrinkage methods that are suitable for high dimensional problems with a small number of samples (large p small n). We start from a classical robust covariance estimator [Tyler (1987)], which is distribution-free within the family of elliptical distribution but inapplicable when n <; p. Using a shrinkage coefficient, we regularize Tylers fixed-point iterations. We prove that, for all n and p , the proposed fixed-point iterations converge to a unique limit regardless of the initial condition. Next, we propose a simple, closed-form and data dependent choice for the shrinkage coefficient, which is based on a minimum mean squared error framework. Simulations demonstrate that the proposed method achieves low estimation error and is robust to heavy-tailed samples. Finally, as a real-world application we demonstrate the performance of the proposed technique in the context of activity/intrusion detection using a wireless sensor network.


international conference on acoustics, speech, and signal processing | 2010

Modulated wideband converter with non-ideal lowpass filters

Yilun Chen; Moshe Mishali; Yonina C. Eldar; Alfred O. Hero

We investigate the impact of using non-ideal lowpass filters in the modulated wideband (MWC) converter, which is a recent sub-Nyquist sampling system for sparse wideband analog signals. We begin by deriving a perfect reconstruction condition for general lowpass filters, which coincides with the well-known Nyquist inter-symbol interference (ISI) criterion in communication theory. Then, we propose to compensate for the non-ideal lowpass filters using a digital FIR correction scheme. The proposed solution is validated by experimental results.


international conference on acoustics, speech, and signal processing | 2009

Shrinkage estimation of high dimensional covariance matrices

Yilun Chen; Ami Wiesel; Alfred O. Hero

We address covariance estimation under mean-squared loss in the Gaussian setting. Specifically, we consider shrinkage methods which are suitable for high dimensional problems with small number of samples (large p small n). First, we improve on the Ledoit-Wolf (LW) method by conditioning on a sufficient statistic via the Rao-Blackwell theorem, obtaining a new estimator RBLW whose mean-squared error dominates the LW under Gaussian model. Second, to further reduce the estimation error, we propose an iterative approach which approximates the clairvoyant shrinkage estimator. Convergence of this iterative method is proven and a closed form expression for the limit is determined, which is called the OAS estimator. Both of the proposed estimators have simple expressions and are easy to compute. Although the two methods are developed from different approaches, their structure is identical up to specific constants. The RBLW estimator provably dominates the LW method; and numerical simulations demonstrate that the OAS estimator performs even better, especially when n is much less than p.


IEEE Transactions on Signal Processing | 2012

Recursive

Yilun Chen; Alfred O. Hero

We introduce a recursive adaptive group lasso algorithm for real-time penalized least squares prediction that produces a time sequence of optimal sparse predictor coefficient vectors. At each time index the proposed algorithm computes an exact update of the optimal l1,∞-penalized recursive least squares (RLS) predictor. Each update minimizes a convex but nondifferentiable function optimization problem. We develop an on-line homotopy method to reduce the computational complexity. Numerical simulations demonstrate that the proposed algorithm outperforms the l1 regularized RLS algorithm for a group sparse system identification problem and has lower implementation complexity than direct group lasso solvers.


sensor array and multichannel signal processing workshop | 2010

\ell_{1,\infty}

Yilun Chen; Ami Wiesel; Alfred O. Hero

We address high dimensional covariance estimation for elliptical distributed samples. Specifically we consider shrinkage methods that are suitable for high dimensional problems with a small number of samples (large p small n). We start from a classical robust covariance estimator [Tyler(1987)], which is distribution-free within the family of elliptical distribution but inapplicable when n < p. Using a shrinkage coefficient, we regularize Tylers fixed point iteration. We derive the minimum mean-squared-error shrinkage coefficient in closed form. The closed form expression is a function of the unknown true covariance and cannot be implemented in practice. Instead, we propose a plug-in estimate to approximate it. Simulations demonstrate that the proposed method achieves low estimation error and is robust to heavy-tailed samples.


asilomar conference on signals, systems and computers | 2011

Group Lasso

Xu Chen; Yilun Chen; Alfred O. Hero

In this paper, we introduce a dimensionality reduction method that can be applied to clustering of high dimensional empirical distributions. The proposed approach is based on stabilized information geometrical representation of the feature distributions. The problem of dimensionality reduction on spaces of distribution functions arises in many applications including hyperspectral imaging, document clustering, and classifying flow cytometry data. Our method is a shrinkage regularized version of Fisher information distance, that we call shrinkage FINE (sFINE), which is implemented by Steinian shrinkage estimation of the matrix of Kullback Liebler distances between feature distributions. The proposed method involves computing similarities using shrinkage regularized Fisher information distance between probability density functions (PDFs) of the data features, then applying Laplacian eigenmaps on a derived similarity matrix to accomplish the embedding and perform clustering. The shrinkage regularization controls the trade-off between bias and variance and is especially well-suited for clustering empirical probability distributions of high-dimensional data sets. We also show significant gains in clustering performance on both of the UCI dataset and a spam data set. Finally we demonstrate the superiority of embedding and clustering distributional data using sFINE as compared to other state-of-the-art methods such as non-parametric information clustering, support vector machine (SVM) and sparse K-means.


arXiv: Methodology | 2010

Robust shrinkage estimation of high-dimensional covariance matrices

Yilun Chen; Yuantao Gu; Alfred O. Hero


international conference on communications | 2009

Shrinkage fisher information embedding of high dimensional feature distributions

Kevin S. Xu; Mark Kliger; Yilun Chen; Peter J. Woolf; Alfred O. Hero

Collaboration


Dive into the Yilun Chen's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ami Wiesel

Hebrew University of Jerusalem

View shared research outputs
Top Co-Authors

Avatar

Yonina C. Eldar

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kevin S. Xu

University of Michigan

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Xu Chen

University of Illinois at Chicago

View shared research outputs
Top Co-Authors

Avatar

Mark Kliger

Ben-Gurion University of the Negev

View shared research outputs
Top Co-Authors

Avatar

Moshe Mishali

Technion – Israel Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge