Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Michael Zibulevsky is active.

Publication


Featured researches published by Michael Zibulevsky.


Neural Computation | 2001

Blind Source Separation by Sparse Decomposition in a Signal Dictionary

Michael Zibulevsky; Barak A. Pearlmutter

The blind source separation problem is to extract the underlying source signals from a set of linear mixtures, where the mixing matrix is unknown. This situation is common in acoustics, radio, medical signal and image processing, hyperspectral imaging, and other areas. We suggest a two-stage separation process: a priori selection of a possibly overcomplete signal dictionary (for instance, a wavelet frame or a learned dictionary) in which the sources are assumed to be sparsely representable, followed by unmixing the sources by exploiting the their sparse representability. We consider the general case of more sources than mixtures, but also derive a more efficient algorithm in the case of a nonovercomplete dictionary and an equal numbers of sources and mixtures. Experiments with artificial signals and musical sounds demonstrate significantly better separation than other known techniques.


Signal Processing | 2001

Underdetermined blind source separation using sparse representations

Pau Bofill; Michael Zibulevsky

The scope of this work is the separation of N sources from M linear mixtures when the underlying system is underdetermined, that is, when Mi N. If the input distribution is sparse the mixing matrix can be estimated either by external optimization or by clustering and, given the mixing matrix, a minimal l1 norm representation of the sources can be obtained by solving a low-dimensional linear programming problem for each of the data points. Yet, when the signals per se do not satisfy this assumption, sparsity can still be achieved by realizing the separation in a sparser transformed domain. The approach is illustrated here for M = 2. In this case we estimate both the number of sources and the mixing matrix by the maxima of a potential function along the circle of unit length, and we obtain the minimal l1 norm representation of each data point by a linear combination of the pair of basis vectors that enclose it. Several experiments with music and speech signals show that their time-domain representation is not sparse enough. Yet, excellent results were obtained using their short-time Fourier transform, including the separation of up to six sources from two mixtures. ? 2001 Elsevier Science B.V. All rights reserved.


IEEE Transactions on Signal Processing | 2010

Double Sparsity: Learning Sparse Dictionaries for Sparse Signal Approximation

Ron Rubinstein; Michael Zibulevsky; Michael Elad

An efficient and flexible dictionary structure is proposed for sparse and redundant signal representation. The proposed sparse dictionary is based on a sparsity model of the dictionary atoms over a base dictionary, and takes the form D = ¿ A, where ¿ is a fixed base dictionary and A is sparse. The sparse dictionary provides efficient forward and adjoint operators, has a compact representation, and can be effectively trained from given example data. In this, the sparse structure bridges the gap between implicit dictionaries, which have efficient implementations yet lack adaptability, and explicit dictionaries, which are fully adaptable but non-efficient and costly to deploy. In this paper, we discuss the advantages of sparse dictionaries, and present an efficient algorithm for training them. We demonstrate the advantages of the proposed structure for 3-D image denoising.


IEEE Signal Processing Magazine | 2010

L1-L2 Optimization in Signal and Image Processing

Michael Zibulevsky; Michael Elad

Sparse, redundant representations offer a powerful emerging model for signals. This model approximates a data source as a linear combination of few atoms from a prespecified and over-complete dictionary. Often such models are fit to data by solving mixed ¿1-¿2 convex optimization problems. Iterative-shrinkage algorithms constitute a new family of highly effective numerical methods for handling these problems, surpassing traditional optimization techniques. In this article, we give a broad view of this group of methods, derive some of them, show accelerations based on the sequential subspace optimization (SESOP), fast iterative soft-thresholding algorithm (FISTA) and the conjugate gradient (CG) method, present a comparative performance, and discuss their potential in various applications, such as compressed sensing, computed tomography, and deblurring.


IEEE Transactions on Information Theory | 2008

On the Uniqueness of Nonnegative Sparse Solutions to Underdetermined Systems of Equations

Alfred M. Bruckstein; Michael Elad; Michael Zibulevsky

An underdetermined linear system of equations Ax = b with nonnegativity constraint x ges 0 is considered. It is shown that for matrices A with a row-span intersecting the positive orthant, if this problem admits a sufficiently sparse solution, it is necessarily unique. The bound on the required sparsity depends on a coherence property of the matrix A. This coherence measure can be improved by applying a conditioning stage on A, thereby strengthening the claimed result. The obtained uniqueness theorem relies on an extended theoretical analysis of the lscr0 - lscr1 equivalence developed here as well, considering a matrix A with arbitrary column norms, and an arbitrary monotone element-wise concave penalty replacing the lscr1-norm objective function. Finally, from a numerical point of view, a greedy algorithm-a variant of the matching pursuit-is presented, such that it is guaranteed to find this sparse solution. It is further shown how this algorithm can benefit from well-designed conditioning of A .


Siam Journal on Optimization | 1997

Penalty/Barrier Multiplier Methods for Convex Programming Problems

Aharon Ben-Tal; Michael Zibulevsky

We study a class of methods for solving convex programs, which are based on nonquadratic augmented Lagrangians for which the penalty parameters are functions of the multipliers. This gives rise to Lagrangians which are nonlinear in the multipliers. Each augmented Lagrangian is specified by a choice of a penalty function


International Journal of Imaging Systems and Technology | 2005

Sparse ICA for blind separation of transmitted and reflected images

Alexander M. Bronstein; Michael M. Bronstein; Michael Zibulevsky; Yehoshua Y. Zeevi

\varphi


Proceedings of SPIE | 2007

A Wide-Angle View at Iterated Shrinkage Algorithms

Michael Elad; Boaz Matalon; Joseph Shtok; Michael Zibulevsky

and a penalty-updating function


IEEE Transactions on Image Processing | 2005

Blind deconvolution of images using optimal sparse representations

Michael M. Bronstein; Alexander M. Bronstein; Michael Zibulevsky; Yehoshua Y. Zeevi

\pi


computer vision and pattern recognition | 2006

Image Denoising with Shrinkage and Redundant Representations

Michael Elad; Boaz Matalon; Michael Zibulevsky

. The requirements on

Collaboration


Dive into the Michael Zibulevsky's collaboration.

Top Co-Authors

Avatar

Alexander M. Bronstein

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Yehoshua Y. Zeevi

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Michael Elad

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Irad Yavneh

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Pavel Kisilev

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Eliyahu Osherovich

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Joseph Shtok

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Boaz Matalon

Technion – Israel Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge