Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Shoham Sabach is active.

Publication


Featured researches published by Shoham Sabach.


Mathematical Programming | 2014

Proximal alternating linearized minimization for nonconvex and nonsmooth problems

Jérôme Bolte; Shoham Sabach; Marc Teboulle

We introduce a proximal alternating linearized minimization (PALM) algorithm for solving a broad class of nonconvex and nonsmooth minimization problems. Building on the powerful Kurdyka–Łojasiewicz property, we derive a self-contained convergence analysis framework and establish that each bounded sequence generated by PALM globally converges to a critical point. Our approach allows to analyze various classes of nonconvex-nonsmooth problems and related nonconvex proximal forward–backward algorithms with semi-algebraic problem’s data, the later property being shared by many functions arising in a wide variety of fundamental applications. A by-product of our framework also shows that our results are new even in the convex setting. As an illustration of the results, we derive a new and simple globally convergent algorithm for solving the sparse nonnegative matrix factorization problem.


Numerical Functional Analysis and Optimization | 2010

Two Strong Convergence Theorems for a Proximal Method in Reflexive Banach Spaces

Simeon Reich; Shoham Sabach

Two strong convergence theorems for a proximal method for finding common zeroes of maximal monotone operators in reflexive Banach spaces are established. Both theorems take into account possible computational errors.


Siam Journal on Optimization | 2011

Iterative Methods for Solving Systems of Variational Inequalities in Reflexive Banach Spaces

Gábor Kassay; Simeon Reich; Shoham Sabach

We prove strong convergence theorems for three iterative algorithms which approximate solutions to systems of variational inequalities for mappings of monotone type. All the theorems are set in reflexive Banach spaces and take into account possible computational errors.


Fixed-point algorithms for inverse problems in science and engineering, 2011, ISBN 978-1-4419-9568-1, págs. 301-316 | 2011

Existence and Approximation of Fixed Points of Bregman Firmly Nonexpansive Mappings in Reflexive Banach Spaces

Simeon Reich; Shoham Sabach

We study the existence and approximation of fixed points of Bregman firmly nonexpansive mappings in reflexive Banach spaces.


Journal of Optimization Theory and Applications | 2015

Weiszfeld's Method: Old and New Results

Amir Beck; Shoham Sabach

In 1937, the 16-years-old Hungarian mathematician Endre Weiszfeld, in a seminal paper, devised a method for solving the Fermat–Weber location problem—a problem whose origins can be traced back to the seventeenth century. Weiszfeld’s method stirred up an enormous amount of research in the optimization and location communities, and is also being discussed and used till these days. In this paper, we review both the past and the ongoing research on Weiszfed’s method. The existing results are presented in a self-contained and concise manner—some are derived by new and simplified techniques. We also establish two new results using modern tools of optimization. First, we establish a non-asymptotic sublinear rate of convergence of Weiszfeld’s method, and second, using an exact smoothing technique, we present a modification of the method with a proven better rate of convergence.


Siam Journal on Imaging Sciences | 2015

Proximal Heterogeneous Block Implicit-Explicit Method and Application to Blind Ptychographic Diffraction Imaging

Robert Hesse; D. Russell Luke; Shoham Sabach; Matthew K. Tam

We propose a general alternating minimization algorithm for nonconvex optimization problems with separable structure and nonconvex coupling between blocks of variables. To fix our ideas, we apply the methodology to the problem of blind ptychographic imaging. Compared to other schemes in the literature, our approach differs in two ways: (i) it is posed within a clear mathematical framework with practical verifiable assumptions, and (ii) under the given assumptions, it is provably convergent to critical points. A numerical comparison of our proposed algorithm with the current state of the art on simulated and experimental data validates our approach and points toward directions for further improvement.


Operations Research Letters | 2015

A simple algorithm for a class of nonsmooth convex-concave saddle-point problems

Yoel Drori; Shoham Sabach; Marc Teboulle

We introduce a novel algorithm for solving a class of structured nonsmooth convex-concave saddle-point problems involving a smooth function and a sum of finitely many bilinear terms and nonsmooth functions. The proposed method is simple and proven to globally converge to a saddle-point with an O ( 1 / e ) efficiency estimate. We demonstrate its usefulness for tackling a broad class of minimization models with a finitely sum of composite nonsmooth functions.


Siam Journal on Optimization | 2011

Products of Finitely Many Resolvents of Maximal Monotone Mappings in Reflexive Banach Spaces

Shoham Sabach

We propose two algorithms for finding (common) zeros of finitely many maximal monotone mappings in reflexive Banach spaces. These algorithms are based on the Bregman distance related to a well-chosen convex function and improve previous results. Finally, we mention two applications of our algorithms for solving equilibrium problems and convex feasibility problems.


Siam Journal on Imaging Sciences | 2016

Inertial proximal alternating linearized minimization (iPALM) for nonconvex and nonsmooth problems

Thomas Pock; Shoham Sabach

In this paper we study nonconvex and nonsmooth optimization problems with semialgebraic data, where the variables vector is split into several blocks of variables. The problem consists of one smooth function of the entire variables vector and the sum of nonsmooth functions for each block separately. We analyze an inertial version of the proximal alternating linearized minimization algorithm and prove its global convergence to a critical point of the objective function at hand. We illustrate our theoretical findings by presenting numerical experiments on blind image deconvolution, on sparse nonnegative matrix factorization and on dictionary learning, which demonstrate the viability and effectiveness of the proposed method.


Siam Journal on Optimization | 2015

The Cyclic Block Conditional Gradient Method for Convex Optimization Problems

Amir Beck; Edouard Pauwels; Shoham Sabach

In this paper we study the convex problem of optimizing the sum of a smooth function and a compactly supported nonsmooth term with a specific separable form. We analyze the block version of the generalized conditional gradient method when the blocks are chosen in a cyclic order. A global sublinear rate of convergence is established for two different stepsize strategies commonly used in this class of methods. Numerical comparisons of the proposed method to both the classical conditional gradient algorithm and its random block version demonstrate the effectiveness of the cyclic block update rule.

Collaboration


Dive into the Shoham Sabach's collaboration.

Top Co-Authors

Avatar

Simeon Reich

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Amir Beck

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yonina C. Eldar

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mordechai Segev

Technion – Israel Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Oren Cohen

Technion – Israel Institute of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge