Shoham Sabach
Technion – Israel Institute of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Shoham Sabach.
Mathematical Programming | 2014
Jérôme Bolte; Shoham Sabach; Marc Teboulle
We introduce a proximal alternating linearized minimization (PALM) algorithm for solving a broad class of nonconvex and nonsmooth minimization problems. Building on the powerful Kurdyka–Łojasiewicz property, we derive a self-contained convergence analysis framework and establish that each bounded sequence generated by PALM globally converges to a critical point. Our approach allows to analyze various classes of nonconvex-nonsmooth problems and related nonconvex proximal forward–backward algorithms with semi-algebraic problem’s data, the later property being shared by many functions arising in a wide variety of fundamental applications. A by-product of our framework also shows that our results are new even in the convex setting. As an illustration of the results, we derive a new and simple globally convergent algorithm for solving the sparse nonnegative matrix factorization problem.
Numerical Functional Analysis and Optimization | 2010
Simeon Reich; Shoham Sabach
Two strong convergence theorems for a proximal method for finding common zeroes of maximal monotone operators in reflexive Banach spaces are established. Both theorems take into account possible computational errors.
Siam Journal on Optimization | 2011
Gábor Kassay; Simeon Reich; Shoham Sabach
We prove strong convergence theorems for three iterative algorithms which approximate solutions to systems of variational inequalities for mappings of monotone type. All the theorems are set in reflexive Banach spaces and take into account possible computational errors.
Fixed-point algorithms for inverse problems in science and engineering, 2011, ISBN 978-1-4419-9568-1, págs. 301-316 | 2011
Simeon Reich; Shoham Sabach
We study the existence and approximation of fixed points of Bregman firmly nonexpansive mappings in reflexive Banach spaces.
Journal of Optimization Theory and Applications | 2015
Amir Beck; Shoham Sabach
In 1937, the 16-years-old Hungarian mathematician Endre Weiszfeld, in a seminal paper, devised a method for solving the Fermat–Weber location problem—a problem whose origins can be traced back to the seventeenth century. Weiszfeld’s method stirred up an enormous amount of research in the optimization and location communities, and is also being discussed and used till these days. In this paper, we review both the past and the ongoing research on Weiszfed’s method. The existing results are presented in a self-contained and concise manner—some are derived by new and simplified techniques. We also establish two new results using modern tools of optimization. First, we establish a non-asymptotic sublinear rate of convergence of Weiszfeld’s method, and second, using an exact smoothing technique, we present a modification of the method with a proven better rate of convergence.
Siam Journal on Imaging Sciences | 2015
Robert Hesse; D. Russell Luke; Shoham Sabach; Matthew K. Tam
We propose a general alternating minimization algorithm for nonconvex optimization problems with separable structure and nonconvex coupling between blocks of variables. To fix our ideas, we apply the methodology to the problem of blind ptychographic imaging. Compared to other schemes in the literature, our approach differs in two ways: (i) it is posed within a clear mathematical framework with practical verifiable assumptions, and (ii) under the given assumptions, it is provably convergent to critical points. A numerical comparison of our proposed algorithm with the current state of the art on simulated and experimental data validates our approach and points toward directions for further improvement.
Operations Research Letters | 2015
Yoel Drori; Shoham Sabach; Marc Teboulle
We introduce a novel algorithm for solving a class of structured nonsmooth convex-concave saddle-point problems involving a smooth function and a sum of finitely many bilinear terms and nonsmooth functions. The proposed method is simple and proven to globally converge to a saddle-point with an O ( 1 / e ) efficiency estimate. We demonstrate its usefulness for tackling a broad class of minimization models with a finitely sum of composite nonsmooth functions.
Siam Journal on Optimization | 2011
Shoham Sabach
We propose two algorithms for finding (common) zeros of finitely many maximal monotone mappings in reflexive Banach spaces. These algorithms are based on the Bregman distance related to a well-chosen convex function and improve previous results. Finally, we mention two applications of our algorithms for solving equilibrium problems and convex feasibility problems.
Siam Journal on Imaging Sciences | 2016
Thomas Pock; Shoham Sabach
In this paper we study nonconvex and nonsmooth optimization problems with semialgebraic data, where the variables vector is split into several blocks of variables. The problem consists of one smooth function of the entire variables vector and the sum of nonsmooth functions for each block separately. We analyze an inertial version of the proximal alternating linearized minimization algorithm and prove its global convergence to a critical point of the objective function at hand. We illustrate our theoretical findings by presenting numerical experiments on blind image deconvolution, on sparse nonnegative matrix factorization and on dictionary learning, which demonstrate the viability and effectiveness of the proposed method.
Siam Journal on Optimization | 2015
Amir Beck; Edouard Pauwels; Shoham Sabach
In this paper we study the convex problem of optimizing the sum of a smooth function and a compactly supported nonsmooth term with a specific separable form. We analyze the block version of the generalized conditional gradient method when the blocks are chosen in a cyclic order. A global sublinear rate of convergence is established for two different stepsize strategies commonly used in this class of methods. Numerical comparisons of the proposed method to both the classical conditional gradient algorithm and its random block version demonstrate the effectiveness of the cyclic block update rule.