Paul Tseng
University of Washington
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Paul Tseng.
Journal of Optimization Theory and Applications | 2001
Paul Tseng
We study the convergence properties of a (block) coordinate descent method applied to minimize a nondifferentiable (nonconvex) function f(x1, . . . , xN) with certain separability and regularity properties. Assuming that f is continuous on a compact level set, the subsequence convergence of the iterates to a stationary point is shown when either f is pseudoconvex in every pair of coordinate blocks from among N-1 coordinate blocks or f has at most one minimum in each of N-2 coordinate blocks. If f is quasiconvex and hemivariate in every coordinate block, then the assumptions of continuity of f and compactness of the level set may be relaxed further. These results are applied to derive new (and old) convergence results for the proximal minimization algorithm, an algorithm of Arimoto and Blahut, and an algorithm of Han. They are applied also to a problem of blind source separation.
Siam Journal on Control and Optimization | 2000
Paul Tseng
We consider the forward-backward splitting method for finding a zero of the sum of two maximal monotone mappings. This method is known to converge when the inverse of the forward mapping is strongly monotone. We propose a modification to this method, in the spirit of the extragradient method for monotone variational inequalities, under which the method converges assuming only the forward mapping is (Lipschitz) continuous on some closed convex subset of its domain. The modification entails an additional forward step and a projection step at each iteration. Applications of the modified method to decomposition in convex programming and monotone variational inequalities are discussed.
Siam Journal on Control and Optimization | 1991
Paul Tseng
Recently Han and Lou proposed a highly parallelizable decomposition algorithm for minimizing a strongly convex cost over the intersection of closed convex sets. It is shown that their algorithm is in fact a special case of a splitting algorithm analyzed by Gabay for finding a zero of the sum of two maximal monotone operators. Gabay’s convergence analysis for the splitting algorithm is sharpened, and new applications of this algorithm to variational inequalities, convex programming, and the solution of linear complementarily problems are proposed. For convex programs with a certain separable structure, a multiplier method that is closely related to the alternating direction method of multipliers of Gabay–Mercier and of Glowinski–Marrocco, but which uses both ordinary and augmented Lagrangians, is obtained.
Journal of Optimization Theory and Applications | 1992
Zhi-Quan Luo; Paul Tseng
The coordinate descent method enjoys a long history in convex differentiable minimization. Surprisingly, very little is known about the convergence of the iterates generated by this method. Convergence typically requires restrictive assumptions such as that the cost function has bounded level sets and is in some sense strictly convex. In a recent work, Luo and Tseng showed that the iterates are convergent for the symmetric monotone linear complementarity problem, for which the cost function is convex quadratic, but not necessarily strictly convex, and does not necessarily have bounded level sets. In this paper, we extend these results to problems for which the cost function is the composition of an affine mapping with a strictly convex function which is twice differentiable in its effective domain. In addition, we show that the convergence is at least linear. As a consequence of this result, we obtain, for the first time, that the dual iterates generated by a number of existing methods for matrix balancing and entropy optimization are linearly convergent.
Siam Journal on Optimization | 2002
Masao Fukushima; Zhi-Quan Luo; Paul Tseng
Smoothing functions have been much studied in the solution of optimization and complementarity problems with nonnegativity constraints. In this paper, we extend smoothing functions to problems in which the nonnegative orthant is replaced by the direct product of second-order cones. These smoothing functions include the Chen--Mangasarian class and the smoothed Fischer--Burmeister function. We study the Lipschitzian and differential properties of these functions and, in particular, we derive computable formulas for these functions and their Jacobians. These properties and formulas can then be used to develop and analyze noninterior continuation methods for solving the corresponding optimization and complementarity problems. In particular, we establish the existence and uniqueness of the Newton direction when the underlying mapping is monotone.
Siam Journal on Control and Optimization | 1996
Michael V. Solodov; Paul Tseng
We propose new methods for solving the variational inequality problem where the underlying function
Annals of Operations Research | 1993
Zhi-Quan Luo; Paul Tseng
F
Siam Journal on Optimization | 2007
Paul Tseng
is monotone. These methods may be viewed as projection-type methods in which the projection direction is modified by a strongly monotone mapping of the form
Siam Journal on Optimization | 2007
Zhi-Quan Luo; Nicholas D. Sidiropoulos; Paul Tseng; Shuzhong Zhang
I - \alpha F
SIAM Journal on Matrix Analysis and Applications | 2013
Maryam Fazel; Ting Kei Pong; Defeng Sun; Paul Tseng
or, if