Matthew K. Tam
University of Göttingen
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Matthew K. Tam.
Journal of Optimization Theory and Applications | 2014
Francisco Javier Aragón Artacho; Jonathan M. Borwein; Matthew K. Tam
We discuss recent positive experiences applying convex feasibility algorithms of Douglas–Rachford type to highly combinatorial and far from convex problems.
Anziam Journal | 2014
Francisco Javier Aragón Artacho; Jonathan M. Borwein; Matthew K. Tam
In this paper, we give general recommendations for successful application of the Douglas–Rachford reflection method to convex and nonconvex real matrix completion problems. These guidelines are demonstrated by various illustrative examples. doi:10.1017/S1446181114000145
Journal of Optimization Theory and Applications | 2014
Jonathan M. Borwein; Matthew K. Tam
In this paper, we present two Douglas–Rachford inspired iteration schemes which can be applied directly to N-set convex feasibility problems in Hilbert space. Our main results are weak convergence of the methods to a point whose nearest point projections onto each of the N sets coincide. For affine subspaces, convergence is in norm. Initial results from numerical experiments, comparing our methods to the classical (product-space) Douglas–Rachford scheme, are promising.
Siam Journal on Imaging Sciences | 2015
Robert Hesse; D. Russell Luke; Shoham Sabach; Matthew K. Tam
We propose a general alternating minimization algorithm for nonconvex optimization problems with separable structure and nonconvex coupling between blocks of variables. To fix our ideas, we apply the methodology to the problem of blind ptychographic imaging. Compared to other schemes in the literature, our approach differs in two ways: (i) it is posed within a clear mathematical framework with practical verifiable assumptions, and (ii) under the given assumptions, it is provably convergent to critical points. A numerical comparison of our proposed algorithm with the current state of the art on simulated and experimental data validates our approach and points toward directions for further improvement.
Journal of Global Optimization | 2016
Francisco Javier Aragón Artacho; Jonathan M. Borwein; Matthew K. Tam
In recent times the Douglas–Rachford algorithm has been observed empirically to solve a variety of nonconvex feasibility problems including those of a combinatorial nature. For many of these problems current theory is not sufficient to explain this observed success and is mainly concerned with questions of local convergence. In this paper we analyze global behavior of the method for finding a point in the intersection of a half-space and a potentially non-convex set which is assumed to satisfy a well-quasi-ordering property or a property weaker than compactness. In particular, the special case in which the second set is finite is covered by our framework and provides a prototypical setting for combinatorial optimization problems.
Mathematics of Operations Research | 2018
D. Russell Luke; Nguyen H. Thao; Matthew K. Tam
We develop a framework for quantitative convergence analysis of Picard iterations of expansive set-valued fixed point mappings. There are two key components of the analysis. The first is a natural generalization of single-valued averaged mappings to expansive, set-valued mappings that characterizes a type of strong calmness of the fixed point mapping. The second component to this analysis is an extension of the well-established notion of metric subregularity -- or inverse calmness -- of the mapping at fixed points. Convergence of expansive fixed point iterations is proved using these two properties, and quantitative estimates are a natural byproduct of the framework. To demonstrate the application of the theory, we prove for the first time a number of results showing local linear convergence of nonconvex cyclic projections for inconsistent (and consistent) feasibility problems, local linear convergence of the forward-backward algorithm for structured optimization without convexity, strong or otherwise, and local linear convergence of the Douglas--Rachford algorithm for structured nonconvex minimization. This theory includes earlier approaches for known results, convex and nonconvex, as special cases.
Siam Journal on Optimization | 2017
Jonathan M. Borwein; Guoyin Li; Matthew K. Tam
In this paper, we establish sublinear and linear convergence of fixed point iterations generated by averaged operators in a Hilbert space. Our results are achieved under a bounded Holder regularity...
arXiv: Optimization and Control | 2017
Jonathan M. Borwein; Matthew K. Tam
The Douglas–Rachford reflection method is a general-purpose algorithm useful for solving the feasibility problem of finding a point in the intersection of finitely many sets. In this chapter, we demonstrate that applied to a specific problem, the method can benefit from heuristics specific to said problem which exploit its special structure. In particular, we focus on the problem of protein conformation determination formulated within the framework of matrix completion, as was considered in a recent paper of the present authors.
Journal of Global Optimization | 2018
Minh N. Dao; Matthew K. Tam
The Douglas–Rachford projection algorithm is an iterative method used to find a point in the intersection of closed constraint sets. The algorithm has been experimentally observed to solve various nonconvex feasibility problems; an observation which current theory cannot sufficiently explain. In this paper, we prove convergence of the Douglas–Rachford algorithm in a potentially nonconvex setting. Our analysis relies on the existence of a Lyapunov-type functional whose convexity properties are not tantamount to convexity of the original constraint sets. Moreover, we provide various nonconvex examples in which our framework proves global convergence of the algorithm.
Set-valued and Variational Analysis | 2018
Florian Lauster; D. Russell Luke; Matthew K. Tam
We consider a class of monotone operators which are appropriate for symbolic representation and manipulation within a computer algebra system. Various structural properties of the class (e.g., closure under taking inverses, resolvents) are investigated as well as the role played by maximal monotonicity within the class. In particular, we show that there is a natural correspondence between our class of monotone operators and the subdifferentials of convex functions belonging to a class of convex functions deemed suitable for symbolic computation of Fenchel conjugates which were previously studied by Bauschke & von Mohrenschildt and by Borwein & Hamilton. A number of illustrative examples utilizing the introduced class of operators are provided including computation of proximity operators, recovery of a convex penalty function associated with the hard thresholding operator, and computation of superexpectations, superdistributions and superquantiles with specialization to risk measures.