2019 IEEE 8th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP) | 2019

The Geometric Effects of Distributing Constrained Nonconvex Optimization Problems

 
 
 
 
 

Abstract


A variety of nonconvex machine learning problems have recently been shown to have benign geometric landscapes, in which there are no spurious local minima and all saddle points are strict saddles at which the Hessian has at least one negative eigenvalue. For such problems, a variety of algorithms can converge to global minimizers. We present a general result relating the geometry of a centralized problem to its distributed extension; our result is new in considering the scenario where the centralized problem obeys a manifold constraint such as when the variables are normalized to the sphere. We show that the first/second-order stationary points of the centralized and distributed problems are one-to-one correspondent, implying that the distributed problem—in spite of its additional variables and constraints—can inherit the benign geometry of its centralized counterpart. We apply this result to show that the distributed matrix eigenvalue problem, multichannel blind deconvolution problem, and dictionary learning problem all enjoy benign geometric landscapes.

Volume None
Pages 16-20
DOI 10.1109/CAMSAP45676.2019.9022447
Language English
Journal 2019 IEEE 8th International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP)

Full Text