IEEE Transactions on Neural Networks and Learning Systems | 2019

Scalable Proximal Jacobian Iteration Method With Global Convergence Analysis for Nonconvex Unconstrained Composite Optimizations

 
 
 
 
 

Abstract


The recent studies have found that the nonconvex relaxation functions usually perform better than the convex counterparts in the $l_{0}$ -norm and rank function minimization problems. However, due to the absence of convexity in these nonconvex problems, developing efficient algorithms with convergence guarantee becomes very challenging. Inspired by the basic ideas of both the Jacobian alternating direction method of multipliers (JADMMs) for solving linearly constrained problems with separable objectives and the proximal gradient methods (PGMs) for optimizing the unconstrained problems with one variable, this paper focuses on extending the PGMs to the proximal Jacobian iteration methods (PJIMs) for handling with a family of nonconvex composite optimization problems with two splitting variables. To reduce the total computational complexity by decreasing the number of iterations, we devise the accelerated version of PJIMs through the well-known Nesterov’s acceleration strategy and further extend both to solve the multivariable cases. Most importantly, we provide a rigorous convergence analysis, in theory, to show that the generated variable sequence globally converges to a critical point by exploiting the Kurdyka–Łojasiewica (KŁ) property for a broad class of functions. Furthermore, we also establish the linear and sublinear convergence rates of the obtained variable sequence in the objective function. As the specific application to the nonconvex sparse and low-rank recovery problems, several numerical experiments can verify that the newly proposed algorithms not only keep fast convergence speed but also have high precision.

Volume 30
Pages 2825-2839
DOI 10.1109/TNNLS.2018.2885699
Language English
Journal IEEE Transactions on Neural Networks and Learning Systems

Full Text