Bingsheng He
Nanjing University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Bingsheng He.
SIAM Journal on Numerical Analysis | 2012
Bingsheng He; Xiaoming Yuan
Alternating direction methods (ADMs) have been well studied in the literature, and they have found many efficient applications in various fields. In this note, we focus on the Douglas-Rachford ADM scheme proposed by Glowinski and Marrocco, and we aim at providing a simple approach to estimating its convergence rate in terms of the iteration number. The linearized version of this ADM scheme, which is known as the split inexact Uzawa method in the image processing literature, is also discussed.
Mathematical Programming | 2002
Bingsheng He; Li-Zhi Liao; Deren Han; Hai Yang
Abstract.The alternating directions method (ADM) is an effective method for solving a class of variational inequalities (VI) when the proximal and penalty parameters in sub-VI problems are properly selected. In this paper, we propose a new ADM method which needs to solve two strongly monotone sub-VI problems in each iteration approximately and allows the parameters to vary from iteration to iteration. The convergence of the proposed ADM method is proved under quite mild assumptions and flexible parameter conditions.
Mathematical Programming | 2016
Caihua Chen; Bingsheng He; Yinyu Ye; Xiaoming Yuan
The alternating direction method of multipliers (ADMM) is now widely used in many fields, and its convergence was proved when two blocks of variables are alternatively updated. It is strongly desirable and practically valuable to extend the ADMM directly to the case of a multi-block convex minimization problem where its objective function is the sum of more than two separable convex functions. However, the convergence of this extension has been missing for a long time—neither an affirmative convergence proof nor an example showing its divergence is known in the literature. In this paper we give a negative answer to this long-standing open question: The direct extension of ADMM is not necessarily convergent. We present a sufficient condition to ensure the convergence of the direct extension of ADMM, and give an example to show its divergence.
Journal of Optimization Theory and Applications | 2000
Bingsheng He; Hai Yang; Shengli Wang
The alternating direction method is one of the attractive approaches for solving linearly constrained separate monotone variational inequalities. Experience on applications has shown that the number of iterations depends significantly on the penalty parameter for the system of linear constraint equations. While the penalty parameter is a constant in the original method, in this paper we present a modified alternating direction method that adjusts the penalty parameter per iteration based on the iterate message. Preliminary numerical tests show that the self-adaptive adjustment technique is effective in practice.
Siam Journal on Optimization | 2012
Bingsheng He; Min Tao; Xiaoming Yuan
We consider the linearly constrained separable convex minimization problem whose objective function is separable into m individual convex functions with nonoverlapping variables. A Douglas–Rachford alternating direction method of multipliers (ADM) has been well studied in the literature for the special case of
Applied Mathematics and Optimization | 1997
Bingsheng He
m=2
Journal of Optimization Theory and Applications | 2002
Bingsheng He; Li-Zhi Liao
. But the convergence of extending ADM to the general case of
Mathematical Programming | 1999
Bingsheng He
m\ge 3
Siam Journal on Imaging Sciences | 2012
Bingsheng He; Xiaoming Yuan
is still open. In this paper, we show that the straightforward extension of ADM is valid for the general case of
Mathematical Programming | 1994
Bingsheng He
m\ge 3