Mingyi Hong
Iowa State University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mingyi Hong.
Siam Journal on Optimization | 2013
Meisam Razaviyayn; Mingyi Hong; Zhi-Quan Luo
The block coordinate descent (BCD) method is widely used for minimizing a continuous function
Mathematical Programming | 2017
Mingyi Hong; Zhi-Quan Luo
f
IEEE Journal on Selected Areas in Communications | 2013
Mingyi Hong; Ruoyu Sun; Hadi Baligh; Zhi-Quan Luo
of several block variables. At each iteration of this method, a single block of variables is optimized, while the remaining variables are held fixed. To ensure the convergence of the BCD method, the subproblem of each block variable needs to be solved to its unique global optimal. Unfortunately, this requirement is often too restrictive for many practical scenarios. In this paper, we study an alternative inexact BCD approach which updates the variable blocks by successively minimizing a sequence of approximations of
Siam Journal on Optimization | 2016
Mingyi Hong; Zhi-Quan Luo; Meisam Razaviyayn
f
IEEE Journal on Selected Areas in Communications | 2013
Qiang Li; Mingyi Hong; Hoi-To Wai; Ya-Feng Liu; Wing-Kin Ma; Zhi-Quan Luo
which are either locally tight upper bounds of
IEEE Transactions on Signal Processing | 2015
Tsung-Hui Chang; Mingyi Hong; Xiangfeng Wang
f
international conference on acoustics, speech, and signal processing | 2015
Mingyi Hong; Zhi-Quan Luo; Meisam Razaviyayn
or strictly convex local approximations of
IEEE Transactions on Signal Processing | 2014
Wei Cheng Liao; Mingyi Hong; Ya-Feng Liu; Zhi-Quan Luo
f
IEEE Signal Processing Magazine | 2016
Mingyi Hong; Meisam Razaviyayn; Zhi-Quan Luo; Jong-Shi Pang
. The main contributions of this work include the characterizations of the convergence conditions for a fairly wide class of such methods, especially for the cases where the objective functions are either nondifferentiable or nonconvex. Our results unify and extend the existing convergence results ...
Mathematical Programming | 2017
Mingyi Hong; Xiangfeng Wang; Meisam Razaviyayn; Zhi-Quan Luo
We analyze the convergence rate of the alternating direction method of multipliers (ADMM) for minimizing the sum of two or more nonsmooth convex separable functions subject to linear constraints. Previous analysis of the ADMM typically assumes that the objective function is the sum of only two convex functions defined on two separable blocks of variables even though the algorithm works well in numerical experiments for three or more blocks. Moreover, there has been no rate of convergence analysis for the ADMM without strong convexity in the objective function. In this paper we establish the global R-linear convergence of the ADMM for minimizing the sum of any number of convex separable functions, assuming that a certain error bound condition holds true and the dual stepsize is sufficiently small. Such an error bound condition is satisfied for example when the feasible set is a compact polyhedron and the objective function consists of a smooth strictly convex function composed with a linear mapping, and a nonsmooth