TY - JOUR
T1 - A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization
AU - Hong, Mingyi
AU - Chang, Tsung Hui
AU - Wang, Xiangfeng
AU - Razaviyayn, Meisam
AU - Ma, Shiqian
AU - Luo, Zhi Quan
N1 - Publisher Copyright:
© 2020 INFORMS Inst.for Operations Res.and the Management Sciences. All rights reserved.
PY - 2020/8
Y1 - 2020/8
N2 - Consider the problem of minimizing the sum of a smooth convex function and a separable nonsmooth convex function subject to linear coupling constraints. Problems of this form arise in many contemporary applications, including signal processing, wireless networking, and smart grid provisioning. Motivated by the huge size of these applications, we propose a new class of first-order primal dual algorithms called the block successive upperboundminimizationmethod ofmultipliers (BSUM-M) to solve this family of problems. The BSUMMupdates the primal variable blocks successively by minimizing locally tight upper bounds of the augmented Lagrangian of the original problem, followed by a gradient-Type update for the dual variable in closed form.We show that under certain regularity conditions, and when the primal block variables are updated in either a deterministic or a random fashion, the BSUM-M converges to a point in the set of optimal solutions. Moreover, in the absence of linear constraints and under similar conditions as in the previous result, we show that the randomized BSUM-M(which reduces to the randomized block successive upper-bound minimization method) converges at an asymptotically linear rate without relying on strong convexity.
AB - Consider the problem of minimizing the sum of a smooth convex function and a separable nonsmooth convex function subject to linear coupling constraints. Problems of this form arise in many contemporary applications, including signal processing, wireless networking, and smart grid provisioning. Motivated by the huge size of these applications, we propose a new class of first-order primal dual algorithms called the block successive upperboundminimizationmethod ofmultipliers (BSUM-M) to solve this family of problems. The BSUMMupdates the primal variable blocks successively by minimizing locally tight upper bounds of the augmented Lagrangian of the original problem, followed by a gradient-Type update for the dual variable in closed form.We show that under certain regularity conditions, and when the primal block variables are updated in either a deterministic or a random fashion, the BSUM-M converges to a point in the set of optimal solutions. Moreover, in the absence of linear constraints and under similar conditions as in the previous result, we show that the randomized BSUM-M(which reduces to the randomized block successive upper-bound minimization method) converges at an asymptotically linear rate without relying on strong convexity.
KW - alternating direction method of multipliers
KW - block successive upper-bound minimization
KW - randomized block coordinate descent
UR - https://www.scopus.com/pages/publications/85093860907
U2 - 10.1287/moor.2019.1010
DO - 10.1287/moor.2019.1010
M3 - 文章
AN - SCOPUS:85093860907
SN - 0364-765X
VL - 45
SP - 833
EP - 861
JO - Mathematics of Operations Research
JF - Mathematics of Operations Research
IS - 3
ER -