## On the global and linear convergence of the generalized alternating direction method of multipliers (ADMM)W. Deng and W. Yin Published in ## OverviewA large number of optimization problems can be reformulated into the form of where and are extended-value convex functions. On problems of this form, a very effective approach is the alternating direction method of multipliers (ADM or ADMM), which solves a sequence of subproblems involving and one at a time. However, its effectiveness has not been matched by a provably fast rate of convergence; only sublinear rates such as and were established in the literature. In many common problems, one of the two objective functions is strictly convex and
has Lipschitz continuous gradient. This paper shows that The above results also apply to generalized ADMs, which
allow their subproblems to be solved prox-linear: good for or with easy proximal and will avoid the inverse of or gradient descent: use or instead of their proximals fast approximate or : invert , where , instead of ; for example, invert a standard discrete Fourier operator instead of one defined on a nonstandard grid
## Citation
« Back |