Global Convergence of ADMM in Nonconvex Nonsmooth Optimization

Yu Wang, Wotao Yin, and Jinshan Zeng

Published in Journal of Scientific Computing


In this paper, we analyze the convergence of the alternating direction method of multipliers (ADMM) for minimizing a nonconvex and possibly nonsmooth objective function, phi(x_0,x_1,ldots,x_p,y), subject to coupled linear equality constraints. Our ADMM sequentially updates the primal variables x_0,x_1,ldots,x_p,y, followed by updating the dual variable. While x_0 is updated first and y is updated last (before the update to the dual variabe) in each iteration, the order of x_1,ldots,x_p can be arbitrary and vary from one iteration to another. We designate the first and the last primal variables because their blocks are specially treated in our analysis.

The developed convergence guarantee covers a variety of nonconvex functions such as

  • piecewise linear functions

  • ell_q quasi-norm, Schatten-q quasi-norm (0<q<1), and SCAD

  • constraints of compact smooth manifolds (e.g., spherical, Stiefel, and Grassman manifolds)

  • complementarity constraints. By applying our analysis, we show, for the first time, that several ADMM algorithms applied to solve nonconvex models in statistical learning, optimization on manifold, and matrix decomposition are guaranteed to converge.

Our results provide sufficient conditions for ADMM to converge on (convex or nonconvex) monotropic programs with three or more blocks, as they are special cases of our model.

Comparison with the augmented Lagrangian method (ALM, also known as the method of multipliers):

ADMM has been regarded as a variant to ALM. We present a simple example to illustrate how ADMM converges but ALM diverges with bounded penalty parameter beta (it still does with proper unbounded beta). Indicated by this example and other analysis in this paper, ADMM can be a better choice than ALM for some nonconvex nonsmooth problems, because ADMM is not only easier to implement, it is also more likely to converge for the concerned scenarios.


Y. Wang, W. Yin, and J. Zeng, Global Convergence of ADMM in Nonconvex Nonsmooth Optimization, Journal of Scientific Computing 78(1), 29-63, 2019.

« Back