## Faster convergence rates of relaxed Peaceman-Rachford and ADMM under regularity assumptionsDamek Davis and Wotao Yin Published in This technical report is Part 2 of a two-part series. Part 1 gives weakers convergence rates without regularity assumptions.
## OverviewSplitting schemes are a class of powerful algorithms that solve complicated monotone inclusion and convex optimization problems that are built from many simpler pieces. They give rise to algorithms in which the simple pieces of the decomposition are processed individually. This leads to easily implementable and highly parallelizable algorithms, which often obtain nearly state-of-the-art performance. In this paper, we provide a comprehensive convergence rate analysis of the Douglas-Rachford splitting (DRS), Peaceman-Rachford splitting (PRS), and alternating direction method of multipliers (ADMM) algorithms under various regularity assumptions. Under strong convexity assumptions, we prove convergence rates of a sequence that strongly converges to a minimizer. Under Lipschitz differentiability assumptions, we prove that the best iterate of the relaxed PRS algorithms converges as least as quickly as the last iterate of the forward-backward splitting (FBS) algorithm, without any knowledge of the Lipschitz constant. In addition, if the Lipschitz constant of the derivative is known, we can choose implicit stepsize parameters such that the entire sequence of objective error converges as quickly as the iterates of the FBS algorithm. Under mixed strong convexity and Lipschitz differentiability assumptions, we show that the relaxed PRS algorithm converges linearly. Finally, we prove that a relaxed PRS iteration applied to the feasibility problem converges linearly whenever the sets satisfy the standard (bounded) linear regularity assumption. One of the main consequences of our results is that ## Citation
« Back |