Plug-and-Play Methods Provably Converge with Properly Trained Denoisers

Ernest Ryu, Jialin Liu, Sicheng Wang, Xiaohan Chen, Zhangyang Wang, Wotao Yin

International Conference on Machine Learning (ICML)’19:


Plug-and-play (PnP) is an optimization framework that integrates pre-trained deep networks (or other nonlinear operators) into ADMM and proximal optimization algorithms with provable convergence. It combines the advantages of deep learning and classic optimization.

PnP lets one use excellent pre-trained networks for tasks where there is not sufficient data for end-to-end training. Although PnP has exhibited great empirical results, theoretical analysis addressing even the basic question of convergence has been insufficient.

We establish convergence of PnP-FBS and PnP-ADMM with a constant stepsize (rather than using diminishing stepsizes). The nonlinear operator is required to have a certain Lipschitz condition. To meet this condition, we propose real spectral normalization (realSN), a technique for training deep learning-based denoisers to satisfy the proposed Lipschitz condition.

Finally, we present experimental results on magnetic resonance imaging (MRI) to validate the theory. PnP is useful in medical imaging as we do not have a large amount of data for end-to-end training: we train the denoiser on natural images, and then “plug” it into the PnP framework to be applied to medical images.


In the above table, higher PSNR is better. BM3D is a nonlinear denoiser by Dabov 2007. DnCNN is a convolutional neural network for denoising by Zhang 2017. realSN is the proposed method to satisfy the proposed Lipschitz condition. simpleCNN is a simple convolutional encoder-decoder network for denoising.

The table shows that PnP methods achieve signifantly better MRI reconstruction performance.


E. Ryu, J. Liu, S. Wang, X. Chen, Z. Wang, and W. Yin, Plug-and-play methods provably converge with properly trained denoisers, International Conference on Machine Learning (ICML), Long Beach, CA, 2019.

« Back