## The block-coordinate descent method for nonconvex optimizationYangyang Xu and Wotao YinPublished in ## BackgroundThe block-coordinate update (BCU) method is a generalization to the following classic methods: alternating minimization (of a function in the form of ), alternating projection (to find a point in the intersection of two convex sets and by alternatively projecting to and ), (block) coordinate minimization (of a function in the form of ), (block) coordinate descent (of a function in the form of ).
BCU solves the problem in the form of by updating just one or a few blocks of variables at a time, rather than updating all the blocks together (the batch update).
The order of update can be The main advantage is that updating one or just a few blocks of variables are computationally much cheaper than the batch update. On the other hand, convergence requires more stringent conditions and typically takes more iterations. The update applied to each block can be exact minimization over the block or take different forms of inexact updates such as one or a few gradient descent steps, one or a few projected gradient descent steps, one or a few (preconditioned) CG steps, prox-linear update, more…
There is a tradeoff between the per-update complexity and the progress of overall minimization. ## Motivation and the Proposed MethodIt is challenging to establish the To establish global convergence, we assume a ## ApplicationsNonnegative matrix/tensor factorization Nonnegative matrix/tensor completion (reconstruction from incomplete observations) Hyperspectral data analysis Sparse dictioanry learning Blind source separation Any multi-convex problems, where the feasible set and objective function are generally non-convex but convex in each block of variables.
## Tested problem setsSynthetic nonnegative matrices (factorization / completion) Synthetic nonnegative tensor (factorization / completion) CBCL and ORL image databases Hyperspectral data
## Citation
« Back |