Sparse Optimization, July 2013

Instructor: Wotao YIN
Teaching assistants: Wei SHI and Kun YUAN
Time: 9:00 am - 12:00 noon (except 9:45 am for July 3)
Dates: July 3/4/5, 10/11/12, 17/18/19, and 24 (possibly extending to July 25/26)

Lectures (contect and order are subject to change)

  1. Review of convex optimizatoin

  2. Sparse optimization: basic formulations, conic programming approaches, applications

  3. Sparse optimization recovery: universal conditions

  4. Sparse optimization recovery: dual certificate and a non-universal condition

  5. First-order optimization methods:

    1. subgradient, gradient, proximal, and dual

    2. operator splitting (forward-backward, Peaceman-Rachford, Douglas-Rachford), prox-linear, ADMM

    3. dual methods, part 1: dual (sub)gradient, linearized Bregman, dual smoothing, augmented Lagrangian, Bregman iteration, residual addback

    4. dual methods, part 2: ADMM, variants of ADMM, distributed ADMM, and applications

    5. parallel and distributed sparse optimization

    6. block-coordinate update

    7. greedy, homotopy

  6. First-order optimization convergence: sublinear, accelerated, and linear rates, extrapolation

  7. Non-convex optimization


« Back