CS774: Topics

The following topics would tentatively be covered in the course. Please refer to the lectures section for the list of lecture topics.

  • Background: convex analysis, linear and matrix algebra, probability theory

  • Preliminaries: applications, optimality and duality conditions

  • First Order Methods

    • Subgradient methods

    • Proximal methods

    • Multiplicative weights update, mirrored descent

  • Second Order Methods

    • Newton method

    • Quasi-Newton methods, L-BFGS

  • Stochastic Optimization Problems

    • Notion of regret, online to batch conversion

    • Methods offering vanishing regret - OGD, EG, OMD

  • Non-convex Optimization Problems

    • Applications - sparse recovery, affine rank minimization, low-rank matrix completion

    • Convex approaches - relaxation-based methods

    • Non-convex approaches - projected gradient descent, alternating minimization

  • Special topics (a subset would be chosen depending on interest and available time)

    • Accelerated first order methods

    • Bayesian methods

    • Coordinate methods

    • Cutting plane methods

    • Interior point methods

    • Optimization methods for deep learning

    • Parallel and distributed methods

    • Robust optimization problems and methods

    • Stochastic mini-batch methods

    • Submodular optimization problems and methods

    • Variance reduced stochastic methods

    • Zeroth order methods