First-order methods for large scale optimisation problems – Additional large-scale algorithms

With Stephen Becker, University of Colorado

First-order methods for large scale optimisation problems – Additional large-scale algorithms

Part of the CCIMI short course ‘First-order methods for large scale optimisation problems’

Instructor: Stephen Becker, University of Colorado

Lecture 4: Additional large-scale algorithms
A variety of useful methods are discussed, with comments but without detailed analysis. Algorithms may include: the simplex method for LPs, classical algorithms for unconstrained problems (non-linear conjugate gradient, quasi-Newton, matrix-free Newton, Levenberg–Marquardt and Gauss-Newton, and active-set approaches), classical algorithms for constrained problems (penalty methods, Augmented Lagrangian, ADMM and Douglas-Rachford, coordinate descent and alternating minimization, interior-point methods, sequential quadratic programming, Frank-Wolfe and conditional gradient), and possibly primal-dual methods and mirror-descent. There will be an interlude on non-convex optimization and some non-convex algorithms (e.g., cubic-regularization).

Add to your calendar or Include in your list