Computer-aided worst-case analyses and design of first-order methods for convex optimization

With Adrien Taylor (INRIA Paris)

Computer-aided worst-case analyses and design of first-order methods for convex optimization

In this presentation, we will provide a high-level overview of recent approaches for analyzing and designing first-order methods using symbolic computations and/or semidefinite programming. A particular emphasis will be given to the “performance estimation” approach, which enjoys comfortable tightness guarantees: the approach fails only when the target results are impossible to prove. In particular, it allows obtaining (tight) worst-case guarantees for fixed-step first-order methods involving a variety of oracles – that includes explicit, projected, proximal, conditional, mirror, inexact, or stochastic (sub)gradient steps – and a variety of convergence measures.

The presentation will be example-based, as the main ingredients necessary for understanding the methodologies are already present in the analysis of the vanilla gradient method. For convincing the audience, we will provide other examples that include analyses of the Douglas-Rachford splitting, and of a variant of the celebrated conjugate gradient method.

The methodology is implemented within the package “PESTO” (for “Performance EStimation TOolbox”, available at https://github.com/AdrienTaylor/Performance-Estimation-Toolbox), which allows using the framework without any tedious semidefinite programming modelling step.

This talk are based on joint works with great collaborators (who will be mentioned during the presentation).

Add to your calendar or Include in your list