The role of Random Matrix Theory for the optimization algorithms of Machine Learning

With Gérard Ben Arous (Courant)

The role of Random Matrix Theory for the optimization algorithms of Machine Learning

I will survey recent progress in the understanding of the optimization dynamics for important tasks for Machine Learning or high dimensional statistics. We will see how these very high-dimensional dynamics are in fact ruled by the so-called “effective dynamics” of much lower dimensional systems. This dynamical dimension reduction is related to the BBP spectral transition of Random Matrix Theory, appearing dynamically along the algorithm path. I will illustrate these phenomena in multi-spike Tensor PCA , XOR, and classification of Gaussian mixtures with multi-layer neural nets.

This talk is based on joint works with Reza Gheissari (Northwestern), Jiaoyang Huang (Wharton), Aukosh Jagannath (Waterloo), and on joint works with Cedric Gerbelot (ENS Lyon) and Vanessa Piccolo (EPFL).

A wine reception in the Central Core will follow the lecture.

Add to your calendar or Include in your list