Statistical guarantees for neural operator surrogates
With Sven Wang (EPFL)
Statistical guarantees for neural operator surrogates
In recent years, “operator learning” methodologies for constructing data-driven surrogates for non-linear operators are gaining widespread attention. We present statistical convergence results for the learning of such non-linear mappings in infinite-dimensional spaces, e.g. arising from PDEs, given noisy input-output pairs. We provide convergence results for least-squares-type empirical risk minimizers over general classes, in terms of their approximation properties and metric entropy bounds. This generalizes classical results from finite-dimensional nonparametric regression to an infinite-dimensional setting.
Assuming $G_0$ to be holomorphic, we prove algebraic (in the sample size $n$) convergence rates in this setting, thereby overcoming the curse of dimensionality. To illustrate the wide applicability, as a prototypical example we discuss the learning of the non-linear solution operator to a parametric elliptic partial differential equation, with an encoder-decoder based neural operator architecture.
- Speaker: Sven Wang (EPFL)
- Friday 14 November 2025, 14:00–15:00
- Venue: MR12, Centre for Mathematical Sciences.
- Series: Statistics; organiser: Qingyuan Zhao.