Constrained and Localized Nonparametric Estimation and Optimization
With John Lafferty (U of Chicago)
Constrained and Localized Nonparametric Estimation and Optimization
We present work on two nonstandard frameworks for minimax analysis.
For the first problem, imagine that I estimate a statistical model
from data, and then want to share my model with you. But we are
communicating over a resource constrained channel. By sending lots of
bits, I can communicate my model accurately, with little loss in
statistical risk. Sending a small number of bits will incur some
excess risk. What can we say about the tradeoff between statistical
risk and the communication constraints? This is a type of rate
distortion and constrained minimax problem, for which we provide a
sharp analysis in certain nonparametric settings.
The second problem starts with the question “how difficult is it to
minimize a specific convex function?” This is tricky to
formalize traditional complexity analysis is expressed in terms of
the worst case over a large class of instances. We extend the
classical minimax analysis of stochastic convex optimization by
introducing a localized form of minimax complexity for individual
functions. This uses a computational analogue of the modulus of
continuity that is central to statistical minimax analysis, which
serves as a computational analogue of Fisher information.
Joint work with Sabyasachi Chatterjee, John Duchi, and Yuancheng Zhu.
- Speaker: John Lafferty (U of Chicago)
- Friday 04 November 2016, 16:00–17:00
- Venue: MR12, Centre for Mathematical Sciences, Wilberforce Road, Cambridge..
- Series: Statistics; organiser: Quentin Berthet.