Entropy contraction of the Gibbs sampler under log-concavity

With Giacomo Zanella (Bocconi University)

Entropy contraction of the Gibbs sampler under log-concavity

In this talk I will present recent work (https://arxiv.org/abs/2410.00858) on the non-asymptotic analysis of the Gibbs sampler, which is a canonical and popular Markov chain Monte Carlo algorithm for sampling. In particular, under the assumption that the probability measure π of interest is strongly log-concave, we show that the random scan Gibbs sampler contracts in relative entropy and provide a sharp characterization of the associated contraction rate. The result implies that, under appropriate conditions, the number of full evaluations of π required for the Gibbs sampler to converge is independent of the dimension. If time permits, I will also discuss connections and applications of the above results to the problem of zero-order parallel sampling.

Based on joint work with Filippo Ascolani and Hugo Lavenant.

Add to your calendar or Include in your list