Bayes Reading Group: Yifan Chen [NYU]

America/New_York
3rd Floor Conference Room (162 Fifth Avenue )

3rd Floor Conference Room

162 Fifth Avenue

Description

Discussion Lead: Yifan Chen [NYU]

Topic: CONVERGENCE OF UNADJUSTED LANGEVIN IN HIGH DIMENSIONS: DELOCALIZATION OF BIAS

Link: https://arxiv.org/abs/2408.13115

Abstract: The unadjusted Langevin algorithm is commonly used to sample probability distributions in extremely high-dimensional settings. However, existing analyses of the algorithm for strongly log-concave distributions suggest that, as the dimension d of the problem increases, the number of iterations required to ensure convergence within a desired error in the W2 metric scales in proportion to d or √d. In this paper, we argue that, despite this poor scaling of the W2 error for the full set of variables, the behavior for a small number of variables can be significantly better: a number of iterations proportional to K, up to logarithmic terms in d, often suffices for the algo- rithm to converge to within a desired W2 error for all K-marginals. We refer to this effect as delocalization of bias. We show that the delocalization effect does not hold universally and prove its validity for Gaussian distributions and strongly log-concave distributions with certain sparse interactions. Our analysis relies on a novel W2,l∞ metric to measure convergence. A key technical challenge we address is the lack of a one-step contraction property in this metric. Finally, we use asymptotic arguments to explore potential generalizations of the delocalization effect beyond the Gaussian and sparse interactions setting.

The agenda of this meeting is empty