- Indico style
- Indico style - inline minutes
- Indico style - numbered
- Indico style - numbered + minutes
- Indico Weeks View
Discussion Lead: Jakob Robnik and Reuben Cohn-Gordon (UC Berkeley)
Topic: Tuning-free unadjusted gradient-based MCMC
Abstract: Hamiltonian and Langevin Monte Carlo (HMC and LMC) and their
Microcanonical counterparts (MCHMC and MCLMC) are popular
algorithms for sampling in high dimensions. Their numerical discretization errors
are typically corrected by the Metropolis-Hastings (MH) accept/reject step. How-
ever, as the dimensionality of the problem increases, the stepsize (and therefore
efficiency) needs to decrease as d^{−1/4} for second order integrators in order to main-
tain reasonable acceptance rate. The unadjusted methods, on the other hand, do
not suffer from this scaling, but the difficulty of controlling the asymptotic bias
has hindered the widespread adoption of these algorithms. For Gaussian targets,
we show that the asymptotic bias can be bounded by the energy-error in integra-
tion, independently of the dimensionality and of the parameters of the Gaussian.
We numerically extend the analysis to the non-Gaussian benchmark problems and
demonstrate that most of these problems abide by the same bias bound as the
Gaussian targets. Controlling the energy-error, which is easy to do, ensures con-
trol over the asymptotic bias. We propose an efficient algorithm for tuning the
stepsize, given the desired asymptotic bias, which enables usage of unadjusted
methods in a tuning-free way, and then show some schemes which use unadjusted samplers.