- Indico style
- Indico style - inline minutes
- Indico style - numbered
- Indico style - numbered + minutes
- Indico Weeks View
Presenter: Tore Selland Kleppe (University of Stavanger)
Title: Adaptive step sizes for HMC
Abstract: Hamiltonian Monte Carlo (HMC) has become an important tool for handling sampling problems within statistics and machine learning. Practical HMC involves the numerical integration of a set of ordinary differential equations. Conventional HMC methods have usually relied on fixed step size integration using the leapfrog / Verlet integrator as the errors incurred by such the numerical integration may be exactly corrected using a Metropolis accept/reject step. However, such fixed step sizes may lead to slow mixing of the resulting MCMC chain or excessive computational cost. To address such problems, this talk considers two approaches for including adaptive step sizes into HMC-like computations:
-The former relies on the application of general purpose adaptive ODE solvers, and relying on the error control mechanism of the solver to control (but generally not remove) the biases incurred by the numerical integration.
-The second approach centers around introduction of adaptive step sizes while at the same time retaining exactness. This approach is a special case of the recently proposed GIST sampler, and may be used in conjunction with many conventional HMC methods, and even locally adaptive HMC methods such as the NUTS sampler currently being used in Stan.
Bio: Tore Selland Kleppe holds a PhD in statistics from the University of Bergen, Norway, and is currently a Professor of mathematical statistics at the University of Stavanger, Norway. His interests are within computational statistics, mainly MCMC methods for Bayesian hierarchical models, and the application of such models and methods within applied fields.