CCM Colloquium: Matt Hoffman (Google)

America/New_York
3rd Floor Classroom/3-Flatiron Institute (162 5th Avenue)

3rd Floor Classroom/3-Flatiron Institute

162 5th Avenue

40
Description

MEADS: Tuning-Free Generalized Hamiltonian Monte Carlo

Abstract:
Hamiltonian Monte Carlo (HMC) has become a go-to family of Markov chain Monte Carlo (MCMC) algorithms for Bayesian inference problems, in part because we have good procedures for automatically tuning its parameters. Much less attention has been paid to automatic tuning of generalized HMC (GHMC), in which the auxiliary momentum vector is partially updated frequently instead of being completely resampled infrequently. Since GHMC spreads progress over many iterations, it is not straightforward to tune GHMC based on quantities typically used to tune HMC such as average acceptance rate and squared jumped distance. In this work, we propose an ensemble-chain adaptation (ECA) algorithm for GHMC that automatically selects values for all of GHMC's tunable parameters each iteration based on statistics collected from a population of many chains. This algorithm is designed to make good use of SIMD hardware accelerators such as GPUs, allowing most chains to be updated in parallel each iteration. Unlike typical adaptive-MCMC algorithms, our ECA algorithm does not perturb the chain's stationary distribution, and therefore does not need to be “frozen” after warm up. Empirically, we find that the proposed algorithm quickly converges to its stationary distribution, producing accurate estimates of posterior expectations with relatively few gradient evaluations per chain.

 

 

If you would like to attend, please email crampersad@flatironinstitute.org for the Zoom details.

The agenda of this meeting is empty