Description
Chair: Shirley Ho
Deep generative models parametrize very flexible families of distributions able to fit complicated datasets of images or text. These models provide independent samples from complex high-distributions at negligible costs. On the other hand, sampling exactly a target distribution, such a Bayesian posterior or the Boltzmann distribution of a physical system, is typically challenging: either because of dimensionality, multi-modality, ill-conditioning or a combination of the previous. In this talk, I will discuss recent works proposing to enhance traditional inference and sampling algorithms with learning. I will present in particular flowMC, an adaptive MCMC with normalizing flows.