14:30 Simon Byrne
Title: Geodesic Hamiltonian Monte Carlo on Manifolds
Abstract: Statistical problems often involve probability distributions on
non-Euclidean manifolds. For instance, the field of directional
statistics utilises distributions over circles, spheres and tori. Many
dimension-reduction methods utilise orthogonal matrices, which form a
natural manifold known as a Stiefel manifold. Unfortunately, it is
often difficult to construct methods for independent sampling from
such distributions, as the normalisation constants are often
intractable, which means that standard approaches such as rejection
sampling cannot be easily implemented. As a result, Markov chain Monte
Carlo (MCMC) methods are often used, however even simple methods such
as Gibbs sampling and random walk Metropolis require complicated
reparametrisations and need to be specifically adapted to each
distributional family of interest. In this talk, I will demonstrate how the geodesic structure of the
manifold (such as “great circle” rotations on spheres) can be
exploited to construct efficient methods for sampling from such
distributions via a Hamiltonian Monte Carlo (HMC) scheme. These
methods are very flexible and straightforward to implement, requiring
only the ability to evaluate the unnormalised log-density and its
gradients.
15:30 Coffee break
15:50 Ben Calderhead (Imperial)
Title: A General Construction for Parallelising Metropolis-Hastings Algorithms
Abstract:
Markov chain Monte Carlo methods are essential tools for solving many
modern day statistical and computational problems, however a major
limitation is the inherently sequential nature of these algorithms. In
this talk we propose a natural generalisation of the Metropolis-Hastings
algorithm that allows for parallelising a single chain using existing MCMC
samplers, while maintaining convergence to the correct stationary
distribution. We do so by proposing multiple points in parallel, then
constructing and sampling from a finite state Markov chain on the proposed
points that has the correct target density as its stationary distribution.
Our approach is generally applicable and easy to implement. We
demonstrate how this construction may be used to greatly increase the
computational speed of a wide variety of existing MCMC methods, including
Metropolis-Adjusted Langevin Algorithms and Adaptive MCMC. Furthermore we
show how it allows for a principled way of utilising every integration
step within Hamiltonian based Monte Carlo methods; our approach
significantly increases robustness to the choice of algorithmic parameters
and results in increased accuracy of Monte Carlo estimates with minimal
extra computational cost.