A general construction for parallelising Metropolis-Hastings algorithms: with application to differential equation modelling of biochemical systems

Markov chain Monte Carlo methods are essential tools for solving many modern day statistical and computational problems, however a major limitation is the inherently sequential nature of these algorithms. In this talk we propose a natural generalisation of the Metropolis-Hastings algorithm that allows for parallelising a single chain using existing MCMC samplers, while maintaining convergence to the correct stationary distribution. We do so by proposing multiple points in parallel, then constructing and sampling from a finite state Markov chain on the proposed points that has the correct target density as its stationary distribution. Our approach is generally applicable and easy to implement. We demonstrate how this construction may be used to greatly increase the computational speed of a wide variety of existing MCMC methods, including Metropolis-Adjusted Langevin Algorithms, Adaptive MCMC and Hamiltonian Monte Carlo. As a motivating example, we consider Bayesian inference of biological systems described using nonlinear differential equation models.