Patrick

Abstract: Neural Differential Equations (NDEs) demonstrate that neural networks and differential equation are two sides of the same coin. Traditional parameterised differential equations are a special case. Many popular neural network architectures are discretisations. NDEs extend current physical modelling techniques whilst integrating tightly with current deep learning practice. This talk will focus on recent work on neural stochastic diffeqs — including their use for modelling time series; their training as GANs; their links to rough path theory — and on the recently-introduced “Diffrax” library; a JAX-based library of differential equation solvers. It will also include a brief overview of the entire field (ordinary, controlled, stochastic diffeqs + software), to set things in context. Moreover if time allows I will discuss some of the more involved details of neural SDEs, such as the use of reversible SDE solvers, the problem of Brownian reconstruction, and how a Lipschitz discriminator may be obtained in the GAN formulation.
Paper Links: “On Neural Differential Equations”, https://arxiv.org/abs/2202.02435
Code Links: “Diffrax”, https://github.com/patrick-kidger/diffrax

Getting here