Imperial College London

ProfessorSamirBhatt

Faculty of MedicineSchool of Public Health

Professor of Statistics and Public Health
 
 
 
//

Contact

 

+44 (0)20 7594 5029s.bhatt

 
 
//

Location

 

G32ASt Mary's Research BuildingSt Mary's Campus

//

Summary

 

Publications

Citation

BibTex format

@article{Mishra:2022:10.1007/s11222-022-10151-w,
author = {Mishra, S and Flaxman, S and Berah, T and Zhu, H and Pakkanen, MS and Bhatt, S},
doi = {10.1007/s11222-022-10151-w},
journal = {Statistics and Computing},
title = {πVAE: a stochastic process prior for Bayesian deep learning with MCMC},
url = {http://dx.doi.org/10.1007/s11222-022-10151-w},
volume = {32},
year = {2022}
}

RIS format (EndNote, RefMan)

TY  - JOUR
AB - Stochastic processes provide a mathematically elegant way to model complex data. In theory, they provide flexible priors over function classes that can encode a wide range of interesting assumptions. However, in practice efficient inference by optimisation or marginalisation is difficult, a problem further exacerbated with big data and high dimensional input spaces. We propose a novel variational autoencoder (VAE) called the prior encoding variational autoencoder (πVAE). πVAE is a new continuous stochastic process. We use πVAE to learn low dimensional embeddings of function classes by combining a trainable feature mapping with generative model using a VAE. We show that our framework can accurately learn expressive function classes such as Gaussian processes, but also properties of functions such as their integrals. For popular tasks, such as spatial interpolation, πVAE achieves state-of-the-art performance both in terms of accuracy and computational efficiency. Perhaps most usefully, we demonstrate an elegant and scalable means of performing fully Bayesian inference for stochastic processes within probabilistic programming languages such as Stan.
AU - Mishra,S
AU - Flaxman,S
AU - Berah,T
AU - Zhu,H
AU - Pakkanen,MS
AU - Bhatt,S
DO - 10.1007/s11222-022-10151-w
PY - 2022///
SN - 0960-3174
TI - πVAE: a stochastic process prior for Bayesian deep learning with MCMC
T2 - Statistics and Computing
UR - http://dx.doi.org/10.1007/s11222-022-10151-w
UR - http://hdl.handle.net/10044/1/100071
VL - 32
ER -