BibTex format

author = {Salimbeni, HR and Cheng, C-A and Boots, B and Deisenroth, MP},
publisher = {Massachusetts Institute of Technology Press},
title = {Orthogonally decoupled variational Gaussian processes},
url = {},

RIS format (EndNote, RefMan)

AB - Gaussian processes (GPs) provide a powerful non-parametric framework for rea-soning over functions. Despite appealing theory, its superlinear computational andmemory complexities have presented a long-standing challenge. State-of-the-artsparse variational inference methods trade modeling accuracy against complexity.However, the complexities of these methods still scale superlinearly in the numberof basis functions, implying that that sparse GP methods are able to learn fromlarge datasets only when a small model is used. Recently, a decoupled approachwas proposed that removes the unnecessary coupling between the complexitiesof modeling the mean and the covariance functions of a GP. It achieves a linearcomplexity in the number of mean parameters, so an expressive posterior meanfunction can be modeled. While promising, this approach suffers from optimizationdifficulties due to ill-conditioning and non-convexity. In this work, we propose analternative decoupled parametrization. It adopts an orthogonal basis in the meanfunction to model the residues that cannot be learned by the standard coupled ap-proach. Therefore, our method extends, rather than replaces, the coupled approachto achieve strictly better performance. This construction admits a straightforwardnatural gradient update rule, so the structure of the information manifold that islost during decoupling can be leveraged to speed up learning. Empirically, ouralgorithm demonstrates significantly faster convergence in multiple experiments.
AU - Salimbeni,HR
AU - Cheng,C-A
AU - Boots,B
AU - Deisenroth,MP
PB - Massachusetts Institute of Technology Press
SN - 1049-5258
TI - Orthogonally decoupled variational Gaussian processes
UR -
ER -