Citation

BibTex format

@inproceedings{Deisenroth:2015,
author = {Deisenroth, MP and Ng, JW},
publisher = {Journal of Machine Learning Research},
title = {Distributed Gaussian Processes},
url = {http://jmlr.org/proceedings/papers/v37/},
year = {2015}
}

RIS format (EndNote, RefMan)

TY  - CPAPER
AB - To scale Gaussian processes (GPs) to large datasets we introduce the robust Bayesian CommitteeMachine (rBCM), a practical and scalableproduct-of-experts model for large-scaledistributed GP regression. Unlike state-of-theartsparse GP approximations, the rBCM is conceptuallysimple and does not rely on inducingor variational parameters. The key idea is torecursively distribute computations to independentcomputational units and, subsequently, recombinethem to form an overall result. Efficientclosed-form inference allows for straightforwardparallelisation and distributed computations witha small memory footprint. The rBCM is independentof the computational graph and canbe used on heterogeneous computing infrastructures,ranging from laptops to clusters. With sufficientcomputing resources our distributed GPmodel can handle arbitrarily large data sets.
AU - Deisenroth,MP
AU - Ng,JW
PB - Journal of Machine Learning Research
PY - 2015///
TI - Distributed Gaussian Processes
UR - http://jmlr.org/proceedings/papers/v37/
UR - http://hdl.handle.net/10044/1/22230
ER -
Email us: contact-ml@imperial.ac.uk