ANC SEMINAR: Marc Deisenroth Chair: Amos Storkey
LargeScale Gaussian Process regression
What 


When 
Mar 31, 2015 from 11:00 AM to 12:00 PM 
Where  Room IF 4.31/4.33 
Add event to calendar 
vCal iCal 
Abstract:
Gaussian processes (GPs) are the method of choice for probabilistic
nonlinear regression. A strength of the GP is that it is a fairly
reliable blackbox function approximator, i.e., it produces reasonable
predictions without manual parameter tuning. A practical limitation of
the GP is its computational demand: Training and predicting scale in
O(N^{3}) and O(N^{2}), respectively, where N is the size of the training
data set.
To scale GPs to data set sizes beyond 10^{4}, we often use sparse
approximations, which implicitly (or explicitly) use a subset of the
data. Modern sparse approximations scale GPs up to O(10^{6}) data points,
but training these methods is nontrivial.
In this talk, I will introduce a generalised version of Tresp's Bayesian
Committee Machine to address the largedata problem of GPs by
distributed computing. This generalised Bayesian Committee Machine
(gBCM) is a practical and scalable hierarchical GP model for largescale
distributed nonparametric regression. The gBCM is a family of
productofexperts models that hierarchically recombines independent
computations to form an approximation of a full Gaussian process. The
gBCM includes classical productofexperts models and the Bayesian
Committee Machine as special cases, while it addresses their respective
shortcomings, such as underestimation of variances or a (more or less)
complete breakdown for weak experts. Closedform computations allow for
efficient and straightforward parallelisation and distributed computing
with a small memory footprint, but without an explicit sparse
approximation. Since training and predicting is independent of the
computational graph our model can be used on heterogeneous computing
infrastructures, ranging from laptops to large clusters. We provide
strong experimental evidence that the gBCM works well on large data sets.
Link to the corresponding working paper:
http://arxiv.org/pdf/1502.02843v1
Lunch will be provided afterwards