Title: Convergence, Robustness and Flexibility of Gaussian Process Regression.

Abstract:  We are interested in the task of estimating an unknown function from a set of point evaluations. In this context, Gaussian process regression is often used as a Bayesian inference procedure. However, hyper-parameters appearing in the mean and covariance structure of the Gaussian process prior, such as smoothness of the function and typical length scales, are often unknown and learnt from the data, along with the posterior mean and covariance.  In the first part of the talk, we will study the robustness of Gaussian process regression with respect to mis-specification of the hyper-parameters, and provide a convergence analysis of the method applied to a fixed, unknown function of interest [1].   In the second part of the talk, we discuss deep Gaussian processes as a class of flexible non-stationary prior distributions [2].

[1] A.L. Teckentrup. Convergence of Gaussian process regression with estimated hyper-parameters and applications in Bayesian inverse problems. SIAM/ASA Journal on Uncertainty Quantification, 8(4), p. 1310-1337, 2020.

[2] M.M. Dunlop, M.A. Girolami, A.M. Stuart, A.L. Teckentrup. How deep are deep Gaussian processes? Journal of Machine Learning Research, 19(54), 1-46, 2018.