Fast Laplace approximation for Gaussian processes with a tensor product kernel

Perry Groot, Markus Peters, Tom Heskes, Wolfgang Ketter

Research output: Contribution to journalConference articleAcademicpeer-review

1 Citation (Scopus)

Abstract

Gaussian processes provide a principled Bayesian framework, but direct implementations are restricted to small data sets due to the cubic time cost in the data size. In case the kernel function is expressible as a tensor product kernel and input data lies on a multidimensional grid it has been shown that the computational cost for Gaussian process regression can be reduced considerably. Tensor product kernels have mainly been used in regression with a Gaussian observation model since key steps in their algorithms do not easily translate to other tasks. In this paper we show how to obtain a scalable Gaussian process framework for gridded inputs and non-Gaussian observation models that factorize over cases. We empirically validate our approach on a binary classification problem and our results shows a major performance improvement in terms of run time.

Original languageEnglish
Pages (from-to)41-48
Number of pages8
JournalBelgian/Netherlands Artificial Intelligence Conference
Publication statusPublished - 2014
Event26th Benelux Conference on Artificial Intelligence, BNAIC 2014 - Nijmegen, Netherlands
Duration: 6 Nov 20147 Nov 2014

Fingerprint

Dive into the research topics of 'Fast Laplace approximation for Gaussian processes with a tensor product kernel'. Together they form a unique fingerprint.

Cite this