Scalable Gaussian Processes, with Guarantees: Kernel Approximations and Deep Feature Extraction

arXiv preprint arXiv:2004.01584(2020)

引用 3|浏览48
暂无评分
摘要
We provide a linear time inferential framework for Gaussian processes that supports automatic feature extraction through deep neural networks and low-rank kernel approximations. Importantly, we derive approximation guarantees bounding the Kullback–Leibler divergence between the idealized Gaussian process and one resulting from a low-rank approximation to its kernel under two types of approximations, which result in two instantiations of our framework: Deep Fourier Gaussian Processes, resulting from random Fourier feature low-rank approximations, and Deep Mercer Gaussian Processes, resulting from truncating the Mercer expansion of the kernel. We do extensive experimental evaluation of these two instantiations in a broad collection of real-world datasets providing strong evidence that they outperform a broad range of state-of-the-art methods in terms of time efficiency, negative log-predictive density, and root mean squared error.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要