How Are The Centered Kernel Principal Components Relevant To Regression Task? - An Exact Analysis

2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP)(2018)

引用 23|浏览18
暂无评分
摘要
We present an exact analytic expression of the contributions of the kernel principal components to the relevant information in a non-linear regression problem. A related study has been presented by Braun, Buhmann, and Muller in 2008, where an upper bound of the contributions was given for a general supervised learning problem but with "uncentered" kernel PCAs. Our analysis clarifies that the relevant information of a kernel regression under explicit centering operation is contained in a finite number of leading kernel principal components, as in the "uncentered" kernel-PCA case, if the kernel matches the underlying nonlinear function so that the eigenvalues of the centered kernel matrix decay quickly. We compare the regression performances of the least-square-based methods with the centered and uncentered kernel PCAs by simulations.
更多
查看译文
关键词
nonlinear regression, kernel PCA, reproducing kernel Hilbert space, spectral decomposition
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要