Sampling Techniques For Kernel Methods

NIPS'01: Proceedings of the 14th International Conference on Neural Information Processing Systems: Natural and Synthetic(2002)

引用 65|浏览46
暂无评分
摘要
We propose randomized techniques for speeding up Kernel Principal Component Analysis on three levels: sampling and quantization of the Gram matrix in training, randomized rounding in evaluating the kernel expansions, and random projections in evaluating the kernel itself. In all three cases, we give sharp bounds on the accuracy of the obtained approximations. Rather intriguingly, all three techniques can be viewed as instantiations of the following idea: replace the kernel function k by a "randomized kernel" which behaves like k in expectation.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要