Large Scale Learning Techniques for Least Squares Support Vector Machines.

Santiago Toledo-Cortés, Iván Yesid Castellanos Martínez,Fabio A. González

CIARP(2018)

引用 0|浏览5
暂无评分
摘要
Although kernel machines allow a non-linear analysis through the transformation of their input data, their computational complexity makes them inefficient in terms of time and memory for the analysis of very large databases. Several attempts have been made to improve kernel methods performance, many of which are focused on approximate the kernel matrix or the feature mapping associated to it. Current trends in machine learning demands the capacity of dealing with large data sets while exploiting the capabilities of massively parallel architectures based on GPUs. This has been mainly accomplished by a combination of gradient descent optimization and online learning. This paper presents an online kernel-based model based on the dual formulation of Least Squared Support Vector Machine method, using the Learning on a Budget strategy to lighten the computational cost. This extends the algorithm capability to analyze very large or high-dimensional data without requiring high memory resources. The method was evaluated against other kernel approximation techniques: Nyström approximation and Random Fourier Features. Experiments made with different datasets show the effectiveness of the Learning on a Budget strategy compared with the other approximation techniques.
更多
查看译文
关键词
Kernel Methods, Least Squared, Support Vector Machine, Nyström, Budget, Random Fourier Features
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要