Random Projections for Linear Support Vector Machines

TKDD(2014)

引用 53|浏览32
暂无评分
摘要
Let X be a data matrix of rank ρ, whose rows represent n points in d-dimensional space. The linear support vector machine constructs a hyperplane separator that maximizes the 1-norm soft margin. We develop a new oblivious dimension reduction technique that is precomputed and can be applied to any input matrix X. We prove that, with high probability, the margin and minimum enclosing ball in the feature space are preserved to within ε-relative error, ensuring comparable generalization as in the original space in the case of classification. For regression, we show that the margin is preserved to ε-relative error with high probability. We present extensive experiments with real and synthetic data to support our theory.
更多
查看译文
关键词
algorithms,experimentation,dimensionality reduction,general,optimization,theory,classification,design methodology,support vector machines
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要