Cost-Sensitive Support Vector Machines

Neurocomputing(2019)

引用 116|浏览85
暂无评分
摘要
Many machine learning applications involve imbalance class prior probabilities, multi-class classification with many classes (often addressed by one-versus-rest strategy), or "cost-sensitive" classification. In such domains, each class (or in some cases, each sample) requires special treatment. In this paper, we use a constructive procedure to extend SVM's standard loss function to optimize the classifier with respect to class imbalance or class costs. By drawing connections between risk minimization and probability elicitation, we show that the resulting classifier guarantees Bayes consistency. We further analyze the primal and the dual objective functions and derive the objective function in a regularized risk minimization framework. Finally, we extend the classifier to the with cost-sensitive learning with example dependent costs. We perform experimental analysis on class imbalance, cost-sensitive learning with given class and example costs and show that proposed algorithm provides superior generalization performance, compared to conventional methods. (C) 2019 Published by Elsevier B.V.
更多
查看译文
关键词
Cost-sensitive learning,Classification,Class imbalance,SVM,Bayes consistency
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要