Constraint-weighted support vector ordinal regression to resist constraint noises

Inf. Sci.(2023)

引用 1|浏览11
暂无评分
摘要
Ordinal regression (OR) is a crucial in machine learning. Usual assumption is that all training instances are perfectly denoted. However, when this assumption does not hold, the performance degrades significantly. As a widely used ordinal regression model, support vector ordinal regression (SVOR) identifies r-1 parallel hyperplanes to separate r ranks, where each instance is associated with r-1 constraints for r-1 parallel hyperplanes. Different from the traditional classification problem, an instance with incorrect label may have no influence on certain parallel hyperplanes during SVOR learning. If a constraint induces the deviation of parallel hyperplane(s), it is termed as constraint noise. To address constraint noises, this paper proposes constraint-weighted support vector ordinal regression (CWSVOR) by introducing a constraint weight vector whose length is r-1 to control the influence of r-1 constraints on parallel hyperplanes for each instance. When an instance is denoted as an incorrect rank, the elements of the weight vector for constraint noises are close to 0, while others remain close to 1. The proposed constraint-weighted strategy aims to mitigate the detrimental effects of constraint noises and simultaneously retain the useful constraints during SVOR learning. The experiments on several datasets demonstrate that CWSVOR outperforms KDLOR, ELMOP, NNOP, SVOR and NPSVOR when the training set is corrupted by noises and it shows comparable performance to pin-SVOR.
更多
查看译文
关键词
Ordinal regression,Constraint-weighted support vector ordinal regression,Constraint weight vector,Constraint noise
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要