Subspace-based minority oversampling for imbalance classification

Information Sciences(2023)

引用 7|浏览16
暂无评分
摘要
In pattern classification, the class imbalance problem always occurs when the number of observations in some classes is significantly different from that of other categories, which leads to the learning bias in the classifiers. One possible solution to this problem is to re-balance the training set by over-sampling the minority class. However, over-samplings always push the classification boundaries to the majority part, thus the recall increases while the precision decreases. To avoid this situation and better handle the class imbalance problem, this paper proposes a new over-sampling method, namely Subspace-based Minority Over-Sampling (abbr. SMO). This approach considers that each category of sam-ples is formed by common and unique characteristics, and such characteristics can be extracted by subspace. To obtain the balanced data, the common part is over-sampled for more accurately depicting the minority, and the unique part can be expanded by some generative methods. The balanced data are obtained by restoring the generated products of the subspace to the original space. The experimental results demonstrate that the SMO has the ability to model complex data distributions and outperforms both classical and newly designed over-sampling algorithms. Also, SMO can be used to generate simple images, and the generation results of MNIST can be clearly identified by both human vision and machine vision.(c) 2022 Elsevier Inc. All rights reserved.
更多
查看译文
关键词
Class imbalance,Minority over-sampling,Low-rank representation,Matrix completion
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要