Feature Learning With a Divergence-Encouraging Autoencoder for Imbalanced Data Classification.

IEEE ACCESS(2018)

引用 5|浏览24
暂无评分
摘要
Imbalanced data exists commonly in machine learning classification applications. Popular classification algorithms are based on the assumption that data in different classes are roughly equally distributed; however, extremely skewed data, with instances from one class taking up most of the dataset, is not exceptional in practice. Thus, performance of algorithm often degrades significantly when encountering skewed data. Mitigating the problem caused by imbalanced data has been an open challenge for years, and previous researches mostly have proposed solutions from the perspectives of data re-sampling and algorithm improvement. In this paper, focusing on two-class imbalanced data, we have proposed a novel divergence-encouraging autoencoder (DEA) to explicitly learn features from both of the two classes and have designed an imbalanced data classification algorithm based on the proposed autoencoder. By encouraging maximization of divergence loss between different classes in the bottleneck layer, the proposed DEA can learn features for both majority and minority classes simultaneously. The training procedure of the proposed autoencoder is to alternately optimize reconstruction and divergence losses. After obtaining the features, we directly compute the cosine distances between the training and testing features and compare the median of distances between classes to perform classification. Experimental results illustrate that our algorithm outperform ordinary and loss-sensitive CNN models both in terms of performance evaluation metrics and convergence properties. To the best of our knowledge, this is the first paper proposed to solve the imbalanced data classification problem from the perspective of explicitly learning representations of different classes simultaneously. In addition, designing of the proposed DEA is also an innovative work, which could improve the performance of imbalanced data classification without data re-sampling and benefit future researches in the field.
更多
查看译文
关键词
Imbalanced data classification,autoencoder,divergence loss,convergence analysis,alternating training paradigm
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要