Positive Unlabeled Learning by Semi-Supervised Learning.

ICIP(2022)

引用 0|浏览6
暂无评分
摘要
Positive and Unlabeled learning (PU learning) trains a binary classifier based on only positive (P) and unlabeled (U) data, where the unlabeled data contains positive or negative samples. Previous importance reweighting approaches treat all unlabeled samples as weighted negative samples, achieving state-of-the-art performance. However, in this paper, we surprisingly find that the classifier could misclassify negative samples in U data as positive ones at the late training stage by weight adjustment. Motivated by this discovery, we leverage Semi-Supervised Learning (SSL) to address this performance degradation problem. To this end, we propose a novel SSL-based framework to tackle PU learning. Firstly, we introduce the dynamic increasing sampling strategy to progressively select both negative and positive samples from U data. Secondly, we adopt MixMatch to take full advantage of the unchosen samples in U data. Finally, we propose the Co-learning strategy that iteratively trains two independent networks with the selected samples to avoid the confirmation bias. Experimental results on four benchmark datasets demonstrate the effectiveness and superiority of our approach when compared with other state-of-the-art methods.
更多
查看译文
关键词
Image Classification,Positive-Unlabeled Learning,Semi-Supervised Learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要