Feature Selection via Transferring Knowledge Across Different Classes

ACM Transactions on Knowledge Discovery from Data (TKDD)(2019)

引用 5|浏览107
暂无评分
摘要
The problem of feature selection has attracted considerable research interest in recent years. Supervised information is capable of significantly improving the quality of selected features. However, existing supervised feature selection methods all require that classes in the labeled data (source domain) and unlabeled data (target domain) to be identical, which may be too restrictive in many cases. In this article, we consider a more challenging cross-class setting where the classes in these two domains are related but different, which has rarely been studied before. We propose a cross-class knowledge transfer feature selection framework which transfers the cross-class knowledge from the source domain to guide target domain feature selection. Specifically, high-level descriptions, i.e., attributes, are used as the bridge for knowledge transfer. To further improve the quality of the selected features, our framework jointly considers the tasks of cross-class knowledge transfer and feature selection. Experimental results on four benchmark datasets demonstrate the superiority of the proposed method.
更多
查看译文
关键词
Feature selection, dimension reduction, supervision transfer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要