Self-Taught Semi-Supervised Dictionary Learning with Non-Negative Constraint

IEEE Transactions on Industrial Informatics(2020)

引用 26|浏览42
暂无评分
摘要
This paper investigates classification by dictionary learning. A novel unified framework termed self-taught semisupervised dictionary learning with nonnegative constraint is proposed for simultaneously optimizing the components of a dictionary and a graph Laplacian. Specifically, an atom graph Laplacian regularization is built by using sparse coefficients to effectively capture the underlying manifold structure. It is more robust to noisy samples and outliers because atoms are more concise and representative than training samples. A nonnegative constraint imposed on the sparse coefficients guarantees that each sample is in the middle of its related atoms. In this way, the dependency between samples and atoms is made explicit. Furthermore, a self-taught mechanism is introduced to effectively feed back the manifold structure induced by atom graph Laplacian regularization and the supervised information hidden in unlabeled samples in order to learn a better dictionary. An efficient algorithm, combining a block coordinate descent method with the alternating direction method of multipliers, is derived to optimize the unified framework. Experimental results on several benchmark datasets show the effectiveness of the proposed model.
更多
查看译文
关键词
Training,Machine learning,Manifolds,Dictionaries,Laplace equations,Sparse matrices,Task analysis
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要