A Label-Specific Attention-Based Network With Regularized Loss For Multi-Label Classification

ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II(2019)

引用 0|浏览124
暂无评分
摘要
In a multi-label text classification task, different parts of a document do not contribute equally to predicting labels. Most existing approaches failed to consider this problem. Several methods have been proposed to take this problem into account. However, they just utilized hidden representations of neural networks as input of attention mechanism, not combining with label information. In this work, we propose an improved attention-based neural network model for multi-label text classification, which can obtain the weights of attention mechanism by computing the similarity between each label and each word of documents. This model adds the label information into text representations which can select the most informative words accurately for predicting labels. Besides, compared with single-label classification, the labels of multi-label classification may have some correlations such as co-occurrence or conditional probability relationship. So we also propose a special regularization term for this model, which can help to exploit label correlations by using label co-occurrence matrix. Experimental results on AAPD and RCV1-V2 datasets demonstrate that the proposed model yields a significant performance gain compared to many state-of-the-art approaches.
更多
查看译文
关键词
Multi-label classification, Attention-based Network, Label correlations
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要