Improving Sentence Representations With Local And Global Attention For Classification

2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)(2019)

引用 2|浏览9
暂无评分
摘要
Representation learning is a key issue for text classification tasks. Few existing representation models are able to learn sufficient text information, including local semantic information and global structure information. This paper focuses on how to generate better semantic and structure representations to obtain better sentence representation with them. In detail, we propose a hierarchical local and global attention network to learn sentence representation automatically. We generate semantic and structure representations respectively with local attention. Global attention is used to get the final representation. The final representation obtained is used for training and prediction. Experimental results show that our method achieves ideal results in several text classification tasks, including sentiment analysis, subjectivity classification and question type classification. The specific accuracies are 81.6%(MR), 93.6%(SUBJ), 49.4%(SST-5) and 95.6%(TREC).
更多
查看译文
关键词
subjectivity classification,question type classification,sentence representation,global attention,representation learning,text classification tasks,local semantic information,global structure information,semantic structure representations,hierarchical local attention network
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要