Combining Gated Recurrent Unit and Attention Pooling for Sentimental Classification

Proceedings of the 2018 2nd International Conference on Computer Science and Artificial Intelligence(2018)

引用 4|浏览114
暂无评分
摘要
Recurrent Neural Network (RNN) is one of the most popular architectures for addressing variable sequence text, and it shows outstanding results in many Natural Language Processing (NLP) tasks and remarkable performance in capturing long-term dependencies. Many models have achieved excellent results based on RNN. However, most of these models ignored the locations of the crucial words in a sentence and the semantic connections in different directions. Such processing approaches do not make full use of the available information. Thus, we consider that some words have a special effect on the whole sentence, while some words have little influence. To address these problems, in this paper, we propose Bidirectional Gated Recurrent Units (BGRU) that are integrated with a novel attention pooling that combine with max-pooling to pay attention to the crucial words and maintain the more meaningful representation of the text automatically, which allows us to encode longer sequences. It not only prevents important information from being discarded but also can be used to filter noise. We evaluate the proposed model on multiple tasks, including sentiment classification, movie review data and a subjective classification dataset. Our model compares predicted labels with correct labels as accuracy. The experimental results show that our model can achieve excellent performance on these tasks.
更多
查看译文
关键词
Natural language processing, Neural Network, Gated Recurrent Units, Text Classification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要