Incorporating syntax information into attention mechanism vector for improved aspect-based opinion mining

Neural Computing and Applications(2024)

引用 0|浏览0
暂无评分
摘要
In Aspect-based Sentiment Analysis (ABSA), accurately determining the sentiment polarity of specific aspects within text requires a nuanced understanding of linguistic elements, including syntax. Traditional ABSA approaches, particularly those leveraging attention mechanisms, have shown effectiveness but often fall short in integrating crucial syntax information. Moreover, while some methods employ Graph Neural Networks (GNNs) to extract syntax information, they face significant limitations, such as information loss due to pooling operations. Addressing these challenges, our study proposes a novel ABSA framework that bypasses the constraints of GNNs by directly incorporating syntax-aware insights into the analysis process. Our approach, the Syntax-Informed Attention Mechanism Vector (SIAMV), integrates syntactic distances obtained from dependency trees and part-of-speech (POS) tags into the attention vectors, ensuring a deeper focus on linguistically relevant elements. This not only substantially enhances ABSA accuracy by enriching the attention mechanism but also maintains the integrity of sequential information, a task managed by adopting Long Short-Term Memory (LSTM) networks. The LSTM’s inputs, consisting of syntactic distance, POS tags, and the sentence itself, are processed to generate a syntax vector. This vector is then combined with the attention vector, offering a robust model that adeptly captures the nuances of language. Moreover, the sequential processing capability of LSTM ensures minimal information loss across the text by preserving the context and dependencies inherent in the sentence structure, unlike traditional pooling methods. Our experimental findings demonstrate that this innovative combination of SIAMV and LSTM significantly outperforms existing GNN-based ABSA models in accuracy, thereby setting a new standard for sentiment analysis research. By overcoming the traditional reliance on GNNs and their pooling-induced information loss, our method presents a comprehensive model that adeptly captures and analyzes sentiment at the aspect level, marking a significant advancement in the field of ABSA. The syntax distance programming code for required to replicate the experiment is accessible: https://github.com/Makera86/Syntax-Distance.git .
更多
查看译文
关键词
ABSA,Attention mechanism,Syntax information,GNN,Deep learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要