Flexibly utilizing syntactic knowledge in aspect-based sentiment analysis

INFORMATION PROCESSING & MANAGEMENT(2024)

引用 0|浏览2
暂无评分
摘要
Aspect-based sentiment analysis (ABSA) refers to ascertaining the propensity of sentiment expressed in a text towards a particular aspect. While previous models have utilized dependency graphs and GNNs to facilitate information exchange, they face challenges such as smoothing of aspect representation and a gap between word-based dependency graphs and subwordbased BERT. Taking into account the above deficiencies, we argue for a new approach called SRE-BERT that flexibly utilizes syntax knowledge to enhance aspect representations by relying on syntax representations. First, we propose a syntax representation encoder to acquire the syntactic vector for each token. Then, we devise a syntax-guided transformer that employs syntax representation to compute multi-head attention, thereby enabling direct syntactic interaction between any two tokens. Finally, the token-level vectors derived from the syntax-guided transformer are employed to enhance the semantic representations obtained by BERT. In addition, during the aforementioned process, we introduced a Masked POS Label Prediction (MPLP) method to pre-train the syntax encoder. A wide range of studies have been undertaken on data collections covering three distinct fields, and the results indicate that our SRE-BERT outperforms the second-ranked model by 1.97%, 1.55%, and 1.20% on the Rest14, Lap14, and Twitter 3 datasets, respectively
更多
查看译文
关键词
Aspect-based sentiment analysis,BERT,Syntax representation,Syntax-guided transformer
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要