Cross-Domain Aspect-based Sentiment Classification with Pre-Training and Fine-Tuning Strategy for Low-Resource Domains

Chunjun Zhao, Meiling Wu, Xinyi Yang, Xuzhuang Sun,Suge Wang,Deyu Li

ACM Transactions on Asian and Low-Resource Language Information Processing(2024)

引用 0|浏览6
暂无评分
摘要
Aspect-based sentiment classification (ABSC) is a crucial subtask of fine-grained sentiment analysis (SA), which aims to predict the sentiment polarity of the given aspects in a sentence as positive, negative, or neutral. Most existing ABSC methods based on supervised learning. However, these methods rely heavily on fine-grained labeled training data, which can be scarce in low-resource domains, limiting their effectiveness. To overcome this challenge, we propose a low-resource cross-domain aspect-based sentiment classification (CDABSC) approach based on a pre-training and fine-tuning strategy. This approach applies the pre-training and fine-tuning strategy to an advanced deep learning method designed for ABSC, namely the attention-based encoding graph convolutional network (AEGCN) model. Specifically, a high-resource domain is selected as the source domain, and the AEGCN model is pre-trained using a large amount of fine-grained annotated data from the source domain. The optimal parameters of the model are preserved. Subsequently, a low-resource domain is used as the target domain, and the pre-trained model parameters are used as the initial parameters of the target domain model. The target domain is fine-tuned using a small amount of annotated data to adapt the parameters to the target domain model, improving the accuracy of sentiment classification in the low-resource domain. Finally, experimental validation on two domain benchmark datasets, restaurant and laptop, demonstrates that significant outperformance of our approach over the baselines in CDABSC Micro-F1.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要