Distillation Language Adversarial Network for Cross-lingual Sentiment Analysis

2022 International Conference on Asian Language Processing (IALP)(2022)

引用 0|浏览22
暂无评分
摘要
Cross-lingual sentiment analysis aims at tackling the lack of annotated corpus of variant low-resource languages by training a common classifier, to transfer the knowledge learned from the source language to target languages. Existing large-scale pre-trained language models have got remarkable improvements in cross-lingual sentiment analysis. However, these models still suffer from lack of annotated corpus for low-resource languages. To address such problems, we propose an end-to-end sentiment analysis architecture for cross-lingual sentiment analysis, named Distillation Language Adversarial Network (DLAN). Based on pre-trained model, DLAN uses adversarial learning with knowledge distillation to learn language invariant features without extra training data. We evaluate the proposed method on Amazon review dataset, a multilingual sentiment dataset. The results illustrate that DLAN is more effective than the baseline methods in cross-lingual sentiment analysis.
更多
查看译文
关键词
Cross-lingual sentiment analysis,Adversarial network,Knowledge distillation,Pre-trained model
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要