Leveraging Multi-lingual Positive Instances in Contrastive Learning to Improve Sentence Embedding
Conference of the European Chapter of the Association for Computational Linguistics(2023)
摘要
Learning multi-lingual sentence embeddings is a fundamental task in natural
language processing. Recent trends in learning both mono-lingual and
multi-lingual sentence embeddings are mainly based on contrastive learning (CL)
among an anchor, one positive, and multiple negative instances. In this work,
we argue that leveraging multiple positives should be considered for
multi-lingual sentence embeddings because (1) positives in a diverse set of
languages can benefit cross-lingual learning, and (2) transitive similarity
across multiple positives can provide reliable structural information for
learning. In order to investigate the impact of multiple positives in CL, we
propose a novel approach, named MPCL, to effectively utilize multiple positive
instances to improve the learning of multi-lingual sentence embeddings.
Experimental results on various backbone models and downstream tasks
demonstrate that MPCL leads to better retrieval, semantic similarity, and
classification performances compared to conventional CL. We also observe that
in unseen languages, sentence embedding models trained on multiple positives
show better cross-lingual transfer performance than models trained on a single
positive instance.
更多查看译文
关键词
contrastive learning,positive,sentence,multi-lingual
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要