Adaptive Multilingual Representations for Cross-Lingual Entity Linking with Attention on Entity Descriptions.

CCKS(2019)

引用 0|浏览96
暂无评分
摘要
Cross-lingual entity linking is the task of resolving ambiguous mentions in text to corresponding entities in knowledge base, where the query text and knowledge base are in different languages. Recent multilingual embedding based methods bring significant progress in this task. However, they still meet some potential problems: (1) They directly use multilingual embeddings obtained by cross-lingual mapping, which may bring noise and degrade the performance; (2) They also rely on the pre-trained fixed entity embeddings, which only carry limited information about entities. In this paper, we propose a cross-lingual entity linking framework with the help of more adaptive representations. For the first problem, we apply trainable adjusting matrices to fine-tune the semantic representations built from multilingual embeddings. For the second problem, we introduce attention mechanisms on entity descriptions to obtain dynamic entity representations, exploiting more clues about entity candidates according to the query mentions. Experiments on the TAC KBP 2015 Chinese-English cross-lingual entity linking dataset show that our model yields better performance than state-of-the-art models.
更多
查看译文
关键词
Cross-lingual entity linking, Multilingual embedding, Attention mechanism
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要