Multi-way, multilingual neural machine translation.
Computer Speech & Language(2017)
摘要
The first attention-based neural-MT for multi-way, multilingual translation is proposed.Multi-way multilingual model is tested on more than 8 languages (En, Fr, Cz, De, Ru, Fi, Tr and Uz).It achieves the translation quality comparable to single-pair NMTs with less parameters.Single attention mechanism supports to align between multiple pairs and directions.Outperforms conventional SMT system on low-resource translation tasks. We propose multi-way, multilingual neural machine translation. The proposed approach enables a single neural translation model to translate between multiple languages, with a number of parameters that grows only linearly with the number of languages. This is made possible by having a single attention mechanism that is shared across all language pairs. We train the proposed multi-way, multilingual model on ten language pairs from WMT15 simultaneously and observe clear performance improvements over models trained on only one language pair. We empirically evaluate the proposed model on low-resource language translation tasks. In particular, we observe that the proposed multilingual model outperforms strong conventional statistical machine translation systems on Turkish-English and Uzbek-English by incorporating the resources of other language pairs.
更多查看译文
关键词
Neural machine translation,Multi-lingual,Low resource translation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络