Future-Aware Knowledge Distillation for Neural Machine Translation.

IEEE/ACM Transactions on Audio, Speech, and Language Processing(2019)

引用 22|浏览82
暂无评分
摘要
Although future context is widely regarded useful for word prediction in machine translation, it is quite difficult in practice to incorporate it into neural machine translation. In this paper, we propose a future-aware knowledge distillation framework (FKD) to address this issue. In the FKD framework, we learn to distill future knowledge from a backward neural language model (teacher) to future-a...
更多
查看译文
关键词
Decoding,History,Context modeling,Predictive models,Computational modeling,Training,Semantics
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要